MACCIA, ELIZABETH S.; AND OTHERS
AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…
Implications of Information Theory for Computational Modeling of Schizophrenia.
Silverstein, Steven M; Wibral, Michael; Phillips, William A
2017-10-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.
Modeling Routinization in Games: An Information Theory Approach
DEFF Research Database (Denmark)
Wallner, Simon; Pichlmair, Martin; Hecher, Michael
2015-01-01
Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...
Automated Physico-Chemical Cell Model Development through Information Theory
Energy Technology Data Exchange (ETDEWEB)
Peter J. Ortoleva
2005-11-29
The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.
An information theory-based approach to modeling the information processing of NPP operators
International Nuclear Information System (INIS)
Kim, Jong Hyun; Seong, Poong Hyun
2002-01-01
This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory
Evaluating hydrological model performance using information theory-based metrics
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Should the model for risk-informed regulation be game theory rather than decision theory?
Bier, Vicki M; Lin, Shi-Woei
2013-02-01
deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.
Quantum biological information theory
Djordjevic, Ivan B
2016-01-01
This book is a self-contained, tutorial-based introduction to quantum information theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological information theory required to describe the quantum information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects. Integrates quantum information and quantum biology concepts; Assumes only knowledge of basic concepts of vector algebra at undergraduate level; Provides a thorough introduction to basic concepts of quantum information processing, quantum information theory, and quantum biology; Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models o...
An information theory model for dissipation in open quantum systems
Rogers, David M.
2017-08-01
This work presents a general model for open quantum systems using an information game along the lines of Jaynes’ original work. It is shown how an energy based reweighting of propagators provides a novel moment generating function at each time point in the process. Derivatives of the generating function give moments of the time derivatives of observables. Aside from the mathematically helpful properties, the ansatz reproduces key physics of stochastic quantum processes. At high temperature, the average density matrix follows the Caldeira-Leggett equation. Its associated Langevin equation clearly demonstrates the emergence of dissipation and decoherence time scales, as well as an additional diffusion due to quantum confinement. A consistent interpretation of these results is that decoherence and wavefunction collapse during measurement are directly related to the degree of environmental noise, and thus occur because of subjective uncertainty of an observer.
Stamovlasis, Dimitrios; Tsaparlis, Georgios
2012-01-01
In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…
Sheldon, Lisa Kennedy; Ellington, Lee
2008-11-01
This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.
Cognition to Collaboration: User-Centric Approach and Information Behaviour Theories/Models
Directory of Open Access Journals (Sweden)
Alperen M Aydin
2016-12-01
Full Text Available Aim/Purpose: The objective of this paper is to review the vast literature of user-centric in-formation science and inform about the emerging themes in information behaviour science. Background:\tThe paradigmatic shift from system-centric to user-centric approach facilitates research on the cognitive and individual information processing. Various information behaviour theories/models emerged. Methodology: Recent information behaviour theories and models are presented. Features, strengths and weaknesses of the models are discussed through the analysis of the information behaviour literature. Contribution: This paper sheds light onto the weaknesses in earlier information behaviour models and stresses (and advocates the need for research on social information behaviour. Findings: Prominent information behaviour models deal with individual information behaviour. People live in a social world and sort out most of their daily or work problems in groups. However, only seven papers discuss social information behaviour (Scopus search. Recommendations for Practitioners\t: ICT tools used for inter-organisational sharing should be redesigned for effective information-sharing during disaster/emergency times. Recommendation for Researchers: There are scarce sources on social side of the information behaviour, however, most of the work tasks are carried out in groups/teams. Impact on Society: In dynamic work contexts like disaster management and health care settings, collaborative information-sharing may result in decreasing the losses. Future Research: A fieldwork will be conducted in disaster management context investigating the inter-organisational information-sharing.
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
Making a difference: incorporating theories of autonomy into models of informed consent.
Delany, C
2008-09-01
Obtaining patients' informed consent is an ethical and legal obligation in healthcare practice. Whilst the law provides prescriptive rules and guidelines, ethical theories of autonomy provide moral foundations. Models of practice of consent, have been developed in the bioethical literature to assist in understanding and integrating the ethical theory of autonomy and legal obligations into the clinical process of obtaining a patient's informed consent to treatment. To review four models of consent and analyse the way each model incorporates the ethical meaning of autonomy and how, as a consequence, they might change the actual communicative process of obtaining informed consent within clinical contexts. An iceberg framework of consent is used to conceptualise how ethical theories of autonomy are positioned and underpin the above surface, and visible clinical communication, including associated legal guidelines and ethical rules. Each model of consent is critically reviewed from the perspective of how it might shape the process of informed consent. All four models would alter the process of obtaining consent. Two models provide structure and guidelines for the content and timing of obtaining patients' consent. The two other models rely on an attitudinal shift in clinicians. They provide ideas for consent by focusing on underlying values, attitudes and meaning associated with the ethical meaning of autonomy. The paper concludes that models of practice that explicitly incorporate the underlying ethical meaning of autonomy as their basis, provide less prescriptive, but more theoretically rich guidance for healthcare communicative practices.
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories
Directory of Open Access Journals (Sweden)
Robert Logozar
2011-12-01
Full Text Available We present the modeling of dynamical systems and finding of their complexity indicators by the use of concepts from computation and information theories, within the framework of J. P. Crutchfield's theory of ε-machines. A short formal outline of the ε-machines is given. In this approach, dynamical systems are analyzed directly from the time series that is received from a properly adjusted measuring instrument. The binary strings are parsed through the parse tree, within which morphologically and probabilistically unique subtrees or morphs are recognized as system states. The outline and precise interrelation of the information-theoretic entropies and complexities emanating from the model is given. The paper serves also as a theoretical foundation for the future presentation of the DSA program that implements the ε-machines modeling up to the stochastic finite automata level.
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
Science and information theory
Brillouin, Léon
1962-01-01
A classic source for exploring the connections between information theory and physics, this text is geared toward upper-level undergraduates and graduate students. The author, a giant of 20th-century mathematics, applies the principles of information theory to a variety of issues, including Maxwell's demon, thermodynamics, and measurement problems. 1962 edition.
Energy Technology Data Exchange (ETDEWEB)
Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it [Sapienza Università di Roma, Dipartimento di Ingegneria Civile, Edile e Ambientale (Italy); Alfonso, L. [Hydroinformatics Chair Group, UNESCO-IHE, Delft (Netherlands); Di Baldassarre, G. [Department of Earth Sciences, Program for Air, Water and Landscape Sciences, Uppsala University (Sweden)
2016-06-08
The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.
International Nuclear Information System (INIS)
Ridolfi, E.; Napolitano, F.; Alfonso, L.; Di Baldassarre, G.
2016-01-01
The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.
The use of network theory to model disparate ship design information
Directory of Open Access Journals (Sweden)
Douglas Rigterink
2014-06-01
Full Text Available This paper introduces the use of network theory to model and analyze disparate ship design information. This work will focus on a ship's distributed systems and their intra- and intersystem structures and interactions. The three system to be analyzed are: a passageway system, an electrical system, and a fire fighting system. These systems will be analyzed individually using common network metrics to glean information regarding their structures and attributes. The systems will also be subjected to community detection algorithms both separately and as a multiplex network to compare their similarities, differences, and interactions. Network theory will be shown to be useful in the early design stage due to its simplicity and ability to model any shipboard system.
The use of network theory to model disparate ship design information
Rigterink, Douglas; Piks, Rebecca; Singer, David J.
2014-06-01
This paper introduces the use of network theory to model and analyze disparate ship design information. This work will focus on a ship's distributed systems and their intra- and intersystem structures and interactions. The three system to be analyzed are: a passageway system, an electrical system, and a fire fighting system. These systems will be analyzed individually using common network metrics to glean information regarding their structures and attributes. The systems will also be subjected to community detection algorithms both separately and as a multiplex network to compare their similarities, differences, and interactions. Network theory will be shown to be useful in the early design stage due to its simplicity and ability to model any shipboard system.
The use of network theory to model disparate ship design information
Directory of Open Access Journals (Sweden)
Rigterink Douglas
2014-06-01
Full Text Available This paper introduces the use of network theory to model and analyze disparate ship design information. This work will focus on a ship’s distributed systems and their intra- and intersystem structures and interactions. The three system to be analyzed are: a passageway system, an electrical system, and a fire fighting system. These systems will be analyzed individually using common network metrics to glean information regarding their structures and attributes. The systems will also be subjected to community detection algorithms both separately and as a multiplex network to compare their similarities, differences, and interactions. Network theory will be shown to be useful in the early design stage due to its simplicity and ability to model any shipboard system.
Directory of Open Access Journals (Sweden)
J. Florian Wellmann
2013-04-01
Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.
Dwivedi, Yogesh K; Schneberger, Scott L
2011-01-01
The overall mission of this book is to provide a comprehensive understanding and coverage of the various theories and models used in IS research. Specifically, it aims to focus on the following key objectives: To describe the various theories and models applicable to studying IS/IT management issues. To outline and describe, for each of the various theories and models, independent and dependent constructs, reference discipline/originating area, originating author(s), seminal articles, level of analysis (i.e. firm, individual, industry) and links with other theories. To provide a critical revie
Pettersson, Rune
2014-01-01
Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…
Nakajima, Toshiyuki
2015-12-01
Higher animals act in the world using their external reality models to cope with the uncertain environment. Organisms that have not developed such information-processing organs may also have external reality models built in the form of their biochemical, physiological, and behavioral structures, acquired by natural selection through successful models constructed internally. Organisms subject to illusions would fail to survive in the material universe. How can organisms, or living systems in general, determine the external reality from within? This paper starts with a phenomenological model, in which the self constitutes a reality model developed through the mental processing of phenomena. Then, the it-from-bit concept is formalized using a simple mathematical model. For this formalization, my previous work on an algorithmic process is employed to constitute symbols referring to the external reality, called the inverse causality, with additional improvements to the previous work. Finally, as an extension of this model, the cognizers system model is employed to describe the self as one of many material entities in a world, each of which acts as a subject by responding to the surrounding entities. This model is used to propose a conceptual framework of information theory that can deal with both the qualitative (semantic) and quantitative aspects of the information involved in biological processes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Theories of information behavior
Erdelez, Sandra; McKechnie, Lynne
2005-01-01
This unique book presents authoritative overviews of more than 70 conceptual frameworks for understanding how people seek, manage, share, and use information in different contexts. A practical and readable reference to both well-established and newly proposed theories of information behavior, the book includes contributions from 85 scholars from 10 countries. Each theory description covers origins, propositions, methodological implications, usage, links to related conceptual frameworks, and listings of authoritative primary and secondary references. The introductory chapters explain key concepts, theory–method connections, and the process of theory development.
Dynamic statistical information theory
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel
Bayesian networks and information theory for audio-visual perception modeling.
Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis
2010-09-01
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.
Prest, M
1988-01-01
In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module
Constructor theory of information
Deutsch, David; Marletto, Chiara
2015-01-01
We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803
Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence
Directory of Open Access Journals (Sweden)
Massimo Materassi
2014-02-01
Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.
Franceschetti, Massimo
2017-01-01
Understand the relationship between information theory and the physics of wave propagation with this expert guide. Balancing fundamental theory with engineering applications, it describes the mechanism and limits for the representation and communication of information using electromagnetic waves. Information-theoretic laws relating functional approximation and quantum uncertainty principles to entropy, capacity, mutual information, rate distortion, and degrees of freedom of band-limited radiation are derived and explained. Both stochastic and deterministic approaches are explored, and applications for sensing and signal reconstruction, wireless communication, and networks of multiple transmitters and receivers are reviewed. With end-of-chapter exercises and suggestions for further reading enabling in-depth understanding of key concepts, it is the ideal resource for researchers and graduate students in electrical engineering, physics and applied mathematics looking for a fresh perspective on classical informat...
Wilde, Mark M
2017-01-01
Developing many of the major, exciting, pre- and post-millennium developments from the ground up, this book is an ideal entry point for graduate students into quantum information theory. Significant attention is given to quantum mechanics for quantum information theory, and careful studies of the important protocols of teleportation, superdense coding, and entanglement distribution are presented. In this new edition, readers can expect to find over 100 pages of new material, including detailed discussions of Bell's theorem, the CHSH game, Tsirelson's theorem, the axiomatic approach to quantum channels, the definition of the diamond norm and its interpretation, and a proof of the Choi–Kraus theorem. Discussion of the importance of the quantum dynamic capacity formula has been completely revised, and many new exercises and references have been added. This new edition will be welcomed by the upcoming generation of quantum information theorists and the already established community of classical information theo...
DEFF Research Database (Denmark)
Gade, Peter; Gade, Anne Nørkjær; Otrel-Cass, Kathrin
2018-01-01
Building information modelling (BIM) offers a great potential for increasing efficiency in the building industry. However, this potential remains largely unfulfilled; substantiating the need for a sound understanding of the existing practices involved in the building design process, to implement...... BIM efficiently. The aim of this article is to discuss the unfolding activity of a building design process and the role of BIM as a mediating tool. A case study was carried out to investigate contradictions related to the use of BIM in an inter-organizational design process of a naval rescue station...... in Denmark. Aspects of the design activity and the development of practices during the project were explored through the lens of Activity Theory. We were able to examine how BIM-mediated the production of the building design, how BIM use evolved. The article presents and discusses four main contradictions...
Geometric theory of information
2014-01-01
This book brings together geometric tools and their applications for Information analysis. It collects current and many uses of in the interdisciplinary fields of Information Geometry Manifolds in Advanced Signal, Image & Video Processing, Complex Data Modeling and Analysis, Information Ranking and Retrieval, Coding, Cognitive Systems, Optimal Control, Statistics on Manifolds, Machine Learning, Speech/sound recognition, and natural language treatment which are also substantially relevant for the industry.
An introduction to information theory
Reza, Fazlollah M
1994-01-01
Graduate-level study for engineering students presents elements of modern probability theory, information theory, coding theory, more. Emphasis on sample space, random variables, capacity, etc. Many reference tables and extensive bibliography. 1961 edition.
Tuai, Cameron K.
2011-01-01
The integration of librarians and technologists to deliver information services represents a new and potentially costly organizational challenge for many library administrators. To understand better how to control the costs of integration, the research presented here will use structural contingency theory to study the coordination of librarians…
Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.
2017-01-01
In their effort to control and manage processes, organizations often create process models. The quality of such models is not always optimal, because it is challenging for a modeler to translate her mental image of the process into a formal process description. In order to support this complex human
Fake News, Conspiracy Theories, and Lies: An Information Laundering Model for Homeland Security
2018-03-01
being distributed. B. INFORMATION LAUNDERING 2.0 MODEL Like Information Laundering 1.0, Information Laundering 2.0 is built on a metaphor of money ...However, unlike the previous model, the new model takes the metaphor a step further, incorporating all three phases of money laundering : placement...we say “ laundering .” Although there are many ways to describe money laundering , the simplest way is: a process by which one can turn “‘dirty’ money
How parents process child health and nutrition information: A grounded theory model.
Lovell, Jennifer L
2016-02-01
The aim of the present study was to investigate low-income parents' experiences receiving, making meaning of, and applying sociocultural messages about childhood health and nutrition. Semi-structured interviews were conducted with parents from 16 low-income Early Head Start families. Verbatim interview transcripts, observations, field notes, documentary evidence, and follow-up participant checks were used during grounded theory analysis of the data. Data yielded a potential theoretical model of parental movement toward action involving (a) the culture and context influencing parents, (b) parents' sources of social and cultural messages, (c) parental values and engagement, (d) parental motivation for action, (e) intervening conditions impacting motivation and application, and (f) parent action taken on the individual and social levels. Parent characteristics greatly impacted the ways in which parents understood and applied health and nutrition information. Among other implications, it is recommended that educators and providers focus on a parent's beliefs, values, and cultural preferences regarding food and health behaviors as well as his/her personal/family definition of "health" when framing recommendations and developing interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.
Information theory of molecular systems
Nalewajski, Roman F
2006-01-01
As well as providing a unified outlook on physics, Information Theory (IT) has numerous applications in chemistry and biology owing to its ability to provide a measure of the entropy/information contained within probability distributions and criteria of their information ""distance"" (similarity) and independence. Information Theory of Molecular Systems applies standard IT to classical problems in the theory of electronic structure and chemical reactivity. The book starts by introducing the basic concepts of modern electronic structure/reactivity theory based upon the Density Functional Theory
Kopylova, N. S.; Bykova, A. A.; Beregovoy, D. N.
2018-05-01
Based on the landscape-geographical approach, a structural and logical scheme for the Northwestern Federal District Econet has been developed, which can be integrated into the federal and world ecological network in order to improve the environmental infrastructure of the region. The method of Northwestern Federal District Econet organization on the basis of graph theory by means of the Quantum GIS geographic information system is proposed as an effective mean of preserving and recreating the unique biodiversity of landscapes, regulation of the sphere of environmental protection.
Information theory in analytical chemistry
National Research Council Canada - National Science Library
Eckschlager, Karel; Danzer, Klaus
1994-01-01
Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...
Thornberg, Robert
2012-01-01
There is a widespread idea that in grounded theory (GT) research, the researcher has to delay the literature review until the end of the analysis to avoid contamination--a dictum that might turn educational researchers away from GT. Nevertheless, in this article the author (a) problematizes the dictum of delaying a literature review in classic…
International Nuclear Information System (INIS)
Kim, Man Cheol; Seong, Poong Hyun
2006-01-01
To develop operator behavior models such as IDAC, quantitative models for the cognitive activities of nuclear power plant (NPP) operators in abnormal situations are essential. Among them, only few quantitative models for the monitoring and detection have been developed. In this paper, we propose a computational model for the knowledge-driven monitoring, which is also known as model-driven monitoring, of NPP operators in abnormal situations, based on the information theory. The basic assumption of the proposed model is that the probability that an operator shifts his or her attention to an information source is proportional to the expected information from the information source. A small experiment performed to evaluate the feasibility of the proposed model shows that the predictions made by the proposed model have high correlations with the experimental results. Even though it has been argued that heuristics might play an important role on human reasoning, we believe that the proposed model can provide part of the mathematical basis for developing quantitative models for knowledge-driven monitoring of NPP operators when NPP operators are assumed to behave very logically
Genre theory in information studies
Andersen, Jack
2015-01-01
This book highlights the important role genre theory plays within information studies. It illustrates how modern genre studies inform and enrich the study of information, and conversely how the study of information makes its own independent contributions to the study of genre.
Towards an Information Retrieval Theory of Everything
Hiemstra, Djoerd; Lammerink, J.M.W.; Katoen, Joost P.; Kok, J.N.; van de Pol, Jan Cornelis; Raamsdonk, F.
2009-01-01
I present three well-known probabilistic models of information retrieval in tutorial style: The binary independence probabilistic model, the language modeling approach, and Google's page rank. Although all three models are based on probability theory, they are very different in nature. Each model
Information theory and statistics
Kullback, Solomon
1968-01-01
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.
Quantum: information theory: technological challenge
International Nuclear Information System (INIS)
Calixto, M.
2001-01-01
The new Quantum Information Theory augurs powerful machines that obey the entangled logic of the subatomic world. Parallelism, entanglement, teleportation, no-cloning and quantum cryptography are typical peculiarities of this novel way of understanding computation. (Author) 24 refs
An informational theory of privacy
Schottmuller, C.; Jann, Ole
2016-01-01
We develop a theory that explains how and when privacy can increase welfare. Without privacy, some individuals misrepresent their preferences, because they will otherwise be statistically discriminated against. This "chilling effect" hurts them individually, and impairs information aggregation. The
A Model of Managerial Effectiveness in Information Security: From Grounded Theory to Empirical Test
National Research Council Canada - National Science Library
Knapp, Kenneth J
2005-01-01
Information security is a critical issue facing organizations worldwide. in order to mitigate risk and protect valuable information, organizations need to operate and manage effective information security programs...
Information theory in molecular biology
Adami, Christoph
2004-01-01
This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the...
The theory of quantum information
Watrous, John
2018-01-01
This largely self-contained book on the theory of quantum information focuses on precise mathematical formulations and proofs of fundamental facts that form the foundation of the subject. It is intended for graduate students and researchers in mathematics, computer science, and theoretical physics seeking to develop a thorough understanding of key results, proof techniques, and methodologies that are relevant to a wide range of research topics within the theory of quantum information and computation. The book is accessible to readers with an understanding of basic mathematics, including linear algebra, mathematical analysis, and probability theory. An introductory chapter summarizes these necessary mathematical prerequisites, and starting from this foundation, the book includes clear and complete proofs of all results it presents. Each subsequent chapter includes challenging exercises intended to help readers to develop their own skills for discovering proofs concerning the theory of quantum information.
Han, Q.; Dellaert, B.G.C.; Raaij, van W.F.; Timmermans, H.J.P.
2005-01-01
Existing policy models of optimal guidance strategies are typically concerned with single-objective optimization based on reliable forecasts in terms of the consistency between predicted and observed aggregate activity-travel patterns. The interaction and interdependencies between policy objective
Zhu, Yenan; Hsieh, Yee-Hsee; Dhingra, Rishi R.; Dick, Thomas E.; Jacono, Frank J.; Galán, Roberto F.
2013-02-01
Interactions between oscillators can be investigated with standard tools of time series analysis. However, these methods are insensitive to the directionality of the coupling, i.e., the asymmetry of the interactions. An elegant alternative was proposed by Rosenblum and collaborators [M. G. Rosenblum, L. Cimponeriu, A. Bezerianos, A. Patzak, and R. Mrowka, Phys. Rev. EPLEEE81063-651X10.1103/PhysRevE.65.041909 65, 041909 (2002); M. G. Rosenblum and A. S. Pikovsky, Phys. Rev. EPLEEE81063-651X10.1103/PhysRevE.64.045202 64, 045202 (2001)] which consists in fitting the empirical phases to a generic model of two weakly coupled phase oscillators. This allows one to obtain the interaction functions defining the coupling and its directionality. A limitation of this approach is that a solution always exists in the least-squares sense, even in the absence of coupling. To preclude spurious results, we propose a three-step protocol: (1) Determine if a statistical dependency exists in the data by evaluating the mutual information of the phases; (2) if so, compute the interaction functions of the oscillators; and (3) validate the empirical oscillator model by comparing the joint probability of the phases obtained from simulating the model with that of the empirical phases. We apply this protocol to a model of two coupled Stuart-Landau oscillators and show that it reliably detects genuine coupling. We also apply this protocol to investigate cardiorespiratory coupling in anesthetized rats. We observe reciprocal coupling between respiration and heartbeat and that the influence of respiration on the heartbeat is generally much stronger than vice versa. In addition, we find that the vagus nerve mediates coupling in both directions.
The Quantitative Theory of Information
DEFF Research Database (Denmark)
Topsøe, Flemming; Harremoës, Peter
2008-01-01
Information Theory as developed by Shannon and followers is becoming more and more important in a number of sciences. The concepts appear to be just the right ones with intuitively appealing operational interpretations. Furthermore, the information theoretical quantities are connected by powerful...
Model for Electromagnetic Information Leakage
Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming
2013-01-01
Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...
Quantum information and relativity theory
International Nuclear Information System (INIS)
Peres, Asher; Terno, Daniel R.
2004-01-01
This article discusses the intimate relationship between quantum mechanics, information theory, and relativity theory. Taken together these are the foundations of present-day theoretical physics, and their interrelationship is an essential part of the theory. The acquisition of information from a quantum system by an observer occurs at the interface of classical and quantum physics. The authors review the essential tools needed to describe this interface, i.e., Kraus matrices and positive-operator-valued measures. They then discuss how special relativity imposes severe restrictions on the transfer of information between distant systems and the implications of the fact that quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about that Lorentz transformations of reduced density matrices for entangled systems may not be completely positive maps. Quantum field theory is, of course, necessary for a consistent description of interactions. Its structure implies a fundamental tradeoff between detector reliability and localizability. Moreover, general relativity produces new and counterintuitive effects, particularly when black holes (or, more generally, event horizons) are involved. In this more general context the authors discuss how most of the current concepts in quantum information theory may require a reassessment
Information Theory and Plasma Turbulence
International Nuclear Information System (INIS)
Dendy, R. O.
2009-01-01
Information theory, applied directly to measured signals, yields new perspectives on, and quantitative knowledge of, the physics of strongly nonlinear and turbulent phenomena in plasmas. It represents a new and productive element of the topical research programmes that use modern techniques to characterise strongly nonlinear signals from plasmas, and that address global plasma behaviour from a complex systems perspective. We here review some pioneering studies of mutual information in solar wind and magnetospheric plasmas, using techniques tested on standard complex systems.
Models of science dynamics encounters between complexity theory and information sciences
Börner, Katy; Besselaar, Peter
2012-01-01
Models of science dynamics aim to capture the structure and evolution of science. They are developed in an emerging research area in which scholars, scientific institutions and scientific communications become themselves basic objects of research. In order to understand phenomena as diverse as the structure of evolving co-authorship networks or citation diffusion patterns, different models have been developed. They include conceptual models based on historical and ethnographic observations, mathematical descriptions of measurable phenomena, and computational algorithms. Despite its evident importance, the mathematical modeling of science still lacks a unifying framework and a comprehensive research agenda. This book aims to fill this gap, reviewing and describing major threads in the mathematical modeling of science dynamics for a wider academic and professional audience. The model classes presented here cover stochastic and statistical models, game-theoretic approaches, agent-based simulations, population-dy...
On Representation in Information Theory
Directory of Open Access Journals (Sweden)
Joseph E. Brenner
2011-09-01
Full Text Available Semiotics is widely applied in theories of information. Following the original triadic characterization of reality by Peirce, the linguistic processes involved in information—production, transmission, reception, and understanding—would all appear to be interpretable in terms of signs and their relations to their objects. Perhaps the most important of these relations is that of the representation-one, entity, standing for or representing some other. For example, an index—one of the three major kinds of signs—is said to represent something by being directly related to its object. My position, however, is that the concept of symbolic representations having such roles in information, as intermediaries, is fraught with the same difficulties as in representational theories of mind. I have proposed an extension of logic to complex real phenomena, including mind and information (Logic in Reality; LIR, most recently at the 4th International Conference on the Foundations of Information Science (Beijing, August, 2010. LIR provides explanations for the evolution of complex processes, including information, that do not require any entities other than the processes themselves. In this paper, I discuss the limitations of the standard relation of representation. I argue that more realistic pictures of informational systems can be provided by reference to information as an energetic process, following the categorial ontology of LIR. This approach enables naïve, anti-realist conceptions of anti-representationalism to be avoided, and enables an approach to both information and meaning in the same novel logical framework.
Quantum Information Theory - an Invitation
Werner, Reinhard F.
Quantum information and quantum computers have received a lot of public attention recently. Quantum computers have been advertised as a kind of warp drive for computing, and indeed the promise of the algorithms of Shor and Grover is to perform computations which are extremely hard or even provably impossible on any merely ``classical'' computer.In this article I shall give an account of the basic concepts of quantum information theory is given, staying as much as possible in the area of general agreement.The article is divided into two parts. The first (up to the end of Sect. 2.5) is mostly in plain English, centered around the exploration of what can or cannot be done with quantum systems as information carriers. The second part, Sect. 2.6, then gives a description of the mathematical structures and of some of the tools needed to develop the theory.
A Model of Managerial Effectiveness in Information Security: From Grounded Theory to Empirical Test
2005-09-13
1 0.1% New Zealand 5 0.7% Egypt 1 0.1% Saudi Arabia 5 0.7% Hungary 1 0.1% Belgium 4 0.5% Kuwait 1 0.1% Denmark 4 0.5% Pakistan 1 0.1% France 4...0.30 92 Executive information systems 19.6 0.06 65 Telework 17.2 0.39 120 MLS Realty 14.4 0.11 106
Shaw, Gordon L.; Silverman, Dennis J.; Pearson, John C.
1985-04-01
Motivated by V. B. Mountcastle's organizational principle for neocortical function, and by M. E. Fisher's model of physical spin systems, we introduce a cooperative model of the cortical column incorporating an idealized substructure, the trion, which represents a localized group of neurons. Computer studies reveal that typical networks composed of a small number of trions (with symmetric interactions) exhibit striking behavior--e.g., hundreds to thousands of quasi-stable, periodic firing patterns, any of which can be selected out and enhanced with only small changes in interaction strengths by using a Hebb-type algorithm.
Fiene, Richard J.; Woods, Lawrence
Two unanswered questions about child care are: (1) Does compliance with state child care regulations have a positive impact on children? and (2) Have predictors of program quality been identified? This paper explores a research study and related model that have had some success in answering these questions. Section I, a general introduction,…
Quantum information theory mathematical foundation
Hayashi, Masahito
2017-01-01
This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics – all of which are addressed here – made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an impro...
Enhanced Anomaly Detection Via PLS Regression Models and Information Entropy Theory
Harrou, Fouzi
2015-12-07
Accurate and effective fault detection and diagnosis of modern engineering systems is crucial for ensuring reliability, safety and maintaining the desired product quality. In this work, we propose an innovative method for detecting small faults in the highly correlated multivariate data. The developed method utilizes partial least square (PLS) method as a modelling framework, and the symmetrized Kullback-Leibler divergence (KLD) as a monitoring index, where it is used to quantify the dissimilarity between probability distributions of current PLS-based residual and reference one obtained using fault-free data. The performance of the PLS-based KLD fault detection algorithm is illustrated and compared to the conventional PLS-based fault detection methods. Using synthetic data, we have demonstrated the greater sensitivity and effectiveness of the developed method over the conventional methods, especially when data are highly correlated and small faults are of interest.
Enhanced Anomaly Detection Via PLS Regression Models and Information Entropy Theory
Harrou, Fouzi; Sun, Ying
2015-01-01
Accurate and effective fault detection and diagnosis of modern engineering systems is crucial for ensuring reliability, safety and maintaining the desired product quality. In this work, we propose an innovative method for detecting small faults in the highly correlated multivariate data. The developed method utilizes partial least square (PLS) method as a modelling framework, and the symmetrized Kullback-Leibler divergence (KLD) as a monitoring index, where it is used to quantify the dissimilarity between probability distributions of current PLS-based residual and reference one obtained using fault-free data. The performance of the PLS-based KLD fault detection algorithm is illustrated and compared to the conventional PLS-based fault detection methods. Using synthetic data, we have demonstrated the greater sensitivity and effectiveness of the developed method over the conventional methods, especially when data are highly correlated and small faults are of interest.
Perla, Rocco J.; Carifio, James
2011-01-01
Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…
Recoverability in quantum information theory
Wilde, Mark
The fact that the quantum relative entropy is non-increasing with respect to quantum physical evolutions lies at the core of many optimality theorems in quantum information theory and has applications in other areas of physics. In this work, we establish improvements of this entropy inequality in the form of physically meaningful remainder terms. One of the main results can be summarized informally as follows: if the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small, then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution. Our proof method is elementary, relying on the method of complex interpolation, basic linear algebra, and the recently introduced Renyi generalization of a relative entropy difference. The theorem has a number of applications in quantum information theory, which have to do with providing physically meaningful improvements to many known entropy inequalities. This is based on arXiv:1505.04661, now accepted for publication in Proceedings of the Royal Society A. I acknowledge support from startup funds from the Department of Physics and Astronomy at LSU, the NSF under Award No. CCF-1350397, and the DARPA Quiness Program through US Army Research Office award W31P4Q-12-1-0019.
Towards an Information Theory of Complex Networks
Dehmer, Matthias; Mehler, Alexander
2011-01-01
For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti
Algesheimer, René; Bagozzi, Richard P.; Dholakia, Utpal M.
2018-01-01
We offer a new conceptualization and measurement models for constructs at the group-level of analysis in small group research. The conceptualization starts with classical notions of group behavior proposed by Tönnies, Simmel, and Weber and then draws upon plural subject theory by philosophers Gilbert and Tuomela to frame a new perspective…
Information theory and the ethylene genetic network.
González-García, José S; Díaz, José
2011-10-01
The original aim of the Information Theory (IT) was to solve a purely technical problem: to increase the performance of communication systems, which are constantly affected by interferences that diminish the quality of the transmitted information. That is, the theory deals only with the problem of transmitting with the maximal precision the symbols constituting a message. In Shannon's theory messages are characterized only by their probabilities, regardless of their value or meaning. As for its present day status, it is generally acknowledged that Information Theory has solid mathematical foundations and has fruitful strong links with Physics in both theoretical and experimental areas. However, many applications of Information Theory to Biology are limited to using it as a technical tool to analyze biopolymers, such as DNA, RNA or protein sequences. The main point of discussion about the applicability of IT to explain the information flow in biological systems is that in a classic communication channel, the symbols that conform the coded message are transmitted one by one in an independent form through a noisy communication channel, and noise can alter each of the symbols, distorting the message; in contrast, in a genetic communication channel the coded messages are not transmitted in the form of symbols but signaling cascades transmit them. Consequently, the information flow from the emitter to the effector is due to a series of coupled physicochemical processes that must ensure the accurate transmission of the message. In this review we discussed a novel proposal to overcome this difficulty, which consists of the modeling of gene expression with a stochastic approach that allows Shannon entropy (H) to be directly used to measure the amount of uncertainty that the genetic machinery has in relation to the correct decoding of a message transmitted into the nucleus by a signaling pathway. From the value of H we can define a function I that measures the amount of
Comparing cosmic web classifiers using information theory
International Nuclear Information System (INIS)
Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin; Jasche, Jens
2016-01-01
We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.
Comparing cosmic web classifiers using information theory
Energy Technology Data Exchange (ETDEWEB)
Leclercq, Florent [Institute of Cosmology and Gravitation (ICG), University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth PO1 3FX (United Kingdom); Lavaux, Guilhem; Wandelt, Benjamin [Institut d' Astrophysique de Paris (IAP), UMR 7095, CNRS – UPMC Université Paris 6, Sorbonne Universités, 98bis boulevard Arago, F-75014 Paris (France); Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: lavaux@iap.fr, E-mail: j.jasche@tum.de, E-mail: wandelt@iap.fr [Excellence Cluster Universe, Technische Universität München, Boltzmannstrasse 2, D-85748 Garching (Germany)
2016-08-01
We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.
Chemical Thermodynamics and Information Theory with Applications
Graham, Daniel J
2011-01-01
Thermodynamics and information touch theory every facet of chemistry. However, the physical chemistry curriculum digested by students worldwide is still heavily skewed toward heat/work principles established more than a century ago. Rectifying this situation, Chemical Thermodynamics and Information Theory with Applications explores applications drawn from the intersection of thermodynamics and information theory--two mature and far-reaching fields. In an approach that intertwines information science and chemistry, this book covers: The informational aspects of thermodynamic state equations The
Introduction to coding and information theory
Roman, Steven
1997-01-01
This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.
Client-Controlled Case Information: A General System Theory Perspective
Fitch, Dale
2004-01-01
The author proposes a model for client control of case information via the World Wide Web built on principles of general system theory. It incorporates the client into the design, resulting in an information structure that differs from traditional human services information-sharing practices. Referencing general system theory, the concepts of…
Directory of Open Access Journals (Sweden)
Mohammad Aghdasi
2011-09-01
In this paper, a practical model is used to identify the most effective rules in information systems. In this model, first, critical business attributes which fit to strategic expectations are taken into account. These are the attributes which their changes are more important than others in achieving the strategic expectations. To identify these attributes we utilize rough set theory. Those business rules which use critical information attribute in their structures are identified as the most effective business rules. The Proposed model helps information system developers to identify scope of effective business rules. It causes a decrease in time and cost of information system maintenance. Also it helps business analyst to focus on managing critical business attributes in order to achieve a specific goal.
Holman, Gordon D.
1989-01-01
The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.
Galli, Leandro; Knight, Rosemary; Robertson, Steven; Hoile, Elizabeth; Oladapo, Olubukola; Francis, David; Free, Caroline
2014-05-22
Recruitment is a major challenge for many trials; just over half reach their targets and almost a third resort to grant extensions. The economic and societal implications of this shortcoming are significant. Yet, we have a limited understanding of the processes that increase the probability that recruitment targets will be achieved. Accordingly, there is an urgent need to bring analytical rigour to the task of improving recruitment, thereby increasing the likelihood that trials reach their recruitment targets. This paper presents a conceptual framework that can be used to improve recruitment to clinical trials. Using a case-study approach, we reviewed the range of initiatives that had been undertaken to improve recruitment in the txt2stop trial using qualitative (semi-structured interviews with the principal investigator) and quantitative (recruitment) data analysis. Later, the txt2stop recruitment practices were compared to a previous model of marketing a trial and to key constructs in social marketing theory. Post hoc, we developed a recruitment optimisation model to serve as a conceptual framework to improve recruitment to clinical trials. A core premise of the model is that improving recruitment needs to be an iterative, learning process. The model describes three essential activities: i) recruitment phase monitoring, ii) marketing research, and iii) the evaluation of current performance. We describe the initiatives undertaken by the txt2stop trial and the results achieved, as an example of the use of the model. Further research should explore the impact of adopting the recruitment optimisation model when applied to other trials.
Directory of Open Access Journals (Sweden)
N.S.Gonchar
2006-01-01
Full Text Available A new stochastic model of economy is developed that takes into account the choice of consumers are the dependent random fields. Axioms of such a model are formulated. The existence of random fields of consumer's choice and decision making by firms are proved. New notions of conditionally independent random fields and random fields of evaluation of information by consumers are introduced. Using the above mentioned random fields the random fields of consumer choice and decision making by firms are constructed. The theory of economic equilibrium is developed.
Comment on Gallistel: behavior theory and information theory: some parallels.
Nevin, John A
2012-05-01
In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.
Client-controlled case information: a general system theory perspective.
Fitch, Dale
2004-07-01
The author proposes a model for client control of case information via the World Wide Web built on principles of general system theory. It incorporates the client into the design, resulting in an information structure that differs from traditional human services information-sharing practices. Referencing general system theory, the concepts of controller and controlled system, as well as entropy and negentropy, are applied to the information flow and autopoietic behavior as they relate to the boundary-maintaining functions of today's organizations. The author's conclusions synthesize general system theory and human services values to lay the foundation for an information-sharing framework for human services in the 21st century.
2014-01-01
Background Recruitment is a major challenge for many trials; just over half reach their targets and almost a third resort to grant extensions. The economic and societal implications of this shortcoming are significant. Yet, we have a limited understanding of the processes that increase the probability that recruitment targets will be achieved. Accordingly, there is an urgent need to bring analytical rigour to the task of improving recruitment, thereby increasing the likelihood that trials reach their recruitment targets. This paper presents a conceptual framework that can be used to improve recruitment to clinical trials. Methods Using a case-study approach, we reviewed the range of initiatives that had been undertaken to improve recruitment in the txt2stop trial using qualitative (semi-structured interviews with the principal investigator) and quantitative (recruitment) data analysis. Later, the txt2stop recruitment practices were compared to a previous model of marketing a trial and to key constructs in social marketing theory. Results Post hoc, we developed a recruitment optimisation model to serve as a conceptual framework to improve recruitment to clinical trials. A core premise of the model is that improving recruitment needs to be an iterative, learning process. The model describes three essential activities: i) recruitment phase monitoring, ii) marketing research, and iii) the evaluation of current performance. We describe the initiatives undertaken by the txt2stop trial and the results achieved, as an example of the use of the model. Conclusions Further research should explore the impact of adopting the recruitment optimisation model when applied to other trials. PMID:24886627
Information theory and rate distortion theory for communications and compression
Gibson, Jerry
2013-01-01
This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the cover
Financial markets theory equilibrium, efficiency and information
Barucci, Emilio
2017-01-01
This work, now in a thoroughly revised second edition, presents the economic foundations of financial markets theory from a mathematically rigorous standpoint and offers a self-contained critical discussion based on empirical results. It is the only textbook on the subject to include more than two hundred exercises, with detailed solutions to selected exercises. Financial Markets Theory covers classical asset pricing theory in great detail, including utility theory, equilibrium theory, portfolio selection, mean-variance portfolio theory, CAPM, CCAPM, APT, and the Modigliani-Miller theorem. Starting from an analysis of the empirical evidence on the theory, the authors provide a discussion of the relevant literature, pointing out the main advances in classical asset pricing theory and the new approaches designed to address asset pricing puzzles and open problems (e.g., behavioral finance). Later chapters in the book contain more advanced material, including on the role of information in financial markets, non-c...
Cognition and biology: perspectives from information theory.
Wallace, Rodrick
2014-02-01
The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.
Econophysics: from Game Theory and Information Theory to Quantum Mechanics
Jimenez, Edward; Moya, Douglas
2005-03-01
Rationality is the universal invariant among human behavior, universe physical laws and ordered and complex biological systems. Econophysics isboth the use of physical concepts in Finance and Economics, and the use of Information Economics in Physics. In special, we will show that it is possible to obtain the Quantum Mechanics principles using Information and Game Theory.
Information Theory for Information Science: Antecedents, Philosophy, and Applications
Losee, Robert M.
2017-01-01
This paper provides an historical overview of the theoretical antecedents leading to information theory, specifically those useful for understanding and teaching information science and systems. Information may be discussed in a philosophical manner and at the same time be measureable. This notion of information can thus be the subject of…
An information theory account of cognitive control.
Fan, Jin
2014-01-01
Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.
An information theory account of cognitive control
Directory of Open Access Journals (Sweden)
Jin eFan
2014-09-01
Full Text Available Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.
Processing Information in Quantum Decision Theory
Yukalov, V. I.; Sornette, D.
2008-01-01
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...
Theory of Neural Information Processing Systems
International Nuclear Information System (INIS)
Galla, Tobias
2006-01-01
It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the
Information theory and coding solved problems
Ivaniš, Predrag
2017-01-01
This book is offers a comprehensive overview of information theory and error control coding, using a different approach then in existed literature. The chapters are organized according to the Shannon system model, where one block affects the others. A relatively brief theoretical introduction is provided at the beginning of every chapter, including a few additional examples and explanations, but without any proofs. And a short overview of some aspects of abstract algebra is given at the end of the corresponding chapters. The characteristic complex examples with a lot of illustrations and tables are chosen to provide detailed insights into the nature of the problem. Some limiting cases are presented to illustrate the connections with the theoretical bounds. The numerical values are carefully selected to provide in-depth explanations of the described algorithms. Although the examples in the different chapters can be considered separately, they are mutually connected and the conclusions for one considered proble...
Do Economic Theories Inform Policy?
DEFF Research Database (Denmark)
Bartalevich, Dzmitry
that address EU antitrust rules and EU merger control. The second article is exploratory; it narrows the focus on EU merger control and employs descriptive network analysis to investigate the overall composition of mergers cleared by the Commission during the period 2004– 2015 and attempts to reinforce...... the results of the analysis in the first article. The third article expands on the findings of the first and second articles and employs inferential network analysis with exponential random graph models to analyze, on the basis of Commission merger cases cleared during the period 2004–2015, whether...... the Harvard School, the Freiburg School, and considerations for Single Market integration underpin EU merger control, in addition to the influence of the Chicago School. The analysis presented in the articles suggests that the Chicago School has exerted considerable influence over EU competition policy...
The Friction Theory for Viscosity Modeling
DEFF Research Database (Denmark)
Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan
2001-01-01
, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...
Directory of Open Access Journals (Sweden)
Paul Walton
2014-09-01
Full Text Available This paper uses an approach drawn from the ideas of computer systems modelling to produce a model for information itself. The model integrates evolutionary, static and dynamic views of information and highlights the relationship between symbolic content and the physical world. The model includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction. The concepts developed in the model enable a richer understanding of Floridi’s questions “what is information?” and “the informational circle: how can information be assessed?” (which he numbers P1 and P12.
Towards a critical theory of information
Directory of Open Access Journals (Sweden)
Christian Fuchs
2009-11-01
The debate on redistribution and recognition between critical theorists Nancy Fraser and Axel Honneth gives the opportunity to renew the discussion of the relationship of base and superstructure in critical social theory. Critical information theory needs to be aware of economic, political, and cultural demands that it needs to make in struggles for ending domination and oppression, and of the unifying role that the economy and class play in these demands and struggles. Objective and subjective information concepts are based on the underlying worldview of reification. Reification endangers human existence. Information as process and relation enables political and ethical alternatives that have radical implications for society.
Activity System Theory Approach to Healthcare Information System
Bai, Guohua
2004-01-01
Healthcare information system is a very complex system and has to be approached from systematic perspectives. This paper presents an Activity System Theory (ATS) approach by integrating system thinking and social psychology. First part of the paper, the activity system theory is presented, especially a recursive model of human activity system is introduced. A project ‘Integrated Mobile Information System for Diabetic Healthcare (IMIS)’ is then used to demonstrate a practical application of th...
Information theory based approaches to cellular signaling.
Waltermann, Christian; Klipp, Edda
2011-10-01
Cells interact with their environment and they have to react adequately to internal and external changes such changes in nutrient composition, physical properties like temperature or osmolarity and other stresses. More specifically, they must be able to evaluate whether the external change is significant or just in the range of noise. Based on multiple external parameters they have to compute an optimal response. Cellular signaling pathways are considered as the major means of information perception and transmission in cells. Here, we review different attempts to quantify information processing on the level of individual cells. We refer to Shannon entropy, mutual information, and informal measures of signaling pathway cross-talk and specificity. Information theory in systems biology has been successfully applied to identification of optimal pathway structures, mutual information and entropy as system response in sensitivity analysis, and quantification of input and output information. While the study of information transmission within the framework of information theory in technical systems is an advanced field with high impact in engineering and telecommunication, its application to biological objects and processes is still restricted to specific fields such as neuroscience, structural and molecular biology. However, in systems biology dealing with a holistic understanding of biochemical systems and cellular signaling only recently a number of examples for the application of information theory have emerged. This article is part of a Special Issue entitled Systems Biology of Microorganisms. Copyright © 2011 Elsevier B.V. All rights reserved.
Reasonable fermionic quantum information theories require relativity
International Nuclear Information System (INIS)
Friis, Nicolai
2016-01-01
We show that any quantum information theory based on anticommuting operators must be supplemented by a superselection rule deeply rooted in relativity to establish a reasonable notion of entanglement. While quantum information may be encoded in the fermionic Fock space, the unrestricted theory has a peculiar feature: the marginals of bipartite pure states need not have identical entropies, which leads to an ambiguous definition of entanglement. We solve this problem, by proving that it is removed by relativity, i.e., by the parity superselection rule that arises from Lorentz invariance via the spin-statistics connection. Our results hence unveil a fundamental conceptual inseparability of quantum information and the causal structure of relativistic field theory. (paper)
Information theory, spectral geometry, and quantum gravity.
Kempf, Achim; Martin, Robert
2008-01-18
We show that there exists a deep link between the two disciplines of information theory and spectral geometry. This allows us to obtain new results on a well-known quantum gravity motivated natural ultraviolet cutoff which describes an upper bound on the spatial density of information. Concretely, we show that, together with an infrared cutoff, this natural ultraviolet cutoff beautifully reduces the path integral of quantum field theory on curved space to a finite number of ordinary integrations. We then show, in particular, that the subsequent removal of the infrared cutoff is safe.
Ranger, Jochen; Kuhn, Jörg-Tobias; Szardenings, Carsten
2016-05-01
Psychological tests are usually analysed with item response models. Recently, some alternative measurement models have been proposed that were derived from cognitive process models developed in experimental psychology. These models consider the responses but also the response times of the test takers. Two such models are the Q-diffusion model and the D-diffusion model. Both models can be calibrated with the diffIRT package of the R statistical environment via marginal maximum likelihood (MML) estimation. In this manuscript, an alternative approach to model calibration is proposed. The approach is based on weighted least squares estimation and parallels the standard estimation approach in structural equation modelling. Estimates are determined by minimizing the discrepancy between the observed and the implied covariance matrix. The estimator is simple to implement, consistent, and asymptotically normally distributed. Least squares estimation also provides a test of model fit by comparing the observed and implied covariance matrix. The estimator and the test of model fit are evaluated in a simulation study. Although parameter recovery is good, the estimator is less efficient than the MML estimator. © 2016 The British Psychological Society.
Writing, Proofreading and Editing in Information Theory
Directory of Open Access Journals (Sweden)
J. Ricardo Arias-Gonzalez
2018-05-01
Full Text Available Information is a physical entity amenable to be described by an abstract theory. The concepts associated with the creation and post-processing of the information have not, however, been mathematically established, despite being broadly used in many fields of knowledge. Here, inspired by how information is managed in biomolecular systems, we introduce writing, entailing any bit string generation, and revision, as comprising proofreading and editing, in information chains. Our formalism expands the thermodynamic analysis of stochastic chains made up of material subunits to abstract strings of symbols. We introduce a non-Markovian treatment of operational rules over the symbols of the chain that parallels the physical interactions responsible for memory effects in material chains. Our theory underlies any communication system, ranging from human languages and computer science to gene evolution.
Geometrical identification of quantum and information theories
International Nuclear Information System (INIS)
Caianiello, E.R.
1983-01-01
The interrelation of quantum and information theories is investigation on the base of the conception of cross-entropy. It is assumed that ''complex information geometry'' may serve as a tool for ''technological transfer'' from one research field to the other which is not connected directly with the first one. It is pointed out that the ''infinitesimal distance'' ds 2 and ''infinitesimal cross-entropy'' dHsub(c) coincide
Structural information theory and visual form
Leeuwenberg, E.L.J.; Kaernbach, C.; Schroeger, E.; Mueller, H.
2003-01-01
The paper attends to basic characteristics of visual form as approached by Structural information theory, or SIT, (Leeuwenberg, Van der Helm and Van Lier). The introduction provides a global survey of this approach. The main part of the paper focuses on three characteristics of SIT. Each one is made
A THEORY OF MAXIMIZING SENSORY INFORMATION
Hateren, J.H. van
1992-01-01
A theory is developed on the assumption that early sensory processing aims at maximizing the information rate in the channels connecting the sensory system to more central parts of the brain, where it is assumed that these channels are noisy and have a limited dynamic range. Given a stimulus power
Lectures on algebraic model theory
Hart, Bradd
2001-01-01
In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.
Model integration and a theory of models
Dolk, Daniel R.; Kottemann, Jeffrey E.
1993-01-01
Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...
Automated image segmentation using information theory
International Nuclear Information System (INIS)
Hibbard, L.S.
2001-01-01
Full text: Our development of automated contouring of CT images for RT planning is based on maximum a posteriori (MAP) analyses of region textures, edges, and prior shapes, and assumes stationary Gaussian distributions for voxel textures and contour shapes. Since models may not accurately represent image data, it would be advantageous to compute inferences without relying on models. The relative entropy (RE) from information theory can generate inferences based solely on the similarity of probability distributions. The entropy of a distribution of a random variable X is defined as -Σ x p(x)log 2 p(x) for all the values x which X may assume. The RE (Kullback-Liebler divergence) of two distributions p(X), q(X), over X is Σ x p(x)log 2 {p(x)/q(x)}. The RE is a kind of 'distance' between p,q, equaling zero when p=q and increasing as p,q are more different. Minimum-error MAP and likelihood ratio decision rules have RE equivalents: minimum error decisions obtain with functions of the differences between REs of compared distributions. One applied result is the contour ideally separating two regions is that which maximizes the relative entropy of the two regions' intensities. A program was developed that automatically contours the outlines of patients in stereotactic headframes, a situation most often requiring manual drawing. The relative entropy of intensities inside the contour (patient) versus outside (background) was maximized by conjugate gradient descent over the space of parameters of a deformable contour. shows the computed segmentation of a patient from headframe backgrounds. This program is particularly useful for preparing images for multimodal image fusion. Relative entropy and allied measures of distribution similarity provide automated contouring criteria that do not depend on statistical models of image data. This approach should have wide utility in medical image segmentation applications. Copyright (2001) Australasian College of Physical Scientists and
Quantum theory informational foundations and foils
Spekkens, Robert
2016-01-01
This book provides the first unified overview of the burgeoning research area at the interface between Quantum Foundations and Quantum Information. Topics include: operational alternatives to quantum theory, information-theoretic reconstructions of the quantum formalism, mathematical frameworks for operational theories, and device-independent features of the set of quantum correlations. Powered by the injection of fresh ideas from the field of Quantum Information and Computation, the foundations of Quantum Mechanics are in the midst of a renaissance. The last two decades have seen an explosion of new results and research directions, attracting broad interest in the scientific community. The variety and number of different approaches, however, makes it challenging for a newcomer to obtain a big picture of the field and of its high-level goals. Here, fourteen original contributions from leading experts in the field cover some of the most promising research directions that have emerged in the new wave of quant...
An information integration theory of consciousness
Directory of Open Access Journals (Sweden)
Tononi Giulio
2004-11-01
Full Text Available Abstract Background Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition? Presentation of the hypothesis This paper presents a theory about what consciousness is and how it can be measured. According to the theory, consciousness corresponds to the capacity of a system to integrate information. This claim is motivated by two key phenomenological properties of consciousness: differentiation – the availability of a very large number of conscious experiences; and integration – the unity of each such experience. The theory states that the quantity of consciousness available to a system can be measured as the Φ value of a complex of elements. Φ is the amount of causally effective information that can be integrated across the informational weakest link of a subset of elements. A complex is a subset of elements with Φ>0 that is not part of a subset of higher Φ. The theory also claims that the quality of consciousness is determined by the informational relationships among the elements of a complex, which are specified by the values of effective information among them. Finally, each particular conscious experience is specified by the value, at any given time, of the variables mediating informational interactions among the elements of a complex. Testing the hypothesis The information integration theory accounts, in a principled manner, for several neurobiological observations
Warped models in string theory
International Nuclear Information System (INIS)
Acharya, B.S.; Benini, F.; Valandro, R.
2006-12-01
Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)
Fort, Hugo
2017-01-01
We derive an analytical approximation for making quantitative predictions for ecological communities as a function of the mean intensity of the inter-specific competition and the species richness. This method, with only a fraction of the model parameters (carrying capacities and competition coefficients), is able to predict accurately empirical measurements covering a wide variety of taxa (algae, plants, protozoa).
Algorithmic information theory mathematics of digital information processing
Seibt, Peter
2007-01-01
Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.
Information theory and stochastics for multiscale nonlinear systems
Majda, Andrew J; Grote, Marcus J
2005-01-01
This book introduces mathematicians to the fascinating emerging mathematical interplay between ideas from stochastics and information theory and important practical issues in studying complex multiscale nonlinear systems. It emphasizes the serendipity between modern applied mathematics and applications where rigorous analysis, the development of qualitative and/or asymptotic models, and numerical modeling all interact to explain complex phenomena. After a brief introduction to the emerging issues in multiscale modeling, the book has three main chapters. The first chapter is an introduction to information theory with novel applications to statistical mechanics, predictability, and Jupiter's Red Spot for geophysical flows. The second chapter discusses new mathematical issues regarding fluctuation-dissipation theorems for complex nonlinear systems including information flow, various approximations, and illustrates applications to various mathematical models. The third chapter discusses stochastic modeling of com...
An, Ji-Young
2006-01-01
The purpose of this web-based study was to explain and predict consumers' acceptance and usage behavior of Internet health information and services. Toward this goal, the Information and Communication Technology Acceptance Model (ICTAM) was developed and tested. Individuals who received a flyer through the LISTSERV of HealthGuide were eligible to participate. The study population was eighteen years old and older who had used Internet health information and services for a minimum of 6 months. For the analyses, SPSS (version 13.0) and AMOS (version 5.0) were employed. More than half of the respondents were women (n = 110, 55%). The average age of the respondents was 35.16 years (S.D. = 10.07). A majority reported at least some college education (n = 126, 63%). All of the observed factors accounted for 75.53% of the total variance explained. The fit indices of the structural model were within an acceptable range: chi2/df = 2.38 (chi2 = 1786.31, df = 752); GFI = .71; RMSEA = .08; CFI = .86; NFI = .78. The results of this study provide empirical support for the continued development of ICTAM in the area of health consumers' information and communication technology acceptance.
Quantum information theory and quantum statistics
International Nuclear Information System (INIS)
Petz, D.
2008-01-01
Based on lectures given by the author, this book focuses on providing reliable introductory explanations of key concepts of quantum information theory and quantum statistics - rather than on results. The mathematically rigorous presentation is supported by numerous examples and exercises and by an appendix summarizing the relevant aspects of linear analysis. Assuming that the reader is familiar with the content of standard undergraduate courses in quantum mechanics, probability theory, linear algebra and functional analysis, the book addresses graduate students of mathematics and physics as well as theoretical and mathematical physicists. Conceived as a primer to bridge the gap between statistical physics and quantum information, a field to which the author has contributed significantly himself, it emphasizes concepts and thorough discussions of the fundamental notions to prepare the reader for deeper studies, not least through the selection of well chosen exercises. (orig.)
The informationally-complete quantum theory
Chen, Zeng-Bing
2014-01-01
Quantum mechanics is a cornerstone of our current understanding of nature and extremely successful in describing physics covering a huge range of scales. However, its interpretation remains controversial since the early days of quantum mechanics. What does a quantum state really mean? Is there any way out of the so-called quantum measurement problem? Here we present an informationally-complete quantum theory (ICQT) and the trinary property of nature to beat the above problems. We assume that ...
Information theory of open fragmenting systems
International Nuclear Information System (INIS)
Gulminelli, F.; Juillet, O.; Chomaz, Ph.; Ison, M. J.; Dorso, C. O.
2007-01-01
An information theory description of finite systems explicitly evolving in time is presented. We impose a MaxEnt variational principle on the Shannon entropy at a given time while the constraints are set at a former time. The resulting density matrix contains explicit time odd components in the form of collective flows. As a specific application we consider the dynamics of the expansion in connection with heavy ion experiments. Lattice gas and classical molecular dynamics simulations are shown
Final Summary: Genre Theory in Information Studies
DEFF Research Database (Denmark)
Andersen, Jack
2015-01-01
Purpose This chapter offers a re-description of knowledge organization in light of genre and activity theory. Knowledge organization needs a new description in order to account for those activities and practices constituting and causing concrete knowledge organization activity. Genre and activity...... informing and shaping concrete forms of knowledge organization activity. With this, we are able to understand how knowledge organization activity also contributes to construct genre and activity systems and not only aid them....
Model Theory in Algebra, Analysis and Arithmetic
Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J
2014-01-01
Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.
Stahl, Robert J.
This review of the current status of the human information processing model presents the Stahl Perceptual Information Processing and Operations Model (SPInPrOM) as a model of how thinking, memory, and the processing of information take place within the individual learner. A related system, the Domain of Cognition, is presented as an alternative to…
Multimedia information retrieval theory and techniques
Raieli, Roberto
2013-01-01
Novel processing and searching tools for the management of new multimedia documents have developed. Multimedia Information Retrieval (MMIR) is an organic system made up of Text Retrieval (TR); Visual Retrieval (VR); Video Retrieval (VDR); and Audio Retrieval (AR) systems. So that each type of digital document may be analysed and searched by the elements of language appropriate to its nature, search criteria must be extended. Such an approach is known as the Content Based Information Retrieval (CBIR), and is the core of MMIR. This novel content-based concept of information handling needs to be integrated with more traditional semantics. Multimedia Information Retrieval focuses on the tools of processing and searching applicable to the content-based management of new multimedia documents. Translated from Italian by Giles Smith, the book is divided in to two parts. Part one discusses MMIR and related theories, and puts forward new methodologies; part two reviews various experimental and operating MMIR systems, a...
1982-08-01
accomplish the task, (2) the instrumentality of task performance for job outcomes, and (3) the instrumentality of outcomes for need satisfaction . We...in this discussion: effort, performance , outcomes, and needs. In order to present briefly the conventional approach to the Vroom models, another...Presumably, this is the final event in the sequence of effort, performance , outcome, and need satisfaction . The actual research reported in expectancy
Information Foraging Theory: A Framework for Intelligence Analysis
2014-11-01
oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT
Modelling Choice of Information Sources
Directory of Open Access Journals (Sweden)
Agha Faisal Habib Pathan
2013-04-01
Full Text Available This paper addresses the significance of traveller information sources including mono-modal and multimodal websites for travel decisions. The research follows a decision paradigm developed earlier, involving an information acquisition process for travel choices, and identifies the abstract characteristics of new information sources that deserve further investigation (e.g. by incorporating these in models and studying their significance in model estimation. A Stated Preference experiment is developed and the utility functions are formulated by expanding the travellers' choice set to include different combinations of sources of information. In order to study the underlying choice mechanisms, the resulting variables are examined in models based on different behavioural strategies, including utility maximisation and minimising the regret associated with the foregone alternatives. This research confirmed that RRM (Random Regret Minimisation Theory can fruitfully be used and can provide important insights for behavioural studies. The study also analyses the properties of travel planning websites and establishes a link between travel choices and the content, provenance, design, presence of advertisements, and presentation of information. The results indicate that travellers give particular credence to governmentowned sources and put more importance on their own previous experiences than on any other single source of information. Information from multimodal websites is more influential than that on train-only websites. This in turn is more influential than information from friends, while information from coachonly websites is the least influential. A website with less search time, specific information on users' own criteria, and real time information is regarded as most attractive
Refreshing Information Literacy: Learning from Recent British Information Literacy Models
Martin, Justine
2013-01-01
Models play an important role in helping practitioners implement and promote information literacy. Over time models can lose relevance with the advances in technology, society, and learning theory. Practitioners and scholars often call for adaptations or transformations of these frameworks to articulate the learning needs in information literacy…
Graphical Model Theory for Wireless Sensor Networks
International Nuclear Information System (INIS)
Davis, William B.
2002-01-01
Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm
Public Management Information Systems: Theory and Prescription.
Bozeman, Barry; Bretschneider, Stuart
1986-01-01
The existing theoretical framework for research in management information systems (MIS) is criticized for its lack of attention to the external environment of organizations, and a new framework is developed which better accommodates MIS in public organizations: public management information systems. Four models of publicness that reflect external…
Astrophysical data analysis with information field theory
International Nuclear Information System (INIS)
Enßlin, Torsten
2014-01-01
Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented
Astrophysical data analysis with information field theory
Enßlin, Torsten
2014-12-01
Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.
Astrophysical data analysis with information field theory
Energy Technology Data Exchange (ETDEWEB)
Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)
2014-12-05
Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.
Quantum information theory. Mathematical foundation. 2. ed.
International Nuclear Information System (INIS)
Hayashi, Masahito
2017-01-01
This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics - all of which are addressed here - made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an improved method for the evaluation of leaked information and identifies a remarkable relation between quantum security and quantum coherence. Taken together, these two improvements allow a better analysis of quantum state transmission. In addition, various types of the newly discovered uncertainty relation are explained. Presenting a wealth of new developments, the book introduces readers to the latest advances and challenges in quantum information. To aid in understanding, each chapter is accompanied by a set of exercises and solutions.
Quantum information theory. Mathematical foundation. 2. ed.
Energy Technology Data Exchange (ETDEWEB)
Hayashi, Masahito [Nagoya Univ. (Japan). Graduate School of Mathematics
2017-07-01
This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics - all of which are addressed here - made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an improved method for the evaluation of leaked information and identifies a remarkable relation between quantum security and quantum coherence. Taken together, these two improvements allow a better analysis of quantum state transmission. In addition, various types of the newly discovered uncertainty relation are explained. Presenting a wealth of new developments, the book introduces readers to the latest advances and challenges in quantum information. To aid in understanding, each chapter is accompanied by a set of exercises and solutions.
Fundamentals of information theory and coding design
Togneri, Roberto
2003-01-01
In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.
Quantum information theory with Gaussian systems
Energy Technology Data Exchange (ETDEWEB)
Krueger, O.
2006-04-06
This thesis applies ideas and concepts from quantum information theory to systems of continuous-variables such as the quantum harmonic oscillator. The focus is on three topics: the cloning of coherent states, Gaussian quantum cellular automata and Gaussian private channels. Cloning was investigated both for finite-dimensional and for continuous-variable systems. We construct a private quantum channel for the sequential encryption of coherent states with a classical key, where the key elements have finite precision. For the case of independent one-mode input states, we explicitly estimate this precision, i.e. the number of key bits needed per input state, in terms of these parameters. (orig.)
Quantum information theory with Gaussian systems
International Nuclear Information System (INIS)
Krueger, O.
2006-01-01
This thesis applies ideas and concepts from quantum information theory to systems of continuous-variables such as the quantum harmonic oscillator. The focus is on three topics: the cloning of coherent states, Gaussian quantum cellular automata and Gaussian private channels. Cloning was investigated both for finite-dimensional and for continuous-variable systems. We construct a private quantum channel for the sequential encryption of coherent states with a classical key, where the key elements have finite precision. For the case of independent one-mode input states, we explicitly estimate this precision, i.e. the number of key bits needed per input state, in terms of these parameters. (orig.)
Feminist Praxis, Critical Theory and Informal Hierarchies
Directory of Open Access Journals (Sweden)
Eva Giraud
2015-05-01
Full Text Available This article draws on my experiences teaching across two undergraduate media modules in a UK research-intensive institution to explore tactics for combatting both institutional and informal hierarchies within university teaching contexts. Building on Sara Motta’s (2012 exploration of implementing critical pedagogic principles at postgraduate level in an elite university context, I discuss additional tactics for combatting these hierarchies in undergraduate settings, which were developed by transferring insights derived from informal workshops led by the University of Nottingham’s Feminism and Teaching network into the classroom. This discussion is framed in relation to the concepts of “cyborg pedagogies” and “political semiotics of articulation,” derived from the work of Donna Haraway, in order to theorize how these tactics can engender productive relationships between radical pedagogies and critical theory.
Quantum Gravity, Information Theory and the CMB
Kempf, Achim
2018-04-01
We review connections between the metric of spacetime and the quantum fluctuations of fields. We start with the finding that the spacetime metric can be expressed entirely in terms of the 2-point correlator of the fluctuations of quantum fields. We then discuss the open question whether the knowledge of only the spectra of the quantum fluctuations of fields also suffices to determine the spacetime metric. This question is of interest because spectra are geometric invariants and their quantization would, therefore, have the benefit of not requiring the modding out of diffeomorphisms. Further, we discuss the fact that spacetime at the Planck scale need not necessarily be either discrete or continuous. Instead, results from information theory show that spacetime may be simultaneously discrete and continuous in the same way that information can. Finally, we review the recent finding that a covariant natural ultraviolet cutoff at the Planck scale implies a signature in the cosmic microwave background (CMB) that may become observable.
An information theory of image gathering
Fales, Carl L.; Huck, Friedrich O.
1991-01-01
Shannon's mathematical theory of communication is extended to image gathering. Expressions are obtained for the total information that is received with a single image-gathering channel and with parallel channels. It is concluded that the aliased signal components carry information even though these components interfere with the within-passband components in conventional image gathering and restoration, thereby degrading the fidelity and visual quality of the restored image. An examination of the expression for minimum mean-square-error, or Wiener-matrix, restoration from parallel image-gathering channels reveals a method for unscrambling the within-passband and aliased signal components to restore spatial frequencies beyond the sampling passband out to the spatial frequency response cutoff of the optical aperture.
Information theory perspective on network robustness
International Nuclear Information System (INIS)
Schieber, Tiago A.; Carpi, Laura; Frery, Alejandro C.; Rosso, Osvaldo A.; Pardalos, Panos M.; Ravetti, Martín G.
2016-01-01
A crucial challenge in network theory is the study of the robustness of a network when facing a sequence of failures. In this work, we propose a dynamical definition of network robustness based on Information Theory, that considers measurements of the structural changes caused by failures of the network's components. Failures are defined here as a temporal process defined in a sequence. Robustness is then evaluated by measuring dissimilarities between topologies after each time step of the sequence, providing a dynamical information about the topological damage. We thoroughly analyze the efficiency of the method in capturing small perturbations by considering different probability distributions on networks. In particular, we find that distributions based on distances are more consistent in capturing network structural deviations, as better reflect the consequences of the failures. Theoretical examples and real networks are used to study the performance of this methodology. - Highlights: • A novel methodology to measure the robustness of a network to component failure or targeted attacks is proposed. • The use of the network's distance PDF allows a precise analysis. • The method provides a dynamic robustness profile showing the response of the topology to each failure event. • The measure is capable to detect network's critical elements.
Minisuperspace models in histories theory
International Nuclear Information System (INIS)
Anastopoulos, Charis; Savvidou, Ntina
2005-01-01
We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context
Observational information for f(T) theories and dark torsion
Energy Technology Data Exchange (ETDEWEB)
Bengochea, Gabriel R., E-mail: gabriel@iafe.uba.a [Instituto de Astronomia y Fisica del Espacio (IAFE), CC 67, Suc. 28, 1428 Buenos Aires (Argentina)
2011-01-17
In the present work we analyze and compare the information coming from different observational data sets in the context of a sort of f(T) theories. We perform a joint analysis with measurements of the most recent type Ia supernovae (SNe Ia), Baryon Acoustic Oscillation (BAO), Cosmic Microwave Background radiation (CMB), Gamma-Ray Bursts data (GRBs) and Hubble parameter observations (OHD) to constraint the only new parameter these theories have. It is shown that when the new combined BAO/CMB parameter is used to put constraints, the result is different from previous works. We also show that when we include Observational Hubble Data (OHD) the simpler {Lambda}CDM model is excluded to one sigma level, leading the effective equation of state of these theories to be of phantom type. Also, analyzing a tension criterion for SNe Ia and other observational sets, we obtain more consistent and better suited data sets to work with these theories.
Informal Theory: The Ignored Link in Theory-to-Practice
Love, Patrick
2012-01-01
Applying theory to practice in student affairs is dominated by the assumption that formal theory is directly applied to practice. Among the problems with this assumption is that many practitioners believe they must choose between their lived experiences and formal theory, and that graduate students are taught that their experience "does not…
Building information modelling (BIM)
CSIR Research Space (South Africa)
Conradie, Dirk CU
2009-02-01
Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...
Intuitive theories of information: beliefs about the value of redundancy.
Soll, J B
1999-03-01
In many situations, quantity estimates from multiple experts or diagnostic instruments must be collected and combined. Normatively, and all else equal, one should value information sources that are nonredundant, in the sense that correlation in forecast errors should be minimized. Past research on the preference for redundancy has been inconclusive. While some studies have suggested that people correctly place higher value on uncorrelated inputs when collecting estimates, others have shown that people either ignore correlation or, in some cases, even prefer it. The present experiments show that the preference for redundancy depends on one's intuitive theory of information. The most common intuitive theory identified is the Error Tradeoff Model (ETM), which explicitly distinguishes between measurement error and bias. According to ETM, measurement error can only be averaged out by consulting the same source multiple times (normatively false), and bias can only be averaged out by consulting different sources (normatively true). As a result, ETM leads people to prefer redundant estimates when the ratio of measurement error to bias is relatively high. Other participants favored different theories. Some adopted the normative model, while others were reluctant to mathematically average estimates from different sources in any circumstance. In a post hoc analysis, science majors were more likely than others to subscribe to the normative model. While tentative, this result lends insight into how intuitive theories might develop and also has potential ramifications for how statistical concepts such as correlation might best be learned and internalized. Copyright 1999 Academic Press.
Informal Risk Perceptions and Formal Theory
International Nuclear Information System (INIS)
Cayford, Jerry
2001-01-01
Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to a qualitative
Informal Risk Perceptions and Formal Theory
Energy Technology Data Exchange (ETDEWEB)
Cayford, Jerry [Resources for the Future, Washington, DC (United States)
2001-07-01
Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to
Foundations of compositional model theory
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim
2011-01-01
Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf
Cyber Power Theory First, Then Information Operations
National Research Council Canada - National Science Library
Smart, Antoinette G
2001-01-01
...) seems disconcerting, at least on the surface. Think tanks, government research organizations, and learned individuals have all pointed to the need for a viable theory of IO, yet no such theory has emerged...
Superfield theory and supermatrix model
International Nuclear Information System (INIS)
Park, Jeong-Hyuck
2003-01-01
We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)
Novel information theory techniques for phonon spectroscopy
International Nuclear Information System (INIS)
Hague, J P
2007-01-01
The maximum entropy method (MEM) and spectral reverse Monte Carlo (SRMC) techniques are applied to the determination of the phonon density of states (PDOS) from heat-capacity data. The approach presented here takes advantage of the standard integral transform relating the PDOS with the specific heat at constant volume. MEM and SRMC are highly successful numerical approaches for inverting integral transforms. The formalism and algorithms necessary to carry out the inversion of specific heat curves are introduced, and where possible, I have concentrated on algorithms and experimental details for practical usage. Simulated data are used to demonstrate the accuracy of the approach. The main strength of the techniques presented here is that the resulting spectra are always physical: Computed PDOS is always positive and properly applied information theory techniques only show statistically significant detail. The treatment set out here provides a simple, cost-effective and reliable method to determine phonon properties of new materials. In particular, the new technique is expected to be very useful for establishing where interesting phonon modes and properties can be found, before spending time at large scale facilities
Steponavičius, Raimundas; Thennadil, Suresh N
2011-03-15
The effectiveness of a scatter correction approach based on decoupling absorption and scattering effects through the use of the radiative transfer theory to invert a suitable set of measurements is studied by considering a model multicomponent suspension. The method was used in conjunction with partial least-squares regression to build calibration models for estimating the concentration of two types of analytes: an absorbing (nonscattering) species and a particulate (absorbing and scattering) species. The performances of the models built by this approach were compared with those obtained by applying empirical scatter correction approaches to diffuse reflectance, diffuse transmittance, and collimated transmittance measurements. It was found that the method provided appreciable improvement in model performance for the prediction of both types of analytes. The study indicates that, as long as the bulk absorption spectra are accurately extracted, no further empirical preprocessing to remove light scattering effects is required.
National Research Council Canada - National Science Library
Harmon, Scott
2001-01-01
This project accomplished all of its objectives: document a theory of information physics, conduct a workshop on planing experiments to test this theory, and design experiments that validate this theory...
Models in cooperative game theory
Branzei, Rodica; Tijs, Stef
2008-01-01
This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.
Critical Theory and Information Studies: A Marcusean Infusion
Pyati, Ajit K.
2006-01-01
In the field of library and information science, also known as information studies, critical theory is often not included in debates about the discipline's theoretical foundations. This paper argues that the critical theory of Herbert Marcuse, in particular, has a significant contribution to make to the field of information studies. Marcuse's…
Field theory and the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Dudas, E [Orsay, LPT (France)
2014-07-01
This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.
Lattice models and conformal field theories
International Nuclear Information System (INIS)
Saleur, H.
1988-01-01
Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied
Information theoretic resources in quantum theory
Meznaric, Sebastian
Resource identification and quantification is an essential element of both classical and quantum information theory. Entanglement is one of these resources, arising when quantum communication and nonlocal operations are expensive to perform. In the first part of this thesis we quantify the effective entanglement when operations are additionally restricted to account for both fundamental restrictions on operations, such as those arising from superselection rules, as well as experimental errors arising from the imperfections in the apparatus. For an important class of errors we find a linear relationship between the usual and effective higher dimensional generalization of concurrence, a measure of entanglement. Following the treatment of effective entanglement, we focus on a related concept of nonlocality in the presence of superselection rules (SSR). Here we propose a scheme that may be used to activate nongenuinely multipartite nonlocality, in that a single copy of a state is not multipartite nonlocal, while two or more copies exhibit nongenuinely multipartite nonlocality. The states used exhibit the more powerful genuinely multipartite nonlocality when SSR are not enforced, but not when they are, raising the question of what is needed for genuinely multipartite nonlocality. We show that whenever the number of particles is insufficient, the degrading of genuinely multipartite to nongenuinely multipartite nonlocality is necessary. While in the first few chapters we focus our attention on understanding the resources present in quantum states, in the final part we turn the picture around and instead treat operations themselves as a resource. We provide our observers with free access to classical operations - ie. those that cannot detect or generate quantum coherence. We show that the operation of interest can then be used to either generate or detect quantum coherence if and only if it violates a particular commutation relation. Using the relative entropy, the
Effective field theory and the quark model
International Nuclear Information System (INIS)
Durand, Loyal; Ha, Phuoc; Jaczko, Gregory
2001-01-01
We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections
Using theories of behaviour change to inform interventions for addictive behaviours.
Webb, Thomas L; Sniehotta, Falko F; Michie, Susan
2010-11-01
This paper reviews a set of theories of behaviour change that are used outside the field of addiction and considers their relevance for this field. Ten theories are reviewed in terms of (i) the main tenets of each theory, (ii) the implications of the theory for promoting change in addictive behaviours and (iii) studies in the field of addiction that have used the theory. An augmented feedback loop model based on Control Theory is used to organize the theories and to show how different interventions might achieve behaviour change. Briefly, each theory provided the following recommendations for intervention: Control Theory: prompt behavioural monitoring, Goal-Setting Theory: set specific and challenging goals, Model of Action Phases: form 'implementation intentions', Strength Model of Self-Control: bolster self-control resources, Social Cognition Models (Protection Motivation Theory, Theory of Planned Behaviour, Health Belief Model): modify relevant cognitions, Elaboration Likelihood Model: consider targets' motivation and ability to process information, Prototype Willingness Model: change perceptions of the prototypical person who engages in behaviour and Social Cognitive Theory: modify self-efficacy. There are a range of theories in the field of behaviour change that can be applied usefully to addiction, each one pointing to a different set of modifiable determinants and/or behaviour change techniques. Studies reporting interventions should describe theoretical basis, behaviour change techniques and mode of delivery accurately so that effective interventions can be understood and replicated. © 2010 The Authors. Journal compilation © 2010 Society for the Study of Addiction.
Directory of Open Access Journals (Sweden)
Abe D Hofman
Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.
Halo modelling in chameleon theories
Energy Technology Data Exchange (ETDEWEB)
Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)
2014-03-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.
Halo modelling in chameleon theories
International Nuclear Information System (INIS)
Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu
2014-01-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations
Stochastic models: theory and simulation.
Energy Technology Data Exchange (ETDEWEB)
Field, Richard V., Jr.
2008-03-01
Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.
Entanglement dynamics in quantum information theory
Energy Technology Data Exchange (ETDEWEB)
Cubitt, T.S.
2007-03-29
This thesis contributes to the theory of entanglement dynamics, that is, the behaviour of entanglement in systems that are evolving with time. Progressively more complex multipartite systems are considered, starting with low-dimensional tripartite systems, whose entanglement dynamics can nonetheless display surprising properties, progressing through larger networks of interacting particles, and finishing with infinitely large lattice models. Firstly, what is perhaps the most basic question in entanglement dynamics is considered: what resources are necessary in order to create entanglement between distant particles? The answer is surprising: sending separable states between the parties is sufficient; entanglement can be created without it being carried by a ''messenger'' particle. The analogous result also holds in the continuous-time case: two particles interacting indirectly via a common ancilla particle can be entangled without the ancilla ever itself becoming entangled. The latter result appears to discount any notion of entanglement flow. However, for pure states, this intuitive idea can be recovered, and even made quantitative. A ''bottleneck'' inequality is derived that relates the entanglement rate of the end particles in a tripartite chain to the entanglement of the middle one. In particular, no entanglement can be created if the middle particle is not entangled. However, although this result can be applied to general interaction networks, it does not capture the full entanglement dynamics of these more complex systems. This is remedied by the derivation of entanglement rate equations, loosely analogous to the rate equations describing a chemical reaction. A complete set of rate equations for a system reflects the full structure of its interaction network, and can be used to prove a lower bound on the scaling with chain length of the time required to entangle the ends of a chain. Finally, in contrast with these more
Entanglement dynamics in quantum information theory
International Nuclear Information System (INIS)
Cubitt, T.S.
2007-01-01
This thesis contributes to the theory of entanglement dynamics, that is, the behaviour of entanglement in systems that are evolving with time. Progressively more complex multipartite systems are considered, starting with low-dimensional tripartite systems, whose entanglement dynamics can nonetheless display surprising properties, progressing through larger networks of interacting particles, and finishing with infinitely large lattice models. Firstly, what is perhaps the most basic question in entanglement dynamics is considered: what resources are necessary in order to create entanglement between distant particles? The answer is surprising: sending separable states between the parties is sufficient; entanglement can be created without it being carried by a ''messenger'' particle. The analogous result also holds in the continuous-time case: two particles interacting indirectly via a common ancilla particle can be entangled without the ancilla ever itself becoming entangled. The latter result appears to discount any notion of entanglement flow. However, for pure states, this intuitive idea can be recovered, and even made quantitative. A ''bottleneck'' inequality is derived that relates the entanglement rate of the end particles in a tripartite chain to the entanglement of the middle one. In particular, no entanglement can be created if the middle particle is not entangled. However, although this result can be applied to general interaction networks, it does not capture the full entanglement dynamics of these more complex systems. This is remedied by the derivation of entanglement rate equations, loosely analogous to the rate equations describing a chemical reaction. A complete set of rate equations for a system reflects the full structure of its interaction network, and can be used to prove a lower bound on the scaling with chain length of the time required to entangle the ends of a chain. Finally, in contrast with these more abstract results, the entanglement and
Vasconcelos, A.C.; Sen, B.A.; Rosa, A.; Ellis, D.
2012-01-01
This paper explores elaborations of Grounded Theory in relation to Arenas/Social Worlds Theory. The notions of arenas and social worlds were present in early applications of Grounded Theory but have not been as much used or recognised as the general Grounded Theory approach, particularly in the information studies field. The studies discussed here are therefore very unusual in information research. The empirical contexts of these studies are those of (1) the role of discourse in the organisat...
Applying Information Processing Theory to Supervision: An Initial Exploration
Tangen, Jodi L.; Borders, L. DiAnne
2017-01-01
Although clinical supervision is an educational endeavor (Borders & Brown, [Borders, L. D., 2005]), many scholars neglect theories of learning in working with supervisees. The authors describe 1 learning theory--information processing theory (Atkinson & Shiffrin, 1968, 1971; Schunk, 2016)--and the ways its associated interventions may…
Quiver gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Yagi, Junya
2015-01-01
We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.
Why hydrological predictions should be evaluated using information theory
Directory of Open Access Journals (Sweden)
S. V. Weijs
2010-12-01
Full Text Available Probabilistic predictions are becoming increasingly popular in hydrology. Equally important are methods to test such predictions, given the topical debate on uncertainty analysis in hydrology. Also in the special case of hydrological forecasting, there is still discussion about which scores to use for their evaluation. In this paper, we propose to use information theory as the central framework to evaluate predictions. From this perspective, we hope to shed some light on what verification scores measure and should measure. We start from the ''divergence score'', a relative entropy measure that was recently found to be an appropriate measure for forecast quality. An interpretation of a decomposition of this measure provides insight in additive relations between climatological uncertainty, correct information, wrong information and remaining uncertainty. When the score is applied to deterministic forecasts, it follows that these increase uncertainty to infinity. In practice, however, deterministic forecasts tend to be judged far more mildly and are widely used. We resolve this paradoxical result by proposing that deterministic forecasts either are implicitly probabilistic or are implicitly evaluated with an underlying decision problem or utility in mind. We further propose that calibration of models representing a hydrological system should be the based on information-theoretical scores, because this allows extracting all information from the observations and avoids learning from information that is not there. Calibration based on maximizing utility for society trains an implicit decision model rather than the forecasting system itself. This inevitably results in a loss or distortion of information in the data and more risk of overfitting, possibly leading to less valuable and informative forecasts. We also show this in an example. The final conclusion is that models should preferably be explicitly probabilistic and calibrated to maximize the
Role of information theoretic uncertainty relations in quantum theory
Energy Technology Data Exchange (ETDEWEB)
Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)
2015-04-15
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.
Role of information theoretic uncertainty relations in quantum theory
International Nuclear Information System (INIS)
Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo
2015-01-01
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed
USING INFORMATION THEORY TO DEFINE A SUSTAINABILITY INDEX
Information theory has many applications in Ecology and Environmental science, such as a biodiversity indicator, as a measure of evolution, a measure of distance from thermodynamic equilibrium, and as a measure of system organization. Fisher Information, in particular, provides a...
Optimizing Sparse Representations of Kinetic Distributions via Information Theory
2017-07-31
Information Theory Robert Martin and Daniel Eckhardt Air Force Research Laboratory (AFMC) AFRL/RQRS 1 Ara Drive Edwards AFB, CA 93524-7013 Air Force...momentum, energy, and physical entropy. N/A Unclassified Unclassified Unclassified SAR 7 Robert Martin N/A Research in Industrial Projects for Students...Journal of Computational Physics, vol. 145, no. 1, pp. 382 – 405, 1998. [7] R. S. Martin , H. Le, D. L. Bilyeu, and S. Gildea, “Plasma model V&V of
Epistemology as Information Theory: From Leibniz to Omega
Chaitin, G. J.
2005-01-01
In 1686 in his Discours de Metaphysique, Leibniz points out that if an arbitrarily complex theory is permitted then the notion of "theory" becomes vacuous because there is always a theory. This idea is developed in the modern theory of algorithmic information, which deals with the size of computer programs and provides a new view of Godel's work on incompleteness and Turing's work on uncomputability. Of particular interest is the halting probability Omega, whose bits are irreducible, i.e., ma...
ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL
Diah Hari Suryaningrum
2012-01-01
This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...
Turel, Ofir; Bechara, Antoine
2016-01-01
This study examines a behavioral tripartite model developed in the field of addiction, and applies it here to understanding general and impulsive information technology use. It suggests that technology use is driven by two information-processing brain systems: reflective and impulsive, and that their effects on use are modulated by interoceptive awareness processes. The resultant reflective-impulsive-interoceptive awareness model is tested in two behavioral studies. Both studies employ SEM techniques to time-lagged self-report data from n 1 = 300 and n 2 = 369 social networking site users. Study 1 demonstrated that temptations augment the effect of habit on technology use, and reduce the effect of satisfaction on use. Study 2 showed that temptations strengthen the effect of habit on impulsive technology use, and weaken the effect of behavioral expectations on impulsive technology use. Hence, the results consistently support the notion that information technology users' behaviors are influenced by reflective and impulsive information processing systems; and that the equilibrium of these systems is determined, at least in part, by one's temptations. These results can serve as a basis for understanding the etiology of modern day addictions.
Rudolf Ahlswede’s lectures on information theory
Althöfer, Ingo; Deppe, Christian; Tamm, Ulrich
Volume 1 : The volume “Storing and Transmitting Data” is based on Rudolf Ahlswede's introductory course on "Information Theory I" and presents an introduction to Shannon Theory. Readers, familiar or unfamiliar with the technical intricacies of Information Theory, will benefit considerably from working through the book; especially Chapter VI with its lively comments and uncensored insider views from the world of science and research offers informative and revealing insights. This is the first of several volumes that will serve as a collected research documentation of Rudolf Ahlswede’s lectures on information theory. Each volume includes comments from an invited well-known expert. Holger Boche contributed his insights in the supplement of the present volume. Classical information processing concerns the main tasks of gaining knowledge, storage, transmitting and hiding data. The first task is the prime goal of Statistics. For the two next, Shannon presented an impressive mathematical theory called Informat...
New Pathways between Group Theory and Model Theory
Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz
2017-01-01
This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...
Quantum theory from first principles an informational approach
D'Ariano, Giacomo Mauro; Perinotti, Paolo
2017-01-01
Quantum theory is the soul of theoretical physics. It is not just a theory of specific physical systems, but rather a new framework with universal applicability. This book shows how we can reconstruct the theory from six information-theoretical principles, by rebuilding the quantum rules from the bottom up. Step by step, the reader will learn how to master the counterintuitive aspects of the quantum world, and how to efficiently reconstruct quantum information protocols from first principles. Using intuitive graphical notation to represent equations, and with shorter and more efficient derivations, the theory can be understood and assimilated with exceptional ease. Offering a radically new perspective on the field, the book contains an efficient course of quantum theory and quantum information for undergraduates. The book is aimed at researchers, professionals, and students in physics, computer science and philosophy, as well as the curious outsider seeking a deeper understanding of the theory.
Could information theory provide an ecological theory of sensory processing?
Atick, Joseph J
2011-01-01
The sensory pathways of animals are well adapted to processing a special class of signals, namely stimuli from the animal's environment. An important fact about natural stimuli is that they are typically very redundant and hence the sampled representation of these signals formed by the array of sensory cells is inefficient. One could argue for some animals and pathways, as we do in this review, that efficiency of information representation in the nervous system has several evolutionary advantages. Consequently, one might expect that much of the processing in the early levels of these sensory pathways could be dedicated towards recoding incoming signals into a more efficient form. In this review, we explore the principle of efficiency of information representation as a design principle for sensory processing. We give a preliminary discussion on how this principle could be applied in general to predict neural processing and then discuss concretely some neural systems where it recently has been shown to be successful. In particular, we examine the fly's LMC coding strategy and the mammalian retinal coding in the spatial, temporal and chromatic domains.
A Mathematical Theory of System Information Flow
2016-06-27
i.i.d. is usually quite involved. There are numerous experiments , often using photons, to test Bell’s Inequality recorded in the literature, but the...classical setting. Peter focused on non-locality as an alternative theory and experiments using the CHSH inequality , and devised a statistical procedure...761 (2014). 7. BIERHORST, P., A new loophole in recent Bell test experiments , arXiv:1311.4488, (2014). 8. BIERHORST, P., A Mathematical Foundation
New approaches in mathematical biology: Information theory and molecular machines
International Nuclear Information System (INIS)
Schneider, T.
1995-01-01
My research uses classical information theory to study genetic systems. Information theory was founded by Claude Shannon in the 1940's and has had an enormous impact on communications engineering and computer sciences. Shannon found a way to measure information. This measure can be used to precisely characterize the sequence conservation at nucleic-acid binding sites. The resulting methods, by completely replacing the use of ''consensus sequences'', provide better models for molecular biologists. An excess of conservation led us to do experimental work on bacteriophage T7 promoters and the F plasmid IncD repeats. The wonderful fidelity of telephone communications and compact disk (CD) music can be traced directly to Shannon's channel capacity theorem. When rederived for molecular biology, this theorem explains the surprising precision of many molecular events. Through connections with the Second Law of Thermodyanmics and Maxwell's Demon, this approach also has implications for the development of technology at the molecular level. Discussions of these topics are held on the internet news group bionet.info-theo. (author). (Abstract only)
Galaxy Alignments: Theory, Modelling & Simulations
Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais
2015-11-01
The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.
Applications of model theory to functional analysis
Iovino, Jose
2014-01-01
During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the
Theory of the Concealed Information Test
Verschuere, B.; Ben-Shakhar, G.; Verschuere, B.; Ben-Shakhar, G.; Meijer, E.
2011-01-01
It is now well established that physiological measures can be validly used to detect concealed information. An important challenge is to elucidate the underlying mechanisms of concealed information detection. We review theoretical approaches that can be broadly classified in two major categories:
Assessment of visual communication by information theory
Huck, Friedrich O.; Fales, Carl L.
1994-01-01
This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.
Equity trees and graphs via information theory
Harré, M.; Bossomaier, T.
2010-01-01
We investigate the similarities and differences between two measures of the relationship between equities traded in financial markets. Our measures are the correlation coefficients and the mutual information. In the context of financial markets correlation coefficients are well established whereas mutual information has not previously been as well studied despite its theoretically appealing properties. We show that asset trees which are derived from either the correlation coefficients or the mutual information have a mixture of both similarities and differences at the individual equity level and at the macroscopic level. We then extend our consideration from trees to graphs using the "genus 0" condition recently introduced in order to study the networks of equities.
Information Theoretic Characterization of Physical Theories with Projective State Space
Zaopo, Marco
2015-08-01
Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.
Theories, Models and Methodology in Writing Research
Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel
1996-01-01
Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the
Realism and Antirealism in Informational Foundations of Quantum Theory
Directory of Open Access Journals (Sweden)
Tina Bilban
2014-08-01
Full Text Available Zeilinger-Brukner's informational foundations of quantum theory, a theory based on Zeilinger's foundational principle for quantum mechanics that an elementary system carried one bit of information, explains seemingly unintuitive quantum behavior with simple theoretical framework. It is based on the notion that distinction between reality and information cannot be made, therefore they are the same. As the critics of informational foundations of quantum theory show, this antirealistic move captures the theory in tautology, where information only refers to itself, while the relationships outside the information with the help of which the nature of information would be defined are lost and the questions "Whose information? Information about what?" cannot be answered. The critic's solution is a return to realism, where the observer's effects on the information are neglected. We show that radical antirealism of informational foundations of quantum theory is not necessary and that the return to realism is not the only way forward. A comprehensive approach that exceeds mere realism and antirealism is also possible: we can consider both sources of the constraints on the information, those coming from the observer and those coming from the observed system/nature/reality. The information is always the observer's information about the observed. Such a comprehensive philosophical approach can still support the theoretical framework of informational foundations of quantum theory: If we take that one bit is the smallest amount of information in the form of which the observed reality can be grasped by the observer, we can say that an elementary system (grasped and defined as such by the observer correlates to one bit of information. Our approach thus explains all the features of the quantum behavior explained by informational foundations of quantum theory: the wave function and its collapse, entanglement, complementarity and quantum randomness. However, it does
Hiemstra, Djoerd; Göker, Ayse; Davies, John
2009-01-01
Many applications that handle information on the internet would be completely inadequate without the support of information retrieval technology. How would we find information on the world wide web if there were no web search engines? How would we manage our email without spam filtering? Much of the
Study on geo-information modelling
Czech Academy of Sciences Publication Activity Database
Klimešová, Dana
2006-01-01
Roč. 5, č. 5 (2006), s. 1108-1113 ISSN 1109-2777 Institutional research plan: CEZ:AV0Z10750506 Keywords : control GIS * geo-information modelling * uncertainty * spatial temporal approach Web Services Subject RIV: BC - Control Systems Theory
Affect Theory and Autoethnography in Ordinary Information Systems
DEFF Research Database (Denmark)
Bødker, Mads; Chamberlain, Alan
2016-01-01
This paper uses philosophical theories of affect as a lens for exploring autoethnographic renderings of everyday experience with information technology. Affect theories, in the paper, denote a broad trend in post-humanistic philosophy that explores sensation and feeling as emergent and relational...
Response to Patrick Love's "Informal Theory": A Rejoinder
Evans, Nancy J.; Guido, Florence M.
2012-01-01
This rejoinder to Patrick Love's article, "Informal Theory: The Ignored Link in Theory-to-Practice," which appears earlier in this issue of the "Journal of College Student Development", was written at the invitation of the Editor. In the critique, we point out the weaknesses of many of Love's arguments and propositions. We provide an alternative…
Quantum: information theory: technological challenge; Computacion Cuantica: un reto tecnologico
Energy Technology Data Exchange (ETDEWEB)
Calixto, M.
2001-07-01
The new Quantum Information Theory augurs powerful machines that obey the entangled logic of the subatomic world. Parallelism, entanglement, teleportation, no-cloning and quantum cryptography are typical peculiarities of this novel way of understanding computation. (Author) 24 refs.
Information Processing Theories and the Education of the Gifted.
Rawl, Ruth K.; O'Tuel, Frances S.
1983-01-01
The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)
Information theory and its application to optical communication
Willems, F.M.J.
2017-01-01
The lecture focusses on the foundations of communication which were developed within the field of information theory. Enumerative shaping techniques and the so-called squareroot transform will be discussed in detail.
An introduction to single-user information theory
Alajaji, Fady
2018-01-01
This book presents a succinct and mathematically rigorous treatment of the main pillars of Shannon’s information theory, discussing the fundamental concepts and indispensable results of Shannon’s mathematical theory of communications. It includes five meticulously written core chapters (with accompanying problems), emphasizing the key topics of information measures; lossless and lossy data compression; channel coding; and joint source-channel coding for single-user (point-to-point) communications systems. It also features two appendices covering necessary background material in real analysis and in probability theory and stochastic processes. The book is ideal for a one-semester foundational course on information theory for senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences. A comprehensive instructor’s solutions manual is available.
Spatial data modelling and maximum entropy theory
Czech Academy of Sciences Publication Activity Database
Klimešová, Dana; Ocelíková, E.
2005-01-01
Roč. 51, č. 2 (2005), s. 80-83 ISSN 0139-570X Institutional research plan: CEZ:AV0Z10750506 Keywords : spatial data classification * distribution function * error distribution Subject RIV: BD - Theory of Information
Generalized information theory: aims, results, and open problems
International Nuclear Information System (INIS)
Klir, George J.
2004-01-01
The principal purpose of this paper is to present a comprehensive overview of generalized information theory (GIT): a research program whose objective is to develop a broad treatment of uncertainty-based information, not restricted to classical notions of uncertainty. After a brief overview of classical information theories, a broad framework for formalizing uncertainty and the associated uncertainty-based information of a great spectrum of conceivable types is sketched. The various theories of imprecise probabilities that have already been developed within this framework are then surveyed, focusing primarily on some important unifying principles applying to all these theories. This is followed by introducing two higher levels of the theories of imprecise probabilities: (i) the level of measuring the amount of relevant uncertainty (predictive, retrodictive, prescriptive, diagnostic, etc.) in any situation formalizable in each given theory, and (ii) the level of some methodological principles of uncertainty, which are contingent upon the capability to measure uncertainty and the associated uncertainty-based information. Various issues regarding both the measurement of uncertainty and the uncertainty principles are discussed. Again, the focus is on unifying principles applicable to all the theories. Finally, the current status of GIT is assessed and future research in the area is discussed
Advancing Theory? Landscape Archaeology and Geographical Information Systems
Directory of Open Access Journals (Sweden)
Di Hu
2012-05-01
Full Text Available This paper will focus on how Geographical Information Systems (GIS have been applied in Landscape Archaeology from the late 1980s to the present. GIS, a tool for organising and analysing spatial information, has exploded in popularity, but we still lack a systematic overview of how it has contributed to archaeological theory, specifically Landscape Archaeology. This paper will examine whether and how GIS has advanced archaeological theory through a historical review of its application in archaeology.
BOOK REVIEW: Theory of Neural Information Processing Systems
Galla, Tobias
2006-04-01
It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the
Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing
Schrauben, Julie E.
2010-01-01
LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…
Brain activity and cognition: a connection from thermodynamics and information theory.
Collell, Guillem; Fauquet, Jordi
2015-01-01
The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity.
Brain activity and cognition: a connection from thermodynamics and information theory
Collell, Guillem; Fauquet, Jordi
2015-01-01
The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. PMID:26136709
Applications of quantum information theory to quantum gravity
International Nuclear Information System (INIS)
Smolin, L.
2005-01-01
Full text: I describe work by and with Fotini Markopoulou and Olaf Dreyeron the application of quantum information theory to quantum gravity. A particular application to black hole physics is described, which treats the black hole horizon as an open system, in interaction with an environment, which are the degrees of freedom in the bulk spacetime. This allows us to elucidate which quantum states of a general horizon contribute to the entropy of a Schwarzchild black hole. This case serves as an example of how methods from quantum information theory may help to elucidate how the classical limit emerges from a background independent quantum theory of gravity. (author)
Analyzing complex networks evolution through Information Theory quantifiers
International Nuclear Information System (INIS)
Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez
2011-01-01
A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.
Analyzing complex networks evolution through Information Theory quantifiers
Energy Technology Data Exchange (ETDEWEB)
Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)
2011-01-24
A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.
Multidimensional Models of Information Need
Yun-jie (Calvin) Xu; Kai Huang (Joseph) Tan
2009-01-01
User studies in information science have recognised relevance as a multidimensional construct. An implication of multidimensional relevance is that a user's information need should be modeled by multiple data structures to represent different relevance dimensions. While the extant literature has attempted to model multiple dimensions of a user's information need, the fundamental assumption that a multidimensional model is better than a uni-dimensional model has not been addressed. This study ...
The g-theorem and quantum information theory
Energy Technology Data Exchange (ETDEWEB)
Casini, Horacio; Landea, Ignacio Salazar; Torroba, Gonzalo [Centro Atómico Bariloche and CONICET,S.C. de Bariloche, Río Negro, R8402AGP (Argentina)
2016-10-25
We study boundary renormalization group flows between boundary conformal field theories in 1+1 dimensions using methods of quantum information theory. We define an entropic g-function for theories with impurities in terms of the relative entanglement entropy, and we prove that this g-function decreases along boundary renormalization group flows. This entropic g-theorem is valid at zero temperature, and is independent from the g-theorem based on the thermal partition function. We also discuss the mutual information in boundary RG flows, and how it encodes the correlations between the impurity and bulk degrees of freedom. Our results provide a quantum-information understanding of (boundary) RG flow as increase of distinguishability between the UV fixed point and the theory along the RG flow.
Information-Theoretic Perspectives on Geophysical Models
Nearing, Grey
2016-04-01
To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar
Goodness-of-Fit Assessment of Item Response Theory Models
Maydeu-Olivares, Alberto
2013-01-01
The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…
Mohammad-Djafari, Ali
2015-01-01
The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.
Crisis in Context Theory: An Ecological Model
Myer, Rick A.; Moore, Holly B.
2006-01-01
This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…
An information theory framework for dynamic functional domain connectivity.
Vergara, Victor M; Miller, Robyn; Calhoun, Vince
2017-06-01
Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.
Planting contemporary practice theory in the garden of information science
Huizing, A.; Cavanagh, M.
2011-01-01
Introduction. The purpose of this paper is to introduce to information science in a coherent fashion the core premises of contemporary practice theory, and thus to engage the information research community in further debate and discussion. Method. Contemporary practice-based approaches are
Year 7 Students, Information Literacy, and Transfer: A Grounded Theory
Herring, James E.
2011-01-01
This study examined the views of year 7 students, teacher librarians, and teachers in three state secondary schools in rural New South Wales, Australia, on information literacy and transfer. The aims of the study included the development of a grounded theory in relation to information literacy and transfer in these schools. The study's perspective…
Basile, Kathleen C; Hall, Jeffrey E; Walters, Mikel L
2013-07-01
This study tested resource and feminist-informed theories to explain physical, sexual, psychological, and stalking intimate partner violence (IPV) perpetrated by court-mandated men. Data were obtained from 340 men arrested for physical assault of a partner before their court-ordered treatment. Using path analysis, findings provided partial support for each model. Ineffective arguing and substance-use problems were moderators of resources and perpetration. Dominance mediated early exposures and perpetration in the feminist-informed model. In both models, predictors of stalking were different than those for other types of perpetration. Future studies should replicate this research and determine the utility of combining models.
Information processing theory in the early design stages
DEFF Research Database (Denmark)
Cash, Philip; Kreye, Melanie
2014-01-01
suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...
Route Choice Model Based on Game Theory for Commuters
Directory of Open Access Journals (Sweden)
Licai Yang
2016-06-01
Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.
Constraint theory multidimensional mathematical model management
Friedman, George J
2017-01-01
Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...
Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics
Wolpert, David H.
2005-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.
Staircase Models from Affine Toda Field Theory
Dorey, P; Dorey, Patrick; Ravanini, Francesco
1993-01-01
We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.
Lenses on reading an introduction to theories and models
Tracey, Diane H
2017-01-01
Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a
Reconstructing bidimensional scalar field theory models
International Nuclear Information System (INIS)
Flores, Gabriel H.; Svaiter, N.F.
2001-07-01
In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)
Mathematical models of information and stochastic systems
Kornreich, Philipp
2008-01-01
From ancient soothsayers and astrologists to today's pollsters and economists, probability theory has long been used to predict the future on the basis of past and present knowledge. Mathematical Models of Information and Stochastic Systems shows that the amount of knowledge about a system plays an important role in the mathematical models used to foretell the future of the system. It explains how this known quantity of information is used to derive a system's probabilistic properties. After an introduction, the book presents several basic principles that are employed in the remainder of the t
The use of information theory in evolutionary biology.
Adami, Christoph
2012-05-01
Information is a key concept in evolutionary biology. Information stored in a biological organism's genome is used to generate the organism and to maintain and control it. Information is also that which evolves. When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here, I review applications of information theory to the evolution of proteins and to the evolution of information processing in simulated agents that adapt to perform a complex task. © 2012 New York Academy of Sciences.
Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics
International Nuclear Information System (INIS)
Altaner, Bernhard
2017-01-01
Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. (paper)
A course on basic model theory
Sarbadhikari, Haimanti
2017-01-01
This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.
Gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Witten, E.
1989-01-01
Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)
Spacecraft TT&C and information transmission theory and technologies
Liu, Jiaxing
2015-01-01
Spacecraft TT&C and Information Transmission Theory and Technologies introduces the basic theory of spacecraft TT&C (telemetry, track and command) and information transmission. Combining TT&C and information transmission, the book presents several technologies for continuous wave radar including measurements for range, range rate and angle, analog and digital information transmissions, telecommand, telemetry, remote sensing and spread spectrum TT&C. For special problems occurred in the channels for TT&C and information transmission, the book represents radio propagation features and its impact on orbit measurement accuracy, and the effects caused by rain attenuation, atmospheric attenuation and multi-path effect, and polarization composition technology. This book can benefit researchers and engineers in the field of spacecraft TT&C and communication systems. Liu Jiaxing is a professor at The 10th Institute of China Electronics Technology Group Corporation.
International Nuclear Information System (INIS)
Kim, Jong Hyun; Seong, Poong Hyun
2002-01-01
This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload
Probability and information theory, with applications to radar
Woodward, P M; Higinbotham, W
1964-01-01
Electronics and Instrumentation, Second Edition, Volume 3: Probability and Information Theory with Applications to Radar provides information pertinent to the development on research carried out in electronics and applied physics. This book presents the established mathematical techniques that provide the code in which so much of the mathematical theory of electronics and radar is expressed.Organized into eight chapters, this edition begins with an overview of the geometry of probability distributions in which moments play a significant role. This text then examines the mathematical methods in
Conceptual models of information processing
Stewart, L. J.
1983-01-01
The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.
Cluster model in reaction theory
International Nuclear Information System (INIS)
Adhikari, S.K.
1979-01-01
A recent work by Rosenberg on cluster states in reaction theory is reexamined and generalized to include energies above the threshold for breakup into four composite fragments. The problem of elastic scattering between two interacting composite fragments is reduced to an equivalent two-particle problem with an effective potential to be determined by extremum principles. For energies above the threshold for breakup into three or four composite fragments effective few-particle potentials are introduced and the problem is reduced to effective three- and four-particle problems. The equivalent three-particle equation contains effective two- and three-particle potentials. The effective potential in the equivalent four-particle equation has two-, three-, and four-body connected parts and a piece which has two independent two-body connected parts. In the equivalent three-particle problem we show how to include the effect of a weak three-body potential perturbatively. In the equivalent four-body problem an approximate simple calculational scheme is given when one neglects the four-particle potential the effect of which is presumably very small
Textual information access statistical models
Gaussier, Eric
2013-01-01
This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications
Economic Modelling in Institutional Economic Theory
Directory of Open Access Journals (Sweden)
Wadim Strielkowski
2017-06-01
Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.
Randomized Item Response Theory Models
Fox, Gerardus J.A.
2005-01-01
The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by
Theory of information warfare: basic framework, methodology and conceptual apparatus
Directory of Open Access Journals (Sweden)
Олександр Васильович Курбан
2015-11-01
Full Text Available It is conducted a comprehensive theoretical study and determine the basic provisions of the modern theory of information warfare in on-line social networks. Three basic blocks, which systematized the theoretical and methodological basis of the topic, are established. There are information and psychological war, social off-line and on-line network. According to the three blocks, theoretical concepts are defined and methodological substantiation of information processes within the information warfare in the social on-line networks is formed
Topological quantum theories and integrable models
International Nuclear Information System (INIS)
Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.
1991-01-01
The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit
Untangling the drivers of nonlinear systems with information theory
Wing, S.; Johnson, J.
2017-12-01
Many systems found in nature are nonlinear. The drivers of the system are often nonlinearly correlated with one another, which makes it a challenge to understand the effects of an individual driver. For example, solar wind velocity (Vsw) and density (nsw) are both found to correlate well with radiation belt fluxes and are thought to be drivers of the magnetospheric dynamics; however, the Vsw is anti-correlated with nsw, which can potentially confuse interpretation of these relationships as causal or coincidental. Information theory can untangle the drivers of these systems, describe the underlying dynamics, and offer constraints to modelers and theorists, leading to better understanding of the systems. Two examples are presented. In the first example, the solar wind drivers of geosynchronous electrons with energy range of 1.8-3.5 MeV are investigated using mutual information (MI), conditional mutual information (CMI), and transfer entropy (TE). The information transfer from Vsw to geosynchronous MeV electron flux (Je) peaks with a lag time (t) of 2 days. As previously reported, Je is anticorrelated with nsw with a lag of 1 day. However, this lag time and anticorrelation can be attributed mainly to the Je(t + 2 days) correlation with Vsw(t) and nsw(t + 1 day) anticorrelation with Vsw(t). Analyses of solar wind driving of the magnetosphere need to consider the large lag times, up to 3 days, in the (Vsw, nsw) anticorrelation. Using CMI to remove the effects of Vsw, the response of Je to nsw is 30% smaller and has a lag time < 24 hr, suggesting that the loss mechanism due to nsw or solar wind dynamic pressure has to start operating in < 24 hr. nsw transfers about 36% as much information as Vsw (the primary driver) to Je. Nonstationarity in the system dynamics are investigated using windowed TE. When the data is ordered according to high or low transfer entropy it is possible to understand details of the triangle distribution that has been identified between Je(t + 2
Self Modeling: Expanding the Theories of Learning
Dowrick, Peter W.
2012-01-01
Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…
Comparison of Predictive Contract Mechanisms from an Information Theory Perspective
Zhang, Xin; Ward, Tomas; McLoone, Seamus
2012-01-01
Inconsistency arises across a Distributed Virtual Environment due to network latency induced by state changes communications. Predictive Contract Mechanisms (PCMs) combat this problem through reducing the amount of messages transmitted in return for perceptually tolerable inconsistency. To date there are no methods to quantify the efficiency of PCMs in communicating this reduced state information. This article presents an approach derived from concepts in information theory for a dee...
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Information richness in construction projects: A critical social theory
Adriaanse, Adriaan Maria; Voordijk, Johannes T.; Greenwood, David
2002-01-01
Two important factors influencing the communication in construction projects are the interests of the people involved and the language spoken by the people involved. The objective of the paper is to analyse these factors by using recent insights in the information richness theory. The critical
Information Architecture without Internal Theory: An Inductive Design Process.
Haverty, Marsha
2002-01-01
Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…
The Philosophy of Information as an Underlying and Unifying Theory of Information Science
Tomic, Taeda
2010-01-01
Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…
Security Theorems via Model Theory
Directory of Open Access Journals (Sweden)
Joshua Guttman
2009-11-01
Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.
INFORMATION MODEL OF SOCIAL TRANSFORMATIONS
Directory of Open Access Journals (Sweden)
Мария Васильевна Комова
2013-09-01
Full Text Available The social transformation is considered as a process of qualitative changes of the society, creating a new level of organization in all areas of life, in different social formations, societies of different types of development. The purpose of the study is to create a universal model for studying social transformations based on their understanding as the consequence of the information exchange processes in the society. After defining the conceptual model of the study, the author uses the following methods: the descriptive method, analysis, synthesis, comparison.Information, objectively existing in all elements and systems of the material world, is an integral attribute of the society transformation as well. The information model of social transformations is based on the definition of the society transformation as the change in the information that functions in the society’s information space. The study of social transformations is the study of information flows circulating in the society and being characterized by different spatial, temporal, and structural states. Social transformations are a highly integrated system of social processes and phenomena, the nature, course and consequences of which are affected by the factors representing the whole complex of material objects. The integrated information model of social transformations foresees the interaction of the following components: social memory, information space, and the social ideal. To determine the dynamics and intensity of social transformations the author uses the notions of "information threshold of social transformations" and "information pressure".Thus, the universal nature of information leads to considering social transformations as a system of information exchange processes. Social transformations can be extended to any episteme actualized by social needs. The establishment of an information threshold allows to simulate the course of social development, to predict the
How to Produce a Transdisciplinary Information Concept for a Universal Theory of Information?
DEFF Research Database (Denmark)
Brier, Søren
2017-01-01
the natural, technical, social and humanistic sciences must be defined as a part of real relational meaningful sign-processes manifesting as tokens. Thus Peirce’s information theory is empirically based in a realistic worldview, which through modern biosemiotics includes all living systems....... concept of information as a difference that makes a difference and in Luhmann’s triple autopoietic communication based system theory, where information is always a part of a message. Charles Sanders Peirce’s pragmaticist semiotics differs from other paradigms in that it integrates logic and information...... in interpretative semiotics. I therefore suggest alternatively building information theories based on semiotics from the basic relations of embodied living systems meaningful cognition and communication. I agree with Peircean biosemiotics that all transdisciplinary information concepts in order to work across...
Vacation queueing models theory and applications
Tian, Naishuo
2006-01-01
A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...
A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection
Reason, Robert D.; Kimball, Ezekiel W.
2012-01-01
In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…
Quantum field theory and the standard model
Schwartz, Matthew D
2014-01-01
Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...
An application of information theory to stochastic classical gravitational fields
Angulo, J.; Angulo, J. C.; Angulo, J. M.
2018-06-01
The objective of this study lies on the incorporation of the concepts developed in the Information Theory (entropy, complexity, etc.) with the aim of quantifying the variation of the uncertainty associated with a stochastic physical system resident in a spatiotemporal region. As an example of application, a relativistic classical gravitational field has been considered, with a stochastic behavior resulting from the effect induced by one or several external perturbation sources. One of the key concepts of the study is the covariance kernel between two points within the chosen region. Using this concept and the appropriate criteria, a methodology is proposed to evaluate the change of uncertainty at a given spatiotemporal point, based on available information and efficiently applying the diverse methods that Information Theory provides. For illustration, a stochastic version of the Einstein equation with an added Gaussian Langevin term is analyzed.
Entropy and information causality in general probabilistic theories
International Nuclear Information System (INIS)
Barnum, Howard; Leifer, Matthew; Spekkens, Robert; Barrett, Jonathan; Clark, Lisa Orloff; Stepanik, Nicholas; Wilce, Alex; Wilke, Robin
2010-01-01
We investigate the concept of entropy in probabilistic theories more general than quantum mechanics, with particular reference to the notion of information causality (IC) recently proposed by Pawlowski et al (2009 arXiv:0905.2292). We consider two entropic quantities, which we term measurement and mixing entropy. In the context of classical and quantum theory, these coincide, being given by the Shannon and von Neumann entropies, respectively; in general, however, they are very different. In particular, while measurement entropy is easily seen to be concave, mixing entropy need not be. In fact, as we show, mixing entropy is not concave whenever the state space is a non-simplicial polytope. Thus, the condition that measurement and mixing entropies coincide is a strong constraint on possible theories. We call theories with this property monoentropic. Measurement entropy is subadditive, but not in general strongly subadditive. Equivalently, if we define the mutual information between two systems A and B by the usual formula I(A: B)=H(A)+H(B)-H(AB), where H denotes the measurement entropy and AB is a non-signaling composite of A and B, then it can happen that I(A:BC)< I(A:B). This is relevant to IC in the sense of Pawlowski et al: we show that any monoentropic non-signaling theory in which measurement entropy is strongly subadditive, and also satisfies a version of the Holevo bound, is informationally causal, and on the other hand we observe that Popescu-Rohrlich boxes, which violate IC, also violate strong subadditivity. We also explore the interplay between measurement and mixing entropy and various natural conditions on theories that arise in quantum axiomatics.
Aubrun, Guillaume
2017-01-01
The quest to build a quantum computer is arguably one of the major scientific and technological challenges of the twenty-first century, and quantum information theory (QIT) provides the mathematical framework for that quest. Over the last dozen or so years, it has become clear that quantum information theory is closely linked to geometric functional analysis (Banach space theory, operator spaces, high-dimensional probability), a field also known as asymptotic geometric analysis (AGA). In a nutshell, asymptotic geometric analysis investigates quantitative properties of convex sets, or other geometric structures, and their approximate symmetries as the dimension becomes large. This makes it especially relevant to quantum theory, where systems consisting of just a few particles naturally lead to models whose dimension is in the thousands, or even in the billions. Alice and Bob Meet Banach is aimed at multiple audiences connected through their interest in the interface of QIT and AGA: at quantum information resea...
Advances in cognitive theory and therapy: the generic cognitive model.
Beck, Aaron T; Haigh, Emily A P
2014-01-01
For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.
International Nuclear Information System (INIS)
Schlingemann, D.
1996-10-01
Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)
Model selection and inference a practical information-theoretic approach
Burnham, Kenneth P
1998-01-01
This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...
Grounded theory for radiotherapy practitioners: Informing clinical practice
International Nuclear Information System (INIS)
Walsh, N.A.
2010-01-01
Radiotherapy practitioners may be best placed to undertake qualitative research within the context of cancer, due to specialist knowledge of radiation treatment and sensitivity to radiotherapy patient's needs. The grounded theory approach to data collection and analysis is a unique method of identifying a theory directly based on data collected within a clinical context. Research for radiotherapy practitioners is integral to role expansion within the government's directive for evidence-based practice. Due to the paucity of information on qualitative research undertaken by radiotherapy radiographers, this article aims to assess the potential impact of qualitative research on radiotherapy patient and service outcomes.
The future (and past) of quantum theory after the Higgs boson: a quantum-informational viewpoint.
Plotnitsky, Arkady
2016-05-28
Taking as its point of departure the discovery of the Higgs boson, this article considers quantum theory, including quantum field theory, which predicted the Higgs boson, through the combined perspective of quantum information theory and the idea of technology, while also adopting anon-realistinterpretation, in 'the spirit of Copenhagen', of quantum theory and quantum phenomena themselves. The article argues that the 'events' in question in fundamental physics, such as the discovery of the Higgs boson (a particularly complex and dramatic, but not essentially different, case), are made possible by the joint workings of three technologies: experimental technology, mathematical technology and, more recently, digital computer technology. The article will consider the role of and the relationships among these technologies, focusing on experimental and mathematical technologies, in quantum mechanics (QM), quantum field theory (QFT) and finite-dimensional quantum theory, with which quantum information theory has been primarily concerned thus far. It will do so, in part, by reassessing the history of quantum theory, beginning with Heisenberg's discovery of QM, in quantum-informational and technological terms. This history, the article argues, is defined by the discoveries of increasingly complex configurations of observed phenomena and the emergence of the increasingly complex mathematical formalism accounting for these phenomena, culminating in the standard model of elementary-particle physics, defining the current state of QFT. © 2016 The Author(s).
DEFF Research Database (Denmark)
Jensen, Tina Blegind; Kjærgaard, Annemette; Svejvig, Per
2009-01-01
Institutional theory has proven to be a central analytical perspective for investigating the role of social and historical structures of information systems (IS) implementation. However, it does not explicitly account for how organisational actors make sense of and enact technologies in their local...... context. We address this limitation by exploring the potential of using institutional theory with sensemaking theory to study IS implementation in organisations. We argue that each theoretical perspective has its own explanatory power and that a combination of the two facilitates a much richer...... interpretation of IS implementation by linking macro- and micro-levels of analysis. To illustrate this, we report from an empirical study of the implementation of an Electronic Patient Record (EPR) system in a clinical setting. Using key constructs from the two theories, our findings address the phenomenon...
Introduction to zeolite theory and modelling
Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.
2001-01-01
A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the
Prospect Theory in the Heterogeneous Agent Model
Czech Academy of Sciences Publication Activity Database
Polach, J.; Kukačka, Jiří
(2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economic s OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438.pdf
Recursive renormalization group theory based subgrid modeling
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
Diagrammatic group theory in quark models
International Nuclear Information System (INIS)
Canning, G.P.
1977-05-01
A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de
Aligning Grammatical Theories and Language Processing Models
Lewis, Shevaun; Phillips, Colin
2015-01-01
We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…
Modeling decisions information fusion and aggregation operators
Torra, Vicenc
2007-01-01
Information fusion techniques and aggregation operators produce the most comprehensive, specific datum about an entity using data supplied from different sources, thus enabling us to reduce noise, increase accuracy, summarize and extract information, and make decisions. These techniques are applied in fields such as economics, biology and education, while in computer science they are particularly used in fields such as knowledge-based systems, robotics, and data mining. This book covers the underlying science and application issues related to aggregation operators, focusing on tools used in practical applications that involve numerical information. Starting with detailed introductions to information fusion and integration, measurement and probability theory, fuzzy sets, and functional equations, the authors then cover the following topics in detail: synthesis of judgements, fuzzy measures, weighted means and fuzzy integrals, indices and evaluation methods, model selection, and parameter extraction. The method...
Executive Information Systems' Multidimensional Models
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.
Russian and Chinese Information Warfare: Theory and Practice
2004-06-01
Integral neurolinguistic programming •Placing essential programs into the conscious or sub- conscious mind •Subconscious suggestions that modify human...Generators of special rays •Optical systems • Neurolinguistic programming •Computer psychotechnology •The mass media •Audiovisual effects •Special effects...Information Warfare: Theory and Practice 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e
Information theory, animal communication, and the search for extraterrestrial intelligence
Doyle, Laurance R.; McCowan, Brenda; Johnston, Simon; Hanser, Sean F.
2011-02-01
We present ongoing research in the application of information theory to animal communication systems with the goal of developing additional detectors and estimators for possible extraterrestrial intelligent signals. Regardless of the species, for intelligence (i.e., complex knowledge) to be transmitted certain rules of information theory must still be obeyed. We demonstrate some preliminary results of applying information theory to socially complex marine mammal species (bottlenose dolphins and humpback whales) as well as arboreal squirrel monkeys, because they almost exclusively rely on vocal signals for their communications, producing signals which can be readily characterized by signal analysis. Metrics such as Zipf's Law and higher-order information-entropic structure are emerging as indicators of the communicative complexity characteristic of an "intelligent message" content within these animals' signals, perhaps not surprising given these species' social complexity. In addition to human languages, for comparison we also apply these metrics to pulsar signals—perhaps (arguably) the most "organized" of stellar systems—as an example of astrophysical systems that would have to be distinguished from an extraterrestrial intelligence message by such information theoretic filters. We also look at a message transmitted from Earth (Arecibo Observatory) that contains a lot of meaning but little information in the mathematical sense we define it here. We conclude that the study of non-human communication systems on our own planet can make a valuable contribution to the detection of extraterrestrial intelligence by providing quantitative general measures of communicative complexity. Studying the complex communication systems of other intelligent species on our own planet may also be one of the best ways to deprovincialize our thinking about extraterrestrial communication systems in general.
Wang, Zhihuan
Research on Information Systems (IS) acceptance is substantially focused on extrinsic motivation in workplaces, little is known about the underlying intrinsic motivations of Hedonic IS (HIS) acceptance. This paper proposes a hybrid HIS acceptance model which takes the unique characteristics of HIS and multiple identities of a HIS user into consideration by interacting Hedonic theory, Flow theory with Technology Acceptance Model (TAM). The model was empirically tested by a field survey. The result indicates that emotional responses, imaginal responses, and flow experience are three main contributions of HIS acceptance.
Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics
Altaner, Bernhard
2017-11-01
Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Bernhard Altaner was selected by the Editorial Board of J. Phys. A as an Emerging Talent.
Preservation of information in Fourier theory based deconvolved nuclear spectra
International Nuclear Information System (INIS)
Madan, V.K.; Gopalakrishnan, K.R.; Sharma, R.C.; Rattan, S.S.
1995-01-01
Nuclear spectroscopy is extremely useful to the internal radiation dosimetry for the estimation of body burden due to gamma emitters. Analysis of nuclear spectra is concerned with the extraction of qualitative and quantitative information embedded in the spectra. A spectral deconvolution method based on Fourier theory is probably the simplest method of deconvolving nuclear spectra. It is proved mathematically that the deconvolution method preserves the qualitative information. It is shown by using simulated spectra and an observed gamma ray spectrum that the method preserves the quantitative information. This may provide a novel approach of information extraction from a deconvolved spectrum. The paper discusses the methodology, mathematical analysis, and the results obtained by deconvolving spectra. (author). 6 refs., 2 tabs
A dynamical theory for the Rishon model
International Nuclear Information System (INIS)
Harari, H.; Seiberg, N.
1980-09-01
We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)
Polyacetylene and relativistic field-theory models
International Nuclear Information System (INIS)
Bishop, A.R.; Campbell, D.K.; Fesser, K.
1981-01-01
Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed
Information Systems Outsourcing Relationship Model
Directory of Open Access Journals (Sweden)
Richard Flemming
2007-09-01
Full Text Available Increasing attention is being paid to what determines the success of an information systems outsourcing arrangement. The current research aims to provide an improved understanding of the factors influencing the outcome of an information systems outsourcing relationship and to provide a preliminary validation of an extended outsourcing relationship model by interviews with information systems outsourcing professionals in both the client and vendor of a major Australian outsourcing relationship. It also investigates whether the client and the vendor perceive the relationship differently and if so, how they perceive it differently and whether the two perspectives are interrelated.
Theory to practice: the humanbecoming leading-following model.
Ursel, Karen L
2015-01-01
Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.
Rao, Zhenhui
2016-01-01
The research reported here investigated the relationship between students' use of language learning strategies and their English proficiency, and then interpreted the data from two models in information-processing theory. Results showed that the students' English proficiency significantly affected their use of learning strategies, with high-level…
Attachment and the Processing of Social Information across the Life Span: Theory and Evidence
Dykas, Matthew J.; Cassidy, Jude
2011-01-01
Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the…
Working memory: theories, models, and controversies.
Baddeley, Alan
2012-01-01
I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.
Topos models for physics and topos theory
International Nuclear Information System (INIS)
Wolters, Sander
2014-01-01
What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos
Critical Theory as a foundation for Pragmatic Information Systems Design
Gerald Benoît
2001-01-01
This paper considers how designers of information systems and end-user perspectives, communication models and linguistic behaviors differ. A critique of these differences is made by applying Habermas's communicative action principles. An empirical study of human-human information seeking, based on those principles, indicates which behaviors are predictors of successful interactions and so are candidate behaviors may be integrated into computerized information systems.
Modeling Human Information Acquisition Strategies
Heuvelink, Annerieke; Klein, Michel C. A.; van Lambalgen, Rianne; Taatgen, Niels A.; Rijn, Hedderik van
2009-01-01
The focus of this paper is the development of a computational model for intelligent agents that decides on whether to acquire required information by retrieving it from memory or by interacting with the world. First, we present a task for which such decisions have to be made. Next, we discuss an
Prospects for advanced RF theory and modeling
International Nuclear Information System (INIS)
Batchelor, D. B.
1999-01-01
This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed. (c) 1999 American Institute of Physics
A Membrane Model from Implicit Elasticity Theory
Freed, A. D.; Liao, J.; Einstein, D. R.
2014-01-01
A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079
Attribution models and the Cooperative Game Theory
Cano Berlanga, Sebastian; Vilella, Cori
2017-01-01
The current paper studies the attribution model used by Google Analytics. Precisely, we use the Cooperative Game Theory to propose a fair distribution of the revenues among the considered channels, in order to facilitate the cooperation and to guarantee stability. We define a transferable utility convex cooperative game from the observed frequencies and we use the Shapley value to allocate the revenues among the di erent channels. Furthermore, we evaluate the impact of an advertising...
MODELS AND THE DYNAMICS OF THEORIES
Directory of Open Access Journals (Sweden)
Paulo Abrantes
2007-12-01
Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.
Conceptual Models and Theory-Embedded Principles on Effective Schooling.
Scheerens, Jaap
1997-01-01
Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…
Integrated information theory of consciousness: an updated account.
Tononi, G
2012-12-01
This article presents an updated account of integrated information theory of consciousness (liT) and some of its implications. /IT stems from thought experiments that lead to phenomenological axioms (existence, compositionality, information, integration, exclusion) and corresponding ontological postulates. The information axiom asserts that every experience is spec~fic - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is unified- it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. /IT formalizes these intuitions with postulates. The information postulate states that only "differences that make a difference" from the intrinsic perpective of a system matter: a mechanism generates cause-effect information if its present state has selective past causes and selective future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated over elements and at the optimal spatiatemporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of
Informed consent in neurosurgery--translating ethical theory into action.
Schmitz, Dagmar; Reinacher, Peter C
2006-09-01
Although a main principle of medical ethics and law since the 1970s, standards of informed consent are regarded with great scepticism by many clinicans. By reviewing the reactions to and adoption of this principle of medical ethics in neurosurgery, the characteristic conflicts that emerge between theory and everyday clinical experience are emphasised and a modified conception of informed consent is proposed. The adoption and debate of informed consent in neurosurgery took place in two steps. Firstly, respect for patient autonomy was included into the ethical codes of the professional organisations. Secondly, the legal demands of the principle were questioned by clinicians. Informed consent is mainly interpreted in terms of freedom from interference and absolute autonomy. It lacks a constructive notion of physician-patient interaction in its effort to promote the best interest of the patient, which, however, potentially emerges from a reconsideration of the principle of beneficence. To avoid insufficient legal interpretations, informed consent should be understood in terms of autonomy and beneficence. A continuous interaction between the patient and the given physician is considered as an essential prerequisite for the realisation of the standards of informed consent.
Informed consent in neurosurgery—translating ethical theory into action
Schmitz, Dagmar; Reinacher, Peter C
2006-01-01
Objective Although a main principle of medical ethics and law since the 1970s, standards of informed consent are regarded with great scepticism by many clinicans. Methods By reviewing the reactions to and adoption of this principle of medical ethics in neurosurgery, the characteristic conflicts that emerge between theory and everyday clinical experience are emphasised and a modified conception of informed consent is proposed. Results The adoption and debate of informed consent in neurosurgery took place in two steps. Firstly, respect for patient autonomy was included into the ethical codes of the professional organisations. Secondly, the legal demands of the principle were questioned by clinicians. Informed consent is mainly interpreted in terms of freedom from interference and absolute autonomy. It lacks a constructive notion of physician–patient interaction in its effort to promote the best interest of the patient, which, however, potentially emerges from a reconsideration of the principle of beneficence. Conclusion To avoid insufficient legal interpretations, informed consent should be understood in terms of autonomy and beneficence. A continuous interaction between the patient and the given physician is considered as an essential prerequisite for the realisation of the standards of informed consent. PMID:16943326
Finite Unification: Theory, Models and Predictions
Heinemeyer, S; Zoupanos, G
2011-01-01
All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...
Properties of some nonlinear Schroedinger equations motivated through information theory
International Nuclear Information System (INIS)
Yuan, Liew Ding; Parwani, Rajesh R
2009-01-01
We update our understanding of nonlinear Schroedinger equations motivated through information theory. In particular we show that a q-deformation of the basic nonlinear equation leads to a perturbative increase in the energy of a system, thus favouring the simplest q = 1 case. Furthermore the energy minimisation criterion is shown to be equivalent, at leading order, to an uncertainty maximisation argument. The special value η = 1/4 for the interpolation parameter, where leading order energy shifts vanish, implies the preservation of existing supersymmetry in nonlinearised supersymmetric quantum mechanics. Physically, η might be encoding relativistic effects.
Surrogate Marker Evaluation from an Information Theory Perspective
Alonso Abad, Ariel; Molenberghs, Geert
2006-01-01
The last 20 years have seen lots of work in the area of surrogate marker validation, partly devoted to frame the evaluation in a multitrial framework, leading to definitions in terms of the quality of trial- and individual-level association between a potential surrogate and a true endpoint (Buyse et al., 2000, Biostatistics 1, 49–67). A drawback is that different settings have led to different measures at the individual level. Here, we use information theory to create a unified framework, lea...
Fang, Song; Ishii, Hideaki
2017-01-01
This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.
An introductory review of information theory in the context of computational neuroscience.
McDonnell, Mark D; Ikeda, Shiro; Manton, Jonathan H
2011-07-01
This article introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand its foundations to incorporate more comprehensively biological processes thereby helping reveal how neuronal networks achieve their remarkable information processing abilities.
Theory, modeling and simulation: Annual report 1993
Energy Technology Data Exchange (ETDEWEB)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.
Theory, modeling and simulation: Annual report 1993
International Nuclear Information System (INIS)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies
Information flow, causality, and the classical theory of tachyons
International Nuclear Information System (INIS)
Basano, L.
1977-01-01
Causal paradoxes arising in the tachyon theory have been systematically solved by using the reinterpretation principle as a consequence of which cause and effect no longer retain an absolute meaning. However, even in the tachyon theory, a cause is always seen to chronologically precede its effect, but this is obtained at the price of allowing cause and effect to be interchanged when required. A recent result has shown that this interchange-ability of cause and effect must not be unlimited if heavy paradoxes are to be avoided. This partial recovery of the classical concept of causality has been expressed by the conjecture that transcendent tachyons cannot be absorbed by a tachyon detector. In this paper the directional properties of the flow of information between two observers in relative motion and its consequences on the logical self-consistency of the theory of superluminal particles are analyzed. It is shown that the above conjecture does not provide a satisfactory solution to the problem because it implies that tachyons of any speed cannot be intercepted by the same detector. (author)
International Nuclear Information System (INIS)
Randjbar-Daemi, S.
1987-01-01
The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix
Building Information Modeling Comprehensive Overview
Directory of Open Access Journals (Sweden)
Sergey Kalinichuk
2015-07-01
Full Text Available The article is addressed to provide a comprehensive review on recently accelerated development of the Information Technology within project market such as industrial, engineering, procurement and construction. Author’s aim is to cover the last decades of the growth of the Information and Communication Technology in construction industry in particular Building Information Modeling and testifies that the problem of a choice of the effective project realization method not only has not lost its urgency, but has also transformed into one of the major condition of the intensive technology development. All of it has created a great impulse on shortening the project duration and has led to the development of various schedule compression techniques what becomes a focus of modern construction.
Information as a Measure of Model Skill
Roulston, M. S.; Smith, L. A.
2002-12-01
Physicist Paul Davies has suggested that rather than the quest for laws that approximate ever more closely to "truth", science should be regarded as the quest for compressibility. The goodness of a model can be judged by the degree to which it allows us to compress data describing the real world. The "logarithmic scoring rule" is a method for evaluating probabilistic predictions of reality that turns this philosophical position into a practical means of model evaluation. This scoring rule measures the information deficit or "ignorance" of someone in possession of the prediction. A more applied viewpoint is that the goodness of a model is determined by its value to a user who must make decisions based upon its predictions. Any form of decision making under uncertainty can be reduced to a gambling scenario. Kelly showed that the value of a probabilistic prediction to a gambler pursuing the maximum return on their bets depends on their "ignorance", as determined from the logarithmic scoring rule, thus demonstrating a one-to-one correspondence between data compression and gambling returns. Thus information theory provides a way to think about model evaluation, that is both philosophically satisfying and practically oriented. P.C.W. Davies, in "Complexity, Entropy and the Physics of Information", Proceedings of the Santa Fe Institute, Addison-Wesley 1990 J. Kelly, Bell Sys. Tech. Journal, 35, 916-926, 1956.
sigma model approach to the heterotic string theory
International Nuclear Information System (INIS)
Sen, A.
1985-09-01
Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs
Prolegomena to a theory of nuclear information exchange
International Nuclear Information System (INIS)
Van Nuffelen, Dominique
1997-01-01
From the researcher's point of view, the communications with the agricultural populations in case of radiological emergency can not be anything else but the application of a theory of nuclear information exchange among social groups. Consequently, it is essentially necessary to work out such a theory, the prolegomena of which are exposed in this paper. It describes an experiment conducted at 'Service de protection contre les radiations ionisantes' - Belgium (SPRI), and proposes an investigation within the scientific knowledge in this matter. The available empirical and theoretical data allow formulating pragmatic recommendations, among which the principal one is the necessity of creating in normal radiological situation of a number of scenarios of messages adapted to the agricultural populations. The author points out that in order to be perfectly adapted these scenarios must been negotiated between the emitter and receiver. If this condition is satisfied the information in case of nuclear emergency will really be an exchange of knowledge between experts and the agricultural population i.e. a 'communication'
Hovick, Shelly R
2014-01-01
Although a family health history can be used to assess disease risk and increase health prevention behaviors, research suggests that few people have collected family health information. Guided by the Theory of Motivated Information Management, this study seeks to understand the barriers to and facilitators of interpersonal information seeking about family health history. Individuals who were engaged to be married (N = 306) were surveyed online and in person to understand how factors such as uncertainty, expectations for an information search, efficacy, and anxiety influence decisions and strategies for obtaining family health histories. The results supported the Theory of Motivated Information Management by demonstrating that individuals who experienced uncertainty discrepancies regarding family heath history had greater intention to seek information from family members when anxiety was low, outcome expectancy was high, and communication efficacy was positive. Although raising uncertainty about family health history may be an effective tool for health communicators to increase communication among family members, low-anxiety situations may be optimal for information seeking. Health communication messages must also build confidence in people's ability to communicate with family to obtain the needed health information.
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....
EDITORIAL: Focus on Quantum Information and Many-Body Theory
Eisert, Jens; Plenio, Martin B.
2010-02-01
Quantum many-body models describing natural systems or materials and physical systems assembled piece by piece in the laboratory for the purpose of realizing quantum information processing share an important feature: intricate correlations that originate from the coherent interaction between a large number of constituents. In recent years it has become manifest that the cross-fertilization between research devoted to quantum information science and to quantum many-body physics leads to new ideas, methods, tools, and insights in both fields. Issues of criticality, quantum phase transitions, quantum order and magnetism that play a role in one field find relations to the classical simulation of quantum systems, to error correction and fault tolerance thresholds, to channel capacities and to topological quantum computation, to name but a few. The structural similarities of typical problems in both fields and the potential for pooling of ideas then become manifest. Notably, methods and ideas from quantum information have provided fresh approaches to long-standing problems in strongly correlated systems in the condensed matter context, including both numerical methods and conceptual insights. Focus on quantum information and many-body theory Contents TENSOR NETWORKS Homogeneous multiscale entanglement renormalization ansatz tensor networks for quantum critical systems M Rizzi, S Montangero, P Silvi, V Giovannetti and Rosario Fazio Concatenated tensor network states R Hübener, V Nebendahl and W Dür Entanglement renormalization in free bosonic systems: real-space versus momentum-space renormalization group transforms G Evenbly and G Vidal Finite-size geometric entanglement from tensor network algorithms Qian-Qian Shi, Román Orús, John Ove Fjærestad and Huan-Qiang Zhou Characterizing symmetries in a projected entangled pair state D Pérez-García, M Sanz, C E González-Guillén, M M Wolf and J I Cirac Matrix product operator representations B Pirvu, V Murg, J I Cirac
Quantum integrable models of field theory
International Nuclear Information System (INIS)
Faddeev, L.D.
1979-01-01
Fundamental features of the classical method of the inverse problem have been formulated in the form which is convenient for its quantum reformulation. Typical examples are studied which may help to formulate the quantum method of the inverse problem. Examples are considered for interaction with both attraction and repulsion at a final density. The sine-Gordon model and the XYZ model from the quantum theory of magnetics are examined in short. It is noted that all the achievements of the one-dimensional mathematical physics as applied to exactly solvable quantum models may be put to an extent within the framework of the quantum method of the inverse problem. Unsolved questions are enumerated and perspectives of applying the inverse problem method are shown
Theory and Model for Martensitic Transformations
DEFF Research Database (Denmark)
Lindgård, Per-Anker; Mouritsen, Ole G.
1986-01-01
Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions....
Efficiency and credit ratings: a permutation-information-theory analysis
International Nuclear Information System (INIS)
Bariviera, Aurelio Fernandez; Martinez, Lisana B; Zunino, Luciano; Belén Guercio, M; Rosso, Osvaldo A
2013-01-01
The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity–entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification. (paper)
Economic contract theory tests models of mutualism.
Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E
2010-09-07
Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.
Magnetic flux tube models in superstring theory
Russo, Jorge G
1996-01-01
Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...
Group theory for unified model building
International Nuclear Information System (INIS)
Slansky, R.
1981-01-01
The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)
A matrix model from string field theory
Directory of Open Access Journals (Sweden)
Syoji Zeze
2016-09-01
Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.
IMMAN: free software for information theory-based chemometric analysis.
Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo
2015-05-01
The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA
What Density Functional Theory could do for Quantum Information
Mattsson, Ann
2015-03-01
The Hohenberg-Kohn theorem of Density Functional Theory (DFT), and extensions thereof, tells us that all properties of a system of electrons can be determined through their density, which uniquely determines the many-body wave-function. Given access to the appropriate, universal, functionals of the density we would, in theory, be able to determine all observables of any electronic system, without explicit reference to the wave-function. On the other hand, the wave-function is at the core of Quantum Information (QI), with the wave-function of a set of qubits being the central computational resource in a quantum computer. While there is seemingly little overlap between DFT and QI, reliance upon observables form a key connection. Though the time-evolution of the wave-function and associated phase information is fundamental to quantum computation, the initial and final states of a quantum computer are characterized by observables of the system. While observables can be extracted directly from a system's wave-function, DFT tells us that we may be able to intuit a method for extracting them from its density. In this talk, I will review the fundamentals of DFT and how these principles connect to the world of QI. This will range from DFT's utility in the engineering of physical qubits, to the possibility of using it to efficiently (but approximately) simulate Hamiltonians at the logical level. The apparent paradox of describing algorithms based on the quantum mechanical many-body wave-function with a DFT-like theory based on observables will remain a focus throughout. The ultimate goal of this talk is to initiate a dialog about what DFT could do for QI, in theory and in practice. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Ormandy, Paula
2011-03-01
Key policy drivers worldwide include optimizing patients' roles in managing their care; focusing services around patients' needs and preferences; and providing information to support patients' contributions and choices. The term information need penetrates many policy documents. Information need is espoused as the foundation from which to develop patient-centred or patient-led services. Yet there is no clear definition as to what the term means or how patients' information needs inform and shape information provision and patient care. The assimilation of complex theories originating from information science has much to offer considerations of patient information need within the context of health care. Health-related research often focuses on the content of information patients prefer, not why they need information. This paper extends and applies knowledge of information behaviour to considerations of information need in health, exposing a working definition for patient information need that reiterates the importance of considering the patient's goals and understanding the patient's context/situation. A patient information need is defined as 'recognition that their knowledge is inadequate to satisfy a goal, within the context/situation that they find themselves at a specific point in the time'. This typifies the key concepts of national/international health policy, the centrality and importance of the patient. The proposed definition of patient information need provides a conceptual framework to guide health-care practitioners on what to consider and why when meeting the information needs of patients in practice. This creates a solid foundation from which to inform future research. © 2010 The Author. Health Expectations © 2010 Blackwell Publishing Ltd.
On low rank classical groups in string theory, gauge theory and matrix models
International Nuclear Information System (INIS)
Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun
2004-01-01
We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature
An Integrative Behavioral Model of Information Security Policy Compliance
Directory of Open Access Journals (Sweden)
Sang Hoon Kim
2014-01-01
Full Text Available The authors found the behavioral factors that influence the organization members’ compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members’ attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1 the study is expected to play a role of the baseline for future research about organization members’ compliance with the information security policy, (2 the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3 the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training
An integrative behavioral model of information security policy compliance.
Kim, Sang Hoon; Yang, Kyung Hoon; Park, Sunyoung
2014-01-01
The authors found the behavioral factors that influence the organization members' compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members' attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1) the study is expected to play a role of the baseline for future research about organization members' compliance with the information security policy, (2) the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3) the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training programs suppressing
Information risk and security modeling
Zivic, Predrag
2005-03-01
This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.
Forewarning model for water pollution risk based on Bayes theory.
Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis
2014-02-01
In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.
Profile-likelihood Confidence Intervals in Item Response Theory Models.
Chalmers, R Philip; Pek, Jolynn; Liu, Yang
2017-01-01
Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.
Application of Chaos Theory to Psychological Models
Blackerby, Rae Fortunato
This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in
New Aspects of Probabilistic Forecast Verification Using Information Theory
Tödter, Julian; Ahrens, Bodo
2013-04-01
This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.
Evaluation of EMG processing techniques using Information Theory.
Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J
2010-11-12
Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.
Evaluation of EMG processing techniques using Information Theory
Directory of Open Access Journals (Sweden)
Felice Carmelo J
2010-11-01
Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.
Information carriers and (reading them through) information theory in quantum chemistry.
Geerlings, Paul; Borgoo, Alex
2011-01-21
This Perspective discusses the reduction of the electronic wave function via the second-order reduced density matrix to the electron density ρ(r), which is the key ingredient in density functional theory (DFT) as a basic carrier of information. Simplifying further, the 1-normalized density function turns out to contain essentially the same information as ρ(r) and is even of preferred use as an information carrier when discussing the periodic properties along Mendeleev's table where essentially the valence electrons are at stake. The Kullback-Leibler information deficiency turns out to be the most interesting choice to obtain information on the differences in ρ(r) or σ(r) between two systems. To put it otherwise: when looking for the construction of a functional F(AB) = F[ζ(A)(r),ζ(B)(r)] for extracting differences in information from an information carrier ζ(r) (i.e. ρ(r), σ(r)) for two systems A and B the Kullback-Leibler information measure ΔS is a particularly adequate choice. Examples are given, varying from atoms, to molecules and molecular interactions. Quantum similarity of atoms indicates that the shape function based KL information deficiency is the most appropriate tool to retrieve periodicity in the Periodic Table. The dissimilarity of enantiomers for which different information measures are presented at global and local (i.e. molecular and atomic) level leads to an extension of Mezey's holographic density theorem and shows numerical evidence that in a chiral molecule the whole molecule is pervaded by chirality. Finally Kullback-Leibler information profiles are discussed for intra- and intermolecular proton transfer reactions and a simple S(N)2 reaction indicating that the theoretical information profile can be used as a companion to the energy based Hammond postulate to discuss the early or late transition state character of a reaction. All in all this Perspective's answer is positive to the question of whether an even simpler carrier of
Viola, Lorenza; Tannor, David
2011-08-01
Precisely characterizing and controlling the dynamics of realistic open quantum systems has emerged in recent years as a key challenge across contemporary quantum sciences and technologies, with implications ranging from physics, chemistry and applied mathematics to quantum information processing (QIP) and quantum engineering. Quantum control theory aims to provide both a general dynamical-system framework and a constructive toolbox to meet this challenge. The purpose of this special issue of Journal of Physics B: Atomic, Molecular and Optical Physics is to present a state-of-the-art account of recent advances and current trends in the field, as reflected in two international meetings that were held on the subject over the last summer and which motivated in part the compilation of this volume—the Topical Group: Frontiers in Open Quantum Systems and Quantum Control Theory, held at the Institute for Theoretical Atomic, Molecular and Optical Physics (ITAMP) in Cambridge, Massachusetts (USA), from 1-14 August 2010, and the Safed Workshop on Quantum Decoherence and Thermodynamics Control, held in Safed (Israel), from 22-27 August 2010. Initial developments in quantum control theory date back to (at least) the early 1980s, and have been largely inspired by the well-established mathematical framework for classical dynamical systems. As the above-mentioned meetings made clear, and as the burgeoning body of literature on the subject testifies, quantum control has grown since then well beyond its original boundaries, and has by now evolved into a highly cross-disciplinary field which, while still fast-moving, is also entering a new phase of maturity, sophistication, and integration. Two trends deserve special attention: on the one hand, a growing emphasis on control tasks and methodologies that are specifically motivated by QIP, in addition and in parallel to applications in more traditional areas where quantum coherence is nevertheless vital (such as, for instance
PARFUME Theory and Model basis Report
Energy Technology Data Exchange (ETDEWEB)
Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson
2009-09-01
The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.
Entropy in quantum information theory - Communication and cryptography
DEFF Research Database (Denmark)
Majenz, Christian
in quantum Shannon theory. While immensely more entanglement-consuming, the variant of port based teleportation is interesting for applications like instantaneous non-local computation and attacks on quantum position-based cryptography. Port based teleportation cannot be implemented perfectly......, for vanishing error. As a byproduct, a new lower bound for the size of the program register for an approximate universal programmable quantum processor is derived. Finally, the mix is completed with a result in quantum cryptography. While quantum key distribution is the most well-known quantum cryptographic...... protocol, there has been increased interest in extending the framework of symmetric key cryptography to quantum messages. We give a new denition for information-theoretic quantum non-malleability, strengthening the previous denition by Ambainis et al. We show that quantum non-malleability implies secrecy...
Surrogate marker evaluation from an information theory perspective.
Alonso, Ariel; Molenberghs, Geert
2007-03-01
The last 20 years have seen lots of work in the area of surrogate marker validation, partly devoted to frame the evaluation in a multitrial framework, leading to definitions in terms of the quality of trial- and individual-level association between a potential surrogate and a true endpoint (Buyse et al., 2000, Biostatistics 1, 49-67). A drawback is that different settings have led to different measures at the individual level. Here, we use information theory to create a unified framework, leading to a definition of surrogacy with an intuitive interpretation, offering interpretational advantages, and applicable in a wide range of situations. Our method provides a better insight into the chances of finding a good surrogate endpoint in a given situation. We further show that some of the previous proposals follow as special cases of our method. We illustrate our methodology using data from a clinical study in psychiatry.
The use of information theory for the evaluation of biomarkers of aging and physiological age.
Blokh, David; Stambler, Ilia
2017-04-01
The present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians. The use of information-theoretical methods, utilizing normalized mutual information, revealed the exact amount of information that various diagnostic parameters or their combinations contained about the persons' age. Based on those exact informative values for the correlation of measured parameters with age, we constructed a diagnostic rule (a decision tree) to evaluate physiological age, as compared to chronological age. The present data illustrated that younger subjects suffering from heart disease showed characteristics of people of higher age (higher physiological age). Utilizing information-theoretical measures, with additional data, it may be possible to create further clinically applicable information-theory-based markers and models for the evaluation of physiological age, its relation to age-related diseases and its potential modifications by therapeutic interventions. Copyright © 2017 Elsevier B.V. All rights reserved.
Information theory applied to econophysics: stock market behaviors
Vogel, Eugenio E.; Saravia, Gonzalo
2014-08-01
The use of data compressor techniques has allowed to recognize magnetic transitions and their associated critical temperatures [E.E. Vogel, G. Saravia, V. Cortez, Physica A 391, 1591 (2012)]. In the present paper we introduce some new concepts associated to data recognition and extend the use of these techniques to econophysics to explore the variations of stock market indicators showing that information theory can help to recognize different regimes. Modifications and further developments to previously introduced data compressor wlzip are introduced yielding two measurements. Additionally, we introduce an algorithm that allows to tune the number of significant digits over which the data compression is due to act complementing, this with an appropriate method to round off the truncation. The application is done to IPSA, the main indicator of the Chilean Stock Market during the year 2010 due to availability of quality data and also to consider a rare effect: the earthquake of the 27th of February on that year which is as of now the sixth strongest earthquake ever recorded by instruments (8.8 Richter scale) according to United States Geological Survey. Along the year 2010 different regimes are recognized. Calm days show larger compression than agitated days allowing for classification and recognition. Then the focus turns onto selected days showing that it is possible to recognize different regimes with the data of the last hour (60 entries) allowing to determine actions in a safer way. The "day of the week" effect is weakly present but "the hour of the day" effect is clearly present; its causes and implications are discussed. This effect also establishes the influence of Asian, European and American stock markets over the smaller Chilean Stock Market. Then dynamical studies are conducted intended to search a system that can help to realize in real time about sudden variations of the market; it is found that information theory can be really helpful in this respect.
Stochastic linear programming models, theory, and computation
Kall, Peter
2011-01-01
This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...
Finding an information concept suited for a universal theory of information.
Brier, Søren
2015-12-01
The view argued in this article is that if we want to define a universal concept of information covering subjective experiential and meaningful cognition - as well as intersubjective meaningful communication in nature, technology, society and life worlds - then the main problem is to decide, which epistemological, ontological and philosophy of science framework the concept of information should be based on and integrated in. All the ontological attempts to create objective concepts of information result in concepts that cannot encompass meaning and experience of embodied living and social systems. There is no conclusive evidence that the core of reality across nature, culture, life and mind is purely either mathematical, logical or of a computational nature. Therefore the core of the information concept should not only be based only on pure logical or mathematical rationality. We need to include interpretation, signification and meaning construction in our transdisciplinary framework for information as a basic aspect of reality alongside the physical, chemical and molecular biological. Dretske defines information as the content of new, true, meaningful, and understandable knowledge. According to this widely held definition information in a transdisciplinary theory cannot be 'objective', but has to be relativized in relation to the receiver's knowledge, as also proposed by Floridi. It is difficult to produce a quantitative statement independently of a qualitative analysis based on some sort of relation to the human condition as a semiotic animal. I therefore alternatively suggest to build information theories based on semiotics from the basic relations of embodied living systems meaningful cognition and communication. I agree with Peircean biosemiotics that all information must be part of real relational sign-processes manifesting as tokens. Copyright © 2015. Published by Elsevier Ltd.
Spiral model pilot project information model
1991-01-01
The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.
Finding an Information Concept Suited for a Universal Theory of Information
DEFF Research Database (Denmark)
Brier, Søren
2015-01-01
. There is no conclusive evidence that the core of reality across nature, culture, life and mind is purely either mathematical, logical or of a computational nature. Therefore the core of the information concept should not only be based only on pure logical or mathematical rationality. We need to include interpretation...... definition information in a transdisciplinary theory cannot be ‘objective’, but has to be relativized in relation to the receiver's knowledge, as also proposed by Floridi. It is difficult to produce a quantitative statement independently of a qualitative analysis based on some sort of relation to the human...
Theory and approach of information retrievals from electromagnetic scattering and remote sensing
Jin, Ya-Qiu
2006-01-01
Covers several hot topics in current research of electromagnetic scattering, and radiative transfer in complex and random media, polarimetric scattering and SAR imagery technology, data validation and information retrieval from space-borne remote sensing, computational electromagnetics, etc.Including both forward modelling and inverse problems, analytic theory and numerical approachesAn overall summary of the author's works during most recent yearsAlso presents some insight for future research topics.
Dunlop, David L.
Reported is another study related to the Project on an Information Memory Model. This study involved using information theory to investigate the concepts of primacy and recency as they were exhibited by ninth-grade science students while processing a biological sorting problem and an immediate, abstract recall task. Two hundred randomly selected…
Inference with minimal Gibbs free energy in information field theory
International Nuclear Information System (INIS)
Ensslin, Torsten A.; Weig, Cornelius
2010-01-01
Non-linear and non-Gaussian signal inference problems are difficult to tackle. Renormalization techniques permit us to construct good estimators for the posterior signal mean within information field theory (IFT), but the approximations and assumptions made are not very obvious. Here we introduce the simple concept of minimal Gibbs free energy to IFT, and show that previous renormalization results emerge naturally. They can be understood as being the Gaussian approximation to the full posterior probability, which has maximal cross information with it. We derive optimized estimators for three applications, to illustrate the usage of the framework: (i) reconstruction of a log-normal signal from Poissonian data with background counts and point spread function, as it is needed for gamma ray astronomy and for cosmography using photometric galaxy redshifts, (ii) inference of a Gaussian signal with unknown spectrum, and (iii) inference of a Poissonian log-normal signal with unknown spectrum, the combination of (i) and (ii). Finally we explain how Gaussian knowledge states constructed by the minimal Gibbs free energy principle at different temperatures can be combined into a more accurate surrogate of the non-Gaussian posterior.
Generalised perturbation theory and source of information through chemical measurements
International Nuclear Information System (INIS)
Lelek, V.; Marek, T.
2001-01-01
It is important to make all analyses and collect all information from the work of the new facility (which the transmutation demonstration unit will surely be) to be sure that the operation corresponds to the forecast or to correct the equations of the facility. The behaviour of the molten salt reactor and in particular the system of measurement are very different from that of the solid fuel reactor. Key information from the long time kinetics could be the nearly on line knowledge of the fuel composition. In this work it is shown how to include it into the control and use such data for the correction of neutron cross-sections for the high actinides or other characteristics. Also the problem of safety - change of the boundary problem to the initial problem - is mentioned. The problem is transformed into the generalised perturbation theory in which the adjoint function is obtained through the solution of the equations with right hand side having the form of source. Such an approach should be a theoretical base for the calculation of the sensitivity coefficients. (authors)
Hillebrandt, Annika; Barclay, Laurie J
2017-05-01
Studies have indicated that observers can infer information about others' behavioral intentions from others' emotions and use this information in making their own decisions. Integrating emotions as social information (EASI) theory and attribution theory, we argue that the interpersonal effects of emotions are not only influenced by the type of discrete emotion (e.g., anger vs. happiness) but also by the target of the emotion (i.e., how the emotion relates to the situation). We compare the interpersonal effects of emotions that are integral (i.e., related to the situation) versus incidental (i.e., lacking a clear target in the situation) in a negotiation context. Results from 4 studies support our general argument that the target of an opponent's emotion influences the degree to which observers attribute the emotion to their own behavior. These attributions influence observers' inferences regarding the perceived threat of an impasse or cooperativeness of an opponent, which can motivate observers to strategically adjust their behavior. Specifically, emotion target influenced concessions for both anger and happiness (Study 1, N = 254), with perceived threat and cooperativeness mediating the effects of anger and happiness, respectively (Study 2, N = 280). Study 3 (N = 314) demonstrated the mediating role of attributions and moderating role of need for closure. Study 4 (N = 193) outlined how observers' need for cognitive closure influences how they attribute incidental anger. We discuss theoretical implications related to the social influence of emotions as well as practical implications related to the impact of personality on negotiators' biases and behaviors. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2017-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Theory and modelling of nanocarbon phase stability.
Energy Technology Data Exchange (ETDEWEB)
Barnard, A. S.
2006-01-01
The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2015-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Attention Discrimination: Theory and Field Experiments with Monitoring Information Acquisition
Bartoš, Vojtěch; Bauer, Michal; Chytilová, Julie; Matějka, Filip
2014-01-01
We link two important ideas: attention is scarce and lack of information about an individual drives discrimination in selection decisions. Our model of allocation of costly attention implies that applicants from negatively stereotyped groups face "attention discrimination": less attention in highly selective cherry-picking markets, where more attention helps applicants, and more attention in lemon-dropping markets, where it harms them. To test the prediction, we integrate tools to monitor inf...
Information aspects of type IX cosmological models
International Nuclear Information System (INIS)
Francisco, G.
1987-01-01
A study of amounts of information necessary to localize the trajectory of a dynamical system known as the Mixmaster universe, is presented. The main result is that less information is necessary near the cosmological singularity of the system than far away. This conclusion is obtained by evolving probability distributions towards the singularity and comparying the associated information functions. Qualitative methods of dynamical systems theory present a phenomenon that might be related to this loss of information. (author) [pt
Game Theory and its Relationship with Linear Programming Models ...
African Journals Online (AJOL)
Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.
Information loss in effective field theory: Entanglement and thermal entropies
Boyanovsky, Daniel
2018-03-01
Integrating out high energy degrees of freedom to yield a low energy effective field theory leads to a loss of information with a concomitant increase in entropy. We obtain the effective field theory of a light scalar field interacting with heavy fields after tracing out the heavy degrees of freedom from the time evolved density matrix. The initial density matrix describes the light field in its ground state and the heavy fields in equilibrium at a common temperature T . For T =0 , we obtain the reduced density matrix in a perturbative expansion; it reveals an emergent mixed state as a consequence of the entanglement between light and heavy fields. We obtain the effective action that determines the time evolution of the reduced density matrix for the light field in a nonperturbative Dyson resummation of one-loop correlations of the heavy fields. The Von-Neumann entanglement entropy associated with the reduced density matrix is obtained for the nonresonant and resonant cases in the asymptotic long time limit. In the nonresonant case the reduced density matrix displays an incipient thermalization albeit with a wave-vector, time and coupling dependent effective temperature as a consequence of memory of initial conditions. The entanglement entropy is time independent and is the thermal entropy for this effective, nonequilibrium temperature. In the resonant case the light field fully thermalizes with the heavy fields, the reduced density matrix loses memory of the initial conditions and the entanglement entropy becomes the thermal entropy of the light field. We discuss the relation between the entanglement entropy ultraviolet divergences and renormalization.
Hosotani model in closed string theory
International Nuclear Information System (INIS)
Shiraishi, Kiyoshi.
1988-11-01
Hosotani mechanism in the closed string theory with current algebra symmetry is described by the (old covariant) operator method. We compare the gauge symmetry breaking mechanism in a string theory which has SU(2) symmetry with the one in an equivalent compactified closed string theory. We also investigate the difference between Hosotani mechanism and Higgs mechanism in closed string theories by calculation of a fourpoint amplitude of 'Higgs' bosons at tree level. (author)
Wang, Lin
2013-01-01
Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…
The Properties of Model Selection when Retaining Theory Variables
DEFF Research Database (Denmark)
Hendry, David F.; Johansen, Søren
Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....
System Dynamics as Model-Based Theory Building
Schwaninger, Markus; Grösser, Stefan N.
2008-01-01
This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...
A Realizability Model for Impredicative Hoare Type Theory
DEFF Research Database (Denmark)
Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar
2008-01-01
We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....
Irreducible integrable theories form tensor products of conformal models
International Nuclear Information System (INIS)
Mathur, S.D.; Warner, N.P.
1991-01-01
By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)
Visceral obesity and psychosocial stress: a generalised control theory model
Wallace, Rodrick
2016-07-01
The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.
Advancing Models and Theories for Digital Behavior Change Interventions.
Hekler, Eric B; Michie, Susan; Pavel, Misha; Rivera, Daniel E; Collins, Linda M; Jimison, Holly B; Garnett, Claire; Parral, Skye; Spruijt-Metz, Donna
2016-11-01
To be suitable for informing digital behavior change interventions, theories and models of behavior change need to capture individual variation and changes over time. The aim of this paper is to provide recommendations for development of models and theories that are informed by, and can inform, digital behavior change interventions based on discussions by international experts, including behavioral, computer, and health scientists and engineers. The proposed framework stipulates the use of a state-space representation to define when, where, for whom, and in what state for that person, an intervention will produce a targeted effect. The "state" is that of the individual based on multiple variables that define the "space" when a mechanism of action may produce the effect. A state-space representation can be used to help guide theorizing and identify crossdisciplinary methodologic strategies for improving measurement, experimental design, and analysis that can feasibly match the complexity of real-world behavior change via digital behavior change interventions. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Kitazono, Jun; Kanai, Ryota; Oizumi, Masafumi
2018-03-01
The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($\\Phi$) in the brain is related to the level of consciousness. IIT proposes that to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that if a measure of $\\Phi$ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of $\\Phi$ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of $\\Phi$ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure $\\Phi$ in large systems within a practical amount of time.
Application of the evolution theory in modelling of innovation diffusion
Directory of Open Access Journals (Sweden)
Krstić Milan
2016-01-01
Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.
Theory of choice in bandit, information sampling and foraging tasks.
Averbeck, Bruno B
2015-03-01
Decision making has been studied with a wide array of tasks. Here we examine the theoretical structure of bandit, information sampling and foraging tasks. These tasks move beyond tasks where the choice in the current trial does not affect future expected rewards. We have modeled these tasks using Markov decision processes (MDPs). MDPs provide a general framework for modeling tasks in which decisions affect the information on which future choices will be made. Under the assumption that agents are maximizing expected rewards, MDPs provide normative solutions. We find that all three classes of tasks pose choices among actions which trade-off immediate and future expected rewards. The tasks drive these trade-offs in unique ways, however. For bandit and information sampling tasks, increasing uncertainty or the time horizon shifts value to actions that pay-off in the future. Correspondingly, decreasing uncertainty increases the relative value of actions that pay-off immediately. For foraging tasks the time-horizon plays the dominant role, as choices do not affect future uncertainty in these tasks.
Building Information Modeling for Managing Design and Construction
DEFF Research Database (Denmark)
Berard, Ole Bengt
outcome of construction work. Even though contractors regularly encounter design information problems, these issues are accepted as a condition of doing business and better design information has yet to be defined. Building information modeling has the inherent promise of improving the quality of design...... information for work tasks. * Amount of Information – the number of documents and files, and other media, should be appropriate for the scope. The criteria were identified by empirical studies and theory on information quality in the architectural, engineering and construction (AEC) industry and other fields......Contractors planning and executing construction work encounter many kinds of problems with design information, such as uncoordinated drawings and specification, missing relevant information, and late delivery of design information. Research has shown that missing design information and unintended...
Rauscher, Emily A; Hesse, Colin
2014-01-01
Although the importance of being knowledgeable of one's family health history is widely known, very little research has investigated how families communicate about this important topic. This study investigated how young adults seek information from parents about family health history. The authors used the Theory of Motivated Information Management as a framework to understand the process of uncertainty discrepancy and emotion in seeking information about family health history. Results of this study show the Theory of Motivated Information Management to be a good model to explain the process young adults go through in deciding to seek information from parents about family health history. Results also show that emotions other than anxiety can be used with success in the Theory of Motivated Information Management framework.
Parsimonious Language Models for Information Retrieval
Hiemstra, Djoerd; Robertson, Stephen; Zaragoza, Hugo
We systematically investigate a new approach to estimating the parameters of language models for information retrieval, called parsimonious language models. Parsimonious language models explicitly address the relation between levels of language models that are typically used for smoothing. As such,
Information theory in econophysics: stock market and retirement funds
Vogel, Eugenio; Saravia, G.; Astete, J.; Díaz, J.; Erribarren, R.; Riadi, F.
2013-03-01
Information theory can help to recognize magnetic phase transitions, what can be seen as a way to recognize different regimes. This is achieved by means of zippers specifically designed to compact data in a meaningful way at is the case for compressor wlzip. In the present contribution we first apply wlzip to the Chilean stock market interpreting the compression rates for the files storing the minute variation of the IPSA indicator. Agitated days yield poor compression rates while calm days yield high compressibility. We then correlate this behavior to the value of the five retirement funds related to the Chilean economy. It is found that the covariance between the profitability of the retirement funds and the compressibility of the IPSA values of previous day is high for those funds investing in risky stocks. Surprisingly, there seems to be no great difference among the three riskier funds contrary to what could be expected from the limitations on the portfolio composition established by the laws that regulate this market.
International Nuclear Information System (INIS)
Cooper, F.
1996-01-01
We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations
National Research Council Canada - National Science Library
Nolte, Loren
2002-01-01
The hypothesis is that one can use signal detection theory to improve the performance in detecting tumors in the breast by using this theory to develop task-oriented information processing techniques...
Information structures in economics studies in the theory of markets with imperfect information
Nermuth, Manfred
1982-01-01
This book is intended as a contribution to the theory of markets with imperfect information. The subject being nearly limitless, only certain selected topics are discussed. These are outlined in the Introduction (Ch. 0). The remainder of the book is divided into three parts. All results of economic significance are contained in Parts II & III. Part I introduces the main tools for the analysis, in particular the concept of an information structure. Although most of the material presented in Part I is not original, it is hoped that the detailed and self-contained exposition will help the reader to understand not only the following pages, but also the existing technical and variegated literature on markets with imperfect information. The mathematical prerequisites needed, but not explained in the text rarely go beyond elementary calculus and probability theory. Whenever more advanced concepts are used, I have made an effort to give an intuitive explanation as well, so that the argument can also be followed o...
Pangenesis as a source of new genetic information. The history of a now disproven theory.
Bergman, Gerald
2006-01-01
Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.
Informal meeting on recent developments in field theory
International Nuclear Information System (INIS)
Anon.
1977-12-01
A topical meeting on recent developments in field theory was organized by the International Centre for Theoretical Physics from 21 to 23 November 1977. The publication is a compilation of the abstracts of lecture given. The mayor themes of the meeting were the problem of confinement, the quantization of Yang-Mills theories and the topological aspects of field theories in flat and curved spaces
Function Model for Community Health Service Information
Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong
In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.
A Model-Driven Development Method for Management Information Systems
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
A Social Information Processing Model of Media Use in Organizations.
Fulk, Janet; And Others
1987-01-01
Presents a model to examine how social influence processes affect individuals' attitudes toward communication media and media use behavior, integrating two research areas: media use patterns as the outcome of objectively rational choices and social information processing theory. Asserts (in a synthesis) that media characteristics and attitudes are…
Optimal item discrimination and maximum information for logistic IRT models
Veerkamp, W.J.J.; Veerkamp, Wim J.J.; Berger, Martijn P.F.; Berger, Martijn
1999-01-01
Items with the highest discrimination parameter values in a logistic item response theory model do not necessarily give maximum information. This paper derives discrimination parameter values, as functions of the guessing parameter and distances between person parameters and item difficulty, that
Item Information in the Rasch Model
Engelen, Ron J.H.; van der Linden, Willem J.; Oosterloo, Sebe J.
1988-01-01
Fisher's information measure for the item difficulty parameter in the Rasch model and its marginal and conditional formulations are investigated. It is shown that expected item information in the unconditional model equals information in the marginal model, provided the assumption of sampling
Fuzzy Stochastic Optimization Theory, Models and Applications
Wang, Shuming
2012-01-01
Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies. The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...
Consumption of Mass Communication--Construction of a Model on Information Consumption Behaviour.
Sepstrup, Preben
A general conceptual model on the consumption of information is introduced. Information as the output of the mass media is treated as a product, and a model on the consumption of this product is developed by merging elements from consumer behavior theory and mass communication theory. Chapter I gives basic assumptions about the individual and the…
Wiio, Osmo A.
A more unified approach to communication theory can evolve through systems modeling of information theory, communication modes, and mass media operations. Such systematic analysis proposes, as is the case care here, that information models be based upon combinations of energy changes and exchanges and changes in receiver systems. The mass media is…
Chaos Theory as a Model for Managing Issues and Crises.
Murphy, Priscilla
1996-01-01
Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
A Leadership Identity Development Model: Applications from a Grounded Theory
Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.
2006-01-01
This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
Revealing Relationships among Relevant Climate Variables with Information Theory
Knuth, Kevin H.; Golera, Anthony; Curry, Charles T.; Huyser, Karen A.; Kevin R. Wheeler; Rossow, William B.
2005-01-01
The primary objective of the NASA Earth-Sun Exploration Technology Office is to understand the observed Earth climate variability, thus enabling the determination and prediction of the climate's response to both natural and human-induced forcing. We are currently developing a suite of computational tools that will allow researchers to calculate, from data, a variety of information-theoretic quantities such as mutual information, which can be used to identify relationships among climate variables, and transfer entropy, which indicates the possibility of causal interactions. Our tools estimate these quantities along with their associated error bars, the latter of which is critical for describing the degree of uncertainty in the estimates. This work is based upon optimal binning techniques that we have developed for piecewise-constant, histogram-style models of the underlying density functions. Two useful side benefits have already been discovered. The first allows a researcher to determine whether there exist sufficient data to estimate the underlying probability density. The second permits one to determine an acceptable degree of round-off when compressing data for efficient transfer and storage. We also demonstrate how mutual information and transfer entropy can be applied so as to allow researchers not only to identify relations among climate variables, but also to characterize and quantify their possible causal interactions.
Model Information Exchange System (MIXS).
2013-08-01
Many travel demand forecast models operate at state, regional, and local levels. While they share the same physical network in overlapping geographic areas, they use different and uncoordinated modeling networks. This creates difficulties for models ...
Theory and modeling of active brazing.
Energy Technology Data Exchange (ETDEWEB)
van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.
2013-09-01
Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.
Domain Theory, Its Models and Concepts
DEFF Research Database (Denmark)
Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt
2014-01-01
Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contribution...
Uncertainty analysis of an integrated energy system based on information theory
International Nuclear Information System (INIS)
Fu, Xueqian; Sun, Hongbin; Guo, Qinglai; Pan, Zhaoguang; Xiong, Wen; Wang, Li
2017-01-01
Currently, a custom-designed configuration of different renewable technologies named the integrated energy system (IES) has become popular due to its high efficiency, benefiting from complementary multi-energy technologies. This paper proposes an information entropy approach to quantify uncertainty in an integrated energy system based on a stochastic model that drives a power system model derived from an actual network on Barry Island. Due to the complexity of co-behaviours between generators, a copula-based approach is utilized to articulate the dependency structure of the generator outputs with regard to such factors as weather conditions. Correlation coefficients and mutual information, which are effective for assessing the dependence relationships, are applied to judge whether the stochastic IES model is correct. The calculated information values can be used to analyse the impacts of the coupling of power and heat on power flows and heat flows, and this approach will be helpful for improving the operation of IES. - Highlights: • The paper explores uncertainty of an integrated energy system. • The dependent weather model is verified from the perspective of correlativity. • The IES model considers the dependence between power and heat. • The information theory helps analyse the complexity of IES operation. • The application of the model is studied using an operational system on Barry Island.
Attachment and the processing of social information across the life span: theory and evidence.
Dykas, Matthew J; Cassidy, Jude
2011-01-01
Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the patterns of results that have emerged from these studies. The central proposition is that individuals who possess secure experience-based internal working models of attachment will process--in a relatively open manner--a broad range of positive and negative attachment-relevant social information. Moreover, secure individuals will draw on their positive attachment-related knowledge to process this information in a positively biased schematic way. In contrast, individuals who possess insecure internal working models of attachment will process attachment-relevant social information in one of two ways, depending on whether the information could cause the individual psychological pain. If processing the information is likely to lead to psychological pain, insecure individuals will defensively exclude this information from further processing. If, however, the information is unlikely to lead to psychological pain, then insecure individuals will process this information in a negatively biased schematic fashion that is congruent with their negative attachment-related experiences. In a comprehensive literature review, we describe studies that illustrate these patterns of attachment-related information processing from childhood to adulthood. This review focuses on studies that have examined specific components (e.g., attention and memory) and broader aspects (e.g., attributions) of social information processing. We also provide general conclusions and suggestions for future research.
A simplified computational memory model from information processing
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
2016-01-01
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847
A simplified computational memory model from information processing.
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
2016-11-23
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Energy Technology Data Exchange (ETDEWEB)
McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Modelling, Information, Processing, and Control
1989-01-15
have studied the use of transfer function methods to analyze closed - loop systems arising out of cer- tain linear feedback laws , use of transfer...classical Muntz- Szasz theory of real exponentials. As a re- sult it is seen that D has domain including that of A I / 2 in all cases. Further work is...approximation questions, we will discuss the use of transfer function methods to analyze closed - loop systems arising out of cer- tain linear feedback laws
Directory of Open Access Journals (Sweden)
Joshua Rodewald
2016-10-01
Full Text Available Supply networks existing today in many industries can behave as complex adaptive systems making them more difficult to analyze and assess. Being able to fully understand both the complex static and dynamic structures of a complex adaptive supply network (CASN are key to being able to make more informed management decisions and prioritize resources and production throughout the network. Previous efforts to model and analyze CASN have been impeded by the complex, dynamic nature of the systems. However, drawing from other complex adaptive systems sciences, information theory provides a model-free methodology removing many of those barriers, especially concerning complex network structure and dynamics. With minimal information about the network nodes, transfer entropy can be used to reverse engineer the network structure while local transfer entropy can be used to analyze the network structure’s dynamics. Both simulated and real-world networks were analyzed using this methodology. Applying the methodology to CASNs allows the practitioner to capitalize on observations from the highly multidisciplinary field of information theory which provides insights into CASN’s self-organization, emergence, stability/instability, and distributed computation. This not only provides managers with a more thorough understanding of a system’s structure and dynamics for management purposes, but also opens up research opportunities into eventual strategies to monitor and manage emergence and adaption within the environment.
Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory
Energy Technology Data Exchange (ETDEWEB)
J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts
2006-05-01
This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and
Local versus nonlocal information in quantum-information theory: Formalism and phenomena
International Nuclear Information System (INIS)
Horodecki, Michal; Horodecki, Ryszard; Synak-Radtke, Barbara; Horodecki, Pawel; Oppenheim, Jonathan; Sen, Aditi; Sen, Ujjwal
2005-01-01
In spite of many results in quantum information theory, the complex nature of compound systems is far from clear. In general the information is a mixture of local and nonlocal ('quantum') information. It is important from both pragmatic and theoretical points of view to know the relationships between the two components. To make this point more clear, we develop and investigate the quantum-information processing paradigm in which parties sharing a multipartite state distill local information. The amount of information which is lost because the parties must use a classical communication channel is the deficit. This scheme can be viewed as complementary to the notion of distilling entanglement. After reviewing the paradigm in detail, we show that the upper bound for the deficit is given by the relative entropy distance to so-called pseudoclassically correlated states; the lower bound is the relative entropy of entanglement. This implies, in particular, that any entangled state is informationally nonlocal - i.e., has nonzero deficit. We also apply the paradigm to defining the thermodynamical cost of erasing entanglement. We show the cost is bounded from below by relative entropy of entanglement. We demonstrate the existence of several other nonlocal phenomena which can be found using the paradigm of local information. For example, we prove the existence of a form of nonlocality without entanglement and with distinguishability. We analyze the deficit for several classes of multipartite pure states and obtain that in contrast to the GHZ state, the Aharonov state is extremely nonlocal. We also show that there do not exist states for which the deficit is strictly equal to the whole informational content (bound local information). We discuss the relation of the paradigm with measures of classical correlations introduced earlier. It is also proved that in the one-way scenario, the deficit is additive for Bell diagonal states. We then discuss complementary features of
Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu
2012-02-01
In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.
Directory of Energy Information Administration Models 1994
International Nuclear Information System (INIS)
1994-07-01
This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994
Directory of Energy Information Administration Models 1994
Energy Technology Data Exchange (ETDEWEB)
1994-07-01
This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994.
Directory of Open Access Journals (Sweden)
Carol A. Gordon
2009-06-01
Full Text Available Objective – Part I of this paper aims to create a framework for an emerging theory of evidence based information literacy instruction. In order to ground this framework in existing theory, a holistic perspective views inquiry as a learning process that synthesizes information searching and knowledge building. An interdisciplinary approach is taken to relate user-centric information behavior theory and constructivist learning theory that supports this synthesis. The substantive theories that emerge serve as a springboard for emerging theory. A second objective of this paper is to define evidence based information literacy instruction by assessing the suitability of performance based assessment and action research as tools of evidence based practice.Methods – An historical review of research grounded in user-centered information behavior theory and constructivist learning theory establishes a body of existing substantive theory that supports emerging theory for evidence based information literacy instruction within an information-to-knowledge approach. A focused review of the literature presents supporting research for an evidence based pedagogy that is performance assessment based, i.e., information users are immersed in real-world tasks that include formative assessments. An analysis of the meaning of action research in terms of its purpose and methodology establishes its suitability for structuring an evidence based pedagogy. Supporting research tests a training model for school librarians and educators which integrates performance based assessment, as well as action research. Results – Findings of an historical analysis of information behavior theory and constructivist teaching practices, and a literature review that explores teaching models for evidence based information literacy instruction, point to two elements of evidence based information literacy instruction: the micro level of information searching behavior and the macro level of
Big bang models in string theory
Energy Technology Data Exchange (ETDEWEB)
Craps, Ben [Theoretische Natuurkunde, Vrije Universiteit Brussel and The International Solvay Institutes Pleinlaan 2, B-1050 Brussels (Belgium)
2006-11-07
These proceedings are based on lectures delivered at the 'RTN Winter School on Strings, Supergravity and Gauge Theories', CERN, 16-20 January 2006. The school was mainly aimed at PhD students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne orbifold and the matrix big bang.
Directory of Open Access Journals (Sweden)
Claes Dahlqvist
2016-12-01
Full Text Available Librarian-teacher cooperation is essential for the integration of information literacy into course syllabi. Therefore, a common theoretical and methodological platform is needed. As librarians at Kristianstad University we have had the opportunity to develop such a platform when teaching information literacy in a basic course for teachers in higher education pedagogy. Information literacy is taught in context with academic writing, distance learning and teaching, and development of course syllabi. Constructive Alignment in Theory: We used constructive alignment in designing our part of the course. John Biggs’ ideas tell us that assessment tasks (ATs should be aligned to what is intended to be learned. Intended learning outcomes (ILOs specify teaching/learning activities (TLAs based on the content of learning. TLAs should be designed in ways that enable students to construct knowledge from their own experience. The ILOs for the course are to have arguments for the role of information literacy in higher education and ideas of implementing them in TLAs. The content of learning is for example the concept of information literacy, theoretical perspectives and constructive alignment for integration in course syllabi. TLAs are written pre-lecture reﬂections on the concept of information literacy, used as a starting point for the three-hour seminar. Learning reﬂections are written afterwards. The AT is to revise a syllabus (preferably using constructive alignment for a course the teacher is responsible for, where information literacy must be integrated with the other parts and topics of the course. Constructive Alignment in Practice: Using constructive alignment has taught us that this model serves well as the foundation of the theoretical and methodological platform for librarian-teacher cooperation when integrating information literacy in course syllabi. It contains all important aspects of the integration of information literacy in course
Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools
Directory of Open Access Journals (Sweden)
Yuejun Guo
2017-06-01
Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.
The Standard Model is Natural as Magnetic Gauge Theory
DEFF Research Database (Denmark)
Sannino, Francesco
2011-01-01
matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...
Pre-Game-Theory Based Information Technology (GAMBIT) Study
National Research Council Canada - National Science Library
Polk, Charles
2003-01-01
.... The generic GAMBIT scenario has been characterized as Dynamic Hierarchical Gaming (DHG). Game theory is not yet ready to fully support analysis of DHG, though existing partial analysis suggests that a full treatment is practical in the midterm...
Signal classification using global dynamical models, Part I: Theory
International Nuclear Information System (INIS)
Kadtke, J.; Kremliovsky, M.
1996-01-01
Detection and classification of signals is one of the principal areas of signal processing, and the utilization of nonlinear information has long been considered as a way of improving performance beyond standard linear (e.g. spectral) techniques. Here, we develop a method for using global models of chaotic dynamical systems theory to define a signal classification processing chain, which is sensitive to nonlinear correlations in the data. We use it to demonstrate classification in high noise regimes (negative SNR), and argue that classification probabilities can be directly computed from ensemble statistics in the model coefficient space. We also develop a modification for non-stationary signals (i.e. transients) using non-autonomous ODEs. In Part II of this paper, we demonstrate the analysis on actual open ocean acoustic data from marine biologics. copyright 1996 American Institute of Physics
Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory
International Nuclear Information System (INIS)
Chung, S.; Tye, S.H.
1993-01-01
The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory
Information modelling and knowledge bases XXV
Tokuda, T; Jaakkola, H; Yoshida, N
2014-01-01
Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin
Electroweak theory and the Standard Model
CERN. Geneva; Giudice, Gian Francesco
2004-01-01
There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.
Statistical Learning Theory: Models, Concepts, and Results
von Luxburg, Ulrike; Schoelkopf, Bernhard
2008-01-01
Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.
Information fusion-based approach for studying influence on Twitter using belief theory.
Azaza, Lobna; Kirgizov, Sergey; Savonnet, Marinette; Leclercq, Éric; Gastineau, Nicolas; Faiz, Rim
2016-01-01
Influence in Twitter has become recently a hot research topic, since this micro-blogging service is widely used to share and disseminate information. Some users are more able than others to influence and persuade peers. Thus, studying most influential users leads to reach a large-scale information diffusion area, something very useful in marketing or political campaigns. In this study, we propose a new approach for multi-level influence assessment on multi-relational networks, such as Twitter . We define a social graph to model the relationships between users as a multiplex graph where users are represented by nodes, and links model the different relations between them (e.g., retweets , mentions , and replies ). We explore how relations between nodes in this graph could reveal about the influence degree and propose a generic computational model to assess influence degree of a certain node. This is based on the conjunctive combination rule from the belief functions theory to combine different types of relations. We experiment the proposed method on a large amount of data gathered from Twitter during the European Elections 2014 and deduce top influential candidates. The results show that our model is flexible enough to to consider multiple interactions combination according to social scientists needs or requirements and that the numerical results of the belief theory are accurate. We also evaluate the approach over the CLEF RepLab 2014 data set and show that our approach leads to quite interesting results.
Glass Durability Modeling, Activated Complex Theory (ACT)
International Nuclear Information System (INIS)
CAROL, JANTZEN
2005-01-01
atomic ratios is shown to represent the structural effects of the glass on the dissolution and the formation of activated complexes in the glass leached layer. This provides two different methods by which a linear glass durability model can be formulated. One based on the quasi- crystalline mineral species in a glass and one based on cation ratios in the glass: both are related to the activated complexes on the surface by the law of mass action. The former would allow a new Thermodynamic Hydration Energy Model to be developed based on the hydration of the quasi-crystalline mineral species if all the pertinent thermodynamic data were available. Since the pertinent thermodynamic data is not available, the quasi-crystalline mineral species and the activated complexes can be related to cation ratios in the glass by the law of mass action. The cation ratio model can, thus, be used by waste form producers to formulate durable glasses based on fundamental structural and activated complex theories. Moreover, glass durability model based on atomic ratios simplifies HLW glass process control in that the measured ratios of only a few waste components and glass formers can be used to predict complex HLW glass performance with a high degree of accuracy, e.g. an R 2 approximately 0.97
Solid modeling and applications rapid prototyping, CAD and CAE theory
Um, Dugan
2016-01-01
The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...
The logical foundations of scientific theories languages, structures, and models
Krause, Decio
2016-01-01
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...
Quantum entanglement in non-local games, graph parameters and zero-error information theory
Scarpa, G.
2013-01-01
We study quantum entanglement and some of its applications in graph theory and zero-error information theory. In Chapter 1 we introduce entanglement and other fundamental concepts of quantum theory. In Chapter 2 we address the question of how much quantum correlations generated by entanglement can
Finding Commonalities: Social Information Processing and Domain Theory in the Study of Aggression
Nucci, Larry
2004-01-01
The Arsenio and Lemerise (this issue) proposal integrating social information processing (SIP) and domain theory to study children's aggression is evaluated from a domain theory perspective. Basic tenets of domain theory rendering it compatible with SIP are discussed as well as points of divergence. Focus is directed to the proposition that…
Supersymmetry and String Theory: Beyond the Standard Model
International Nuclear Information System (INIS)
Rocek, Martin
2007-01-01
When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)
COMPLEMENTARITY OF HISTORIC BUILDING INFORMATION MODELLING AND GEOGRAPHIC INFORMATION SYSTEMS
Directory of Open Access Journals (Sweden)
X. Yang
2016-06-01
Full Text Available In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM and Geographical Information Systems (GIS to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D, time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.
Introduction to gauge theories and the Standard Model
de Wit, Bernard
1995-01-01
The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.
A 'theory of everything'? [Extending the Standard Model
International Nuclear Information System (INIS)
Ross, G.G.
1993-01-01
The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)
Neutron Star Models in Alternative Theories of Gravity
Manolidis, Dimitrios
We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.
Generalized algebra-valued models of set theory
Löwe, B.; Tarafder, S.
2015-01-01
We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.
A QCD Model Using Generalized Yang-Mills Theory
International Nuclear Information System (INIS)
Wang Dianfu; Song Heshan; Kou Lina
2007-01-01
Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.
A review of organizational buyer behaviour models and theories ...
African Journals Online (AJOL)
Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...
Zhu, Wenmin; Jia, Yuanhua
2018-01-01
Based on the risk management theory and the PDCA cycle model, requirements of the railway passenger transport safety production is analyzed, and the establishment of the security risk assessment team is proposed to manage risk by FTA with Delphi from both qualitative and quantitative aspects. The safety production committee is also established to accomplish performance appraisal, which is for further ensuring the correctness of risk management results, optimizing the safety management business processes and improving risk management capabilities. The basic framework and risk information database of risk management information system of railway passenger transport safety are designed by Ajax, Web Services and SQL technologies. The system realizes functions about risk management, performance appraisal and data management, and provides an efficient and convenient information management platform for railway passenger safety manager.
Optimised Selection of Stroke Biomarker Based on Svm and Information Theory
Directory of Open Access Journals (Sweden)
Wang Xiang
2017-01-01
Full Text Available With the development of molecular biology and gene-engineering technology, gene diagnosis has been an emerging approach for modern life sciences. Biological marker, recognized as the hot topic in the molecular and gene fields, has important values in early diagnosis, malignant tumor stage, treatment and therapeutic efficacy evaluation. So far, the researcher has not found any effective way to predict and distinguish different type of stroke. In this paper, we aim to optimize stroke biomarker and figure out effective stroke detection index based on SVM (support vector machine and information theory. Through mutual information analysis and principal component analysis to complete the selection of biomarkers and then we use SVM to verify our model. According to the testing data of patients provided by Xuanwu Hospital, we explore the significant markers of the stroke through data analysis. Our model can predict stroke well. Then discuss the effects of each biomarker on the incidence of stroke.
A Participatory Model for Multi-Document Health Information Summarisation
Directory of Open Access Journals (Sweden)
Dinithi Nallaperuma
2017-03-01
Full Text Available Increasing availability and access to health information has been a paradigm shift in healthcare provision as it empowers both patients and practitioners alike. Besides awareness, significant time savings and process efficiencies can be achieved through effective summarisation of healthcare information. Relevance and accuracy are key concerns when generating summaries for such documents. Despite advances in automated summarisation approaches, the role of participation has not been explored. In this paper, we propose a new model for multi-document health information summarisation that takes into account the role of participation. The updated IS user participation theory was extended to explicate these roles. The proposed model integrates both extractive and abstractive summarisation processes with continuous participatory inputs to each phase. The model was implemented as a client-server application and evaluated by both domain experts and health information consumers. Results from the evaluation phase indicates the model is successful in generating relevant and accurate summaries for diverse audiences.
A proposed general model of information behaviour.
Directory of Open Access Journals (Sweden)
2003-01-01
Full Text Available Presents a critical description of Wilson's (1996 global model of information behaviour and proposes major modification on the basis of research into information behaviour of managers, conducted in Poland. The theoretical analysis and research results suggest that Wilson's model has certain imperfections, both in its conceptual content, and in graphical presentation. The model, for example, cannot be used to describe managers' information behaviour, since managers basically are not the end users of external from organization or computerized information services, and they acquire information mainly through various intermediaries. Therefore, the model cannot be considered as a general model, applicable to every category of information users. The proposed new model encompasses the main concepts of Wilson's model, such as: person-in-context, three categories of intervening variables (individual, social and environmental, activating mechanisms, cyclic character of information behaviours, and the adoption of a multidisciplinary approach to explain them. However, the new model introduces several changes. They include: 1. identification of 'context' with the intervening variables; 2. immersion of the chain of information behaviour in the 'context', to indicate that the context variables influence behaviour at all stages of the process (identification of needs, looking for information, processing and using it; 3. stress is put on the fact that the activating mechanisms also can occur at all stages of the information acquisition process; 4. introduction of two basic strategies of looking for information: personally and/or using various intermediaries.
The Information Warfare Life Cycle Model
Directory of Open Access Journals (Sweden)
Brett van Niekerk
2011-11-01
Full Text Available Information warfare (IW is a dynamic and developing concept, which constitutes a number of disciplines. This paper aims to develop a life cycle model for information warfare that is applicable to all of the constituent disciplines. The model aims to be scalable and applicable to civilian and military incidents where information warfare tactics are employed. Existing information warfare models are discussed, and a new model is developed from the common aspects of these existing models. The proposed model is then applied to a variety of incidents to test its applicability and scalability. The proposed model is shown to be applicable to multiple disciplines of information warfare and is scalable, thus meeting the objectives of the model.
The Information Warfare Life Cycle Model
Directory of Open Access Journals (Sweden)
Brett van Niekerk
2011-03-01
Full Text Available Information warfare (IW is a dynamic and developing concept, which constitutes a number of disciplines. This paper aims to develop a life cycle model for information warfare that is applicable to all of the constituent disciplines. The model aims to be scalable and applicable to civilian and military incidents where information warfare tactics are employed. Existing information warfare models are discussed, and a new model is developed from the common aspects of these existing models. The proposed model is then applied to a variety of incidents to test its applicability and scalability. The proposed model is shown to be applicable to multiple disciplines of information warfare and is scalable, thus meeting the objectives of the model.
The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives
Badesa, Calixto
2008-01-01
Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali
Spinal Cord Injury Model System Information Network
... the UAB-SCIMS More The UAB-SCIMS Information Network The University of Alabama at Birmingham Spinal Cord Injury Model System (UAB-SCIMS) maintains this Information Network as a resource to promote knowledge in the ...
Alfonso, Leonardo; Chacon, Juan; Solomatine, Dimitri
2016-04-01
The EC-FP7 WeSenseIt project proposes the development of a Citizen Observatory of Water, aiming at enhancing environmental monitoring and forecasting with the help of citizens equipped with low-cost sensors and personal devices such as smartphones and smart umbrellas. In this regard, Citizen Observatories may complement the limited data availability in terms of spatial and temporal density, which is of interest, among other areas, to improve hydraulic and hydrological models. At this point, the following question arises: how can citizens, who are part of a citizen observatory, be optimally guided so that the data they collect and send is useful to improve modelling and water management? This research proposes a new methodology to identify the optimal location and timing of potential observations coming from moving sensors of hydrological variables. The methodology is based on Information Theory, which has been widely used in hydrometric monitoring design [1-4]. In particular, the concepts of Joint Entropy, as a measure of the amount of information that is contained in a set of random variables, which, in our case, correspond to the time series of hydrological variables captured at given locations in a catchment. The methodology presented is a step forward in the state of the art because it solves the multiobjective optimisation problem of getting simultaneously the minimum number of informative and non-redundant sensors needed for a given time, so that the best configuration of monitoring sites is found at every particular moment in time. To this end, the existing algorithms have been improved to make them efficient. The method is applied to cases in The Netherlands, UK and Italy and proves to have a great potential to complement the existing in-situ monitoring networks. [1] Alfonso, L., A. Lobbrecht, and R. Price (2010a), Information theory-based approach for location of monitoring water level gauges in polders, Water Resour. Res., 46(3), W03528 [2] Alfonso, L., A
Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather
2017-11-28
There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners
Non-linear σ-models and string theories
International Nuclear Information System (INIS)
Sen, A.
1986-10-01
The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs
Mathematics Education as a Proving-Ground for Information-Processing Theories.
Greer, Brian, Ed.; Verschaffel, Lieven, Ed.
1990-01-01
Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…
Toric Methods in F-Theory Model Building
Directory of Open Access Journals (Sweden)
Johanna Knapp
2011-01-01
Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.
Wallace, Rodrick
2018-06-01
Cognition in living entities-and their social groupings or institutional artifacts-is necessarily as complicated as their embedding environments, which, for humans, includes a particularly rich cultural milieu. The asymptotic limit theorems of information and control theories permit construction of a new class of empirical 'regression-like' statistical models for cognitive developmental processes, their dynamics, and modes of dysfunction. Such models may, as have their simpler analogs, prove useful in the study and re-mediation of cognitive failure at and across the scales and levels of organization that constitute and drive the phenomena of life. These new models particularly focus on the roles of sociocultural environment and stress, in a large sense, as both trigger for the failure of the regulation of bio-cognition and as 'riverbanks' determining the channels of pathology, with implications across life-course developmental trajectories. We examine the effects of an embedding cultural milieu and its socioeconomic implementations using the 'lenses' of metabolic optimization, control system theory, and an extension of symmetry-breaking appropriate to information systems. A central implication is that most, if not all, human developmental disorders are fundamentally culture-bound syndromes. This has deep implications for both individual treatment and public health policy.
An Evolutionary Game Theory Model of Spontaneous Brain Functioning.
Madeo, Dario; Talarico, Agostino; Pascual-Leone, Alvaro; Mocenni, Chiara; Santarnecchi, Emiliano
2017-11-22
Our brain is a complex system of interconnected regions spontaneously organized into distinct networks. The integration of information between and within these networks is a continuous process that can be observed even when the brain is at rest, i.e. not engaged in any particular task. Moreover, such spontaneous dynamics show predictive value over individual cognitive profile and constitute a potential marker in neurological and psychiatric conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN), able to capture brain's interregional dynamics by balancing emulative and non-emulative attitudes among brain regions. This results in the net behavior of nodes composing resting-state networks identified using functional magnetic resonance imaging (fMRI), determining their moment-to-moment level of activation and inhibition as expressed by positive and negative shifts in BOLD fMRI signal. By spontaneously generating low-frequency oscillatory behaviors, the EGN model is able to mimic functional connectivity dynamics, approximate fMRI time series on the basis of initial subset of available data, as well as simulate the impact of network lesions and provide evidence of compensation mechanisms across networks. Results suggest evolutionary game theory on networks as a new potential framework for the understanding of human brain network dynamics.
Actor-network Theory and cartography of controversies in Information Science
LOURENÇO, Ramon Fernandes; TOMAÉL, Maria Inês
2018-01-01
Abstract The present study aims to discuss the interactions between the Actor-network Theory and the Cartography of Controversies method in Information Science research. A literature review was conducted on books, scholarly articles, and any other sources addressing the Theory-Actor Network and Cartography of Controversies. The understanding of the theoretical assumptions that guide the Network-Actor Theory allows examining important aspects to Information Science research, seeking to identif...
The urban informal economy in Ethiopia: theory and empirical ...
African Journals Online (AJOL)
Eastern Africa Social Science Research Review ... data to explore the roles and characteristics of the informal sector in urban centers of Ethiopia, ... informal sources, 4) the level of income per person varied sharply among the various sectors.
Quantum Link Models and Quantum Simulation of Gauge Theories
International Nuclear Information System (INIS)
Wiese, U.J.
2015-01-01
This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)
Topic Models in Information Retrieval
2007-08-01
Information Processing Systems, Cambridge, MA, MIT Press, 2004. Brown, P.F., Della Pietra, V.J., deSouza, P.V., Lai, J.C. and Mercer, R.L., Class-based...2003. http://www.wkap.nl/prod/b/1-4020-1216-0. Croft, W.B., Lucia , T.J., Cringean, J., and Willett, P., Retrieving Documents By Plausible Inference
Reference group theory with implications for information studies: a theoretical essay
Directory of Open Access Journals (Sweden)
E. Murell Dawson
2001-01-01
Full Text Available This article explores the role and implications of reference group theory in relation to the field of library and information science. Reference group theory is based upon the principle that people take the standards of significant others as a basis for making self-appraisals, comparisons, and choices regarding need and use of information. Research that applies concepts of reference group theory to various sectors of library and information studies can provide data useful in enhancing areas such as information-seeking research, special populations, and uses of information. Implications are promising that knowledge gained from like research can be beneficial in helping information professionals better understand the role theory plays in examining ways in which people manage their information and social worlds.
Teaching Qualitative Research: Using Theory to Inform Practice
Sallee, Margaret W.
2010-01-01
This article considers how theories of instructional scaffolding--which call for a skilled expert to teach a novice a new task by breaking it into smaller pieces--might be employed in graduate-level qualitative methods courses. The author discusses how she used instructional scaffolding in the design and delivery of a qualitative methods course…
Information and Uncertainty in the Theory of Monetary Policy
Wagner, Helmut
2007-01-01
Theory and practice of monetary policy have changed significantly over the past three decades. A very important part of today's monetary policy is management of the expectations of private market participants. Publishing and justifying the central bank's best forecast of inflation, output, and the instrument rate is argued to be the most effective way to manage those expectations.
Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H
2014-11-01
Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.
INFORMATION MODEL OF A GENERAL PRACTITIONER
Directory of Open Access Journals (Sweden)
S. M. Zlepko
2016-06-01
Full Text Available In the paper the authors developed information model family doctor shows its innovation and functionality. The proposed model meets the requirements of the current job description and criteria World Organization of Family Doctors.
Holden, Richard J.; Karsh, Ben-Tzion
2009-01-01
Primary objective: much research and practice related to the design and implementation of information technology in health care has been atheoretical. It is argued that using extant theory to develop testable models of health information technology (HIT) benefits both research and practice. Methods and procedures: several theories of motivation,…
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
Spin foam model for pure gauge theory coupled to quantum gravity
International Nuclear Information System (INIS)
Oriti, Daniele; Pfeiffer, Hendryk
2002-01-01
We propose a spin foam model for pure gauge fields coupled to Riemannian quantum gravity in four dimensions. The model is formulated for the triangulation of a four-manifold which is given merely combinatorially. The Riemannian Barrett-Crane model provides the gravity sector of our model and dynamically assigns geometric data to the given combinatorial triangulation. The gauge theory sector is a lattice gauge theory living on the same triangulation and obtains from the gravity sector the geometric information which is required to calculate the Yang-Mills action. The model is designed so that one obtains a continuum approximation of the gauge theory sector at an effective level, similarly to the continuum limit of lattice gauge theory, when the typical length scale of gravity is much smaller than the Yang-Mills scale
The Self-Perception Theory vs. a Dynamic Learning Model
Swank, Otto H.
2006-01-01
Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...
Computer Models and Automata Theory in Biology and Medicine
Baianu, I C
2004-01-01
The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...
Restructuring Consciousness –the Psychedelic State in Light of Integrated Information Theory
Directory of Open Access Journals (Sweden)
Andrew Robert Gallimore
2015-06-01
Full Text Available The psychological state elicited by the classic psychedelics drugs, such as LSD and psilocybin, is one of the most fascinating and yet least understood states of consciousness. However, with the advent of modern functional neuroimaging techniques, the effect of these drugs on neural activity is now being revealed, although many of the varied phenomenological features of the psychedelic state remain challenging to explain. Integrated information theory (IIT is one of the foremost contemporary theories of consciousness, providing a mathematical formalization of both the quantity and quality of conscious experience. This theory can be applied to all known states of consciousness, including the psychedelic state. Using the results of functional neuroimaging data on the psychedelic state, the effects of psychedelic drugs on both the level and structure of consciousness can be explained in terms of the conceptual framework of IIT. This new IIT-based model of the psychedelic state provides an explanation for many of its phenomenological features, including unconstrained cognition, alterations in the structure and meaning of concepts and a sense of expanded awareness. This model also suggests that whilst cognitive flexibility, creativity, and imagination are enhanced during the psychedelic state, this occurs at the expense of cause-effect information, as well as degrading the brain’s ability to organize, categorize, and differentiate the constituents of conscious experience. Furthermore, the model generates specific predictions that can be tested using a combination of functional imaging techniques, as has been applied to the study of levels of consciousness during anesthesia and following brain injury.
Stakeholder theory and reporting information The case of performance prism
Directory of Open Access Journals (Sweden)
Bartłomiej Nita
2016-07-01
Full Text Available The aim of the paper is to explain the stakeholder theory in the context of performance measurement in integrated reporting. Main research methods used in the article include logical reasoning, critical analysis of academic literature, and observation. The principal result of the discussion is included in the statement that the stakeholder theory in the field of accounting is reflected in the so-called integrated reporting. Moreover, among the large variety of performance measurement methods, such as balanced scorecard and others, the concept of performance prism can be considered as the only method that fully takes into account the wide range of stakeholders. The analysis performed leads to the conclusion that development in accounting research takes into account the objectives of an organization in the context of the so-called corporate social responsibility as well as performance reporting oriented towards the communication of the company with its environment and the various stakeholder groups.
Theory and model use in social marketing health interventions.
Luca, Nadina Raluca; Suggs, L Suzanne
2013-01-01
The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.
Measurement Models for Reasoned Action Theory
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-01-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...
Compilation of information on melter modeling
International Nuclear Information System (INIS)
Eyler, L.L.
1996-03-01
The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development
Lin, Luan; McKerrow, Wilson H; Richards, Bryce; Phonsom, Chukiat; Lawrence, Charles E
2018-03-05
The nearest neighbor model and associated dynamic programming algorithms allow for the efficient estimation of the RNA secondary structure Boltzmann ensemble. However because a given RNA secondary structure only contains a fraction of the possible helices that could form from a given sequence, the Boltzmann ensemble is multimodal. Several methods exist for clustering structures and finding those modes. However less focus is given to exploring the underlying reasons for this multimodality: the presence of conflicting basepairs. Information theory, or more specifically mutual information, provides a method to identify those basepairs that are key to the secondary structure. To this end we find most informative basepairs and visualize the effect of these basepairs on the secondary structure. Knowing whether a most informative basepair is present tells us not only the status of the particular pair but also provides a large amount of information about which other pairs are present or not present. We find that a few basepairs account for a large amount of the structural uncertainty. The identification of these pairs indicates small changes to sequence or stability that will have a large effect on structure. We provide a novel algorithm that uses mutual information to identify the key basepairs that lead to a multimodal Boltzmann distribution. We then visualize the effect of these pairs on the overall Boltzmann ensemble.
Models based on multichannel R-matrix theory for evaluating light element reactions
International Nuclear Information System (INIS)
Dodder, D.C.; Hale, G.M.; Nisley, R.A.; Witte, K.; Young, P.G.
1975-01-01
Multichannel R-matrix theory has been used as a basis for models for analysis and evaluation of light nuclear systems. These models have the characteristic that data predictions can be made utilizing information derived from other reactions related to the one of primary interest. Several examples are given where such an approach is valid and appropriate. (auth.)
Hiding data selected topics : Rudolf Ahlswede’s lectures on information theory 3
Althöfer, Ingo; Deppe, Christian; Tamm, Ulrich
2016-01-01
Devoted to information security, this volume begins with a short course on cryptography, mainly based on lectures given by Rudolf Ahlswede at the University of Bielefeld in the mid 1990s. It was the second of his cycle of lectures on information theory which opened with an introductory course on basic coding theorems, as covered in Volume 1 of this series. In this third volume, Shannon’s historical work on secrecy systems is detailed, followed by an introduction to an information-theoretic model of wiretap channels, and such important concepts as homophonic coding and authentication. Once the theoretical arguments have been presented, comprehensive technical details of AES are given. Furthermore, a short introduction to the history of public-key cryptology, RSA and El Gamal cryptosystems is provided, followed by a look at the basic theory of elliptic curves, and algorithms for efficient addition in elliptic curves. Lastly, the important topic of “oblivious transfer” is discussed, which is strongly conne...
Information-Processing Models and Curriculum Design
Calfee, Robert C.
1970-01-01
"This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)
Internal Universes in Models of Homotopy Type Theory
DEFF Research Database (Denmark)
Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.
2018-01-01
We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...
Theory, modeling, and simulation annual report, 1992
Energy Technology Data Exchange (ETDEWEB)
1993-05-01
This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.
Theories of conduct disorder: a causal modelling analysis
Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De
2004-01-01
Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –
Models of Regge behaviour in an asymptotically free theory
International Nuclear Information System (INIS)
Polkinghorne, J.C.
1976-01-01
Two simple Feynman integral models are presented which reproduce the features expected to be of physical importance in the Regge behaviour of asymptotically free theories. Analysis confirms the result, expected on general grounds, that phi 3 in six dimensions has an essential singularity at l=-1. The extension to gauge theories is discussed. (Auth.)
Internet information triangulation: Design theory and prototype evaluation
Wijnhoven, Alphonsus B.J.M.; Brinkhuis, Michel
2014-01-01
Many discussions exist regarding the credibility of information on the Internet. Similar discussions happen on the interpretation of social scientific research data, for which information triangulation has been proposed as a useful method. In this article, we explore a design theory—consisting of a
Innovations in information retrieval perspectives for theory and practice
Foster, Allen
2011-01-01
The advent of various information retrieval (IR) technologies and approaches to storage and retrieval provide communities with opportunities for mass documentation, digitization, and the recording of information in different forms. This book introduces and contextualizes these developments and looks at supporting research in IR.
On the assessment of visual communication by information theory
Huck, Friedrich O.; Fales, Carl L.
1993-01-01
This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Information Theory Broadens the Spectrum of Molecular Ecology and Evolution.
Sherwin, W B; Chao, A; Jost, L; Smouse, P E
2017-12-01
Information or entropy analysis of diversity is used extensively in community ecology, and has recently been exploited for prediction and analysis in molecular ecology and evolution. Information measures belong to a spectrum (or q profile) of measures whose contrasting properties provide a rich summary of diversity, including allelic richness (q=0), Shannon information (q=1), and heterozygosity (q=2). We present the merits of information measures for describing and forecasting molecular variation within and among groups, comparing forecasts with data, and evaluating underlying processes such as dispersal. Importantly, information measures directly link causal processes and divergence outcomes, have straightforward relationship to allele frequency differences (including monotonicity that q=2 lacks), and show additivity across hierarchical layers such as ecology, behaviour, cellular processes, and nongenetic inheritance. Copyright © 2017 Elsevier Ltd. All rights reserved.
Taousser, Fatima; Defoort, Michael; Djemai, Mohamed
2016-01-01
This paper investigates the consensus problem for linear multi-agent system with fixed communication topology in the presence of intermittent communication using the time-scale theory. Since each agent can only obtain relative local information intermittently, the proposed consensus algorithm is based on a discontinuous local interaction rule. The interaction among agents happens at a disjoint set of continuous-time intervals. The closed-loop multi-agent system can be represented using mixed linear continuous-time and linear discrete-time models due to intermittent information transmissions. The time-scale theory provides a powerful tool to combine continuous-time and discrete-time cases and study the consensus protocol under a unified framework. Using this theory, some conditions are derived to achieve exponential consensus under intermittent information transmissions. Simulations are performed to validate the theoretical results.
Conceptual Modeling of Time-Varying Information
DEFF Research Database (Denmark)
Gregersen, Heidi; Jensen, Christian S.
2004-01-01
A wide range of database applications manage information that varies over time. Many of the underlying database schemas of these were designed using the Entity-Relationship (ER) model. In the research community as well as in industry, it is common knowledge that the temporal aspects of the mini......-world are important, but difficult to capture using the ER model. Several enhancements to the ER model have been proposed in an attempt to support the modeling of temporal aspects of information. Common to the existing temporally extended ER models, few or no specific requirements to the models were given...
Directory of Energy Information Administration models 1996
Energy Technology Data Exchange (ETDEWEB)
NONE
1996-07-01
This directory revises and updates the Directory of Energy Information Administration Models 1995, DOE/EIA-0293(95), Energy Information Administration (EIA), U.S. Department of Energy, July 1995. Four models have been deleted in this directory as they are no longer being used: (1) Market Penetration Model for Ground-Water Heat Pump Systems (MPGWHP); (2) Market Penetration Model for Residential Rooftop PV Systems (MPRESPV-PC); (3) Market Penetration Model for Active and Passive Solar Technologies (MPSOLARPC); and (4) Revenue Requirements Modeling System (RRMS).
Identifying the Source of Misfit in Item Response Theory Models.
Liu, Yang; Maydeu-Olivares, Alberto
2014-01-01
When an item response theory model fails to fit adequately, the items for which the model provides a good fit and those for which it does not must be determined. To this end, we compare the performance of several fit statistics for item pairs with known asymptotic distributions under maximum likelihood estimation of the item parameters: (a) a mean and variance adjustment to bivariate Pearson's X(2), (b) a bivariate subtable analog to Reiser's (1996) overall goodness-of-fit test, (c) a z statistic for the bivariate residual cross product, and (d) Maydeu-Olivares and Joe's (2006) M2 statistic applied to bivariate subtables. The unadjusted Pearson's X(2) with heuristically determined degrees of freedom is also included in the comparison. For binary and ordinal data, our simulation results suggest that the z statistic has the best Type I error and power behavior among all the statistics under investigation when the observed information matrix is used in its computation. However, if one has to use the cross-product information, the mean and variance adjusted X(2) is recommended. We illustrate the use of pairwise fit statistics in 2 real-data examples and discuss possible extensions of the current research in various directions.
Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E
2018-04-13
To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.
Directory of Open Access Journals (Sweden)
Ali Mohammad-Djafari
2015-06-01
Full Text Available The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP, information theory, relative entropy and the Kullback–Leibler (KL divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC and, in particular, the variational Bayesian approximation (VBA methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC methods. We will also see that VBA englobes joint maximum a posteriori (MAP, as well as the different expectation-maximization (EM algorithms as particular cases.
An information maximization model of eye movements
Renninger, Laura Walker; Coughlan, James; Verghese, Preeti; Malik, Jitendra
2005-01-01
We propose a sequential information maximization model as a general strategy for programming eye movements. The model reconstructs high-resolution visual information from a sequence of fixations, taking into account the fall-off in resolution from the fovea to the periphery. From this framework we get a simple rule for predicting fixation sequences: after each fixation, fixate next at the location that minimizes uncertainty (maximizes information) about the stimulus. By comparing our model performance to human eye movement data and to predictions from a saliency and random model, we demonstrate that our model is best at predicting fixation locations. Modeling additional biological constraints will improve the prediction of fixation sequences. Our results suggest that information maximization is a useful principle for programming eye movements.
A ROADMAP FOR A COMPUTATIONAL THEORY OF THE VALUE OF INFORMATION IN ORIGIN OF LIFE QUESTIONS
Directory of Open Access Journals (Sweden)
Soumya Banerjee
2016-06-01
Full Text Available Information plays a critical role in complex biological systems. Complex systems like immune systems and ant colonies co-ordinate heterogeneous components in a decentralized fashion. How do these distributed decentralized systems function? One key component is how these complex systems efficiently process information. These complex systems have an architecture for integrating and processing information coming in from various sources and points to the value of information in the functioning of different complex biological systems. This article proposes a role for information processing in questions around the origin of life and suggests how computational simulations may yield insights into questions related to the origin of life. Such a computational model of the origin of life would unify thermodynamics with information processing and we would gain an appreciation of why proteins and nucleotides evolved as the substrate of computation and information processing in living systems that we see on Earth. Answers to questions like these may give us insights into non-carbon based forms of life that we could search for outside Earth. We hypothesize that carbon-based life forms are only one amongst a continuum of life-like systems in the universe. Investigations into the role of computational substrates that allow information processing is important and could yield insights into: 1 novel non-carbon based computational substrates that may have “life-like” properties, and 2 how life may have actually originated from non-life on Earth. Life may exist as a continuum between non-life and life and we may have to revise our notion of life and how common it is in the universe. Looking at life or life-like phenomenon through the lens of information theory may yield a broader view of life.
Extended Nambu models: Their relation to gauge theories
Escobar, C. A.; Urrutia, L. F.
2017-05-01
Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.
Linear control theory for gene network modeling.
Shin, Yong-Jun; Bleris, Leonidas
2010-09-16
Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis
Energy Technology Data Exchange (ETDEWEB)
Ferson, Scott [Applied Biomathematics, Setauket, NY (United States); Nelsen, Roger B. [Lewis & Clark College, Portland OR (United States); Hajagos, Janos [Applied Biomathematics, Setauket, NY (United States); Berleant, Daniel J. [Iowa State Univ., Ames, IA (United States); Zhang, Jianzhong [Iowa State Univ., Ames, IA (United States); Tucker, W. Troy [Applied Biomathematics, Setauket, NY (United States); Ginzburg, Lev R. [Applied Biomathematics, Setauket, NY (United States); Oberkampf, William L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-05-01
This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.
Polling models : from theory to traffic intersections
Boon, M.A.A.
2011-01-01
The subject of the present monograph is the study of polling models, which are queueing models consisting of multiple queues, cyclically attended by one server. Polling models originated in the late 1950s, but did not receive much attention until the 1980s when an abundance of new applications arose
Optimization of hydrometric monitoring network in urban drainage systems using information theory.
Yazdi, J
2017-10-01
Regular and continuous monitoring of urban runoff in both quality and quantity aspects is of great importance for controlling and managing surface runoff. Due to the considerable costs of establishing new gauges, optimization of the monitoring network is essential. This research proposes an approach for site selection of new discharge stations in urban areas, based on entropy theory in conjunction with multi-objective optimization tools and numerical models. The modeling framework provides an optimal trade-off between the maximum possible information content and the minimum shared information among stations. This approach was applied to the main surface-water collection system in Tehran to determine new optimal monitoring points under the cost considerations. Experimental results on this drainage network show that the obtained cost-effective designs noticeably outperform the consulting engineers' proposal in terms of both information contents and shared information. The research also determined the highly frequent sites at the Pareto front which might be important for decision makers to give a priority for gauge installation on those locations of the network.
The information a history, a theory, a flood
Gleick, James
2011-01-01
Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing. We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code. In 'The Information' James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the 'bit', it is a fascinating account of the modern age's defining idea and a brilliant exploration of how information has revolutionised our lives.
Reflections on the Right to Information Based on Citizenship Theories
Directory of Open Access Journals (Sweden)
Vitor Gentilli
2007-06-01
Full Text Available In modern societies, structured as representative democracies, all rights to some extent are related to the right to information: the enlargement of participation in citizenship presupposes an enlargement of the right to information as a premise. It is a right which encourages the exercising of citizenship and aﬀ ords the citizens access to and criticism of the instruments necessary for the full exercising of the group of citizenship rights. The right to information can have characteristics of emancipation or of tutelage. An emancipating right is a right to freedom, a right whose basic presupposition is freedom of choice. Accordingly, the maxim which could sum up the ethical issue of the right to information would be: give maximum publicity to everything which refers to the public sphere and keep secret that which refers to the private sphere.
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
Contribution to the study of conformal theories and integrable models
International Nuclear Information System (INIS)
Sochen, N.
1992-05-01
The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved
Tsallis Entropy Theory for Modeling in Water Engineering: A Review
Directory of Open Access Journals (Sweden)
Vijay P. Singh
2017-11-01
Full Text Available Water engineering is an amalgam of engineering (e.g., hydraulics, hydrology, irrigation, ecosystems, environment, water resources and non-engineering (e.g., social, economic, political aspects that are needed for planning, designing and managing water systems. These aspects and the associated issues have been dealt with in the literature using different techniques that are based on different concepts and assumptions. A fundamental question that still remains is: Can we develop a unifying theory for addressing these? The second law of thermodynamics permits us to develop a theory that helps address these in a unified manner. This theory can be referred to as the entropy theory. The thermodynamic entropy theory is analogous to the Shannon entropy or the information theory. Perhaps, the most popular generalization of the Shannon entropy is the Tsallis entropy. The Tsallis entropy has been applied to a wide spectrum of problems in water engineering. This paper provides an overview of Tsallis entropy theory in water engineering. After some basic description of entropy and Tsallis entropy, a review of its applications in water engineering is presented, based on three types of problems: (1 problems requiring entropy maximization; (2 problems requiring coupling Tsallis entropy theory with another theory; and (3 problems involving physical relations.
Three level constraints on conformal field theories and string models
International Nuclear Information System (INIS)
Lewellen, D.C.
1989-05-01
Simple tree level constraints for conformal field theories which follow from the requirement of crossing symmetry of four-point amplitudes are presented, and their utility for probing general properties of string models is briefly illustrated and discussed. 9 refs
Nematic elastomers: from a microscopic model to macroscopic elasticity theory.
Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette
2008-05-01
A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.
MATHEMATICAL MODEL FOR CALCULATION OF INFORMATION RISKS FOR INFORMATION AND LOGISTICS SYSTEM
Directory of Open Access Journals (Sweden)
A. G. Korobeynikov
2015-05-01
Full Text Available Subject of research. The paper deals with mathematical model for assessment calculation of information risks arising during transporting and distribution of material resources in the conditions of uncertainty. Meanwhile information risks imply the danger of origin of losses or damage as a result of application of information technologies by the company. Method. The solution is based on ideology of the transport task solution in stochastic statement with mobilization of mathematical modeling theory methods, the theory of graphs, probability theory, Markov chains. Creation of mathematical model is performed through the several stages. At the initial stage, capacity on different sites depending on time is calculated, on the basis of information received from information and logistic system, the weight matrix is formed and the digraph is under construction. Then there is a search of the minimum route which covers all specified vertexes by means of Dejkstra algorithm. At the second stage, systems of differential Kolmogorov equations are formed using information about the calculated route. The received decisions show probabilities of resources location in concrete vertex depending on time. At the third stage, general probability of the whole route passing depending on time is calculated on the basis of multiplication theorem of probabilities. Information risk, as time function, is defined by multiplication of the greatest possible damage by the general probability of the whole route passing. In this case information risk is measured in units of damage which corresponds to that monetary unit which the information and logistic system operates with. Main results. Operability of the presented mathematical model is shown on a concrete example of transportation of material resources where places of shipment and delivery, routes and their capacity, the greatest possible damage and admissible risk are specified. The calculations presented on a diagram showed