WorldWideScience

Sample records for information theory based

  1. Information theory based approaches to cellular signaling.

    Science.gov (United States)

    Waltermann, Christian; Klipp, Edda

    2011-10-01

    Cells interact with their environment and they have to react adequately to internal and external changes such changes in nutrient composition, physical properties like temperature or osmolarity and other stresses. More specifically, they must be able to evaluate whether the external change is significant or just in the range of noise. Based on multiple external parameters they have to compute an optimal response. Cellular signaling pathways are considered as the major means of information perception and transmission in cells. Here, we review different attempts to quantify information processing on the level of individual cells. We refer to Shannon entropy, mutual information, and informal measures of signaling pathway cross-talk and specificity. Information theory in systems biology has been successfully applied to identification of optimal pathway structures, mutual information and entropy as system response in sensitivity analysis, and quantification of input and output information. While the study of information transmission within the framework of information theory in technical systems is an advanced field with high impact in engineering and telecommunication, its application to biological objects and processes is still restricted to specific fields such as neuroscience, structural and molecular biology. However, in systems biology dealing with a holistic understanding of biochemical systems and cellular signaling only recently a number of examples for the application of information theory have emerged. This article is part of a Special Issue entitled Systems Biology of Microorganisms. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  3. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  4. Information theory of molecular systems

    CERN Document Server

    Nalewajski, Roman F

    2006-01-01

    As well as providing a unified outlook on physics, Information Theory (IT) has numerous applications in chemistry and biology owing to its ability to provide a measure of the entropy/information contained within probability distributions and criteria of their information ""distance"" (similarity) and independence. Information Theory of Molecular Systems applies standard IT to classical problems in the theory of electronic structure and chemical reactivity. The book starts by introducing the basic concepts of modern electronic structure/reactivity theory based upon the Density Functional Theory

  5. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 1: Building a Foundation

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-06-01

    Full Text Available Objective – Part I of this paper aims to create a framework for an emerging theory of evidence based information literacy instruction. In order to ground this framework in existing theory, a holistic perspective views inquiry as a learning process that synthesizes information searching and knowledge building. An interdisciplinary approach is taken to relate user-centric information behavior theory and constructivist learning theory that supports this synthesis. The substantive theories that emerge serve as a springboard for emerging theory. A second objective of this paper is to define evidence based information literacy instruction by assessing the suitability of performance based assessment and action research as tools of evidence based practice.Methods – An historical review of research grounded in user-centered information behavior theory and constructivist learning theory establishes a body of existing substantive theory that supports emerging theory for evidence based information literacy instruction within an information-to-knowledge approach. A focused review of the literature presents supporting research for an evidence based pedagogy that is performance assessment based, i.e., information users are immersed in real-world tasks that include formative assessments. An analysis of the meaning of action research in terms of its purpose and methodology establishes its suitability for structuring an evidence based pedagogy. Supporting research tests a training model for school librarians and educators which integrates performance based assessment, as well as action research. Results – Findings of an historical analysis of information behavior theory and constructivist teaching practices, and a literature review that explores teaching models for evidence based information literacy instruction, point to two elements of evidence based information literacy instruction: the micro level of information searching behavior and the macro level of

  6. Quantum biological information theory

    CERN Document Server

    Djordjevic, Ivan B

    2016-01-01

    This book is a self-contained, tutorial-based introduction to quantum information theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological information theory required to describe the quantum information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects. Integrates quantum information and quantum biology concepts; Assumes only knowledge of basic concepts of vector algebra at undergraduate level; Provides a thorough introduction to basic concepts of quantum information processing, quantum information theory, and quantum biology; Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models o...

  7. Preservation of information in Fourier theory based deconvolved nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishnan, K.R.; Sharma, R.C.; Rattan, S.S.

    1995-01-01

    Nuclear spectroscopy is extremely useful to the internal radiation dosimetry for the estimation of body burden due to gamma emitters. Analysis of nuclear spectra is concerned with the extraction of qualitative and quantitative information embedded in the spectra. A spectral deconvolution method based on Fourier theory is probably the simplest method of deconvolving nuclear spectra. It is proved mathematically that the deconvolution method preserves the qualitative information. It is shown by using simulated spectra and an observed gamma ray spectrum that the method preserves the quantitative information. This may provide a novel approach of information extraction from a deconvolved spectrum. The paper discusses the methodology, mathematical analysis, and the results obtained by deconvolving spectra. (author). 6 refs., 2 tabs

  8. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    Energy Technology Data Exchange (ETDEWEB)

    Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it [Sapienza Università di Roma, Dipartimento di Ingegneria Civile, Edile e Ambientale (Italy); Alfonso, L. [Hydroinformatics Chair Group, UNESCO-IHE, Delft (Netherlands); Di Baldassarre, G. [Department of Earth Sciences, Program for Air, Water and Landscape Sciences, Uppsala University (Sweden)

    2016-06-08

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  9. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    International Nuclear Information System (INIS)

    Ridolfi, E.; Napolitano, F.; Alfonso, L.; Di Baldassarre, G.

    2016-01-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  10. Generalized information theory: aims, results, and open problems

    International Nuclear Information System (INIS)

    Klir, George J.

    2004-01-01

    The principal purpose of this paper is to present a comprehensive overview of generalized information theory (GIT): a research program whose objective is to develop a broad treatment of uncertainty-based information, not restricted to classical notions of uncertainty. After a brief overview of classical information theories, a broad framework for formalizing uncertainty and the associated uncertainty-based information of a great spectrum of conceivable types is sketched. The various theories of imprecise probabilities that have already been developed within this framework are then surveyed, focusing primarily on some important unifying principles applying to all these theories. This is followed by introducing two higher levels of the theories of imprecise probabilities: (i) the level of measuring the amount of relevant uncertainty (predictive, retrodictive, prescriptive, diagnostic, etc.) in any situation formalizable in each given theory, and (ii) the level of some methodological principles of uncertainty, which are contingent upon the capability to measure uncertainty and the associated uncertainty-based information. Various issues regarding both the measurement of uncertainty and the uncertainty principles are discussed. Again, the focus is on unifying principles applicable to all the theories. Finally, the current status of GIT is assessed and future research in the area is discussed

  11. How to Produce a Transdisciplinary Information Concept for a Universal Theory of Information?

    DEFF Research Database (Denmark)

    Brier, Søren

    2017-01-01

    the natural, technical, social and humanistic sciences must be defined as a part of real relational meaningful sign-processes manifesting as tokens. Thus Peirce’s information theory is empirically based in a realistic worldview, which through modern biosemiotics includes all living systems....... concept of information as a difference that makes a difference and in Luhmann’s triple autopoietic communication based system theory, where information is always a part of a message. Charles Sanders Peirce’s pragmaticist semiotics differs from other paradigms in that it integrates logic and information...... in interpretative semiotics. I therefore suggest alternatively building information theories based on semiotics from the basic relations of embodied living systems meaningful cognition and communication. I agree with Peircean biosemiotics that all transdisciplinary information concepts in order to work across...

  12. Towards a critical theory of information

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2009-11-01

    The debate on redistribution and recognition between critical theorists Nancy Fraser and Axel Honneth gives the opportunity to renew the discussion of the relationship of base and superstructure in critical social theory. Critical information theory needs to be aware of economic, political, and cultural demands that it needs to make in struggles for ending domination and oppression, and of the unifying role that the economy and class play in these demands and struggles. Objective and subjective information concepts are based on the underlying worldview of reification. Reification endangers human existence. Information as process and relation enables political and ethical alternatives that have radical implications for society.

  13. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  14. Rudolf Ahlswede’s lectures on information theory

    CERN Document Server

    Althöfer, Ingo; Deppe, Christian; Tamm, Ulrich

    Volume 1 : The volume “Storing and Transmitting Data” is based on Rudolf Ahlswede's introductory course on "Information Theory I" and presents an introduction to Shannon Theory. Readers, familiar or unfamiliar with the technical intricacies of Information Theory, will benefit considerably from working through the book; especially Chapter VI with its lively comments and uncensored insider views from the world of science and research offers informative and revealing insights. This is the first of several volumes that will serve as a collected research documentation of Rudolf Ahlswede’s lectures on information theory. Each volume includes comments from an invited well-known expert. Holger Boche contributed his insights in the supplement of the present volume. Classical information processing concerns the main tasks of gaining knowledge, storage, transmitting and hiding data. The first task is the prime goal of Statistics. For the two next, Shannon presented an impressive mathematical theory called Informat...

  15. Highly accurate fluorogenic DNA sequencing with information theory-based error correction.

    Science.gov (United States)

    Chen, Zitian; Zhou, Wenxiong; Qiao, Shuo; Kang, Li; Duan, Haifeng; Xie, X Sunney; Huang, Yanyi

    2017-12-01

    Eliminating errors in next-generation DNA sequencing has proved challenging. Here we present error-correction code (ECC) sequencing, a method to greatly improve sequencing accuracy by combining fluorogenic sequencing-by-synthesis (SBS) with an information theory-based error-correction algorithm. ECC embeds redundancy in sequencing reads by creating three orthogonal degenerate sequences, generated by alternate dual-base reactions. This is similar to encoding and decoding strategies that have proved effective in detecting and correcting errors in information communication and storage. We show that, when combined with a fluorogenic SBS chemistry with raw accuracy of 98.1%, ECC sequencing provides single-end, error-free sequences up to 200 bp. ECC approaches should enable accurate identification of extremely rare genomic variations in various applications in biology and medicine.

  16. Cooperative Localization for Multi-AUVs Based on GM-PHD Filters and Information Entropy Theory

    Directory of Open Access Journals (Sweden)

    Lichuan Zhang

    2017-10-01

    Full Text Available Cooperative localization (CL is considered a promising method for underwater localization with respect to multiple autonomous underwater vehicles (multi-AUVs. In this paper, we proposed a CL algorithm based on information entropy theory and the probability hypothesis density (PHD filter, aiming to enhance the global localization accuracy of the follower. In the proposed framework, the follower carries lower cost navigation systems, whereas the leaders carry better ones. Meanwhile, the leaders acquire the followers’ observations, including both measurements and clutter. Then, the PHD filters are utilized on the leaders and the results are communicated to the followers. The followers then perform weighted summation based on all received messages and obtain a final positioning result. Based on the information entropy theory and the PHD filter, the follower is able to acquire a precise knowledge of its position.

  17. Dynamic statistical information theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel

  18. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 2: Building a Culture of Inquiry

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-09-01

    Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action

  19. Processing Information in Quantum Decision Theory

    OpenAIRE

    Yukalov, V. I.; Sornette, D.

    2008-01-01

    A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...

  20. The g-theorem and quantum information theory

    Energy Technology Data Exchange (ETDEWEB)

    Casini, Horacio; Landea, Ignacio Salazar; Torroba, Gonzalo [Centro Atómico Bariloche and CONICET,S.C. de Bariloche, Río Negro, R8402AGP (Argentina)

    2016-10-25

    We study boundary renormalization group flows between boundary conformal field theories in 1+1 dimensions using methods of quantum information theory. We define an entropic g-function for theories with impurities in terms of the relative entanglement entropy, and we prove that this g-function decreases along boundary renormalization group flows. This entropic g-theorem is valid at zero temperature, and is independent from the g-theorem based on the thermal partition function. We also discuss the mutual information in boundary RG flows, and how it encodes the correlations between the impurity and bulk degrees of freedom. Our results provide a quantum-information understanding of (boundary) RG flow as increase of distinguishability between the UV fixed point and the theory along the RG flow.

  1. Reasonable fermionic quantum information theories require relativity

    International Nuclear Information System (INIS)

    Friis, Nicolai

    2016-01-01

    We show that any quantum information theory based on anticommuting operators must be supplemented by a superselection rule deeply rooted in relativity to establish a reasonable notion of entanglement. While quantum information may be encoded in the fermionic Fock space, the unrestricted theory has a peculiar feature: the marginals of bipartite pure states need not have identical entropies, which leads to an ambiguous definition of entanglement. We solve this problem, by proving that it is removed by relativity, i.e., by the parity superselection rule that arises from Lorentz invariance via the spin-statistics connection. Our results hence unveil a fundamental conceptual inseparability of quantum information and the causal structure of relativistic field theory. (paper)

  2. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    Science.gov (United States)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  3. Optimal design of hydrometric monitoring networks with dynamic components based on Information Theory

    Science.gov (United States)

    Alfonso, Leonardo; Chacon, Juan; Solomatine, Dimitri

    2016-04-01

    The EC-FP7 WeSenseIt project proposes the development of a Citizen Observatory of Water, aiming at enhancing environmental monitoring and forecasting with the help of citizens equipped with low-cost sensors and personal devices such as smartphones and smart umbrellas. In this regard, Citizen Observatories may complement the limited data availability in terms of spatial and temporal density, which is of interest, among other areas, to improve hydraulic and hydrological models. At this point, the following question arises: how can citizens, who are part of a citizen observatory, be optimally guided so that the data they collect and send is useful to improve modelling and water management? This research proposes a new methodology to identify the optimal location and timing of potential observations coming from moving sensors of hydrological variables. The methodology is based on Information Theory, which has been widely used in hydrometric monitoring design [1-4]. In particular, the concepts of Joint Entropy, as a measure of the amount of information that is contained in a set of random variables, which, in our case, correspond to the time series of hydrological variables captured at given locations in a catchment. The methodology presented is a step forward in the state of the art because it solves the multiobjective optimisation problem of getting simultaneously the minimum number of informative and non-redundant sensors needed for a given time, so that the best configuration of monitoring sites is found at every particular moment in time. To this end, the existing algorithms have been improved to make them efficient. The method is applied to cases in The Netherlands, UK and Italy and proves to have a great potential to complement the existing in-situ monitoring networks. [1] Alfonso, L., A. Lobbrecht, and R. Price (2010a), Information theory-based approach for location of monitoring water level gauges in polders, Water Resour. Res., 46(3), W03528 [2] Alfonso, L., A

  4. Hybrid Multicriteria Group Decision Making Method for Information System Project Selection Based on Intuitionistic Fuzzy Theory

    Directory of Open Access Journals (Sweden)

    Jian Guo

    2013-01-01

    Full Text Available Information system (IS project selection is of critical importance to every organization in dynamic competing environment. The aim of this paper is to develop a hybrid multicriteria group decision making approach based on intuitionistic fuzzy theory for IS project selection. The decision makers’ assessment information can be expressed in the form of real numbers, interval-valued numbers, linguistic variables, and intuitionistic fuzzy numbers (IFNs. All these evaluation pieces of information can be transformed to the form of IFNs. Intuitionistic fuzzy weighted averaging (IFWA operator is utilized to aggregate individual opinions of decision makers into a group opinion. Intuitionistic fuzzy entropy is used to obtain the entropy weights of the criteria. TOPSIS method combined with intuitionistic fuzzy set is proposed to select appropriate IS project in group decision making environment. Finally, a numerical example for information system projects selection is given to illustrate application of hybrid multi-criteria group decision making (MCGDM method based on intuitionistic fuzzy theory and TOPSIS method.

  5. Uncertainty analysis of an integrated energy system based on information theory

    International Nuclear Information System (INIS)

    Fu, Xueqian; Sun, Hongbin; Guo, Qinglai; Pan, Zhaoguang; Xiong, Wen; Wang, Li

    2017-01-01

    Currently, a custom-designed configuration of different renewable technologies named the integrated energy system (IES) has become popular due to its high efficiency, benefiting from complementary multi-energy technologies. This paper proposes an information entropy approach to quantify uncertainty in an integrated energy system based on a stochastic model that drives a power system model derived from an actual network on Barry Island. Due to the complexity of co-behaviours between generators, a copula-based approach is utilized to articulate the dependency structure of the generator outputs with regard to such factors as weather conditions. Correlation coefficients and mutual information, which are effective for assessing the dependence relationships, are applied to judge whether the stochastic IES model is correct. The calculated information values can be used to analyse the impacts of the coupling of power and heat on power flows and heat flows, and this approach will be helpful for improving the operation of IES. - Highlights: • The paper explores uncertainty of an integrated energy system. • The dependent weather model is verified from the perspective of correlativity. • The IES model considers the dependence between power and heat. • The information theory helps analyse the complexity of IES operation. • The application of the model is studied using an operational system on Barry Island.

  6. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    Science.gov (United States)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  7. Realism and Antirealism in Informational Foundations of Quantum Theory

    Directory of Open Access Journals (Sweden)

    Tina Bilban

    2014-08-01

    Full Text Available Zeilinger-Brukner's informational foundations of quantum theory, a theory based on Zeilinger's foundational principle for quantum mechanics that an elementary system carried one bit of information, explains seemingly unintuitive quantum behavior with simple theoretical framework. It is based on the notion that distinction between reality and information cannot be made, therefore they are the same. As the critics of informational foundations of quantum theory show, this antirealistic move captures the theory in tautology, where information only refers to itself, while the relationships outside the information with the help of which the nature of information would be defined are lost and the questions "Whose information? Information about what?" cannot be answered. The critic's solution is a return to realism, where the observer's effects on the information are neglected. We show that radical antirealism of informational foundations of quantum theory is not necessary and that the return to realism is not the only way forward. A comprehensive approach that exceeds mere realism and antirealism is also possible: we can consider both sources of the constraints on the information, those coming from the observer and those coming from the observed system/nature/reality. The information is always the observer's information about the observed. Such a comprehensive philosophical approach can still support the theoretical framework of informational foundations of quantum theory: If we take that one bit is the smallest amount of information in the form of which the observed reality can be grasped by the observer, we can say that an elementary system (grasped and defined as such by the observer correlates to one bit of information. Our approach thus explains all the features of the quantum behavior explained by informational foundations of quantum theory: the wave function and its collapse, entanglement, complementarity and quantum randomness. However, it does

  8. Science and information theory

    CERN Document Server

    Brillouin, Léon

    1962-01-01

    A classic source for exploring the connections between information theory and physics, this text is geared toward upper-level undergraduates and graduate students. The author, a giant of 20th-century mathematics, applies the principles of information theory to a variety of issues, including Maxwell's demon, thermodynamics, and measurement problems. 1962 edition.

  9. Information Design Theories

    Science.gov (United States)

    Pettersson, Rune

    2014-01-01

    Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…

  10. Constructor theory of information

    Science.gov (United States)

    Deutsch, David; Marletto, Chiara

    2015-01-01

    We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803

  11. Theories of information behavior

    CERN Document Server

    Erdelez, Sandra; McKechnie, Lynne

    2005-01-01

    This unique book presents authoritative overviews of more than 70 conceptual frameworks for understanding how people seek, manage, share, and use information in different contexts. A practical and readable reference to both well-established and newly proposed theories of information behavior, the book includes contributions from 85 scholars from 10 countries. Each theory description covers origins, propositions, methodological implications, usage, links to related conceptual frameworks, and listings of authoritative primary and secondary references. The introductory chapters explain key concepts, theory–method connections, and the process of theory development.

  12. The criteria for selecting a method for unfolding neutron spectra based on the information entropy theory

    International Nuclear Information System (INIS)

    Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin

    2014-01-01

    To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts

  13. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  14. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  15. Reference group theory with implications for information studies: a theoretical essay

    Directory of Open Access Journals (Sweden)

    E. Murell Dawson

    2001-01-01

    Full Text Available This article explores the role and implications of reference group theory in relation to the field of library and information science. Reference group theory is based upon the principle that people take the standards of significant others as a basis for making self-appraisals, comparisons, and choices regarding need and use of information. Research that applies concepts of reference group theory to various sectors of library and information studies can provide data useful in enhancing areas such as information-seeking research, special populations, and uses of information. Implications are promising that knowledge gained from like research can be beneficial in helping information professionals better understand the role theory plays in examining ways in which people manage their information and social worlds.

  16. Finding an information concept suited for a universal theory of information.

    Science.gov (United States)

    Brier, Søren

    2015-12-01

    The view argued in this article is that if we want to define a universal concept of information covering subjective experiential and meaningful cognition - as well as intersubjective meaningful communication in nature, technology, society and life worlds - then the main problem is to decide, which epistemological, ontological and philosophy of science framework the concept of information should be based on and integrated in. All the ontological attempts to create objective concepts of information result in concepts that cannot encompass meaning and experience of embodied living and social systems. There is no conclusive evidence that the core of reality across nature, culture, life and mind is purely either mathematical, logical or of a computational nature. Therefore the core of the information concept should not only be based only on pure logical or mathematical rationality. We need to include interpretation, signification and meaning construction in our transdisciplinary framework for information as a basic aspect of reality alongside the physical, chemical and molecular biological. Dretske defines information as the content of new, true, meaningful, and understandable knowledge. According to this widely held definition information in a transdisciplinary theory cannot be 'objective', but has to be relativized in relation to the receiver's knowledge, as also proposed by Floridi. It is difficult to produce a quantitative statement independently of a qualitative analysis based on some sort of relation to the human condition as a semiotic animal. I therefore alternatively suggest to build information theories based on semiotics from the basic relations of embodied living systems meaningful cognition and communication. I agree with Peircean biosemiotics that all information must be part of real relational sign-processes manifesting as tokens. Copyright © 2015. Published by Elsevier Ltd.

  17. Quantum information theory

    CERN Document Server

    Wilde, Mark M

    2017-01-01

    Developing many of the major, exciting, pre- and post-millennium developments from the ground up, this book is an ideal entry point for graduate students into quantum information theory. Significant attention is given to quantum mechanics for quantum information theory, and careful studies of the important protocols of teleportation, superdense coding, and entanglement distribution are presented. In this new edition, readers can expect to find over 100 pages of new material, including detailed discussions of Bell's theorem, the CHSH game, Tsirelson's theorem, the axiomatic approach to quantum channels, the definition of the diamond norm and its interpretation, and a proof of the Choi–Kraus theorem. Discussion of the importance of the quantum dynamic capacity formula has been completely revised, and many new exercises and references have been added. This new edition will be welcomed by the upcoming generation of quantum information theorists and the already established community of classical information theo...

  18. Financial markets theory equilibrium, efficiency and information

    CERN Document Server

    Barucci, Emilio

    2017-01-01

    This work, now in a thoroughly revised second edition, presents the economic foundations of financial markets theory from a mathematically rigorous standpoint and offers a self-contained critical discussion based on empirical results. It is the only textbook on the subject to include more than two hundred exercises, with detailed solutions to selected exercises. Financial Markets Theory covers classical asset pricing theory in great detail, including utility theory, equilibrium theory, portfolio selection, mean-variance portfolio theory, CAPM, CCAPM, APT, and the Modigliani-Miller theorem. Starting from an analysis of the empirical evidence on the theory, the authors provide a discussion of the relevant literature, pointing out the main advances in classical asset pricing theory and the new approaches designed to address asset pricing puzzles and open problems (e.g., behavioral finance). Later chapters in the book contain more advanced material, including on the role of information in financial markets, non-c...

  19. Optimised Selection of Stroke Biomarker Based on Svm and Information Theory

    Directory of Open Access Journals (Sweden)

    Wang Xiang

    2017-01-01

    Full Text Available With the development of molecular biology and gene-engineering technology, gene diagnosis has been an emerging approach for modern life sciences. Biological marker, recognized as the hot topic in the molecular and gene fields, has important values in early diagnosis, malignant tumor stage, treatment and therapeutic efficacy evaluation. So far, the researcher has not found any effective way to predict and distinguish different type of stroke. In this paper, we aim to optimize stroke biomarker and figure out effective stroke detection index based on SVM (support vector machine and information theory. Through mutual information analysis and principal component analysis to complete the selection of biomarkers and then we use SVM to verify our model. According to the testing data of patients provided by Xuanwu Hospital, we explore the significant markers of the stroke through data analysis. Our model can predict stroke well. Then discuss the effects of each biomarker on the incidence of stroke.

  20. An introduction to information theory

    CERN Document Server

    Reza, Fazlollah M

    1994-01-01

    Graduate-level study for engineering students presents elements of modern probability theory, information theory, coding theory, more. Emphasis on sample space, random variables, capacity, etc. Many reference tables and extensive bibliography. 1961 edition.

  1. Multimedia information retrieval theory and techniques

    CERN Document Server

    Raieli, Roberto

    2013-01-01

    Novel processing and searching tools for the management of new multimedia documents have developed. Multimedia Information Retrieval (MMIR) is an organic system made up of Text Retrieval (TR); Visual Retrieval (VR); Video Retrieval (VDR); and Audio Retrieval (AR) systems. So that each type of digital document may be analysed and searched by the elements of language appropriate to its nature, search criteria must be extended. Such an approach is known as the Content Based Information Retrieval (CBIR), and is the core of MMIR. This novel content-based concept of information handling needs to be integrated with more traditional semantics. Multimedia Information Retrieval focuses on the tools of processing and searching applicable to the content-based management of new multimedia documents. Translated from Italian by Giles Smith, the book is divided in to two parts. Part one discusses MMIR and related theories, and puts forward new methodologies; part two reviews various experimental and operating MMIR systems, a...

  2. Self-Instructional Module Based on Cognitive Load Theory: A Study on Information Retention among Trainee Teachers

    Science.gov (United States)

    Ong, Chiek Pin; Tasir, Zaidatun

    2015-01-01

    The aim of the research is to study the information retention among trainee teachers using a self-instructional printed module based on Cognitive Load Theory for learning spreadsheet software. Effective pedagogical considerations integrating the theoretical concepts related to cognitive load are reflected in the design and development of the…

  3. Quantum information and relativity theory

    International Nuclear Information System (INIS)

    Peres, Asher; Terno, Daniel R.

    2004-01-01

    This article discusses the intimate relationship between quantum mechanics, information theory, and relativity theory. Taken together these are the foundations of present-day theoretical physics, and their interrelationship is an essential part of the theory. The acquisition of information from a quantum system by an observer occurs at the interface of classical and quantum physics. The authors review the essential tools needed to describe this interface, i.e., Kraus matrices and positive-operator-valued measures. They then discuss how special relativity imposes severe restrictions on the transfer of information between distant systems and the implications of the fact that quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about that Lorentz transformations of reduced density matrices for entangled systems may not be completely positive maps. Quantum field theory is, of course, necessary for a consistent description of interactions. Its structure implies a fundamental tradeoff between detector reliability and localizability. Moreover, general relativity produces new and counterintuitive effects, particularly when black holes (or, more generally, event horizons) are involved. In this more general context the authors discuss how most of the current concepts in quantum information theory may require a reassessment

  4. Selective information seeking: can consumers' avoidance of evidence-based information on colorectal cancer screening be explained by the theory of cognitive dissonance?

    Science.gov (United States)

    Steckelberg, Anke; Kasper, Jürgen; Mühlhauser, Ingrid

    2007-08-27

    Evidence-based patient information (EBPI) is a prerequisite for informed decision-making. However, presentation of EBPI may lead to irrational reactions causing avoidance, minimisation and devaluation of the information. To explore whether the theory of cognitive dissonance is applicable to medical decision-making and useful to explain these phenomena. 261 volunteers from Hamburg (157 women), >or=50 years old without diagnosis of colorectal cancer. DESIGN AND VARIABLES: Within an experiment we simulated information seeking on colorectal cancer screening. Consumers' attitudes towards screening were surveyed using a rating scale from -5 (participate in no way) to +5 (participate unconditionally) (independent variable). Using a cover story, participants were asked to sort 5 article headlines according to their reading preferences. The headlines simulated the pro to contra variety of contents to be found in print media about colorectal cancer screening. The dependent variable was the sequence of article headlines. Participants were very much in favour of screening with scores for faecal occult blood test of 4.0 (0.1) and for colonoscopy 3.3 (0.1). According to our hypothesis we found statistically significant positive correlations between the stimuli in favour of screening and attitudes and significant negative correlations between the stimuli against screening and attitudes. The theory of cognitive dissonance is applicable to medical decision-making. It may explain some phenomena of irrational reactions to evidence-based patient information.

  5. Geometrical identification of quantum and information theories

    International Nuclear Information System (INIS)

    Caianiello, E.R.

    1983-01-01

    The interrelation of quantum and information theories is investigation on the base of the conception of cross-entropy. It is assumed that ''complex information geometry'' may serve as a tool for ''technological transfer'' from one research field to the other which is not connected directly with the first one. It is pointed out that the ''infinitesimal distance'' ds 2 and ''infinitesimal cross-entropy'' dHsub(c) coincide

  6. Towards an Information Retrieval Theory of Everything

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Lammerink, J.M.W.; Katoen, Joost P.; Kok, J.N.; van de Pol, Jan Cornelis; Raamsdonk, F.

    2009-01-01

    I present three well-known probabilistic models of information retrieval in tutorial style: The binary independence probabilistic model, the language modeling approach, and Google's page rank. Although all three models are based on probability theory, they are very different in nature. Each model

  7. [Prediction of regional soil quality based on mutual information theory integrated with decision tree algorithm].

    Science.gov (United States)

    Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu

    2012-02-01

    In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.

  8. The theory of quantum information

    CERN Document Server

    Watrous, John

    2018-01-01

    This largely self-contained book on the theory of quantum information focuses on precise mathematical formulations and proofs of fundamental facts that form the foundation of the subject. It is intended for graduate students and researchers in mathematics, computer science, and theoretical physics seeking to develop a thorough understanding of key results, proof techniques, and methodologies that are relevant to a wide range of research topics within the theory of quantum information and computation. The book is accessible to readers with an understanding of basic mathematics, including linear algebra, mathematical analysis, and probability theory. An introductory chapter summarizes these necessary mathematical prerequisites, and starting from this foundation, the book includes clear and complete proofs of all results it presents. Each subsequent chapter includes challenging exercises intended to help readers to develop their own skills for discovering proofs concerning the theory of quantum information.

  9. Quantum information theory and quantum statistics

    International Nuclear Information System (INIS)

    Petz, D.

    2008-01-01

    Based on lectures given by the author, this book focuses on providing reliable introductory explanations of key concepts of quantum information theory and quantum statistics - rather than on results. The mathematically rigorous presentation is supported by numerous examples and exercises and by an appendix summarizing the relevant aspects of linear analysis. Assuming that the reader is familiar with the content of standard undergraduate courses in quantum mechanics, probability theory, linear algebra and functional analysis, the book addresses graduate students of mathematics and physics as well as theoretical and mathematical physicists. Conceived as a primer to bridge the gap between statistical physics and quantum information, a field to which the author has contributed significantly himself, it emphasizes concepts and thorough discussions of the fundamental notions to prepare the reader for deeper studies, not least through the selection of well chosen exercises. (orig.)

  10. Multi-Sensor Building Fire Alarm System with Information Fusion Technology Based on D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Qian Ding

    2014-10-01

    Full Text Available Multi-sensor and information fusion technology based on Dempster-Shafer evidence theory is applied in the system of a building fire alarm to realize early detecting and alarming. By using a multi-sensor to monitor the parameters of the fire process, such as light, smoke, temperature, gas and moisture, the range of fire monitoring in space and time is expanded compared with a single-sensor system. Then, the D-S evidence theory is applied to fuse the information from the multi-sensor with the specific fire model, and the fire alarm is more accurate and timely. The proposed method can avoid the failure of the monitoring data effectively, deal with the conflicting evidence from the multi-sensor robustly and improve the reliability of fire warning significantly.

  11. Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory

    Science.gov (United States)

    Pei, Di; Yue, Jianhai; Jiao, Jing

    2017-10-01

    This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.

  12. Theory-based explanation as intervention.

    Science.gov (United States)

    Weisman, Kara; Markman, Ellen M

    2017-10-01

    Cogent explanations are an indispensable means of providing new information and an essential component of effective education. Beyond this, we argue that there is tremendous untapped potential in using explanations to motivate behavior change. In this article we focus on health interventions. We review four case studies that used carefully tailored explanations to address gaps and misconceptions in people's intuitive theories, providing participants with a conceptual framework for understanding how and why some recommended behavior is an effective way of achieving a health goal. These case studies targeted a variety of health-promoting behaviors: (1) children washing their hands to prevent viral epidemics; (2) parents vaccinating their children to stem the resurgence of infectious diseases; (3) adults completing the full course of an antibiotic prescription to reduce antibiotic resistance; and (4) children eating a variety of healthy foods to improve unhealthy diets. Simply telling people to engage in these behaviors has been largely ineffective-if anything, concern about these issues is mounting. But in each case, teaching participants coherent explanatory frameworks for understanding health recommendations has shown great promise, with such theory-based explanations outperforming state-of-the-art interventions from national health authorities. We contrast theory-based explanations both with simply listing facts, information, and advice and with providing a full-blown educational curriculum, and argue for providing the minimum amount of information required to understand the causal link between a target behavior and a health outcome. We argue that such theory-based explanations lend people the motivation and confidence to act on their new understanding.

  13. Finding an Information Concept Suited for a Universal Theory of Information

    DEFF Research Database (Denmark)

    Brier, Søren

    2015-01-01

    . There is no conclusive evidence that the core of reality across nature, culture, life and mind is purely either mathematical, logical or of a computational nature. Therefore the core of the information concept should not only be based only on pure logical or mathematical rationality. We need to include interpretation...... definition information in a transdisciplinary theory cannot be ‘objective’, but has to be relativized in relation to the receiver's knowledge, as also proposed by Floridi. It is difficult to produce a quantitative statement independently of a qualitative analysis based on some sort of relation to the human...

  14. Wave theory of information

    CERN Document Server

    Franceschetti, Massimo

    2017-01-01

    Understand the relationship between information theory and the physics of wave propagation with this expert guide. Balancing fundamental theory with engineering applications, it describes the mechanism and limits for the representation and communication of information using electromagnetic waves. Information-theoretic laws relating functional approximation and quantum uncertainty principles to entropy, capacity, mutual information, rate distortion, and degrees of freedom of band-limited radiation are derived and explained. Both stochastic and deterministic approaches are explored, and applications for sensing and signal reconstruction, wireless communication, and networks of multiple transmitters and receivers are reviewed. With end-of-chapter exercises and suggestions for further reading enabling in-depth understanding of key concepts, it is the ideal resource for researchers and graduate students in electrical engineering, physics and applied mathematics looking for a fresh perspective on classical informat...

  15. An information theory framework for dynamic functional domain connectivity.

    Science.gov (United States)

    Vergara, Victor M; Miller, Robyn; Calhoun, Vince

    2017-06-01

    Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  17. Selective information seeking: can consumers' avoidance of evidence-based information on colorectal cancer screening be explained by the theory of cognitive dissonance?

    Directory of Open Access Journals (Sweden)

    Mühlhauser, Ingrid

    2007-08-01

    Full Text Available Background: Evidence-based patient information (EBPI is a prerequisite for informed decision-making. However, presentation of EBPI may lead to irrational reactions causing avoidance, minimisation and devaluation of the information. Objective: To explore whether the theory of cognitive dissonance is applicable to medical decision-making and useful to explain these phenomena. Setting and participants: 261 volunteers from Hamburg (157 women, ≥50 years old without diagnosis of colorectal cancer. Design and variables: Within an experiment we simulated information seeking on colorectal cancer screening. Consumers’ attitudes towards screening were surveyed using a rating scale from -5 (participate in no way to +5 (participate unconditionally (independent variable. Using a cover story, participants were asked to sort 5 article headlines according to their reading preferences. The headlines simulated the pro to contra variety of contents to be found in print media about colorectal cancer screening. The dependent variable was the sequence of article headlines. Results: Participants were very much in favour of screening with scores for faecal occult blood test of 4.0 (0.1 and for colonoscopy 3.3 (0.1. According to our hypothesis we found statistically significant positive correlations between the stimuli in favour of screening and attitudes and significant negative correlations between the stimuli against screening and attitudes. Conclusion: The theory of cognitive dissonance is applicable to medical decision-making. It may explain some phenomena of irrational reactions to evidence-based patient information.

  18. Chemical Thermodynamics and Information Theory with Applications

    CERN Document Server

    Graham, Daniel J

    2011-01-01

    Thermodynamics and information touch theory every facet of chemistry. However, the physical chemistry curriculum digested by students worldwide is still heavily skewed toward heat/work principles established more than a century ago. Rectifying this situation, Chemical Thermodynamics and Information Theory with Applications explores applications drawn from the intersection of thermodynamics and information theory--two mature and far-reaching fields. In an approach that intertwines information science and chemistry, this book covers: The informational aspects of thermodynamic state equations The

  19. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  20. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster-Shafer Evidence Theory.

    Science.gov (United States)

    Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang

    2017-08-28

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.

  1. Information fusion-based approach for studying influence on Twitter using belief theory.

    Science.gov (United States)

    Azaza, Lobna; Kirgizov, Sergey; Savonnet, Marinette; Leclercq, Éric; Gastineau, Nicolas; Faiz, Rim

    2016-01-01

    Influence in Twitter has become recently a hot research topic, since this micro-blogging service is widely used to share and disseminate information. Some users are more able than others to influence and persuade peers. Thus, studying most influential users leads to reach a large-scale information diffusion area, something very useful in marketing or political campaigns. In this study, we propose a new approach for multi-level influence assessment on multi-relational networks, such as Twitter . We define a social graph to model the relationships between users as a multiplex graph where users are represented by nodes, and links model the different relations between them (e.g., retweets , mentions , and replies ). We explore how relations between nodes in this graph could reveal about the influence degree and propose a generic computational model to assess influence degree of a certain node. This is based on the conjunctive combination rule from the belief functions theory to combine different types of relations. We experiment the proposed method on a large amount of data gathered from Twitter during the European Elections 2014 and deduce top influential candidates. The results show that our model is flexible enough to to consider multiple interactions combination according to social scientists needs or requirements and that the numerical results of the belief theory are accurate. We also evaluate the approach over the CLEF RepLab 2014 data set and show that our approach leads to quite interesting results.

  2. Grounded theory for radiotherapy practitioners: Informing clinical practice

    International Nuclear Information System (INIS)

    Walsh, N.A.

    2010-01-01

    Radiotherapy practitioners may be best placed to undertake qualitative research within the context of cancer, due to specialist knowledge of radiation treatment and sensitivity to radiotherapy patient's needs. The grounded theory approach to data collection and analysis is a unique method of identifying a theory directly based on data collected within a clinical context. Research for radiotherapy practitioners is integral to role expansion within the government's directive for evidence-based practice. Due to the paucity of information on qualitative research undertaken by radiotherapy radiographers, this article aims to assess the potential impact of qualitative research on radiotherapy patient and service outcomes.

  3. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Science.gov (United States)

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  4. Genre theory in information studies

    CERN Document Server

    Andersen, Jack

    2015-01-01

    This book highlights the important role genre theory plays within information studies. It illustrates how modern genre studies inform and enrich the study of information, and conversely how the study of information makes its own independent contributions to the study of genre.

  5. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  6. Information theory and rate distortion theory for communications and compression

    CERN Document Server

    Gibson, Jerry

    2013-01-01

    This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the cover

  7. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  8. Planting contemporary practice theory in the garden of information science

    NARCIS (Netherlands)

    Huizing, A.; Cavanagh, M.

    2011-01-01

    Introduction. The purpose of this paper is to introduce to information science in a coherent fashion the core premises of contemporary practice theory, and thus to engage the information research community in further debate and discussion. Method. Contemporary practice-based approaches are

  9. Information theory perspective on network robustness

    International Nuclear Information System (INIS)

    Schieber, Tiago A.; Carpi, Laura; Frery, Alejandro C.; Rosso, Osvaldo A.; Pardalos, Panos M.; Ravetti, Martín G.

    2016-01-01

    A crucial challenge in network theory is the study of the robustness of a network when facing a sequence of failures. In this work, we propose a dynamical definition of network robustness based on Information Theory, that considers measurements of the structural changes caused by failures of the network's components. Failures are defined here as a temporal process defined in a sequence. Robustness is then evaluated by measuring dissimilarities between topologies after each time step of the sequence, providing a dynamical information about the topological damage. We thoroughly analyze the efficiency of the method in capturing small perturbations by considering different probability distributions on networks. In particular, we find that distributions based on distances are more consistent in capturing network structural deviations, as better reflect the consequences of the failures. Theoretical examples and real networks are used to study the performance of this methodology. - Highlights: • A novel methodology to measure the robustness of a network to component failure or targeted attacks is proposed. • The use of the network's distance PDF allows a precise analysis. • The method provides a dynamic robustness profile showing the response of the topology to each failure event. • The measure is capable to detect network's critical elements.

  10. The Research on Safety Management Information System of Railway Passenger Based on Risk Management Theory

    Science.gov (United States)

    Zhu, Wenmin; Jia, Yuanhua

    2018-01-01

    Based on the risk management theory and the PDCA cycle model, requirements of the railway passenger transport safety production is analyzed, and the establishment of the security risk assessment team is proposed to manage risk by FTA with Delphi from both qualitative and quantitative aspects. The safety production committee is also established to accomplish performance appraisal, which is for further ensuring the correctness of risk management results, optimizing the safety management business processes and improving risk management capabilities. The basic framework and risk information database of risk management information system of railway passenger transport safety are designed by Ajax, Web Services and SQL technologies. The system realizes functions about risk management, performance appraisal and data management, and provides an efficient and convenient information management platform for railway passenger safety manager.

  11. Workflow management based on information management

    NARCIS (Netherlands)

    Lutters, Diederick; Mentink, R.J.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    2001-01-01

    In manufacturing processes, the role of the underlying information is of the utmost importance. Based on three different types of integration (function, information and control), as well as the theory of information management and the accompanying information structures, the entire product creation

  12. An introduction to single-user information theory

    CERN Document Server

    Alajaji, Fady

    2018-01-01

    This book presents a succinct and mathematically rigorous treatment of the main pillars of Shannon’s information theory, discussing the fundamental concepts and indispensable results of Shannon’s mathematical theory of communications. It includes five meticulously written core chapters (with accompanying problems), emphasizing the key topics of information measures; lossless and lossy data compression; channel coding; and joint source-channel coding for single-user (point-to-point) communications systems. It also features two appendices covering necessary background material in real analysis and in probability theory and stochastic processes. The book is ideal for a one-semester foundational course on information theory for senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences. A comprehensive instructor’s solutions manual is available.

  13. Electricity procurement for large consumers based on Information Gap Decision Theory

    International Nuclear Information System (INIS)

    Zare, Kazem; Moghaddam, Mohsen Parsa; Sheikh El Eslami, Mohammad Kazem

    2010-01-01

    In the competitive electricity market, consumers seek strategies to meet their electricity needs at minimum cost and risk. This paper provides a technique based on Information Gap Decision Theory (IGDT) to assess different procurement strategies for large consumers. Supply sources include bilateral contracts, a limited self-generating facility, and the pool. It is considered that the pool price is uncertain and its volatility around the estimated value is modeled using an IGDT model. The proposed method does not minimize the procurement cost but assesses the risk aversion or risk-taking nature of some procurement strategies with regard to the minimum cost. Using this method, the robustness of experiencing costs higher than the expected one is optimized and the related strategy is determined. The proposed method deals with optimizing the opportunities to take advantage of low procurement costs or low pool prices. A case study is used to illustrate the proposed technique.

  14. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    National Research Council Canada - National Science Library

    Nolte, Loren

    2002-01-01

    The hypothesis is that one can use signal detection theory to improve the performance in detecting tumors in the breast by using this theory to develop task-oriented information processing techniques...

  15. Recoverability in quantum information theory

    Science.gov (United States)

    Wilde, Mark

    The fact that the quantum relative entropy is non-increasing with respect to quantum physical evolutions lies at the core of many optimality theorems in quantum information theory and has applications in other areas of physics. In this work, we establish improvements of this entropy inequality in the form of physically meaningful remainder terms. One of the main results can be summarized informally as follows: if the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small, then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution. Our proof method is elementary, relying on the method of complex interpolation, basic linear algebra, and the recently introduced Renyi generalization of a relative entropy difference. The theorem has a number of applications in quantum information theory, which have to do with providing physically meaningful improvements to many known entropy inequalities. This is based on arXiv:1505.04661, now accepted for publication in Proceedings of the Royal Society A. I acknowledge support from startup funds from the Department of Physics and Astronomy at LSU, the NSF under Award No. CCF-1350397, and the DARPA Quiness Program through US Army Research Office award W31P4Q-12-1-0019.

  16. Comparing cosmic web classifiers using information theory

    Energy Technology Data Exchange (ETDEWEB)

    Leclercq, Florent [Institute of Cosmology and Gravitation (ICG), University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth PO1 3FX (United Kingdom); Lavaux, Guilhem; Wandelt, Benjamin [Institut d' Astrophysique de Paris (IAP), UMR 7095, CNRS – UPMC Université Paris 6, Sorbonne Universités, 98bis boulevard Arago, F-75014 Paris (France); Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: lavaux@iap.fr, E-mail: j.jasche@tum.de, E-mail: wandelt@iap.fr [Excellence Cluster Universe, Technische Universität München, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-08-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  17. Comparing cosmic web classifiers using information theory

    International Nuclear Information System (INIS)

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin; Jasche, Jens

    2016-01-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  18. Fundamentals of the fuzzy logic-based generalized theory of decisions

    CERN Document Server

    Aliev, Rafik Aziz

    2013-01-01

    Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes’s analysis of uncertainty. There is a need for further generalization – a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than comput...

  19. An information theory account of cognitive control.

    Science.gov (United States)

    Fan, Jin

    2014-01-01

    Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  20. An information theory account of cognitive control

    Directory of Open Access Journals (Sweden)

    Jin eFan

    2014-09-01

    Full Text Available Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  1. Electricity procurement for large consumers based on Information Gap Decision Theory

    Energy Technology Data Exchange (ETDEWEB)

    Zare, Kazem; Moghaddam, Mohsen Parsa; Sheikh El Eslami, Mohammad Kazem [Tarbiat Modares University, P.O. Box 14115-111, Tehran (Iran)

    2010-01-15

    In the competitive electricity market, consumers seek strategies to meet their electricity needs at minimum cost and risk. This paper provides a technique based on Information Gap Decision Theory (IGDT) to assess different procurement strategies for large consumers. Supply sources include bilateral contracts, a limited self-generating facility, and the pool. It is considered that the pool price is uncertain and its volatility around the estimated value is modeled using an IGDT model. The proposed method does not minimize the procurement cost but assesses the risk aversion or risk-taking nature of some procurement strategies with regard to the minimum cost. Using this method, the robustness of experiencing costs higher than the expected one is optimized and the related strategy is determined. The proposed method deals with optimizing the opportunities to take advantage of low procurement costs or low pool prices. A case study is used to illustrate the proposed technique. (author)

  2. Information Theory for Information Science: Antecedents, Philosophy, and Applications

    Science.gov (United States)

    Losee, Robert M.

    2017-01-01

    This paper provides an historical overview of the theoretical antecedents leading to information theory, specifically those useful for understanding and teaching information science and systems. Information may be discussed in a philosophical manner and at the same time be measureable. This notion of information can thus be the subject of…

  3. Web 2.0 systems supporting childhood chronic disease management: design guidelines based on information behaviour and social learning theories.

    Science.gov (United States)

    Ekberg, Joakim; Ericson, Leni; Timpka, Toomas; Eriksson, Henrik; Nordfeldt, Sam; Hanberger, Lena; Ludvigsson, Johnny

    2010-04-01

    Self-directed learning denotes that the individual is in command of what should be learned and why it is important. In this study, guidelines for the design of Web 2.0 systems for supporting diabetic adolescents' every day learning needs are examined in light of theories about information behaviour and social learning. A Web 2.0 system was developed to support a community of practice and social learning structures were created to support building of relations between members on several levels in the community. The features of the system included access to participation in the culture of diabetes management practice, entry to information about the community and about what needs to be learned to be a full practitioner or respected member in the community, and free sharing of information, narratives and experience-based knowledge. After integration with the key elements derived from theories of information behaviour, a preliminary design guideline document was formulated.

  4. Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics

    International Nuclear Information System (INIS)

    Altaner, Bernhard

    2017-01-01

    Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. (paper)

  5. Information carriers and (reading them through) information theory in quantum chemistry.

    Science.gov (United States)

    Geerlings, Paul; Borgoo, Alex

    2011-01-21

    This Perspective discusses the reduction of the electronic wave function via the second-order reduced density matrix to the electron density ρ(r), which is the key ingredient in density functional theory (DFT) as a basic carrier of information. Simplifying further, the 1-normalized density function turns out to contain essentially the same information as ρ(r) and is even of preferred use as an information carrier when discussing the periodic properties along Mendeleev's table where essentially the valence electrons are at stake. The Kullback-Leibler information deficiency turns out to be the most interesting choice to obtain information on the differences in ρ(r) or σ(r) between two systems. To put it otherwise: when looking for the construction of a functional F(AB) = F[ζ(A)(r),ζ(B)(r)] for extracting differences in information from an information carrier ζ(r) (i.e. ρ(r), σ(r)) for two systems A and B the Kullback-Leibler information measure ΔS is a particularly adequate choice. Examples are given, varying from atoms, to molecules and molecular interactions. Quantum similarity of atoms indicates that the shape function based KL information deficiency is the most appropriate tool to retrieve periodicity in the Periodic Table. The dissimilarity of enantiomers for which different information measures are presented at global and local (i.e. molecular and atomic) level leads to an extension of Mezey's holographic density theorem and shows numerical evidence that in a chiral molecule the whole molecule is pervaded by chirality. Finally Kullback-Leibler information profiles are discussed for intra- and intermolecular proton transfer reactions and a simple S(N)2 reaction indicating that the theoretical information profile can be used as a companion to the energy based Hammond postulate to discuss the early or late transition state character of a reaction. All in all this Perspective's answer is positive to the question of whether an even simpler carrier of

  6. Information Foraging Theory: A Framework for Intelligence Analysis

    Science.gov (United States)

    2014-11-01

    oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT

  7. The Quantitative Theory of Information

    DEFF Research Database (Denmark)

    Topsøe, Flemming; Harremoës, Peter

    2008-01-01

    Information Theory as developed by Shannon and followers is becoming more and more important in a number of sciences. The concepts appear to be just the right ones with intuitively appealing operational interpretations. Furthermore, the information theoretical quantities are connected by powerful...

  8. On long-only information-based portfolio diversification framework

    Science.gov (United States)

    Santos, Raphael A.; Takada, Hellinton H.

    2014-12-01

    Using the concepts from information theory, it is possible to improve the traditional frameworks for long-only asset allocation. In modern portfolio theory, the investor has two basic procedures: the choice of a portfolio that maximizes its risk-adjusted excess return or the mixed allocation between the maximum Sharpe portfolio and the risk-free asset. In the literature, the first procedure was already addressed using information theory. One contribution of this paper is the consideration of the second procedure in the information theory context. The performance of these approaches was compared with three traditional asset allocation methodologies: the Markowitz's mean-variance, the resampled mean-variance and the equally weighted portfolio. Using simulated and real data, the information theory-based methodologies were verified to be more robust when dealing with the estimation errors.

  9. Online dating in Japan: a test of social information processing theory.

    Science.gov (United States)

    Farrer, James; Gavin, Jeff

    2009-08-01

    This study examines the experiences of past and present members of a popular Japanese online dating site in order to explore the extent to which Western-based theories of computer-mediated communication (CMC) and the development of online relationships are relevant to the Japanese online dating experience. Specifically, it examines whether social information processing theory (SIPT) is applicable to Japanese online dating interactions, and how and to what extent Japanese daters overcome the limitations of CMC through the use of contextual and other cues. Thirty-six current members and 27 former members of Match.com Japan completed an online survey. Using issue-based procedures for grounded theory analysis, we found strong support for SIPT. Japanese online daters adapt their efforts to present and acquire social information using the cues that the online dating platform provides, although many of these cues are specific to Japanese social context.

  10. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  11. Information theory in molecular biology

    OpenAIRE

    Adami, Christoph

    2004-01-01

    This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the...

  12. An information integration theory of consciousness

    Directory of Open Access Journals (Sweden)

    Tononi Giulio

    2004-11-01

    Full Text Available Abstract Background Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition? Presentation of the hypothesis This paper presents a theory about what consciousness is and how it can be measured. According to the theory, consciousness corresponds to the capacity of a system to integrate information. This claim is motivated by two key phenomenological properties of consciousness: differentiation – the availability of a very large number of conscious experiences; and integration – the unity of each such experience. The theory states that the quantity of consciousness available to a system can be measured as the Φ value of a complex of elements. Φ is the amount of causally effective information that can be integrated across the informational weakest link of a subset of elements. A complex is a subset of elements with Φ>0 that is not part of a subset of higher Φ. The theory also claims that the quality of consciousness is determined by the informational relationships among the elements of a complex, which are specified by the values of effective information among them. Finally, each particular conscious experience is specified by the value, at any given time, of the variables mediating informational interactions among the elements of a complex. Testing the hypothesis The information integration theory accounts, in a principled manner, for several neurobiological observations

  13. Astrophysical data analysis with information field theory

    International Nuclear Information System (INIS)

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented

  14. Astrophysical data analysis with information field theory

    Science.gov (United States)

    Enßlin, Torsten

    2014-12-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  15. Astrophysical data analysis with information field theory

    Energy Technology Data Exchange (ETDEWEB)

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  16. Quantum theory from first principles an informational approach

    CERN Document Server

    D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2017-01-01

    Quantum theory is the soul of theoretical physics. It is not just a theory of specific physical systems, but rather a new framework with universal applicability. This book shows how we can reconstruct the theory from six information-theoretical principles, by rebuilding the quantum rules from the bottom up. Step by step, the reader will learn how to master the counterintuitive aspects of the quantum world, and how to efficiently reconstruct quantum information protocols from first principles. Using intuitive graphical notation to represent equations, and with shorter and more efficient derivations, the theory can be understood and assimilated with exceptional ease. Offering a radically new perspective on the field, the book contains an efficient course of quantum theory and quantum information for undergraduates. The book is aimed at researchers, professionals, and students in physics, computer science and philosophy, as well as the curious outsider seeking a deeper understanding of the theory.

  17. Critical Theory and Information Studies: A Marcusean Infusion

    Science.gov (United States)

    Pyati, Ajit K.

    2006-01-01

    In the field of library and information science, also known as information studies, critical theory is often not included in debates about the discipline's theoretical foundations. This paper argues that the critical theory of Herbert Marcuse, in particular, has a significant contribution to make to the field of information studies. Marcuse's…

  18. Information theory and the ethylene genetic network.

    Science.gov (United States)

    González-García, José S; Díaz, José

    2011-10-01

    The original aim of the Information Theory (IT) was to solve a purely technical problem: to increase the performance of communication systems, which are constantly affected by interferences that diminish the quality of the transmitted information. That is, the theory deals only with the problem of transmitting with the maximal precision the symbols constituting a message. In Shannon's theory messages are characterized only by their probabilities, regardless of their value or meaning. As for its present day status, it is generally acknowledged that Information Theory has solid mathematical foundations and has fruitful strong links with Physics in both theoretical and experimental areas. However, many applications of Information Theory to Biology are limited to using it as a technical tool to analyze biopolymers, such as DNA, RNA or protein sequences. The main point of discussion about the applicability of IT to explain the information flow in biological systems is that in a classic communication channel, the symbols that conform the coded message are transmitted one by one in an independent form through a noisy communication channel, and noise can alter each of the symbols, distorting the message; in contrast, in a genetic communication channel the coded messages are not transmitted in the form of symbols but signaling cascades transmit them. Consequently, the information flow from the emitter to the effector is due to a series of coupled physicochemical processes that must ensure the accurate transmission of the message. In this review we discussed a novel proposal to overcome this difficulty, which consists of the modeling of gene expression with a stochastic approach that allows Shannon entropy (H) to be directly used to measure the amount of uncertainty that the genetic machinery has in relation to the correct decoding of a message transmitted into the nucleus by a signaling pathway. From the value of H we can define a function I that measures the amount of

  19. Quantum Information Theory - an Invitation

    Science.gov (United States)

    Werner, Reinhard F.

    Quantum information and quantum computers have received a lot of public attention recently. Quantum computers have been advertised as a kind of warp drive for computing, and indeed the promise of the algorithms of Shor and Grover is to perform computations which are extremely hard or even provably impossible on any merely ``classical'' computer.In this article I shall give an account of the basic concepts of quantum information theory is given, staying as much as possible in the area of general agreement.The article is divided into two parts. The first (up to the end of Sect. 2.5) is mostly in plain English, centered around the exploration of what can or cannot be done with quantum systems as information carriers. The second part, Sect. 2.6, then gives a description of the mathematical structures and of some of the tools needed to develop the theory.

  20. Informed Systems: Enabling Collaborative Evidence Based Organizational Learning

    Directory of Open Access Journals (Sweden)

    Mary M. Somerville

    2015-12-01

    Full Text Available Objective – In response to unrelenting disruptions in academic publishing and higher education ecosystems, the Informed Systems approach supports evidence based professional activities to make decisions and take actions. This conceptual paper presents two core models, Informed Systems Leadership Model and Collaborative Evidence-Based Information Process Model, whereby co-workers learn to make informed decisions by identifying the decisions to be made and the information required for those decisions. This is accomplished through collaborative design and iterative evaluation of workplace systems, relationships, and practices. Over time, increasingly effective and efficient structures and processes for using information to learn further organizational renewal and advance nimble responsiveness amidst dynamically changing circumstances. Methods – The integrated Informed Systems approach to fostering persistent workplace inquiry has its genesis in three theories that together activate and enable robust information usage and organizational learning. The information- and learning-intensive theories of Peter Checkland in England, which advance systems design, stimulate participants’ appreciation during the design process of the potential for using information to learn. Within a co-designed environment, intentional social practices continue workplace learning, described by Christine Bruce in Australia as informed learning enacted through information experiences. In addition, in Japan, Ikujiro Nonaka’s theories foster information exchange processes and knowledge creation activities within and across organizational units. In combination, these theories promote the kind of learning made possible through evolving and transferable capacity to use information to learn through design and usage of collaborative communication systems with associated professional practices. Informed Systems therein draws from three antecedent theories to create an original

  1. Pangenesis as a source of new genetic information. The history of a now disproven theory.

    Science.gov (United States)

    Bergman, Gerald

    2006-01-01

    Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.

  2. An application of information theory to stochastic classical gravitational fields

    Science.gov (United States)

    Angulo, J.; Angulo, J. C.; Angulo, J. M.

    2018-06-01

    The objective of this study lies on the incorporation of the concepts developed in the Information Theory (entropy, complexity, etc.) with the aim of quantifying the variation of the uncertainty associated with a stochastic physical system resident in a spatiotemporal region. As an example of application, a relativistic classical gravitational field has been considered, with a stochastic behavior resulting from the effect induced by one or several external perturbation sources. One of the key concepts of the study is the covariance kernel between two points within the chosen region. Using this concept and the appropriate criteria, a methodology is proposed to evaluate the change of uncertainty at a given spatiotemporal point, based on available information and efficiently applying the diverse methods that Information Theory provides. For illustration, a stochastic version of the Einstein equation with an added Gaussian Langevin term is analyzed.

  3. On Representation in Information Theory

    Directory of Open Access Journals (Sweden)

    Joseph E. Brenner

    2011-09-01

    Full Text Available Semiotics is widely applied in theories of information. Following the original triadic characterization of reality by Peirce, the linguistic processes involved in information—production, transmission, reception, and understanding—would all appear to be interpretable in terms of signs and their relations to their objects. Perhaps the most important of these relations is that of the representation-one, entity, standing for or representing some other. For example, an index—one of the three major kinds of signs—is said to represent something by being directly related to its object. My position, however, is that the concept of symbolic representations having such roles in information, as intermediaries, is fraught with the same difficulties as in representational theories of mind. I have proposed an extension of logic to complex real phenomena, including mind and information (Logic in Reality; LIR, most recently at the 4th International Conference on the Foundations of Information Science (Beijing, August, 2010. LIR provides explanations for the evolution of complex processes, including information, that do not require any entities other than the processes themselves. In this paper, I discuss the limitations of the standard relation of representation. I argue that more realistic pictures of informational systems can be provided by reference to information as an energetic process, following the categorial ontology of LIR. This approach enables naïve, anti-realist conceptions of anti-representationalism to be avoided, and enables an approach to both information and meaning in the same novel logical framework.

  4. Image matching navigation based on fuzzy information

    Institute of Scientific and Technical Information of China (English)

    田玉龙; 吴伟仁; 田金文; 柳健

    2003-01-01

    In conventional image matching methods, the image matching process is mostly based on image statistic information. One aspect neglected by all these methods is that there is much fuzzy information contained in these images. A new fuzzy matching algorithm based on fuzzy similarity for navigation is presented in this paper. Because the fuzzy theory is of the ability of making good description of the fuzzy information contained in images, the image matching method based on fuzzy similarity would look forward to producing good performance results. Experimental results using matching algorithm based on fuzzy information also demonstrate its reliability and practicability.

  5. Actor Network Theory Approach and its Application in Investigating Agricultural Climate Information System

    Directory of Open Access Journals (Sweden)

    Maryam Sharifzadeh

    2013-03-01

    Full Text Available Actor network theory as a qualitative approach to study complex social factors and process of socio-technical interaction provides new concepts and ideas to understand socio-technical nature of information systems. From the actor network theory viewpoint, agricultural climate information system is a network consisting of actors, actions and information related processes (production, transformation, storage, retrieval, integration, diffusion and utilization, control and management, and system mechanisms (interfaces and networks. Analysis of such systemsembody the identification of basic components and structure of the system (nodes –thedifferent sources of information production, extension, and users, and the understanding of how successfully the system works (interaction and links – in order to promote climate knowledge content and improve system performance to reach agricultural development. The present research attempted to introduce actor network theory as research framework based on network view of agricultural climate information system.

  6. A prediction method based on grey system theory in equipment condition based maintenance

    International Nuclear Information System (INIS)

    Yan, Shengyuan; Yan, Shengyuan; Zhang, Hongguo; Zhang, Zhijian; Peng, Minjun; Yang, Ming

    2007-01-01

    Grey prediction is a modeling method based on historical or present, known or indefinite information, which can be used for forecasting the development of the eigenvalues of the targeted equipment system and setting up the model by using less information. In this paper, the postulate of grey system theory, which includes the grey generating, the sorts of grey generating and the grey forecasting model, is introduced first. The concrete application process, which includes the grey prediction modeling, grey prediction, error calculation, equal dimension and new information approach, is introduced secondly. Application of a so-called 'Equal Dimension and New Information' (EDNI) technology in grey system theory is adopted in an application case, aiming at improving the accuracy of prediction without increasing the amount of calculation by replacing old data with new ones. The proposed method can provide a new way for solving the problem of eigenvalue data exploding in equal distance effectively, short time interval and real time prediction. The proposed method, which was based on historical or present, known or indefinite information, was verified by the vibration prediction of induced draft fan of a boiler of the Yantai Power Station in China, and the results show that the proposed method based on grey system theory is simple and provides a high accuracy in prediction. So, it is very useful and significant to the controlling and controllable management in safety production. (authors)

  7. A computational model for knowledge-driven monitoring of nuclear power plant operators based on information theory

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2006-01-01

    To develop operator behavior models such as IDAC, quantitative models for the cognitive activities of nuclear power plant (NPP) operators in abnormal situations are essential. Among them, only few quantitative models for the monitoring and detection have been developed. In this paper, we propose a computational model for the knowledge-driven monitoring, which is also known as model-driven monitoring, of NPP operators in abnormal situations, based on the information theory. The basic assumption of the proposed model is that the probability that an operator shifts his or her attention to an information source is proportional to the expected information from the information source. A small experiment performed to evaluate the feasibility of the proposed model shows that the predictions made by the proposed model have high correlations with the experimental results. Even though it has been argued that heuristics might play an important role on human reasoning, we believe that the proposed model can provide part of the mathematical basis for developing quantitative models for knowledge-driven monitoring of NPP operators when NPP operators are assumed to behave very logically

  8. Client-controlled case information: a general system theory perspective.

    Science.gov (United States)

    Fitch, Dale

    2004-07-01

    The author proposes a model for client control of case information via the World Wide Web built on principles of general system theory. It incorporates the client into the design, resulting in an information structure that differs from traditional human services information-sharing practices. Referencing general system theory, the concepts of controller and controlled system, as well as entropy and negentropy, are applied to the information flow and autopoietic behavior as they relate to the boundary-maintaining functions of today's organizations. The author's conclusions synthesize general system theory and human services values to lay the foundation for an information-sharing framework for human services in the 21st century.

  9. Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics

    Science.gov (United States)

    Altaner, Bernhard

    2017-11-01

    Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Bernhard Altaner was selected by the Editorial Board of J. Phys. A as an Emerging Talent.

  10. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  11. The application of foraging theory to the information searching behaviour of general practitioners.

    Science.gov (United States)

    Dwairy, Mai; Dowell, Anthony C; Stahl, Jean-Claude

    2011-08-23

    General Practitioners (GPs) employ strategies to identify and retrieve medical evidence for clinical decision making which take workload and time constraints into account. Optimal Foraging Theory (OFT) initially developed to study animal foraging for food is used to explore the information searching behaviour of General Practitioners. This study is the first to apply foraging theory within this context.Study objectives were: 1. To identify the sequence and steps deployed in identifiying and retrieving evidence for clinical decision making. 2. To utilise Optimal Foraging Theory to assess the effectiveness and efficiency of General Practitioner information searching. GPs from the Wellington region of New Zealand were asked to document in a pre-formatted logbook the steps and outcomes of an information search linked to their clinical decision making, and fill in a questionnaire about their personal, practice and information-searching backgrounds. A total of 115/155 eligible GPs returned a background questionnaire, and 71 completed their information search logbook. GPs spent an average of 17.7 minutes addressing their search for clinical information. Their preferred information sources were discussions with colleagues (38% of sources) and books (22%). These were the two most profitable information foraging sources (15.9 min and 9.5 min search time per answer, compared to 34.3 minutes in databases). GPs nearly always accessed another source when unsuccessful (95% after 1st source), and frequently when successful (43% after 2nd source). Use of multiple sources accounted for 41% of searches, and increased search success from 70% to 89%. By consulting in foraging terms the most 'profitable' sources of information (colleagues, books), rapidly switching sources when unsuccessful, and frequently double checking, GPs achieve an efficient trade-off between maximizing search success and information reliability, and minimizing searching time. As predicted by foraging theory, GPs

  12. Elaborations of grounded theory in information research: arenas/social worlds theory, discourse and situational analysis

    OpenAIRE

    Vasconcelos, A.C.; Sen, B.A.; Rosa, A.; Ellis, D.

    2012-01-01

    This paper explores elaborations of Grounded Theory in relation to Arenas/Social Worlds Theory. The notions of arenas and social worlds were present in early applications of Grounded Theory but have not been as much used or recognised as the general Grounded Theory approach, particularly in the information studies field. The studies discussed here are therefore very unusual in information research. The empirical contexts of these studies are those of (1) the role of discourse in the organisat...

  13. Theory-Based Stakeholder Evaluation

    Science.gov (United States)

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  14. Quantum: information theory: technological challenge

    International Nuclear Information System (INIS)

    Calixto, M.

    2001-01-01

    The new Quantum Information Theory augurs powerful machines that obey the entangled logic of the subatomic world. Parallelism, entanglement, teleportation, no-cloning and quantum cryptography are typical peculiarities of this novel way of understanding computation. (Author) 24 refs

  15. Client-Controlled Case Information: A General System Theory Perspective

    Science.gov (United States)

    Fitch, Dale

    2004-01-01

    The author proposes a model for client control of case information via the World Wide Web built on principles of general system theory. It incorporates the client into the design, resulting in an information structure that differs from traditional human services information-sharing practices. Referencing general system theory, the concepts of…

  16. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  17. An informational theory of privacy

    NARCIS (Netherlands)

    Schottmuller, C.; Jann, Ole

    2016-01-01

    We develop a theory that explains how and when privacy can increase welfare. Without privacy, some individuals misrepresent their preferences, because they will otherwise be statistically discriminated against. This "chilling effect" hurts them individually, and impairs information aggregation. The

  18. Econophysics: from Game Theory and Information Theory to Quantum Mechanics

    Science.gov (United States)

    Jimenez, Edward; Moya, Douglas

    2005-03-01

    Rationality is the universal invariant among human behavior, universe physical laws and ordered and complex biological systems. Econophysics isboth the use of physical concepts in Finance and Economics, and the use of Information Economics in Physics. In special, we will show that it is possible to obtain the Quantum Mechanics principles using Information and Game Theory.

  19. Information processing theory in the early design stages

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2014-01-01

    suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...

  20. Comment on Gallistel: behavior theory and information theory: some parallels.

    Science.gov (United States)

    Nevin, John A

    2012-05-01

    In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Cognitive Load Theory and the Effects of Transient Information on the Modality Effect

    Science.gov (United States)

    Leahy, Wayne; Sweller, John

    2016-01-01

    Based on cognitive load theory and the "transient information effect," this paper investigated the "modality effect" while interpreting a contour map. The length and complexity of auditory and visual text instructions were manipulated. Experiment 1 indicated that longer audio text information within a presentation was inferior…

  2. Writing, Proofreading and Editing in Information Theory

    Directory of Open Access Journals (Sweden)

    J. Ricardo Arias-Gonzalez

    2018-05-01

    Full Text Available Information is a physical entity amenable to be described by an abstract theory. The concepts associated with the creation and post-processing of the information have not, however, been mathematically established, despite being broadly used in many fields of knowledge. Here, inspired by how information is managed in biomolecular systems, we introduce writing, entailing any bit string generation, and revision, as comprising proofreading and editing, in information chains. Our formalism expands the thermodynamic analysis of stochastic chains made up of material subunits to abstract strings of symbols. We introduce a non-Markovian treatment of operational rules over the symbols of the chain that parallels the physical interactions responsible for memory effects in material chains. Our theory underlies any communication system, ranging from human languages and computer science to gene evolution.

  3. Information theory, spectral geometry, and quantum gravity.

    Science.gov (United States)

    Kempf, Achim; Martin, Robert

    2008-01-18

    We show that there exists a deep link between the two disciplines of information theory and spectral geometry. This allows us to obtain new results on a well-known quantum gravity motivated natural ultraviolet cutoff which describes an upper bound on the spatial density of information. Concretely, we show that, together with an infrared cutoff, this natural ultraviolet cutoff beautifully reduces the path integral of quantum field theory on curved space to a finite number of ordinary integrations. We then show, in particular, that the subsequent removal of the infrared cutoff is safe.

  4. The application of foraging theory to the information searching behaviour of general practitioners

    Directory of Open Access Journals (Sweden)

    Dowell Anthony C

    2011-08-01

    minimizing searching time. As predicted by foraging theory, GPs trade time-consuming evidence-based (electronic information sources for sources with a higher information reward per unit time searched. Evidence-based practice must accommodate these 'real world' foraging pressures, and Internet resources should evolve to deliver information as effectively as traditional methods of information gathering.

  5. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  6. Information Theory and Plasma Turbulence

    International Nuclear Information System (INIS)

    Dendy, R. O.

    2009-01-01

    Information theory, applied directly to measured signals, yields new perspectives on, and quantitative knowledge of, the physics of strongly nonlinear and turbulent phenomena in plasmas. It represents a new and productive element of the topical research programmes that use modern techniques to characterise strongly nonlinear signals from plasmas, and that address global plasma behaviour from a complex systems perspective. We here review some pioneering studies of mutual information in solar wind and magnetospheric plasmas, using techniques tested on standard complex systems.

  7. Testing components of Rothbard’s theory with the current information system

    Directory of Open Access Journals (Sweden)

    Aurelian Virgil BĂLUŢĂ

    2016-03-01

    Full Text Available The concept of aggression against property rights of individuals generates a series of developments that allow solutions and options to problems and dilemmas of today's economy: the dynamics of the tax system, focusing attention on shaping the budget with macro-economic calculations, the protection of competition, and customs policy in the modern era. The confidence in theory in general, especially in economic theory, is based on the logical and methodological validation of scientific reasoning and moral aspects. Transforming the theory into a means of changing the society can only be made when a theory is experimentally validated. The economic theory needs confirmation from specialized disciplines such as statistics and accounting. It is possible and necessary for the advantages of radical liberal thinking to be reflected in every company’s bookkeeping and in public statistics. As an example, the paper presents the way some components of Rothbard's theory are reflect in the accounting and statistics information system.

  8. Quantum theory informational foundations and foils

    CERN Document Server

    Spekkens, Robert

    2016-01-01

    This book provides the first unified overview of the burgeoning research area at the interface between Quantum Foundations and Quantum Information.  Topics include: operational alternatives to quantum theory, information-theoretic reconstructions of the quantum formalism, mathematical frameworks for operational theories, and device-independent features of the set of quantum correlations. Powered by the injection of fresh ideas from the field of Quantum Information and Computation, the foundations of Quantum Mechanics are in the midst of a renaissance. The last two decades have seen an explosion of new results and research directions, attracting broad interest in the scientific community. The variety and number of different approaches, however, makes it challenging for a newcomer to obtain a big picture of the field and of its high-level goals. Here, fourteen original contributions from leading experts in the field cover some of the most promising research directions that have emerged in the new wave of quant...

  9. Applications of quantum information theory to quantum gravity

    International Nuclear Information System (INIS)

    Smolin, L.

    2005-01-01

    Full text: I describe work by and with Fotini Markopoulou and Olaf Dreyeron the application of quantum information theory to quantum gravity. A particular application to black hole physics is described, which treats the black hole horizon as an open system, in interaction with an environment, which are the degrees of freedom in the bulk spacetime. This allows us to elucidate which quantum states of a general horizon contribute to the entropy of a Schwarzchild black hole. This case serves as an example of how methods from quantum information theory may help to elucidate how the classical limit emerges from a background independent quantum theory of gravity. (author)

  10. Activity System Theory Approach to Healthcare Information System

    OpenAIRE

    Bai, Guohua

    2004-01-01

    Healthcare information system is a very complex system and has to be approached from systematic perspectives. This paper presents an Activity System Theory (ATS) approach by integrating system thinking and social psychology. First part of the paper, the activity system theory is presented, especially a recursive model of human activity system is introduced. A project ‘Integrated Mobile Information System for Diabetic Healthcare (IMIS)’ is then used to demonstrate a practical application of th...

  11. Theory-based interventions for contraception.

    Science.gov (United States)

    Lopez, Laureen M; Grey, Thomas W; Chen, Mario; Tolley, Elizabeth E; Stockton, Laurie L

    2016-11-23

    The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, many educational interventions addressing contraception have no explicit theoretical base. To review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice and encourage or improve contraceptive use. To 1 November 2016, we searched for trials that tested a theory-based intervention for improving contraceptive use in PubMed, CENTRAL, POPLINE, Web of Science, ClinicalTrials.gov, and ICTRP. For the initial review, we wrote to investigators to find other trials. Included trials tested a theory-based intervention for improving contraceptive use. Interventions addressed the use of one or more methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy and contraceptive choice or use. We assessed titles and abstracts identified during the searches. One author extracted and entered the data into Review Manager; a second author verified accuracy. We examined studies for methodological quality.For unadjusted dichotomous outcomes, we calculated the Mantel-Haenszel odds ratio (OR) with 95% confidence interval (CI). Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. We did not conduct meta-analysis due to varied interventions and outcome measures. We included 10 new trials for a total of 25. Five were conducted outside the USA. Fifteen randomly assigned individuals and 10 randomized clusters. This section focuses on nine trials with high or

  12. Role of information theoretic uncertainty relations in quantum theory

    International Nuclear Information System (INIS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-01-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed

  13. Role of information theoretic uncertainty relations in quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  14. An introductory review of information theory in the context of computational neuroscience.

    Science.gov (United States)

    McDonnell, Mark D; Ikeda, Shiro; Manton, Jonathan H

    2011-07-01

    This article introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand its foundations to incorporate more comprehensively biological processes thereby helping reveal how neuronal networks achieve their remarkable information processing abilities.

  15. Informal Risk Perceptions and Formal Theory

    International Nuclear Information System (INIS)

    Cayford, Jerry

    2001-01-01

    Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to a qualitative

  16. Informal Risk Perceptions and Formal Theory

    Energy Technology Data Exchange (ETDEWEB)

    Cayford, Jerry [Resources for the Future, Washington, DC (United States)

    2001-07-01

    Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to

  17. Advancing Theory? Landscape Archaeology and Geographical Information Systems

    Directory of Open Access Journals (Sweden)

    Di Hu

    2012-05-01

    Full Text Available This paper will focus on how Geographical Information Systems (GIS have been applied in Landscape Archaeology from the late 1980s to the present. GIS, a tool for organising and analysing spatial information, has exploded in popularity, but we still lack a systematic overview of how it has contributed to archaeological theory, specifically Landscape Archaeology. This paper will examine whether and how GIS has advanced archaeological theory through a historical review of its application in archaeology.

  18. Epistemology as Information Theory: From Leibniz to Omega

    OpenAIRE

    Chaitin, G. J.

    2005-01-01

    In 1686 in his Discours de Metaphysique, Leibniz points out that if an arbitrarily complex theory is permitted then the notion of "theory" becomes vacuous because there is always a theory. This idea is developed in the modern theory of algorithmic information, which deals with the size of computer programs and provides a new view of Godel's work on incompleteness and Turing's work on uncomputability. Of particular interest is the halting probability Omega, whose bits are irreducible, i.e., ma...

  19. Applying Information Processing Theory to Supervision: An Initial Exploration

    Science.gov (United States)

    Tangen, Jodi L.; Borders, L. DiAnne

    2017-01-01

    Although clinical supervision is an educational endeavor (Borders & Brown, [Borders, L. D., 2005]), many scholars neglect theories of learning in working with supervisees. The authors describe 1 learning theory--information processing theory (Atkinson & Shiffrin, 1968, 1971; Schunk, 2016)--and the ways its associated interventions may…

  20. Affect Theory and Autoethnography in Ordinary Information Systems

    DEFF Research Database (Denmark)

    Bødker, Mads; Chamberlain, Alan

    2016-01-01

    This paper uses philosophical theories of affect as a lens for exploring autoethnographic renderings of everyday experience with information technology. Affect theories, in the paper, denote a broad trend in post-humanistic philosophy that explores sensation and feeling as emergent and relational...

  1. Entropy in quantum information theory - Communication and cryptography

    DEFF Research Database (Denmark)

    Majenz, Christian

    in quantum Shannon theory. While immensely more entanglement-consuming, the variant of port based teleportation is interesting for applications like instantaneous non-local computation and attacks on quantum position-based cryptography. Port based teleportation cannot be implemented perfectly......, for vanishing error. As a byproduct, a new lower bound for the size of the program register for an approximate universal programmable quantum processor is derived. Finally, the mix is completed with a result in quantum cryptography. While quantum key distribution is the most well-known quantum cryptographic...... protocol, there has been increased interest in extending the framework of symmetric key cryptography to quantum messages. We give a new denition for information-theoretic quantum non-malleability, strengthening the previous denition by Ambainis et al. We show that quantum non-malleability implies secrecy...

  2. Should the model for risk-informed regulation be game theory rather than decision theory?

    Science.gov (United States)

    Bier, Vicki M; Lin, Shi-Woei

    2013-02-01

    Risk analysts frequently view the regulation of risks as being largely a matter of decision theory. According to this view, risk analysis methods provide information on the likelihood and severity of various possible outcomes; this information should then be assessed using a decision-theoretic approach (such as cost/benefit analysis) to determine whether the risks are acceptable, and whether additional regulation is warranted. However, this view ignores the fact that in many industries (particularly industries that are technologically sophisticated and employ specialized risk and safety experts), risk analyses may be done by regulated firms, not by the regulator. Moreover, those firms may have more knowledge about the levels of safety at their own facilities than the regulator does. This creates a situation in which the regulated firm has both the opportunity-and often also the motive-to provide inaccurate (in particular, favorably biased) risk information to the regulator, and hence the regulator has reason to doubt the accuracy of the risk information provided by regulated parties. Researchers have argued that decision theory is capable of dealing with many such strategic interactions as well as game theory can. This is especially true in two-player, two-stage games in which the follower has a unique best strategy in response to the leader's strategy, as appears to be the case in the situation analyzed in this article. However, even in such cases, we agree with Cox that game-theoretic methods and concepts can still be useful. In particular, the tools of mechanism design, and especially the revelation principle, can simplify the analysis of such games because the revelation principle provides rigorous assurance that it is sufficient to analyze only games in which licensees truthfully report their risk levels, making the problem more manageable. Without that, it would generally be necessary to consider much more complicated forms of strategic behavior (including

  3. Actor-network Theory and cartography of controversies in Information Science

    OpenAIRE

    LOURENÇO, Ramon Fernandes; TOMAÉL, Maria Inês

    2018-01-01

    Abstract The present study aims to discuss the interactions between the Actor-network Theory and the Cartography of Controversies method in Information Science research. A literature review was conducted on books, scholarly articles, and any other sources addressing the Theory-Actor Network and Cartography of Controversies. The understanding of the theoretical assumptions that guide the Network-Actor Theory allows examining important aspects to Information Science research, seeking to identif...

  4. What Density Functional Theory could do for Quantum Information

    Science.gov (United States)

    Mattsson, Ann

    2015-03-01

    The Hohenberg-Kohn theorem of Density Functional Theory (DFT), and extensions thereof, tells us that all properties of a system of electrons can be determined through their density, which uniquely determines the many-body wave-function. Given access to the appropriate, universal, functionals of the density we would, in theory, be able to determine all observables of any electronic system, without explicit reference to the wave-function. On the other hand, the wave-function is at the core of Quantum Information (QI), with the wave-function of a set of qubits being the central computational resource in a quantum computer. While there is seemingly little overlap between DFT and QI, reliance upon observables form a key connection. Though the time-evolution of the wave-function and associated phase information is fundamental to quantum computation, the initial and final states of a quantum computer are characterized by observables of the system. While observables can be extracted directly from a system's wave-function, DFT tells us that we may be able to intuit a method for extracting them from its density. In this talk, I will review the fundamentals of DFT and how these principles connect to the world of QI. This will range from DFT's utility in the engineering of physical qubits, to the possibility of using it to efficiently (but approximately) simulate Hamiltonians at the logical level. The apparent paradox of describing algorithms based on the quantum mechanical many-body wave-function with a DFT-like theory based on observables will remain a focus throughout. The ultimate goal of this talk is to initiate a dialog about what DFT could do for QI, in theory and in practice. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  5. The Philosophy of Information as an Underlying and Unifying Theory of Information Science

    Science.gov (United States)

    Tomic, Taeda

    2010-01-01

    Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…

  6. Understanding women's mammography intentions: a theory-based investigation.

    Science.gov (United States)

    Naito, Mikako; O'Callaghan, Frances V; Morrissey, Shirley

    2009-01-01

    The present study compared the utility of two models (the Theory of Planned Behavior and Protection Motivation Theory) in identifying factors associated with intentions to undertake screening mammography, before and after an intervention. The comparison was made between the unique components of the two models. The effect of including implementation intentions was also investigated. Two hundred and fifty-one women aged 37 to 69 years completed questionnaires at baseline and following the delivery of a standard (control) or a protection motivation theory-based informational intervention. Hierarchical multiple regressions indicated that theory of planned behavior variables were associated with mammography intentions. Results also showed that inclusion of implementation intention in the model significantly increased the association with mammography intentions. The findings suggest that future interventions aiming to increase screening mammography participation should focus on the theory of planned behavior variables and that implementation intention should also be targeted.

  7. An Information Theory-Inspired Strategy for Design of Re-programmable Encrypted Graphene-based Coding Metasurfaces at Terahertz Frequencies.

    Science.gov (United States)

    Momeni, Ali; Rouhi, Kasra; Rajabalipanah, Hamid; Abdolali, Ali

    2018-04-18

    Inspired by the information theory, a new concept of re-programmable encrypted graphene-based coding metasurfaces was investigated at terahertz frequencies. A channel-coding function was proposed to convolutionally record an arbitrary information message onto unrecognizable but recoverable parity beams generated by a phase-encrypted coding metasurface. A single graphene-based reflective cell with dual-mode biasing voltages was designed to act as "0" and "1" meta-atoms, providing broadband opposite reflection phases. By exploiting graphene tunability, the proposed scheme enabled an unprecedented degree of freedom in the real-time mapping of information messages onto multiple parity beams which could not be damaged, altered, and reverse-engineered. Various encryption types such as mirroring, anomalous reflection, multi-beam generation, and scattering diffusion can be dynamically attained via our multifunctional metasurface. Besides, contrary to conventional time-consuming and optimization-based methods, this paper convincingly offers a fast, straightforward, and efficient design of diffusion metasurfaces of arbitrarily large size. Rigorous full-wave simulations corroborated the results where the phase-encrypted metasurfaces exhibited a polarization-insensitive reflectivity less than -10 dB over a broadband frequency range from 1 THz to 1.7 THz. This work reveals new opportunities for the extension of re-programmable THz-coding metasurfaces and may be of interest for reflection-type security systems, computational imaging, and camouflage technology.

  8. Probability and information theory, with applications to radar

    CERN Document Server

    Woodward, P M; Higinbotham, W

    1964-01-01

    Electronics and Instrumentation, Second Edition, Volume 3: Probability and Information Theory with Applications to Radar provides information pertinent to the development on research carried out in electronics and applied physics. This book presents the established mathematical techniques that provide the code in which so much of the mathematical theory of electronics and radar is expressed.Organized into eight chapters, this edition begins with an overview of the geometry of probability distributions in which moments play a significant role. This text then examines the mathematical methods in

  9. Information Theoretic Characterization of Physical Theories with Projective State Space

    Science.gov (United States)

    Zaopo, Marco

    2015-08-01

    Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.

  10. Using theories of behaviour change to inform interventions for addictive behaviours.

    Science.gov (United States)

    Webb, Thomas L; Sniehotta, Falko F; Michie, Susan

    2010-11-01

    This paper reviews a set of theories of behaviour change that are used outside the field of addiction and considers their relevance for this field. Ten theories are reviewed in terms of (i) the main tenets of each theory, (ii) the implications of the theory for promoting change in addictive behaviours and (iii) studies in the field of addiction that have used the theory. An augmented feedback loop model based on Control Theory is used to organize the theories and to show how different interventions might achieve behaviour change. Briefly, each theory provided the following recommendations for intervention: Control Theory: prompt behavioural monitoring, Goal-Setting Theory: set specific and challenging goals, Model of Action Phases: form 'implementation intentions', Strength Model of Self-Control: bolster self-control resources, Social Cognition Models (Protection Motivation Theory, Theory of Planned Behaviour, Health Belief Model): modify relevant cognitions, Elaboration Likelihood Model: consider targets' motivation and ability to process information, Prototype Willingness Model: change perceptions of the prototypical person who engages in behaviour and Social Cognitive Theory: modify self-efficacy. There are a range of theories in the field of behaviour change that can be applied usefully to addiction, each one pointing to a different set of modifiable determinants and/or behaviour change techniques. Studies reporting interventions should describe theoretical basis, behaviour change techniques and mode of delivery accurately so that effective interventions can be understood and replicated. © 2010 The Authors. Journal compilation © 2010 Society for the Study of Addiction.

  11. Quantum information theory mathematical foundation

    CERN Document Server

    Hayashi, Masahito

    2017-01-01

    This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics – all of which are addressed here – made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an impro...

  12. Consensus for linear multi-agent system with intermittent information transmissions using the time-scale theory

    Science.gov (United States)

    Taousser, Fatima; Defoort, Michael; Djemai, Mohamed

    2016-01-01

    This paper investigates the consensus problem for linear multi-agent system with fixed communication topology in the presence of intermittent communication using the time-scale theory. Since each agent can only obtain relative local information intermittently, the proposed consensus algorithm is based on a discontinuous local interaction rule. The interaction among agents happens at a disjoint set of continuous-time intervals. The closed-loop multi-agent system can be represented using mixed linear continuous-time and linear discrete-time models due to intermittent information transmissions. The time-scale theory provides a powerful tool to combine continuous-time and discrete-time cases and study the consensus protocol under a unified framework. Using this theory, some conditions are derived to achieve exponential consensus under intermittent information transmissions. Simulations are performed to validate the theoretical results.

  13. Discovery and validation of information theory-based transcription factor and cofactor binding site motifs.

    Science.gov (United States)

    Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K

    2017-03-17

    Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website.

    Science.gov (United States)

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; White, Peter; Yardley, Lucy; Bishop, Felicity L

    2016-06-10

    According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative 'think aloud' study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients' stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants' experiences of using the website. We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes

  15. Entropy and information causality in general probabilistic theories

    International Nuclear Information System (INIS)

    Barnum, Howard; Leifer, Matthew; Spekkens, Robert; Barrett, Jonathan; Clark, Lisa Orloff; Stepanik, Nicholas; Wilce, Alex; Wilke, Robin

    2010-01-01

    We investigate the concept of entropy in probabilistic theories more general than quantum mechanics, with particular reference to the notion of information causality (IC) recently proposed by Pawlowski et al (2009 arXiv:0905.2292). We consider two entropic quantities, which we term measurement and mixing entropy. In the context of classical and quantum theory, these coincide, being given by the Shannon and von Neumann entropies, respectively; in general, however, they are very different. In particular, while measurement entropy is easily seen to be concave, mixing entropy need not be. In fact, as we show, mixing entropy is not concave whenever the state space is a non-simplicial polytope. Thus, the condition that measurement and mixing entropies coincide is a strong constraint on possible theories. We call theories with this property monoentropic. Measurement entropy is subadditive, but not in general strongly subadditive. Equivalently, if we define the mutual information between two systems A and B by the usual formula I(A: B)=H(A)+H(B)-H(AB), where H denotes the measurement entropy and AB is a non-signaling composite of A and B, then it can happen that I(A:BC)< I(A:B). This is relevant to IC in the sense of Pawlowski et al: we show that any monoentropic non-signaling theory in which measurement entropy is strongly subadditive, and also satisfies a version of the Holevo bound, is informationally causal, and on the other hand we observe that Popescu-Rohrlich boxes, which violate IC, also violate strong subadditivity. We also explore the interplay between measurement and mixing entropy and various natural conditions on theories that arise in quantum axiomatics.

  16. The Use of Ideas of Information Theory for Studying “Language” and Intelligence in Ants

    Directory of Open Access Journals (Sweden)

    Zhanna Reznikova

    2009-11-01

    Full Text Available In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a message (l and its frequency (p, i.e., l = –log p for rational communication systems. This approach enabled us to obtain the following important results on ants’ communication and intelligence: (i to reveal “distant homing” in ants, that is, their ability to transfer information about remote events; (ii to estimate the rate of information transmission; (iii to reveal that ants are able to grasp regularities and to use them for “compression” of information; (iv to reveal that ants are able to transfer to each other the information about the number of objects; (v to discover that ants can add and subtract small numbers. The obtained results show that information theory is not only excellent mathematical theory, but many of its results may be considered as Nature laws.

  17. Exploring a Theory Describing the Physics of Information Systems, Characterizing the Phenomena of Complex Information Systems

    National Research Council Canada - National Science Library

    Harmon, Scott

    2001-01-01

    This project accomplished all of its objectives: document a theory of information physics, conduct a workshop on planing experiments to test this theory, and design experiments that validate this theory...

  18. Mathematics Education as a Proving-Ground for Information-Processing Theories.

    Science.gov (United States)

    Greer, Brian, Ed.; Verschaffel, Lieven, Ed.

    1990-01-01

    Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…

  19. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  20. Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact

    Science.gov (United States)

    McQuillin, Samuel D.; Lyons, Michael D.

    2016-01-01

    This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…

  1. Pre-Game-Theory Based Information Technology (GAMBIT) Study

    National Research Council Canada - National Science Library

    Polk, Charles

    2003-01-01

    .... The generic GAMBIT scenario has been characterized as Dynamic Hierarchical Gaming (DHG). Game theory is not yet ready to fully support analysis of DHG, though existing partial analysis suggests that a full treatment is practical in the midterm...

  2. Designing theoretically-informed implementation interventions: Fine in theory, but evidence of effectiveness in practice is needed

    Directory of Open Access Journals (Sweden)

    Reeves Scott

    2006-02-01

    Full Text Available Abstract The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.

  3. USING INFORMATION THEORY TO DEFINE A SUSTAINABILITY INDEX

    Science.gov (United States)

    Information theory has many applications in Ecology and Environmental science, such as a biodiversity indicator, as a measure of evolution, a measure of distance from thermodynamic equilibrium, and as a measure of system organization. Fisher Information, in particular, provides a...

  4. Computer Support of Groups: Theory-Based Models for GDSS Research

    OpenAIRE

    V. Srinivasan Rao; Sirkka L. Jarvenpaa

    1991-01-01

    Empirical research in the area of computer support of groups is characterized by inconsistent results across studies. This paper attempts to reconcile the inconsistencies by linking the ad hoc reasoning in the studies to existing theories of communication, minority influence and human information processing. Contingency models are then presented based on the theories discussed. The paper concludes by discussing the linkages between the current work and other recently published integrations of...

  5. Response to Patrick Love's "Informal Theory": A Rejoinder

    Science.gov (United States)

    Evans, Nancy J.; Guido, Florence M.

    2012-01-01

    This rejoinder to Patrick Love's article, "Informal Theory: The Ignored Link in Theory-to-Practice," which appears earlier in this issue of the "Journal of College Student Development", was written at the invitation of the Editor. In the critique, we point out the weaknesses of many of Love's arguments and propositions. We provide an alternative…

  6. Generalized phase retrieval algorithm based on information measures

    OpenAIRE

    Shioya, Hiroyuki; Gohara, Kazutoshi

    2006-01-01

    An iterative phase retrieval algorithm based on the maximum entropy method (MEM) is presented. Introducing a new generalized information measure, we derive a novel class of algorithms which includes the conventionally used error reduction algorithm and a MEM-type iterative algorithm which is presented for the first time. These different phase retrieval methods are unified on the basis of the framework of information measures used in information theory.

  7. Hiding data selected topics : Rudolf Ahlswede’s lectures on information theory 3

    CERN Document Server

    Althöfer, Ingo; Deppe, Christian; Tamm, Ulrich

    2016-01-01

    Devoted to information security, this volume begins with a short course on cryptography, mainly based on lectures given by Rudolf Ahlswede at the University of Bielefeld in the mid 1990s. It was the second of his cycle of lectures on information theory which opened with an introductory course on basic coding theorems, as covered in Volume 1 of this series. In this third volume, Shannon’s historical work on secrecy systems is detailed, followed by an introduction to an information-theoretic model of wiretap channels, and such important concepts as homophonic coding and authentication. Once the theoretical arguments have been presented, comprehensive technical details of AES are given. Furthermore, a short introduction to the history of public-key cryptology, RSA and El Gamal cryptosystems is provided, followed by a look at the basic theory of elliptic curves, and algorithms for efficient addition in elliptic curves. Lastly, the important topic of “oblivious transfer” is discussed, which is strongly conne...

  8. Structural information theory and visual form

    NARCIS (Netherlands)

    Leeuwenberg, E.L.J.; Kaernbach, C.; Schroeger, E.; Mueller, H.

    2003-01-01

    The paper attends to basic characteristics of visual form as approached by Structural information theory, or SIT, (Leeuwenberg, Van der Helm and Van Lier). The introduction provides a global survey of this approach. The main part of the paper focuses on three characteristics of SIT. Each one is made

  9. An Intuitionistic Fuzzy Stochastic Decision-Making Method Based on Case-Based Reasoning and Prospect Theory

    Directory of Open Access Journals (Sweden)

    Peng Li

    2017-01-01

    Full Text Available According to the case-based reasoning method and prospect theory, this paper mainly focuses on finding a way to obtain decision-makers’ preferences and the criterion weights for stochastic multicriteria decision-making problems and classify alternatives. Firstly, we construct a new score function for an intuitionistic fuzzy number (IFN considering the decision-making environment. Then, we aggregate the decision-making information in different natural states according to the prospect theory and test decision-making matrices. A mathematical programming model based on a case-based reasoning method is presented to obtain the criterion weights. Moreover, in the original decision-making problem, we integrate all the intuitionistic fuzzy decision-making matrices into an expectation matrix using the expected utility theory and classify or rank the alternatives by the case-based reasoning method. Finally, two illustrative examples are provided to illustrate the implementation process and applicability of the developed method.

  10. Correlation Feature Selection and Mutual Information Theory Based Quantitative Research on Meteorological Impact Factors of Module Temperature for Solar Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Yujing Sun

    2016-12-01

    Full Text Available The module temperature is the most important parameter influencing the output power of solar photovoltaic (PV systems, aside from solar irradiance. In this paper, we focus on the interdisciplinary research that combines the correlation analysis, mutual information (MI and heat transfer theory, which aims to figure out the correlative relations between different meteorological impact factors (MIFs and PV module temperature from both quality and quantitative aspects. The identification and confirmation of primary MIFs of PV module temperature are investigated as the first step of this research from the perspective of physical meaning and mathematical analysis about electrical performance and thermal characteristic of PV modules based on PV effect and heat transfer theory. Furthermore, the quantitative description of the MIFs influence on PV module temperature is mathematically formulated as several indexes using correlation-based feature selection (CFS and MI theory to explore the specific impact degrees under four different typical weather statuses named general weather classes (GWCs. Case studies for the proposed methods were conducted using actual measurement data of a 500 kW grid-connected solar PV plant in China. The results not only verified the knowledge about the main MIFs of PV module temperatures, more importantly, but also provide the specific ratio of quantitative impact degrees of these three MIFs respectively through CFS and MI based measures under four different GWCs.

  11. Parametric sensitivity analysis for biochemical reaction networks based on pathwise information theory.

    Science.gov (United States)

    Pantazis, Yannis; Katsoulakis, Markos A; Vlachos, Dionisios G

    2013-10-22

    Stochastic modeling and simulation provide powerful predictive methods for the intrinsic understanding of fundamental mechanisms in complex biochemical networks. Typically, such mathematical models involve networks of coupled jump stochastic processes with a large number of parameters that need to be suitably calibrated against experimental data. In this direction, the parameter sensitivity analysis of reaction networks is an essential mathematical and computational tool, yielding information regarding the robustness and the identifiability of model parameters. However, existing sensitivity analysis approaches such as variants of the finite difference method can have an overwhelming computational cost in models with a high-dimensional parameter space. We develop a sensitivity analysis methodology suitable for complex stochastic reaction networks with a large number of parameters. The proposed approach is based on Information Theory methods and relies on the quantification of information loss due to parameter perturbations between time-series distributions. For this reason, we need to work on path-space, i.e., the set consisting of all stochastic trajectories, hence the proposed approach is referred to as "pathwise". The pathwise sensitivity analysis method is realized by employing the rigorously-derived Relative Entropy Rate, which is directly computable from the propensity functions. A key aspect of the method is that an associated pathwise Fisher Information Matrix (FIM) is defined, which in turn constitutes a gradient-free approach to quantifying parameter sensitivities. The structure of the FIM turns out to be block-diagonal, revealing hidden parameter dependencies and sensitivities in reaction networks. As a gradient-free method, the proposed sensitivity analysis provides a significant advantage when dealing with complex stochastic systems with a large number of parameters. In addition, the knowledge of the structure of the FIM can allow to efficiently address

  12. The use of information theory for the evaluation of biomarkers of aging and physiological age.

    Science.gov (United States)

    Blokh, David; Stambler, Ilia

    2017-04-01

    The present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians. The use of information-theoretical methods, utilizing normalized mutual information, revealed the exact amount of information that various diagnostic parameters or their combinations contained about the persons' age. Based on those exact informative values for the correlation of measured parameters with age, we constructed a diagnostic rule (a decision tree) to evaluate physiological age, as compared to chronological age. The present data illustrated that younger subjects suffering from heart disease showed characteristics of people of higher age (higher physiological age). Utilizing information-theoretical measures, with additional data, it may be possible to create further clinically applicable information-theory-based markers and models for the evaluation of physiological age, its relation to age-related diseases and its potential modifications by therapeutic interventions. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  14. A THEORY OF MAXIMIZING SENSORY INFORMATION

    NARCIS (Netherlands)

    Hateren, J.H. van

    1992-01-01

    A theory is developed on the assumption that early sensory processing aims at maximizing the information rate in the channels connecting the sensory system to more central parts of the brain, where it is assumed that these channels are noisy and have a limited dynamic range. Given a stimulus power

  15. Evaluation of the efficiency of computer-aided spectra search systems based on information theory

    International Nuclear Information System (INIS)

    Schaarschmidt, K.

    1979-01-01

    Application of information theory allows objective evaluation of the efficiency of computer-aided spectra search systems. For this purpose, a significant number of search processes must be analyzed. The amount of information gained by computer application is considered as the difference between the entropy of the data bank and a conditional entropy depending on the proportion of unsuccessful search processes and ballast. The influence of the following factors can be estimated: volume, structure, and quality of the spectra collection stored, efficiency of the encoding instruction and the comparing algorithm, and subjective errors involved in the encoding of spectra. The relations derived are applied to two published storage and retrieval systems for infared spectra. (Auth.)

  16. Information theory and stochastics for multiscale nonlinear systems

    CERN Document Server

    Majda, Andrew J; Grote, Marcus J

    2005-01-01

    This book introduces mathematicians to the fascinating emerging mathematical interplay between ideas from stochastics and information theory and important practical issues in studying complex multiscale nonlinear systems. It emphasizes the serendipity between modern applied mathematics and applications where rigorous analysis, the development of qualitative and/or asymptotic models, and numerical modeling all interact to explain complex phenomena. After a brief introduction to the emerging issues in multiscale modeling, the book has three main chapters. The first chapter is an introduction to information theory with novel applications to statistical mechanics, predictability, and Jupiter's Red Spot for geophysical flows. The second chapter discusses new mathematical issues regarding fluctuation-dissipation theorems for complex nonlinear systems including information flow, various approximations, and illustrates applications to various mathematical models. The third chapter discusses stochastic modeling of com...

  17. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  18. Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling

    Science.gov (United States)

    Shin, Young Shik; Remacle, F.; Fan, Rong; Hwang, Kiwook; Wei, Wei; Ahmad, Habib; Levine, R.D.; Heath, James R.

    2011-01-01

    Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network. PMID:21575571

  19. Spacecraft TT&C and information transmission theory and technologies

    CERN Document Server

    Liu, Jiaxing

    2015-01-01

    Spacecraft TT&C and Information Transmission Theory and Technologies introduces the basic theory of spacecraft TT&C (telemetry, track and command) and information transmission. Combining TT&C and information transmission, the book presents several technologies for continuous wave radar including measurements for range, range rate and angle, analog and digital information transmissions, telecommand, telemetry, remote sensing and spread spectrum TT&C. For special problems occurred in the channels for TT&C and information transmission, the book represents radio propagation features and its impact on orbit measurement accuracy, and the effects caused by rain attenuation, atmospheric attenuation and multi-path effect, and polarization composition technology. This book can benefit researchers and engineers in the field of spacecraft TT&C and communication systems. Liu Jiaxing is a professor at The 10th Institute of China Electronics Technology Group Corporation.

  20. Towards an Information Theory of Complex Networks

    CERN Document Server

    Dehmer, Matthias; Mehler, Alexander

    2011-01-01

    For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti

  1. Quantum Information Biology: From Theory of Open Quantum Systems to Adaptive Dynamics

    Science.gov (United States)

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    This chapter reviews quantum(-like) information biology (QIB). Here biology is treated widely as even covering cognition and its derivatives: psychology and decision making, sociology, and behavioral economics and finances. QIB provides an integrative description of information processing by bio-systems at all scales of life: from proteins and cells to cognition, ecological and social systems. Mathematically QIB is based on the theory of adaptive quantum systems (which covers also open quantum systems). Ideologically QIB is based on the quantum-like (QL) paradigm: complex bio-systems process information in accordance with the laws of quantum information and probability. This paradigm is supported by plenty of statistical bio-data collected at all bio-scales. QIB re ects the two fundamental principles: a) adaptivity; and, b) openness (bio-systems are fundamentally open). In addition, quantum adaptive dynamics provides the most generally possible mathematical representation of these principles.

  2. The Foundation Role for Theories of Agency in Understanding Information Systems Design

    Directory of Open Access Journals (Sweden)

    Robert Johnston

    2002-11-01

    Full Text Available In this paper we argue that theories of agency form a foundation upon which we can build a deeper understanding of information systems design. We do so by firstly recognising that information systems are part of purposeful sociotechnical systems and that consequently theories of agency may help in understanding them. We then present two alternative theories of agency (deliberative and situational, mainly drawn from the robotics and artificial intelligence disciplines, and in doing so, we note that existing information system design methods and ontological studies of those methods implicitly adhere to the deliberative theory of agency. We also note that while there are advantages in specific circumstances from utilising the situated theory of agency in designing complex systems, because of their differing ontological commitments, such systems would be difficult to analyse and evaluate using ontologies currently used in information systems. We then provide evidence that such situational information systems can indeed exist, by giving a specific example (the Kanban system, which has emerged from manufacturing practice. We conclude that information systems are likely to benefit from creating design approaches supporting the production of situational systems.

  3. The informationally-complete quantum theory

    OpenAIRE

    Chen, Zeng-Bing

    2014-01-01

    Quantum mechanics is a cornerstone of our current understanding of nature and extremely successful in describing physics covering a huge range of scales. However, its interpretation remains controversial since the early days of quantum mechanics. What does a quantum state really mean? Is there any way out of the so-called quantum measurement problem? Here we present an informationally-complete quantum theory (ICQT) and the trinary property of nature to beat the above problems. We assume that ...

  4. Using information theory to assess the communicative capacity of circulating microRNA.

    Science.gov (United States)

    Finn, Nnenna A; Searles, Charles D

    2013-10-11

    The discovery of extracellular microRNAs (miRNAs) and their transport modalities (i.e., microparticles, exosomes, proteins and lipoproteins) has sparked theories regarding their role in intercellular communication. Here, we assessed the information transfer capacity of different miRNA transport modalities in human serum by utilizing basic principles of information theory. Zipf Statistics were calculated for each of the miRNA transport modalities identified in human serum. Our analyses revealed that miRNA-mediated information transfer is redundant, as evidenced by negative Zipf's Statistics with magnitudes greater than one. In healthy subjects, the potential communicative capacity of miRNA in complex with circulating proteins was significantly lower than that of miRNA encapsulated in circulating microparticles and exosomes. Moreover, the presence of coronary heart disease significantly lowered the communicative capacity of all circulating miRNA transport modalities. To assess the internal organization of circulating miRNA signals, Shannon's zero- and first-order entropies were calculated. Microparticles (MPs) exhibited the lowest Shannon entropic slope, indicating a relatively high capacity for information transfer. Furthermore, compared to the other miRNA transport modalities, MPs appeared to be the most efficient at transferring miRNA to cultured endothelial cells. Taken together, these findings suggest that although all transport modalities have the capacity for miRNA-based information transfer, MPs may be the simplest and most robust way to achieve miRNA-based signal transduction in sera. This study presents a novel method for analyzing the quantitative capacity of miRNA-mediated information transfer while providing insight into the communicative characteristics of distinct circulating miRNA transport modalities. Published by Elsevier Inc.

  5. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    Science.gov (United States)

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  6. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  7. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  8. The Identification of Reasons, Solutions, and Techniques Informing a Theory-Based Intervention Targeting Recreational Sports Participation

    Science.gov (United States)

    St Quinton, Tom; Brunton, Julie A.

    2018-01-01

    Purpose: This study is the 3rd piece of formative research utilizing the theory of planned behavior to inform the development of a behavior change intervention. Focus groups were used to identify reasons for and solutions to previously identified key beliefs in addition to potentially effective behavior change techniques. Method: A purposive…

  9. Understanding family health information seeking: a test of the theory of motivated information management.

    Science.gov (United States)

    Hovick, Shelly R

    2014-01-01

    Although a family health history can be used to assess disease risk and increase health prevention behaviors, research suggests that few people have collected family health information. Guided by the Theory of Motivated Information Management, this study seeks to understand the barriers to and facilitators of interpersonal information seeking about family health history. Individuals who were engaged to be married (N = 306) were surveyed online and in person to understand how factors such as uncertainty, expectations for an information search, efficacy, and anxiety influence decisions and strategies for obtaining family health histories. The results supported the Theory of Motivated Information Management by demonstrating that individuals who experienced uncertainty discrepancies regarding family heath history had greater intention to seek information from family members when anxiety was low, outcome expectancy was high, and communication efficacy was positive. Although raising uncertainty about family health history may be an effective tool for health communicators to increase communication among family members, low-anxiety situations may be optimal for information seeking. Health communication messages must also build confidence in people's ability to communicate with family to obtain the needed health information.

  10. A Rolling Element Bearing Fault Diagnosis Approach Based on Multifractal Theory and Gray Relation Theory.

    Science.gov (United States)

    Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying

    2016-01-01

    Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately.

  11. Systematizing Web Search through a Meta-Cognitive, Systems-Based, Information Structuring Model (McSIS)

    Science.gov (United States)

    Abuhamdieh, Ayman H.; Harder, Joseph T.

    2015-01-01

    This paper proposes a meta-cognitive, systems-based, information structuring model (McSIS) to systematize online information search behavior based on literature review of information-seeking models. The General Systems Theory's (GST) prepositions serve as its framework. Factors influencing information-seekers, such as the individual learning…

  12. Information systems theory

    CERN Document Server

    Dwivedi, Yogesh K; Schneberger, Scott L

    2011-01-01

    The overall mission of this book is to provide a comprehensive understanding and coverage of the various theories and models used in IS research. Specifically, it aims to focus on the following key objectives: To describe the various theories and models applicable to studying IS/IT management issues. To outline and describe, for each of the various theories and models, independent and dependent constructs, reference discipline/originating area, originating author(s), seminal articles, level of analysis (i.e. firm, individual, industry) and links with other theories. To provide a critical revie

  13. Why hydrological predictions should be evaluated using information theory

    Directory of Open Access Journals (Sweden)

    S. V. Weijs

    2010-12-01

    Full Text Available Probabilistic predictions are becoming increasingly popular in hydrology. Equally important are methods to test such predictions, given the topical debate on uncertainty analysis in hydrology. Also in the special case of hydrological forecasting, there is still discussion about which scores to use for their evaluation. In this paper, we propose to use information theory as the central framework to evaluate predictions. From this perspective, we hope to shed some light on what verification scores measure and should measure. We start from the ''divergence score'', a relative entropy measure that was recently found to be an appropriate measure for forecast quality. An interpretation of a decomposition of this measure provides insight in additive relations between climatological uncertainty, correct information, wrong information and remaining uncertainty. When the score is applied to deterministic forecasts, it follows that these increase uncertainty to infinity. In practice, however, deterministic forecasts tend to be judged far more mildly and are widely used. We resolve this paradoxical result by proposing that deterministic forecasts either are implicitly probabilistic or are implicitly evaluated with an underlying decision problem or utility in mind. We further propose that calibration of models representing a hydrological system should be the based on information-theoretical scores, because this allows extracting all information from the observations and avoids learning from information that is not there. Calibration based on maximizing utility for society trains an implicit decision model rather than the forecasting system itself. This inevitably results in a loss or distortion of information in the data and more risk of overfitting, possibly leading to less valuable and informative forecasts. We also show this in an example. The final conclusion is that models should preferably be explicitly probabilistic and calibrated to maximize the

  14. Role-based typology of information technology : Model development and assessment.

    NARCIS (Netherlands)

    Zand, F.; Solaimani, H. (Sam); Beers, van C.

    2015-01-01

    Managers aim to explain how and why IT creates business value, recognize their IT-based capabilities, and select the appropriate IT to enhance and leverage those capabilities. This article synthesizes the Organizational Information Processing Theory and Resource-Based View into a descriptive

  15. The use of information theory in evolutionary biology.

    Science.gov (United States)

    Adami, Christoph

    2012-05-01

    Information is a key concept in evolutionary biology. Information stored in a biological organism's genome is used to generate the organism and to maintain and control it. Information is also that which evolves. When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here, I review applications of information theory to the evolution of proteins and to the evolution of information processing in simulated agents that adapt to perform a complex task. © 2012 New York Academy of Sciences.

  16. Novel theory of the human brain: information-commutation basis of architecture and principles of operation

    Directory of Open Access Journals (Sweden)

    Bryukhovetskiy AS

    2015-02-01

    Full Text Available Andrey S Bryukhovetskiy Center for Biomedical Technologies, Federal Research and Clinical Center for Specialized Types of Medical Assistance and Medical Technologies of the Federal Medical Biological Agency, NeuroVita Clinic of Interventional and Restorative Neurology and Therapy, Moscow, Russia Abstract: Based on the methodology of the informational approach and research of the genome, proteome, and complete transcriptome profiles of different cells in the nervous tissue of the human brain, the author proposes a new theory of information-commutation organization and architecture of the human brain which is an alternative to the conventional systemic connective morphofunctional paradigm of the brain framework. Informational principles of brain operation are defined: the modular principle, holographic principle, principle of systematicity of vertical commutative connection and complexity of horizontal commutative connection, regulatory principle, relay principle, modulation principle, “illumination” principle, principle of personalized memory and intellect, and principle of low energy consumption. The author demonstrates that the cortex functions only as a switchboard and router of information, while information is processed outside the nervous tissue of the brain in the intermeningeal space. The main structural element of information-commutation in the brain is not the neuron, but information-commutation modules that are subdivided into receiver modules, transmitter modules, and subscriber modules, forming a vertical architecture of nervous tissue in the brain as information lines and information channels, and a horizontal architecture as central, intermediate, and peripheral information-commutation platforms. Information in information-commutation modules is transferred by means of the carriers that are characteristic to the specific information level from inductome to genome, transcriptome, proteome, metabolome, secretome, and magnetome

  17. Information theory and its application to optical communication

    NARCIS (Netherlands)

    Willems, F.M.J.

    2017-01-01

    The lecture focusses on the foundations of communication which were developed within the field of information theory. Enumerative shaping techniques and the so-called squareroot transform will be discussed in detail.

  18. Brain activity and cognition: a connection from thermodynamics and information theory.

    Science.gov (United States)

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity.

  19. Brain activity and cognition: a connection from thermodynamics and information theory

    Science.gov (United States)

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. PMID:26136709

  20. Resource-Based View of Information Systems: Sustainable and Transient Competitive Advantage Perspectives

    Directory of Open Access Journals (Sweden)

    Gaurav Gupta

    2018-01-01

    Full Text Available The resource-based view (RBV, or resource-based theory, is one of the oldest and most influential theories in the field of information systems. This paper contends that it is timely to revisit, reflect on, and reposition RBV to ensure its continued disciplinary relevance and progress. In doing so, this paper (i provides a succinct and sharp evaluation of the conventional RBV of information systems that firms use to establish sustainable competitive advantage, and (ii makes an original contribution by introducing a contemporary RBV of information systems that firms can use to establish transient competitive advantage. Both these contributions should advance the current and future understanding of information systems as (a an internal firm resource, (b a source of competitive advantage, and (c a driver of firm performance.

  1. Information structures in economics studies in the theory of markets with imperfect information

    CERN Document Server

    Nermuth, Manfred

    1982-01-01

    This book is intended as a contribution to the theory of markets with imperfect information. The subject being nearly limitless, only certain selected topics are discussed. These are outlined in the Introduction (Ch. 0). The remainder of the book is divided into three parts. All results of economic significance are contained in Parts II & III. Part I introduces the main tools for the analysis, in particular the concept of an information structure. Although most of the material presented in Part I is not original, it is hoped that the detailed and self-contained exposition will help the reader to understand not only the following pages, but also the existing technical and variegated literature on markets with imperfect information. The mathematical prerequisites needed, but not explained in the text rarely go beyond elementary calculus and probability theory. Whenever more advanced concepts are used, I have made an effort to give an intuitive explanation as well, so that the argument can also be followed o...

  2. Analytical implications of using practice theory in workplace information literacy research

    DEFF Research Database (Denmark)

    Moring, Camilla Elisabeth; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical...... focus and interest when researching workplace information literacy. Two practice theoretical perspectives are selected, one by Theodore Schatzki and one by Etienne Wenger, and their general commonalities and differences are analysed and discussed. Analysis: The two practice theories and their main ideas...... of what constitute practices, how practices frame social life and the central concepts used to explain this, are presented. Then the application of the theories within workplace information literacy research is briefly explored. Results and Conclusion: The two theoretical perspectives share some...

  3. Theory-Based Evaluation Meets Ambiguity

    DEFF Research Database (Denmark)

    Dahler-Larsen, Peter

    2017-01-01

    As theory-based evaluation (TBE) engages in situations where multiple stakeholders help develop complex program theory about dynamic phenomena in politically contested settings, it becomes difficult to develop and use program theory without ambiguity. The purpose of this article is to explore...... ambiguity as a fruitful perspective that helps TBE face current challenges. Literatures in organization theory and political theory are consulted in order to cultivate the concept of ambiguity. Janus variables (which work in two ways) and other ambiguous aspects of program theories are classified...... and exemplified. Stances towards ambiguity are considered, as are concrete steps that TBE evaluators can take to identify and deal with ambiguity in TBE....

  4. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  5. A short course in quantum information theory. An approach from theoretical physics. 2. ed.

    International Nuclear Information System (INIS)

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition: ''The best things about this book are its brevity and clarity. In around 100 pages it provides a tutorial introduction to quantum information theory, including problems and solutions.. it's worth a look if you want to quickly get up to speed with the language and central concepts of quantum information theory, including the background classical information theory.'' (Craig Savage, Australian Physics, Vol. 44 (2), 2007). (orig.)

  6. New Aspects of Probabilistic Forecast Verification Using Information Theory

    Science.gov (United States)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  7. An information theory of image gathering

    Science.gov (United States)

    Fales, Carl L.; Huck, Friedrich O.

    1991-01-01

    Shannon's mathematical theory of communication is extended to image gathering. Expressions are obtained for the total information that is received with a single image-gathering channel and with parallel channels. It is concluded that the aliased signal components carry information even though these components interfere with the within-passband components in conventional image gathering and restoration, thereby degrading the fidelity and visual quality of the restored image. An examination of the expression for minimum mean-square-error, or Wiener-matrix, restoration from parallel image-gathering channels reveals a method for unscrambling the within-passband and aliased signal components to restore spatial frequencies beyond the sampling passband out to the spatial frequency response cutoff of the optical aperture.

  8. Information theory of open fragmenting systems

    International Nuclear Information System (INIS)

    Gulminelli, F.; Juillet, O.; Chomaz, Ph.; Ison, M. J.; Dorso, C. O.

    2007-01-01

    An information theory description of finite systems explicitly evolving in time is presented. We impose a MaxEnt variational principle on the Shannon entropy at a given time while the constraints are set at a former time. The resulting density matrix contains explicit time odd components in the form of collective flows. As a specific application we consider the dynamics of the expansion in connection with heavy ion experiments. Lattice gas and classical molecular dynamics simulations are shown

  9. Applying an expectancy-value model to study motivators for work-task based information seeking

    DEFF Research Database (Denmark)

    Sigaard, Karen Tølbøl; Skov, Mette

    2015-01-01

    on the theory of expectancy-value and on the operationalisation used when the model was first developed. Data for the analysis were collected from a sample of seven informants working as consultants in Danish municipalities. Each participant filled out a questionnaire, kept a log book for a week...... for interpersonal and internal sources increased when the task had high-value motivation or low-expectancy motivation or both. Research limitations/implications: The study is based on a relatively small sample and considers only one motivation theory. This should be addressed in future research along...... with a broadening of the studied group to involve other professions than municipality consultants. Originality/value: Motivational theories from the field of psychology have been used sparsely in studies of information seeking. This study operationalises and verifies such a theory based on a theoretical adaptation...

  10. Using institutional theory with sensemaking theory: a case study of information system implementation in healthcare

    DEFF Research Database (Denmark)

    Jensen, Tina Blegind; Kjærgaard, Annemette; Svejvig, Per

    2009-01-01

    Institutional theory has proven to be a central analytical perspective for investigating the role of social and historical structures of information systems (IS) implementation. However, it does not explicitly account for how organisational actors make sense of and enact technologies in their local...... context. We address this limitation by exploring the potential of using institutional theory with sensemaking theory to study IS implementation in organisations. We argue that each theoretical perspective has its own explanatory power and that a combination of the two facilitates a much richer...... interpretation of IS implementation by linking macro- and micro-levels of analysis. To illustrate this, we report from an empirical study of the implementation of an Electronic Patient Record (EPR) system in a clinical setting. Using key constructs from the two theories, our findings address the phenomenon...

  11. The development and implementation of theory-driven programs capable of addressing poverty-impacted children's health, mental health, and prevention needs: CHAMP and CHAMP+, evidence-informed, family-based interventions to address HIV risk and care.

    Science.gov (United States)

    McKernan McKay, Mary; Alicea, Stacey; Elwyn, Laura; McClain, Zachary R B; Parker, Gary; Small, Latoya A; Mellins, Claude Ann

    2014-01-01

    This article describes a program of prevention and intervention research conducted by the CHAMP (Collaborative HIV prevention and Adolescent Mental health Project; McKay & Paikoff, 2007 ) investigative team. CHAMP refers to a set of theory-driven, evidence-informed, collaboratively designed, family-based approaches meant to address the prevention, health, and mental health needs of poverty-impacted African American and Latino urban youth who are either at risk for HIV exposure or perinatally infected and at high risk for reinfection and possible transmission. CHAMP approaches are informed by theoretical frameworks that incorporate an understanding of the critical influences of multilevel contextual factors on youth risk taking and engagement in protective health behaviors. Highly influential theories include the triadic theory of influence, social action theory, and ecological developmental perspectives. CHAMP program delivery strategies were developed via a highly collaborative process drawing upon community-based participatory research methods in order to enhance cultural and contextual sensitivity of program content and format. The development and preliminary outcomes associated with a family-based intervention for a new population, perinatally HIV-infected youth and their adult caregivers, referred to as CHAMP+, is described to illustrate the integration of theory, existing evidence, and intensive input from consumers and healthcare providers.

  12. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  13. A Game Theory Based Solution for Security Challenges in CRNs

    Science.gov (United States)

    Poonam; Nagpal, Chander Kumar

    2018-03-01

    Cognitive radio networks (CRNs) are being envisioned to drive the next generation Ad hoc wireless networks due to their ability to provide communications resilience in continuously changing environments through the use of dynamic spectrum access. Conventionally CRNs are dependent upon the information gathered by other secondary users to ensure the accuracy of spectrum sensing making them vulnerable to security attacks leading to the need of security mechanisms like cryptography and trust. However, a typical cryptography based solution is not a viable security solution for CRNs owing to their limited resources. Effectiveness of trust based approaches has always been, in question, due to credibility of secondary trust resources. Game theory with its ability to optimize in an environment of conflicting interests can be quite a suitable tool to manage an ad hoc network in the presence of autonomous selfish/malevolent/malicious and attacker nodes. The literature contains several theoretical proposals for augmenting game theory in the ad hoc networks without explicit/detailed implementation. This paper implements a game theory based solution in MATLAB-2015 to secure the CRN environment and compares the obtained results with the traditional approaches of trust and cryptography. The simulation result indicates that as the time progresses the game theory performs much better with higher throughput, lower jitter and better identification of selfish/malicious nodes.

  14. Human vision is determined based on information theory

    Science.gov (United States)

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  15. Product-oriented design theory for digital information services: A literature review.

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.; Kraaijenbrink, Jeroen

    2008-01-01

    Purpose – The purpose of this paper is to give a structured literature review, design concepts, and research propositions related to a product-oriented design theory for information services. Information services facilitate the exchange of information goods with or without transforming these goods.

  16. Critical theory as an approach to the ethics of information security.

    Science.gov (United States)

    Stahl, Bernd Carsten; Doherty, Neil F; Shaw, Mark; Janicke, Helge

    2014-09-01

    Information security can be of high moral value. It can equally be used for immoral purposes and have undesirable consequences. In this paper we suggest that critical theory can facilitate a better understanding of possible ethical issues and can provide support when finding ways of addressing them. The paper argues that critical theory has intrinsic links to ethics and that it is possible to identify concepts frequently used in critical theory to pinpoint ethical concerns. Using the example of UK electronic medical records the paper demonstrates that a critical lens can highlight issues that traditional ethical theories tend to overlook. These are often linked to collective issues such as social and organisational structures, which philosophical ethics with its typical focus on the individual does not tend to emphasise. The paper suggests that this insight can help in developing ways of researching and innovating responsibly in the area of information security.

  17. MaxEnt-Based Ecological Theory: A Template for Integrated Catchment Theory

    Science.gov (United States)

    Harte, J.

    2017-12-01

    The maximum information entropy procedure (MaxEnt) is both a powerful tool for inferring least-biased probability distributions from limited data and a framework for the construction of complex systems theory. The maximum entropy theory of ecology (METE) describes remarkably well widely observed patterns in the distribution, abundance and energetics of individuals and taxa in relatively static ecosystems. An extension to ecosystems undergoing change in response to disturbance or natural succession (DynaMETE) is in progress. I describe the structure of both the static and the dynamic theory and show a range of comparisons with census data. I then propose a generalization of the MaxEnt approach that could provide a framework for a predictive theory of both static and dynamic, fully-coupled, eco-socio-hydrological catchment systems.

  18. Prolegomena to a theory of nuclear information exchange

    International Nuclear Information System (INIS)

    Van Nuffelen, Dominique

    1997-01-01

    From the researcher's point of view, the communications with the agricultural populations in case of radiological emergency can not be anything else but the application of a theory of nuclear information exchange among social groups. Consequently, it is essentially necessary to work out such a theory, the prolegomena of which are exposed in this paper. It describes an experiment conducted at 'Service de protection contre les radiations ionisantes' - Belgium (SPRI), and proposes an investigation within the scientific knowledge in this matter. The available empirical and theoretical data allow formulating pragmatic recommendations, among which the principal one is the necessity of creating in normal radiological situation of a number of scenarios of messages adapted to the agricultural populations. The author points out that in order to be perfectly adapted these scenarios must been negotiated between the emitter and receiver. If this condition is satisfied the information in case of nuclear emergency will really be an exchange of knowledge between experts and the agricultural population i.e. a 'communication'

  19. Evaluation of EMG processing techniques using Information Theory.

    Science.gov (United States)

    Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J

    2010-11-12

    Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  20. Rebooting Kirkpatrick: Integrating Information System Theory Into the Evaluation of Web-based Continuing Professional Development Interventions for Interprofessional Education.

    Science.gov (United States)

    Shen, Nelson; Yufe, Shira; Saadatfard, Omid; Sockalingam, Sanjeev; Wiljer, David

    2017-01-01

    Information system research has stressed the importance of theory in understanding how user perceptions can motivate the use and adoption of technology such as web-based continuing professional development programs for interprofessional education (WCPD-IPE). A systematic review was conducted to provide an information system perspective on the current state of WCPD-IPE program evaluation and how current evaluations capture essential theoretical constructs in promoting technology adoption. Six databases were searched to identify studies evaluating WCPD-IPE. Three investigators determined eligibility of the articles. Evaluation items extracted from the studies were assessed using the Kirkpatrick-Barr framework and mapped to the Benefits Evaluation Framework. Thirty-seven eligible studies yielded 362 evaluation items for analysis. Most items (n = 252) were assessed as Kirkpatrick-Barr level 1 (reaction) and were mainly focused on the quality (information, service, and quality) and satisfaction dimensions of the Benefits Evaluation. System quality was the least evaluated quality dimension, accounting for 26 items across 13 studies. WCPD-IPE use was reported in 17 studies and its antecedent factors were evaluated in varying degrees of comprehensiveness. Although user reactions were commonly evaluated, greater focus on user perceptions of system quality (ie, functionality and performance), usefulness, and usability of the web-based platform is required. Surprisingly, WCPD-IPE use was reported in less than half of the studies. This is problematic as use is a prerequisite to realizing any individual, organizational, or societal benefit of WCPD-IPE. This review proposes an integrated framework which accounts for these factors and provides a theoretically grounded guide for future evaluations.

  1. Combination of uncertainty theories and decision-aiding methods for natural risk management in a context of imperfect information

    Science.gov (United States)

    Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille

    2017-04-01

    propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.

  2. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  3. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  4. Wavelet-Based Quantum Field Theory

    Directory of Open Access Journals (Sweden)

    Mikhail V. Altaisky

    2007-11-01

    Full Text Available The Euclidean quantum field theory for the fields $phi_{Delta x}(x$, which depend on both the position $x$ and the resolution $Delta x$, constructed in SIGMA 2 (2006, 046, on the base of the continuous wavelet transform, is considered. The Feynman diagrams in such a theory become finite under the assumption there should be no scales in internal lines smaller than the minimal of scales of external lines. This regularisation agrees with the existing calculations of radiative corrections to the electron magnetic moment. The transition from the newly constructed theory to a standard Euclidean field theory is achieved by integration over the scale arguments.

  5. Using Information Theory to Assess the Communicative Capacity of Circulating MicroRNA

    OpenAIRE

    Finn, Nnenna A.; Searles, Charles D.

    2013-01-01

    The discovery of extracellular microRNAs (miRNAs) and their transport modalities (i.e. microparticles, exosomes, proteins and lipoproteins) has sparked theories regarding their role in intercellular communication. Here, we assessed the information transfer capacity of different miRNA transport modalities in human serum by utilizing basic principles of information theory. Zipf Statistics were calculated for each of the miRNA transport modalities identified in human serum. Our analyses revealed...

  6. Information theory, animal communication, and the search for extraterrestrial intelligence

    Science.gov (United States)

    Doyle, Laurance R.; McCowan, Brenda; Johnston, Simon; Hanser, Sean F.

    2011-02-01

    We present ongoing research in the application of information theory to animal communication systems with the goal of developing additional detectors and estimators for possible extraterrestrial intelligent signals. Regardless of the species, for intelligence (i.e., complex knowledge) to be transmitted certain rules of information theory must still be obeyed. We demonstrate some preliminary results of applying information theory to socially complex marine mammal species (bottlenose dolphins and humpback whales) as well as arboreal squirrel monkeys, because they almost exclusively rely on vocal signals for their communications, producing signals which can be readily characterized by signal analysis. Metrics such as Zipf's Law and higher-order information-entropic structure are emerging as indicators of the communicative complexity characteristic of an "intelligent message" content within these animals' signals, perhaps not surprising given these species' social complexity. In addition to human languages, for comparison we also apply these metrics to pulsar signals—perhaps (arguably) the most "organized" of stellar systems—as an example of astrophysical systems that would have to be distinguished from an extraterrestrial intelligence message by such information theoretic filters. We also look at a message transmitted from Earth (Arecibo Observatory) that contains a lot of meaning but little information in the mathematical sense we define it here. We conclude that the study of non-human communication systems on our own planet can make a valuable contribution to the detection of extraterrestrial intelligence by providing quantitative general measures of communicative complexity. Studying the complex communication systems of other intelligent species on our own planet may also be one of the best ways to deprovincialize our thinking about extraterrestrial communication systems in general.

  7. A game theory-based trust measurement model for social networks.

    Science.gov (United States)

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  8. Friction Theory Prediction of Crude Oil Viscosity at Reservoir Conditions Based on Dead Oil Properties

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2003-01-01

    The general one-parameter friction theory (f-theory) models have been further extended to the prediction of the viscosity of real "live" reservoir fluids based on viscosity measurements of the "dead" oil and the compositional information of the live fluid. This work representation of the viscosity...... of real fluids is obtained by a simple one-parameter tuning of a linear equation derived from a general one-parameter f-theory model. Further, this is achieved using simple cubic equations of state (EOS), such as the Peng-Robinson (PR) EOS or the Soave-Redlich-Kwong (SRK) EOS, which are commonly used...... within the oil industry. In sake of completeness, this work also presents a simple characterization procedure which is based on compositional information of an oil sample. This procedure provides a method for characterizing an oil into a number of compound groups along with the critical constants...

  9. Theory of mind selectively predicts preschoolers' knowledge-based selective word learning.

    Science.gov (United States)

    Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane

    2015-11-01

    Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory-of-mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children's preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children's developing social cognition and early learning. © 2015 The British Psychological Society.

  10. Theory of mind selectively predicts preschoolers’ knowledge-based selective word learning

    Science.gov (United States)

    Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane

    2015-01-01

    Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory of mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children’s preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children’s developing social cognition and early learning. PMID:26211504

  11. Information Processing Theories and the Education of the Gifted.

    Science.gov (United States)

    Rawl, Ruth K.; O'Tuel, Frances S.

    1983-01-01

    The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)

  12. Application of information and complexity theories to public opinion polls. The case of Greece (2004-2007)

    OpenAIRE

    Panos, C. P.; Chatzisavvas, K. Ch.

    2007-01-01

    A general methodology to study public opinion inspired from information and complexity theories is outlined. It is based on probabilistic data extracted from opinion polls. It gives a quantitative information-theoretic explanation of high job approval of Greek Prime Minister Mr. Constantinos Karamanlis (2004-2007), while the same time series of polls conducted by the company Metron Analysis showed that his party New Democracy (abbr. ND) was slightly higher than the opposition party of PASOK -...

  13. The Scope of Usage-based Theory

    OpenAIRE

    Paul eIbbotson

    2013-01-01

    Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highli...

  14. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  15. Application of Bayesian Decision Theory Based on Prior Information in the Multi-Objective Optimization Problem

    Directory of Open Access Journals (Sweden)

    Xia Lei

    2010-12-01

    Full Text Available General multi-objective optimization methods are hard to obtain prior information, how to utilize prior information has been a challenge. This paper analyzes the characteristics of Bayesian decision-making based on maximum entropy principle and prior information, especially in case that how to effectively improve decision-making reliability in deficiency of reference samples. The paper exhibits effectiveness of the proposed method using the real application of multi-frequency offset estimation in distributed multiple-input multiple-output system. The simulation results demonstrate Bayesian decision-making based on prior information has better global searching capability when sampling data is deficient.

  16. Internet-Based Health Information Consumer Skills Intervention for People Living with HIV/AIDS

    Science.gov (United States)

    Kalichman, Seth C.; Cherry, Charsey; Cain, Demetria; Pope, Howard; Kalichman, Moira; Eaton, Lisa; Weinhardt, Lance; Benotsch, Eric G.

    2006-01-01

    Medical information can improve health, and there is an enormous amount of health information available on the Internet. A randomized clinical trial tested the effectiveness of an intervention based on social-cognitive theory to improve information use among people living with HIV/AIDS. Men and women (N = 448) were placed in either (a) an…

  17. An approach to higher dimensional theories based on lattice gauge theory

    International Nuclear Information System (INIS)

    Murata, M.; So, H.

    2004-01-01

    A higher dimensional lattice space can be decomposed into a number of four-dimensional lattices called as layers. The higher dimensional gauge theory on the lattice can be interpreted as four-dimensional gauge theories on the multi-layer with interactions between neighboring layers. We propose the new possibility to realize the continuum limit of a five-dimensional theory based on the property of the phase diagram

  18. Concept theory

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2009-01-01

      Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...

  19. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  20. Integration of Information Literacy into the Curriculum: Constructive Alignment from Theory into Practice

    Directory of Open Access Journals (Sweden)

    Claes Dahlqvist

    2016-12-01

    Full Text Available Librarian-teacher cooperation is essential for the integration of information literacy into course syllabi. Therefore, a common theoretical and methodological platform is needed. As librarians at Kristianstad University we have had the opportunity to develop such a platform when teaching information literacy in a basic course for teachers in higher education pedagogy. Information literacy is taught in context with academic writing, distance learning and teaching, and development of course syllabi. Constructive Alignment in Theory: We used constructive alignment in designing our part of the course. John Biggs’ ideas tell us that assessment tasks (ATs should be aligned to what is intended to be learned. Intended learning outcomes (ILOs specify teaching/learning activities (TLAs based on the content of learning. TLAs should be designed in ways that enable students to construct knowledge from their own experience. The ILOs for the course are to have arguments for the role of information literacy in higher education and ideas of implementing them in TLAs. The content of learning is for example the concept of information literacy, theoretical perspectives and constructive alignment for integration in course syllabi. TLAs are written pre-lecture reflections on the concept of information literacy, used as a starting point for the three-hour seminar. Learning reflections are written afterwards. The AT is to revise a syllabus (preferably using constructive alignment for a course the teacher is responsible for, where information literacy must be integrated with the other parts and topics of the course. Constructive Alignment in Practice: Using constructive alignment has taught us that this model serves well as the foundation of the theoretical and methodological platform for librarian-teacher cooperation when integrating information literacy in course syllabi. It contains all important aspects of the integration of information literacy in course

  1. Theory of information warfare: basic framework, methodology and conceptual apparatus

    Directory of Open Access Journals (Sweden)

    Олександр Васильович Курбан

    2015-11-01

    Full Text Available It is conducted a comprehensive theoretical study and determine the basic provisions of the modern theory of information warfare in on-line social networks. Three basic blocks, which systematized the theoretical and methodological basis of the topic, are established. There are information and psychological war, social off-line and on-line network. According to the three blocks, theoretical concepts are defined and methodological substantiation of information processes within the information warfare in the social on-line networks is formed

  2. How to Develop a Multi-Grounded Theory: the evolution of a business process theory

    Directory of Open Access Journals (Sweden)

    Mikael Lind

    2006-05-01

    Full Text Available In the information systems field there is a great need for different theories. Theory development can be performed in different ways – deductively and/or inductively. Different approaches with their pros and cons for theory development exists. A combined approach, which builds on inductive as well as deductive thinking, has been put forward – a Multi-Grounded Theory approach. In this paper the evolution of a business process theory is regarded as the development of a multi-grounded theory. This evolution is based on empirical studies, theory-informed conceptual development and the creation of conceptual cohesion. The theoretical development has involved a dialectic approach aiming at a theoretical synthesis based on antagonistic theories. The result of this research process was a multi-grounded business process theory. Multi-grounded means that the theory is empirically, internally and theoretically founded. This business process theory can be used as an aid for business modellers to direct attention towards relevant aspects when business process determination is performed.

  3. Final Summary: Genre Theory in Information Studies

    DEFF Research Database (Denmark)

    Andersen, Jack

    2015-01-01

    Purpose This chapter offers a re-description of knowledge organization in light of genre and activity theory. Knowledge organization needs a new description in order to account for those activities and practices constituting and causing concrete knowledge organization activity. Genre and activity...... informing and shaping concrete forms of knowledge organization activity. With this, we are able to understand how knowledge organization activity also contributes to construct genre and activity systems and not only aid them....

  4. Year 7 Students, Information Literacy, and Transfer: A Grounded Theory

    Science.gov (United States)

    Herring, James E.

    2011-01-01

    This study examined the views of year 7 students, teacher librarians, and teachers in three state secondary schools in rural New South Wales, Australia, on information literacy and transfer. The aims of the study included the development of a grounded theory in relation to information literacy and transfer in these schools. The study's perspective…

  5. Evaluation of EMG processing techniques using Information Theory

    Directory of Open Access Journals (Sweden)

    Felice Carmelo J

    2010-11-01

    Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  6. Observational information for f(T) theories and dark torsion

    Energy Technology Data Exchange (ETDEWEB)

    Bengochea, Gabriel R., E-mail: gabriel@iafe.uba.a [Instituto de Astronomia y Fisica del Espacio (IAFE), CC 67, Suc. 28, 1428 Buenos Aires (Argentina)

    2011-01-17

    In the present work we analyze and compare the information coming from different observational data sets in the context of a sort of f(T) theories. We perform a joint analysis with measurements of the most recent type Ia supernovae (SNe Ia), Baryon Acoustic Oscillation (BAO), Cosmic Microwave Background radiation (CMB), Gamma-Ray Bursts data (GRBs) and Hubble parameter observations (OHD) to constraint the only new parameter these theories have. It is shown that when the new combined BAO/CMB parameter is used to put constraints, the result is different from previous works. We also show that when we include Observational Hubble Data (OHD) the simpler {Lambda}CDM model is excluded to one sigma level, leading the effective equation of state of these theories to be of phantom type. Also, analyzing a tension criterion for SNe Ia and other observational sets, we obtain more consistent and better suited data sets to work with these theories.

  7. Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts

    2006-05-01

    This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and

  8. Quantum entanglement in non-local games, graph parameters and zero-error information theory

    NARCIS (Netherlands)

    Scarpa, G.

    2013-01-01

    We study quantum entanglement and some of its applications in graph theory and zero-error information theory. In Chapter 1 we introduce entanglement and other fundamental concepts of quantum theory. In Chapter 2 we address the question of how much quantum correlations generated by entanglement can

  9. An information theory based approach for quantitative evaluation of man-machine interface complexity

    International Nuclear Information System (INIS)

    Kang, Hyun Gook

    1999-02-01

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  10. An information theory based approach for quantitative evaluation of man-machine interface complexity

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hyun Gook

    1999-02-15

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  11. Theory of reasoned action and theory of planned behavior-based dietary interventions in adolescents and young adults: a systematic review

    Directory of Open Access Journals (Sweden)

    Hackman CL

    2014-06-01

    Full Text Available Christine L Hackman, Adam P KnowldenDepartment of Health Science, The University of Alabama, Tuscaloosa, AL, USABackground: Childhood obesity has reached epidemic proportions in many nations around the world. The theory of planned behavior (TPB and the theory of reasoned action (TRA have been used to successfully plan and evaluate numerous interventions for many different behaviors. The aim of this study was to systematically review and synthesize TPB and TRA-based dietary behavior interventions targeting adolescents and young adults.Methods: The following databases were systematically searched to find articles for this review: Academic Search Premier; Cumulative Index to Nursing and Allied Health (CINAHL; Education Resources Information Center (ERIC; Health Source: Nursing/Academic Edition; Cochrane Central Register of Controlled Trials (CENTRAL; and MEDLINE. Inclusion criteria for articles were: 1 primary or secondary interventions, 2 with any quantitative design, 3 published in the English language, 4 between January 2003 and March 2014, 5 that targeted adolescents or young adults, 6 which included dietary change behavior as the outcome, and 7 utilized TPB or TRA.Results: Of the eleven intervention studies evaluated, nine resulted in dietary behavior change that was attributed to the treatment. Additionally, all but one study found there to be a change in at least one construct of TRA or TPB, while one study did not measure constructs. All of the studies utilized some type of quantitative design, with two employing quasi-experimental, and eight employing randomized control trial design. Among the studies, four utilized technology including emails, social media posts, information on school websites, web-based activities, audio messages in classrooms, interactive DVDs, and health-related websites. Two studies incorporated goal setting and four employed persuasive communication.Conclusion: Interventions directed toward changing dietary behaviors

  12. Creativity, information, and consciousness: The information dynamics of thinking.

    Science.gov (United States)

    Wiggins, Geraint A

    2018-05-07

    This paper presents a theory of the basic operation of mind, Information Dynamics of Thinking, which is intended for computational implementation and thence empirical testing. It is based on the information theory of Shannon, and treats the mind/brain as an information processing organ that aims to be information-efficient, in that it predicts its world, so as to use information efficiently, and regularly re-represents it, so as to store information efficiently. The theory is presented in context of a background review of various research areas that impinge upon its development. Consequences of the theory and testable hypotheses arising from it are discussed. Copyright © 2018. Published by Elsevier B.V.

  13. Cognition to Collaboration: User-Centric Approach and Information Behaviour Theories/Models

    Directory of Open Access Journals (Sweden)

    Alperen M Aydin

    2016-12-01

    Full Text Available Aim/Purpose: The objective of this paper is to review the vast literature of user-centric in-formation science and inform about the emerging themes in information behaviour science. Background:\tThe paradigmatic shift from system-centric to user-centric approach facilitates research on the cognitive and individual information processing. Various information behaviour theories/models emerged. Methodology: Recent information behaviour theories and models are presented. Features, strengths and weaknesses of the models are discussed through the analysis of the information behaviour literature. Contribution: This paper sheds light onto the weaknesses in earlier information behaviour models and stresses (and advocates the need for research on social information behaviour. Findings: Prominent information behaviour models deal with individual information behaviour. People live in a social world and sort out most of their daily or work problems in groups. However, only seven papers discuss social information behaviour (Scopus search. Recommendations for Practitioners\t: ICT tools used for inter-organisational sharing should be redesigned for effective information-sharing during disaster/emergency times. Recommendation for Researchers: There are scarce sources on social side of the information behaviour, however, most of the work tasks are carried out in groups/teams. Impact on Society: In dynamic work contexts like disaster management and health care settings, collaborative information-sharing may result in decreasing the losses. Future Research: A fieldwork will be conducted in disaster management context investigating the inter-organisational information-sharing.

  14. Workplace-based assessment: raters' performance theories and constructs.

    Science.gov (United States)

    Govaerts, M J B; Van de Wiel, M W J; Schuwirth, L W T; Van der Vleuten, C P M; Muijtjens, A M M

    2013-08-01

    Weaknesses in the nature of rater judgments are generally considered to compromise the utility of workplace-based assessment (WBA). In order to gain insight into the underpinnings of rater behaviours, we investigated how raters form impressions of and make judgments on trainee performance. Using theoretical frameworks of social cognition and person perception, we explored raters' implicit performance theories, use of task-specific performance schemas and the formation of person schemas during WBA. We used think-aloud procedures and verbal protocol analysis to investigate schema-based processing by experienced (N = 18) and inexperienced (N = 16) raters (supervisor-raters in general practice residency training). Qualitative data analysis was used to explore schema content and usage. We quantitatively assessed rater idiosyncrasy in the use of performance schemas and we investigated effects of rater expertise on the use of (task-specific) performance schemas. Raters used different schemas in judging trainee performance. We developed a normative performance theory comprising seventeen inter-related performance dimensions. Levels of rater idiosyncrasy were substantial and unrelated to rater expertise. Experienced raters made significantly more use of task-specific performance schemas compared to inexperienced raters, suggesting more differentiated performance schemas in experienced raters. Most raters started to develop person schemas the moment they began to observe trainee performance. The findings further our understanding of processes underpinning judgment and decision making in WBA. Raters make and justify judgments based on personal theories and performance constructs. Raters' information processing seems to be affected by differences in rater expertise. The results of this study can help to improve rater training, the design of assessment instruments and decision making in WBA.

  15. Value of information-based inspection planning for offshore structures

    DEFF Research Database (Denmark)

    Irman, Arifian Agusta; Thöns, Sebastian; Leira, Bernt J.

    2017-01-01

    with each inspection strategy. A simplified and generic risk-based inspection planning utilizing pre- posterior Bayesian decision analysis had been proposed by Faber et al. [1] and Straub [2]. This paper provides considerations on the theoretical background and a Value of Information analysis......-based inspection planning. The paper will start out with a review of the state-of-art RBI planning procedure based on Bayesian decision theory and its application in offshore structure integrity management. An example of the Value of Information approach is illustrated and it is pointed to further research......Asset integrity and management is an important part of the oil and gas industry especially for existing offshore structures. With declining oil price, the production rate is an important factor to be maintained that makes integrity of the structures one of the main concerns. Reliability based...

  16. Finding Commonalities: Social Information Processing and Domain Theory in the Study of Aggression

    Science.gov (United States)

    Nucci, Larry

    2004-01-01

    The Arsenio and Lemerise (this issue) proposal integrating social information processing (SIP) and domain theory to study children's aggression is evaluated from a domain theory perspective. Basic tenets of domain theory rendering it compatible with SIP are discussed as well as points of divergence. Focus is directed to the proposition that…

  17. Information richness in construction projects: A critical social theory

    NARCIS (Netherlands)

    Adriaanse, Adriaan Maria; Voordijk, Johannes T.; Greenwood, David

    2002-01-01

    Two important factors influencing the communication in construction projects are the interests of the people involved and the language spoken by the people involved. The objective of the paper is to analyse these factors by using recent insights in the information richness theory. The critical

  18. Quantum information theory. Mathematical foundation. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, Masahito [Nagoya Univ. (Japan). Graduate School of Mathematics

    2017-07-01

    This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics - all of which are addressed here - made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an improved method for the evaluation of leaked information and identifies a remarkable relation between quantum security and quantum coherence. Taken together, these two improvements allow a better analysis of quantum state transmission. In addition, various types of the newly discovered uncertainty relation are explained. Presenting a wealth of new developments, the book introduces readers to the latest advances and challenges in quantum information. To aid in understanding, each chapter is accompanied by a set of exercises and solutions.

  19. Quantum information theory. Mathematical foundation. 2. ed.

    International Nuclear Information System (INIS)

    Hayashi, Masahito

    2017-01-01

    This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics - all of which are addressed here - made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an improved method for the evaluation of leaked information and identifies a remarkable relation between quantum security and quantum coherence. Taken together, these two improvements allow a better analysis of quantum state transmission. In addition, various types of the newly discovered uncertainty relation are explained. Presenting a wealth of new developments, the book introduces readers to the latest advances and challenges in quantum information. To aid in understanding, each chapter is accompanied by a set of exercises and solutions.

  20. Defining information need in health - assimilating complex theories derived from information science.

    Science.gov (United States)

    Ormandy, Paula

    2011-03-01

    Key policy drivers worldwide include optimizing patients' roles in managing their care; focusing services around patients' needs and preferences; and providing information to support patients' contributions and choices. The term information need penetrates many policy documents. Information need is espoused as the foundation from which to develop patient-centred or patient-led services. Yet there is no clear definition as to what the term means or how patients' information needs inform and shape information provision and patient care. The assimilation of complex theories originating from information science has much to offer considerations of patient information need within the context of health care. Health-related research often focuses on the content of information patients prefer, not why they need information. This paper extends and applies knowledge of information behaviour to considerations of information need in health, exposing a working definition for patient information need that reiterates the importance of considering the patient's goals and understanding the patient's context/situation. A patient information need is defined as 'recognition that their knowledge is inadequate to satisfy a goal, within the context/situation that they find themselves at a specific point in the time'. This typifies the key concepts of national/international health policy, the centrality and importance of the patient. The proposed definition of patient information need provides a conceptual framework to guide health-care practitioners on what to consider and why when meeting the information needs of patients in practice. This creates a solid foundation from which to inform future research. © 2010 The Author. Health Expectations © 2010 Blackwell Publishing Ltd.

  1. Restructuring Consciousness –the Psychedelic State in Light of Integrated Information Theory

    Directory of Open Access Journals (Sweden)

    Andrew Robert Gallimore

    2015-06-01

    Full Text Available The psychological state elicited by the classic psychedelics drugs, such as LSD and psilocybin, is one of the most fascinating and yet least understood states of consciousness. However, with the advent of modern functional neuroimaging techniques, the effect of these drugs on neural activity is now being revealed, although many of the varied phenomenological features of the psychedelic state remain challenging to explain. Integrated information theory (IIT is one of the foremost contemporary theories of consciousness, providing a mathematical formalization of both the quantity and quality of conscious experience. This theory can be applied to all known states of consciousness, including the psychedelic state. Using the results of functional neuroimaging data on the psychedelic state, the effects of psychedelic drugs on both the level and structure of consciousness can be explained in terms of the conceptual framework of IIT. This new IIT-based model of the psychedelic state provides an explanation for many of its phenomenological features, including unconstrained cognition, alterations in the structure and meaning of concepts and a sense of expanded awareness. This model also suggests that whilst cognitive flexibility, creativity, and imagination are enhanced during the psychedelic state, this occurs at the expense of cause-effect information, as well as degrading the brain’s ability to organize, categorize, and differentiate the constituents of conscious experience. Furthermore, the model generates specific predictions that can be tested using a combination of functional imaging techniques, as has been applied to the study of levels of consciousness during anesthesia and following brain injury.

  2. A re-examination of information seeking behaviour in the context of activity theory

    Directory of Open Access Journals (Sweden)

    Wilson T.D.

    2006-01-01

    Full Text Available Introduction. Activity theory, developed in the USSR as a Marxist alternative to Western psychology, has been applied widely in educational studies and increasingly in human-computer interaction research. Argument. The key elements of activity theory, Motivation, Goal, Activity, Tools, Object, Outcome, Rules, Community and Division of labour are all directly applicable to the conduct of information behaviour research. An activity-theoretical approach to information behaviour research would provide a sound basis for the elaboration of contextual issues, for the discovering of organizational and other contradictions that affect information behaviour. It may be used to aid the design and analysis of investigations. Elaboration. The basic ideas of activity theory are outlined and an attempt is made to harmonize different perspectives. A contrast is made between an activity system perspective and an activity process perspective and a diagrammatic representation of the process perspective is offered. Conclusion. Activity theory is not a predictive theory but a conceptual framework within which different theoretical perspectives may be employed. Typically, it is suggested that several methods of data collection should be employed and that the time frame for investigation should be long enough for the full range of contextual issues to emerge. Activity theory offers not only a useful conceptual framework, but also a coherent terminology to be shared by researchers, and a rapidly developing body of literature in associated disciplines.

  3. Information Architecture without Internal Theory: An Inductive Design Process.

    Science.gov (United States)

    Haverty, Marsha

    2002-01-01

    Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…

  4. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  5. Optimization of hydrometric monitoring network in urban drainage systems using information theory.

    Science.gov (United States)

    Yazdi, J

    2017-10-01

    Regular and continuous monitoring of urban runoff in both quality and quantity aspects is of great importance for controlling and managing surface runoff. Due to the considerable costs of establishing new gauges, optimization of the monitoring network is essential. This research proposes an approach for site selection of new discharge stations in urban areas, based on entropy theory in conjunction with multi-objective optimization tools and numerical models. The modeling framework provides an optimal trade-off between the maximum possible information content and the minimum shared information among stations. This approach was applied to the main surface-water collection system in Tehran to determine new optimal monitoring points under the cost considerations. Experimental results on this drainage network show that the obtained cost-effective designs noticeably outperform the consulting engineers' proposal in terms of both information contents and shared information. The research also determined the highly frequent sites at the Pareto front which might be important for decision makers to give a priority for gauge installation on those locations of the network.

  6. Accounting bases of theory: Why they matter

    Directory of Open Access Journals (Sweden)

    Zafeer Nagdee

    2016-11-01

    Full Text Available It is widely agreed that contemporary accounting practice is largely based on the application of professional accounting standards rather than on the application of sound, academic bases of theory. This has led to uncertainty within the field which has in turn inhibited the ability of accounting to develop into a more robust academic discipline. In conducting a thematic analysis of existing literature, this study will identify and expand on three key themes which will collectively establish the argument positing that a lacking basis of accounting theory has impaired the scholastic development of accounting practice worldwide. By introducing this argument to the academic community, this study will expose the economic risks associated with accounting’s absent bases of theory and will consequently add value by highlighting the need for additional research into the development, clarification and refinement of accounting theories that will result in more useful accounting practices worldwide

  7. Quantum information theory with Gaussian systems

    Energy Technology Data Exchange (ETDEWEB)

    Krueger, O.

    2006-04-06

    This thesis applies ideas and concepts from quantum information theory to systems of continuous-variables such as the quantum harmonic oscillator. The focus is on three topics: the cloning of coherent states, Gaussian quantum cellular automata and Gaussian private channels. Cloning was investigated both for finite-dimensional and for continuous-variable systems. We construct a private quantum channel for the sequential encryption of coherent states with a classical key, where the key elements have finite precision. For the case of independent one-mode input states, we explicitly estimate this precision, i.e. the number of key bits needed per input state, in terms of these parameters. (orig.)

  8. Quantum information theory with Gaussian systems

    International Nuclear Information System (INIS)

    Krueger, O.

    2006-01-01

    This thesis applies ideas and concepts from quantum information theory to systems of continuous-variables such as the quantum harmonic oscillator. The focus is on three topics: the cloning of coherent states, Gaussian quantum cellular automata and Gaussian private channels. Cloning was investigated both for finite-dimensional and for continuous-variable systems. We construct a private quantum channel for the sequential encryption of coherent states with a classical key, where the key elements have finite precision. For the case of independent one-mode input states, we explicitly estimate this precision, i.e. the number of key bits needed per input state, in terms of these parameters. (orig.)

  9. Feminist Praxis, Critical Theory and Informal Hierarchies

    Directory of Open Access Journals (Sweden)

    Eva Giraud

    2015-05-01

    Full Text Available This article draws on my experiences teaching across two undergraduate media modules in a UK research-intensive institution to explore tactics for combatting both institutional and informal hierarchies within university teaching contexts. Building on Sara Motta’s (2012 exploration of implementing critical pedagogic principles at postgraduate level in an elite university context, I discuss additional tactics for combatting these hierarchies in undergraduate settings, which were developed by transferring insights derived from informal workshops led by the University of Nottingham’s Feminism and Teaching network into the classroom. This discussion is framed in relation to the concepts of “cyborg pedagogies” and “political semiotics of articulation,” derived from the work of Donna Haraway, in order to theorize how these tactics can engender productive relationships between radical pedagogies and critical theory.

  10. Vocation in theology-based nursing theories.

    Science.gov (United States)

    Lundmark, Mikael

    2007-11-01

    By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.

  11. Theory-informed design of values clarification methods: a cognitive psychological perspective on patient health-related decision making.

    Science.gov (United States)

    Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb

    2013-01-01

    Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Generalised perturbation theory and source of information through chemical measurements

    International Nuclear Information System (INIS)

    Lelek, V.; Marek, T.

    2001-01-01

    It is important to make all analyses and collect all information from the work of the new facility (which the transmutation demonstration unit will surely be) to be sure that the operation corresponds to the forecast or to correct the equations of the facility. The behaviour of the molten salt reactor and in particular the system of measurement are very different from that of the solid fuel reactor. Key information from the long time kinetics could be the nearly on line knowledge of the fuel composition. In this work it is shown how to include it into the control and use such data for the correction of neutron cross-sections for the high actinides or other characteristics. Also the problem of safety - change of the boundary problem to the initial problem - is mentioned. The problem is transformed into the generalised perturbation theory in which the adjoint function is obtained through the solution of the equations with right hand side having the form of source. Such an approach should be a theoretical base for the calculation of the sensitivity coefficients. (authors)

  13. New approaches in mathematical biology: Information theory and molecular machines

    International Nuclear Information System (INIS)

    Schneider, T.

    1995-01-01

    My research uses classical information theory to study genetic systems. Information theory was founded by Claude Shannon in the 1940's and has had an enormous impact on communications engineering and computer sciences. Shannon found a way to measure information. This measure can be used to precisely characterize the sequence conservation at nucleic-acid binding sites. The resulting methods, by completely replacing the use of ''consensus sequences'', provide better models for molecular biologists. An excess of conservation led us to do experimental work on bacteriophage T7 promoters and the F plasmid IncD repeats. The wonderful fidelity of telephone communications and compact disk (CD) music can be traced directly to Shannon's channel capacity theorem. When rederived for molecular biology, this theorem explains the surprising precision of many molecular events. Through connections with the Second Law of Thermodyanmics and Maxwell's Demon, this approach also has implications for the development of technology at the molecular level. Discussions of these topics are held on the internet news group bionet.info-theo. (author). (Abstract only)

  14. The use of network theory to model disparate ship design information

    Science.gov (United States)

    Rigterink, Douglas; Piks, Rebecca; Singer, David J.

    2014-06-01

    This paper introduces the use of network theory to model and analyze disparate ship design information. This work will focus on a ship's distributed systems and their intra- and intersystem structures and interactions. The three system to be analyzed are: a passageway system, an electrical system, and a fire fighting system. These systems will be analyzed individually using common network metrics to glean information regarding their structures and attributes. The systems will also be subjected to community detection algorithms both separately and as a multiplex network to compare their similarities, differences, and interactions. Network theory will be shown to be useful in the early design stage due to its simplicity and ability to model any shipboard system.

  15. Quantum: information theory: technological challenge; Computacion Cuantica: un reto tecnologico

    Energy Technology Data Exchange (ETDEWEB)

    Calixto, M.

    2001-07-01

    The new Quantum Information Theory augurs powerful machines that obey the entangled logic of the subatomic world. Parallelism, entanglement, teleportation, no-cloning and quantum cryptography are typical peculiarities of this novel way of understanding computation. (Author) 24 refs.

  16. An enstrophy-based linear and nonlinear receptivity theory

    Science.gov (United States)

    Sengupta, Aditi; Suman, V. K.; Sengupta, Tapan K.; Bhaumik, Swagata

    2018-05-01

    In the present research, a new theory of instability based on enstrophy is presented for incompressible flows. Explaining instability through enstrophy is counter-intuitive, as it has been usually associated with dissipation for the Navier-Stokes equation (NSE). This developed theory is valid for both linear and nonlinear stages of disturbance growth. A previously developed nonlinear theory of incompressible flow instability based on total mechanical energy described in the work of Sengupta et al. ["Vortex-induced instability of an incompressible wall-bounded shear layer," J. Fluid Mech. 493, 277-286 (2003)] is used to compare with the present enstrophy based theory. The developed equations for disturbance enstrophy and disturbance mechanical energy are derived from NSE without any simplifying assumptions, as compared to other classical linear/nonlinear theories. The theory is tested for bypass transition caused by free stream convecting vortex over a zero pressure gradient boundary layer. We explain the creation of smaller scales in the flow by a cascade of enstrophy, which creates rotationality, in general inhomogeneous flows. Linear and nonlinear versions of the theory help explain the vortex-induced instability problem under consideration.

  17. Comparison of Predictive Contract Mechanisms from an Information Theory Perspective

    OpenAIRE

    Zhang, Xin; Ward, Tomas; McLoone, Seamus

    2012-01-01

    Inconsistency arises across a Distributed Virtual Environment due to network latency induced by state changes communications. Predictive Contract Mechanisms (PCMs) combat this problem through reducing the amount of messages transmitted in return for perceptually tolerable inconsistency. To date there are no methods to quantify the efficiency of PCMs in communicating this reduced state information. This article presents an approach derived from concepts in information theory for a dee...

  18. An Innovative Thinking-Based Intelligent Information Fusion Algorithm

    Directory of Open Access Journals (Sweden)

    Huimin Lu

    2013-01-01

    Full Text Available This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information.

  19. Taking Root: a grounded theory on evidence-based nursing implementation in China.

    Science.gov (United States)

    Cheng, L; Broome, M E; Feng, S; Hu, Y

    2018-06-01

    Evidence-based nursing is widely recognized as the critical foundation for quality care. To develop a middle-range theory on the process of evidence-based nursing implementation in Chinese context. A grounded theory study using unstructured in-depth individual interviews was conducted with 56 participants who were involved in 24 evidence-based nursing implementation projects in Mainland China from September 2015 to September 2016. A middle-range grounded theory of 'Taking Root' was developed. The theory describes the evidence implementation process consisting of four components (driving forces, process, outcome, sustainment/regression), three approaches (top-down, bottom-up and outside-in), four implementation strategies (patient-centred, nurses at the heart of change, reaching agreement, collaboration) and two patterns (transformational and adaptive implementation). Certain perspectives may have not been captured, as the retrospective nature of the interviewing technique did not allow for 'real-time' assessment of the actual implementation process. The transferability of the findings requires further exploration as few participants with negative experiences were recruited. This is the first study that explored evidence-based implementation process, strategies, approaches and patterns in the Chinese nursing practice context to inform international nursing and health policymaking. The theory of Taking Root described various approaches to evidence implementation and how the implementation can be transformational for the nurses and the setting in which they work. Nursing educators, managers and researchers should work together to improve nurses' readiness for evidence implementation. Healthcare systems need to optimize internal mechanisms and external collaborations to promote nursing practice in line with evidence and achieve clinical outcomes and sustainability. © 2017 International Council of Nurses.

  20. Russian and Chinese Information Warfare: Theory and Practice

    Science.gov (United States)

    2004-06-01

    Integral neurolinguistic programming •Placing essential programs into the conscious or sub- conscious mind •Subconscious suggestions that modify human...Generators of special rays •Optical systems • Neurolinguistic programming •Computer psychotechnology •The mass media •Audiovisual effects •Special effects...Information Warfare: Theory and Practice 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

  1. Informal Theory: The Ignored Link in Theory-to-Practice

    Science.gov (United States)

    Love, Patrick

    2012-01-01

    Applying theory to practice in student affairs is dominated by the assumption that formal theory is directly applied to practice. Among the problems with this assumption is that many practitioners believe they must choose between their lived experiences and formal theory, and that graduate students are taught that their experience "does not…

  2. Intuitive theories of information: beliefs about the value of redundancy.

    Science.gov (United States)

    Soll, J B

    1999-03-01

    In many situations, quantity estimates from multiple experts or diagnostic instruments must be collected and combined. Normatively, and all else equal, one should value information sources that are nonredundant, in the sense that correlation in forecast errors should be minimized. Past research on the preference for redundancy has been inconclusive. While some studies have suggested that people correctly place higher value on uncorrelated inputs when collecting estimates, others have shown that people either ignore correlation or, in some cases, even prefer it. The present experiments show that the preference for redundancy depends on one's intuitive theory of information. The most common intuitive theory identified is the Error Tradeoff Model (ETM), which explicitly distinguishes between measurement error and bias. According to ETM, measurement error can only be averaged out by consulting the same source multiple times (normatively false), and bias can only be averaged out by consulting different sources (normatively true). As a result, ETM leads people to prefer redundant estimates when the ratio of measurement error to bias is relatively high. Other participants favored different theories. Some adopted the normative model, while others were reluctant to mathematically average estimates from different sources in any circumstance. In a post hoc analysis, science majors were more likely than others to subscribe to the normative model. While tentative, this result lends insight into how intuitive theories might develop and also has potential ramifications for how statistical concepts such as correlation might best be learned and internalized. Copyright 1999 Academic Press.

  3. Quantum information processing in the radical-pair mechanism: Haberkorn's theory violates the Ozawa entropy bound

    Science.gov (United States)

    Mouloudakis, K.; Kominis, I. K.

    2017-02-01

    Radical-ion-pair reactions, central for understanding the avian magnetic compass and spin transport in photosynthetic reaction centers, were recently shown to be a fruitful paradigm of the new synthesis of quantum information science with biological processes. We show here that the master equation so far constituting the theoretical foundation of spin chemistry violates fundamental bounds for the entropy of quantum systems, in particular the Ozawa bound. In contrast, a recently developed theory based on quantum measurements, quantum coherence measures, and quantum retrodiction, thus exemplifying the paradigm of quantum biology, satisfies the Ozawa bound as well as the Lanford-Robinson bound on information extraction. By considering Groenewold's information, the quantum information extracted during the reaction, we reproduce the known and unravel other magnetic-field effects not conveyed by reaction yields.

  4. Quantum information processing in the radical-pair mechanism: Haberkorn's theory violates the Ozawa entropy bound.

    Science.gov (United States)

    Mouloudakis, K; Kominis, I K

    2017-02-01

    Radical-ion-pair reactions, central for understanding the avian magnetic compass and spin transport in photosynthetic reaction centers, were recently shown to be a fruitful paradigm of the new synthesis of quantum information science with biological processes. We show here that the master equation so far constituting the theoretical foundation of spin chemistry violates fundamental bounds for the entropy of quantum systems, in particular the Ozawa bound. In contrast, a recently developed theory based on quantum measurements, quantum coherence measures, and quantum retrodiction, thus exemplifying the paradigm of quantum biology, satisfies the Ozawa bound as well as the Lanford-Robinson bound on information extraction. By considering Groenewold's information, the quantum information extracted during the reaction, we reproduce the known and unravel other magnetic-field effects not conveyed by reaction yields.

  5. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

    Science.gov (United States)

    Kitazono, Jun; Kanai, Ryota; Oizumi, Masafumi

    2018-03-01

    The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($\\Phi$) in the brain is related to the level of consciousness. IIT proposes that to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that if a measure of $\\Phi$ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of $\\Phi$ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of $\\Phi$ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure $\\Phi$ in large systems within a practical amount of time.

  6. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    Science.gov (United States)

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs.

  7. Frame Works: Using Metaphor in Theory and Practice in Information Literacy

    Science.gov (United States)

    Holliday, Wendy

    2017-01-01

    The ACRL Framework for Information Literacy in Higher Education generated a large amount of discourse during its development and adoption. All of this discourse is rich in metaphoric language that can be used as a tool for critical reflection on teaching and learning, information literacy, and the nature and role of theory in the practice of…

  8. Quantum-like model of processing of information in the brain based on classical electromagnetic field.

    Science.gov (United States)

    Khrennikov, Andrei

    2011-09-01

    We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of "quantum physical brain" reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a variety of concrete images given by temporal realizations of the corresponding (Gaussian) random signal. This signal has the covariance operator coinciding with the density operator encoding the abstract concept under consideration. The presence of various temporal scales in the brain plays the crucial role in creation of QLR in the brain. Moreover, in our model electromagnetic noise produced by neurons is a source of superstrong QL correlations between processes in different spatial domains in the brain; the binding problem is solved on the QL level, but with the aid of the classical background fluctuations. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. The future (and past) of quantum theory after the Higgs boson: a quantum-informational viewpoint.

    Science.gov (United States)

    Plotnitsky, Arkady

    2016-05-28

    Taking as its point of departure the discovery of the Higgs boson, this article considers quantum theory, including quantum field theory, which predicted the Higgs boson, through the combined perspective of quantum information theory and the idea of technology, while also adopting anon-realistinterpretation, in 'the spirit of Copenhagen', of quantum theory and quantum phenomena themselves. The article argues that the 'events' in question in fundamental physics, such as the discovery of the Higgs boson (a particularly complex and dramatic, but not essentially different, case), are made possible by the joint workings of three technologies: experimental technology, mathematical technology and, more recently, digital computer technology. The article will consider the role of and the relationships among these technologies, focusing on experimental and mathematical technologies, in quantum mechanics (QM), quantum field theory (QFT) and finite-dimensional quantum theory, with which quantum information theory has been primarily concerned thus far. It will do so, in part, by reassessing the history of quantum theory, beginning with Heisenberg's discovery of QM, in quantum-informational and technological terms. This history, the article argues, is defined by the discoveries of increasingly complex configurations of observed phenomena and the emergence of the increasingly complex mathematical formalism accounting for these phenomena, culminating in the standard model of elementary-particle physics, defining the current state of QFT. © 2016 The Author(s).

  10. Risk Oriented Audit Methodology’s Improvement, based on the Fraud Theory

    Directory of Open Access Journals (Sweden)

    Sergei V. Arzhenovskii

    2016-12-01

    Full Text Available Modern economic development is accompanied by the increasing complexity of the relationship structure, intercompany transactions, the requirements for information disclosure on the entity’s activities. It leads to the opportunities growth of fraudulent representation public companies reporting. The external audit is considered as an institution that resists to negative processes. The issues of improving the tools that assessing the risk of material misstatement of the financial statements due to fraud from the standpoint the fraud theory have been examined here. Determining potential areas in which methodology for risk assessment of material misstatement of the financial statements due to fraud in an audit are based on the modern fraud theory interpretations will be develop. Functional and structural, comparative, logical and historical, deductive and inductive analyst methods have been used. Retrospective analysis on research areas in the fraud theory field has been performed. Perspective directions for improving methodology and methods for risk assessment of material misstatement of the financial statements in an audit have been identified. Methodology development for fraud risk assessment procedures in an audit of financial statements is based on “fraud diamond” theory. Creating new interpretations the fraud theory during last fifteen years is a consequence the expansion opportunities for corporate fraud, one of the most common species which is the financial statements falsification. Changing the list of factors that could initiate the fraud in the company's public reporting requires that the methods of its identification during the audit should be improved. Methodology and methods risk assessment of intentional material misstatement of the financial statements need to be improved in the direction both the modern modification of fraud theory.

  11. The use of network theory to model disparate ship design information

    Directory of Open Access Journals (Sweden)

    Douglas Rigterink

    2014-06-01

    Full Text Available This paper introduces the use of network theory to model and analyze disparate ship design information. This work will focus on a ship's distributed systems and their intra- and intersystem structures and interactions. The three system to be analyzed are: a passageway system, an electrical system, and a fire fighting system. These systems will be analyzed individually using common network metrics to glean information regarding their structures and attributes. The systems will also be subjected to community detection algorithms both separately and as a multiplex network to compare their similarities, differences, and interactions. Network theory will be shown to be useful in the early design stage due to its simplicity and ability to model any shipboard system.

  12. The use of network theory to model disparate ship design information

    Directory of Open Access Journals (Sweden)

    Rigterink Douglas

    2014-06-01

    Full Text Available This paper introduces the use of network theory to model and analyze disparate ship design information. This work will focus on a ship’s distributed systems and their intra- and intersystem structures and interactions. The three system to be analyzed are: a passageway system, an electrical system, and a fire fighting system. These systems will be analyzed individually using common network metrics to glean information regarding their structures and attributes. The systems will also be subjected to community detection algorithms both separately and as a multiplex network to compare their similarities, differences, and interactions. Network theory will be shown to be useful in the early design stage due to its simplicity and ability to model any shipboard system.

  13. The application of information theory for the research of aging and aging-related diseases.

    Science.gov (United States)

    Blokh, David; Stambler, Ilia

    2017-10-01

    This article reviews the application of information-theoretical analysis, employing measures of entropy and mutual information, for the study of aging and aging-related diseases. The research of aging and aging-related diseases is particularly suitable for the application of information theory methods, as aging processes and related diseases are multi-parametric, with continuous parameters coexisting alongside discrete parameters, and with the relations between the parameters being as a rule non-linear. Information theory provides unique analytical capabilities for the solution of such problems, with unique advantages over common linear biostatistics. Among the age-related diseases, information theory has been used in the study of neurodegenerative diseases (particularly using EEG time series for diagnosis and prediction), cancer (particularly for establishing individual and combined cancer biomarkers), diabetes (mainly utilizing mutual information to characterize the diseased and aging states), and heart disease (mainly for the analysis of heart rate variability). Few works have employed information theory for the analysis of general aging processes and frailty, as underlying determinants and possible early preclinical diagnostic measures for aging-related diseases. Generally, the use of information-theoretical analysis permits not only establishing the (non-linear) correlations between diagnostic or therapeutic parameters of interest, but may also provide a theoretical insight into the nature of aging and related diseases by establishing the measures of variability, adaptation, regulation or homeostasis, within a system of interest. It may be hoped that the increased use of such measures in research may considerably increase diagnostic and therapeutic capabilities and the fundamental theoretical mathematical understanding of aging and disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Novel information theory-based measures for quantifying incongruence among phylogenetic trees.

    Science.gov (United States)

    Salichos, Leonidas; Stamatakis, Alexandros; Rokas, Antonis

    2014-05-01

    Phylogenies inferred from different data matrices often conflict with each other necessitating the development of measures that quantify this incongruence. Here, we introduce novel measures that use information theory to quantify the degree of conflict or incongruence among all nontrivial bipartitions present in a set of trees. The first measure, internode certainty (IC), calculates the degree of certainty for a given internode by considering the frequency of the bipartition defined by the internode (internal branch) in a given set of trees jointly with that of the most prevalent conflicting bipartition in the same tree set. The second measure, IC All (ICA), calculates the degree of certainty for a given internode by considering the frequency of the bipartition defined by the internode in a given set of trees in conjunction with that of all conflicting bipartitions in the same underlying tree set. Finally, the tree certainty (TC) and TC All (TCA) measures are the sum of IC and ICA values across all internodes of a phylogeny, respectively. IC, ICA, TC, and TCA can be calculated from different types of data that contain nontrivial bipartitions, including from bootstrap replicate trees to gene trees or individual characters. Given a set of phylogenetic trees, the IC and ICA values of a given internode reflect its specific degree of incongruence, and the TC and TCA values describe the global degree of incongruence between trees in the set. All four measures are implemented and freely available in version 8.0.0 and subsequent versions of the widely used program RAxML.

  15. A force-matching Stillinger-Weber potential for MoS2: Parameterization and Fisher information theory based sensitivity analysis

    Science.gov (United States)

    Wen, Mingjian; Shirodkar, Sharmila N.; Plecháč, Petr; Kaxiras, Efthimios; Elliott, Ryan S.; Tadmor, Ellad B.

    2017-12-01

    Two-dimensional molybdenum disulfide (MoS2) is a promising material for the next generation of switchable transistors and photodetectors. In order to perform large-scale molecular simulations of the mechanical and thermal behavior of MoS2-based devices, an accurate interatomic potential is required. To this end, we have developed a Stillinger-Weber potential for monolayer MoS2. The potential parameters are optimized to reproduce the geometry (bond lengths and bond angles) of MoS2 in its equilibrium state and to match as closely as possible the forces acting on the atoms along a dynamical trajectory obtained from ab initio molecular dynamics. Verification calculations indicate that the new potential accurately predicts important material properties including the strain dependence of the cohesive energy, the elastic constants, and the linear thermal expansion coefficient. The uncertainty in the potential parameters is determined using a Fisher information theory analysis. It is found that the parameters are fully identified, and none are redundant. In addition, the Fisher information matrix provides uncertainty bounds for predictions of the potential for new properties. As an example, bounds on the average vibrational thickness of a MoS2 monolayer at finite temperature are computed and found to be consistent with the results from a molecular dynamics simulation. The new potential is available through the OpenKIM interatomic potential repository at https://openkim.org/cite/MO_201919462778_000.

  16. Soft Measurement Modeling Based on Chaos Theory for Biochemical Oxygen Demand (BOD

    Directory of Open Access Journals (Sweden)

    Junfei Qiao

    2016-12-01

    Full Text Available The precision of soft measurement for biochemical oxygen demand (BOD is always restricted due to various factors in the wastewater treatment plant (WWTP. To solve this problem, a new soft measurement modeling method based on chaos theory is proposed and is applied to BOD measurement in this paper. Phase space reconstruction (PSR based on Takens embedding theorem is used to extract more information from the limited datasets of the chaotic system. The WWTP is first testified as a chaotic system by the correlation dimension (D, the largest Lyapunov exponents (λ1, the Kolmogorov entropy (K of the BOD and other water quality parameters time series. Multivariate chaotic time series modeling method with principal component analysis (PCA and artificial neural network (ANN is then adopted to estimate the value of the effluent BOD. Simulation results show that the proposed approach has higher accuracy and better prediction ability than the corresponding modeling approaches not based on chaos theory.

  17. Local versus nonlocal information in quantum-information theory: Formalism and phenomena

    International Nuclear Information System (INIS)

    Horodecki, Michal; Horodecki, Ryszard; Synak-Radtke, Barbara; Horodecki, Pawel; Oppenheim, Jonathan; Sen, Aditi; Sen, Ujjwal

    2005-01-01

    In spite of many results in quantum information theory, the complex nature of compound systems is far from clear. In general the information is a mixture of local and nonlocal ('quantum') information. It is important from both pragmatic and theoretical points of view to know the relationships between the two components. To make this point more clear, we develop and investigate the quantum-information processing paradigm in which parties sharing a multipartite state distill local information. The amount of information which is lost because the parties must use a classical communication channel is the deficit. This scheme can be viewed as complementary to the notion of distilling entanglement. After reviewing the paradigm in detail, we show that the upper bound for the deficit is given by the relative entropy distance to so-called pseudoclassically correlated states; the lower bound is the relative entropy of entanglement. This implies, in particular, that any entangled state is informationally nonlocal - i.e., has nonzero deficit. We also apply the paradigm to defining the thermodynamical cost of erasing entanglement. We show the cost is bounded from below by relative entropy of entanglement. We demonstrate the existence of several other nonlocal phenomena which can be found using the paradigm of local information. For example, we prove the existence of a form of nonlocality without entanglement and with distinguishability. We analyze the deficit for several classes of multipartite pure states and obtain that in contrast to the GHZ state, the Aharonov state is extremely nonlocal. We also show that there do not exist states for which the deficit is strictly equal to the whole informational content (bound local information). We discuss the relation of the paradigm with measures of classical correlations introduced earlier. It is also proved that in the one-way scenario, the deficit is additive for Bell diagonal states. We then discuss complementary features of

  18. Cognition and biology: perspectives from information theory.

    Science.gov (United States)

    Wallace, Rodrick

    2014-02-01

    The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.

  19. Towards a conceptual framework for protection of personal information from the perspective of activity theory

    Directory of Open Access Journals (Sweden)

    Tiko Iyamu

    2017-11-01

    Full Text Available Background: Personal information about individuals is stored by organisations including government agencies. The information is intended to be kept confidential and strictly used for its primary and legitimate purposes. However, that has not always been the case in many South African government agencies and departments. In recent years, personal information about individuals and groups has been illegally leaked for other motives, in which some were detrimental. Even though there exists a legislation, Protection of Personal Information (POPI Act, which prohibits such malpractices, illegally leaked information has however, not stopped or reduced. In addition to the adoption of the POPI Act, a more stringent approach is therefore needed in order to improve sanity in the use and management of personal information. Otherwise, the detriment that such malpractices cause too many citizens can only be on the increase. Objectives: The objectives of this study were in twofold: (1 to examine and understand the activities that happen with personal information leaks, which includes why and how information is leaked; and (2 to develop a conceptual framework, which includes identification of the factors that influence information leaks and breaches in an environment. Method: Qualitative research methods were followed in achieving the objectives of the study. Within the qualitative methods, documents including existing literature were gathered. The activity theory was employed as lens to guide the analysis. Result: From the analysis, four critical factors were found to be of influence in information leaks and breaches in organisations. The factors include: (1 information and its value, (2 the roles of society and its compliance to information protection, (3 government and its laws relating to information protection and (4 the need for standardisation of information usage and management within a community. Based on the factors, a conceptual framework was

  20. Geographic information modeling of Econet of Northwestern Federal District territory on graph theory basis

    Science.gov (United States)

    Kopylova, N. S.; Bykova, A. A.; Beregovoy, D. N.

    2018-05-01

    Based on the landscape-geographical approach, a structural and logical scheme for the Northwestern Federal District Econet has been developed, which can be integrated into the federal and world ecological network in order to improve the environmental infrastructure of the region. The method of Northwestern Federal District Econet organization on the basis of graph theory by means of the Quantum GIS geographic information system is proposed as an effective mean of preserving and recreating the unique biodiversity of landscapes, regulation of the sphere of environmental protection.

  1. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  2. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  3. Theory of reasoned action and theory of planned behavior-based dietary interventions in adolescents and young adults: a systematic review.

    Science.gov (United States)

    Hackman, Christine L; Knowlden, Adam P

    2014-01-01

    Childhood obesity has reached epidemic proportions in many nations around the world. The theory of planned behavior (TPB) and the theory of reasoned action (TRA) have been used to successfully plan and evaluate numerous interventions for many different behaviors. The aim of this study was to systematically review and synthesize TPB and TRA-based dietary behavior interventions targeting adolescents and young adults. THE FOLLOWING DATABASES WERE SYSTEMATICALLY SEARCHED TO FIND ARTICLES FOR THIS REVIEW: Academic Search Premier; Cumulative Index to Nursing and Allied Health (CINAHL); Education Resources Information Center (ERIC); Health Source: Nursing/Academic Edition; Cochrane Central Register of Controlled Trials (CENTRAL); and MEDLINE. Inclusion criteria for articles were: 1) primary or secondary interventions, 2) with any quantitative design, 3) published in the English language, 4) between January 2003 and March 2014, 5) that targeted adolescents or young adults, 6) which included dietary change behavior as the outcome, and 7) utilized TPB or TRA. Of the eleven intervention studies evaluated, nine resulted in dietary behavior change that was attributed to the treatment. Additionally, all but one study found there to be a change in at least one construct of TRA or TPB, while one study did not measure constructs. All of the studies utilized some type of quantitative design, with two employing quasi-experimental, and eight employing randomized control trial design. Among the studies, four utilized technology including emails, social media posts, information on school websites, web-based activities, audio messages in classrooms, interactive DVDs, and health-related websites. Two studies incorporated goal setting and four employed persuasive communication. Interventions directed toward changing dietary behaviors in adolescents should aim to incorporate multi-faceted, theory-based approaches. Future studies should consider utilizing randomized control trial design and

  4. A Novel Abandoned Object Detection System Based on Three-Dimensional Image Information

    Directory of Open Access Journals (Sweden)

    Yiliang Zeng

    2015-03-01

    Full Text Available A new idea of an abandoned object detection system for road traffic surveillance systems based on three-dimensional image information is proposed in this paper to prevent traffic accidents. A novel Binocular Information Reconstruction and Recognition (BIRR algorithm is presented to implement the new idea. As initial detection, suspected abandoned objects are detected by the proposed static foreground region segmentation algorithm based on surveillance video from a monocular camera. After detection of suspected abandoned objects, three-dimensional (3D information of the suspected abandoned object is reconstructed by the proposed theory about 3D object information reconstruction with images from a binocular camera. To determine whether the detected object is hazardous to normal road traffic, road plane equation and height of suspected-abandoned object are calculated based on the three-dimensional information. Experimental results show that this system implements fast detection of abandoned objects and this abandoned object system can be used for road traffic monitoring and public area surveillance.

  5. Information theory and robotics meet to study predator-prey interactions

    Science.gov (United States)

    Neri, Daniele; Ruberto, Tommaso; Cord-Cruz, Gabrielle; Porfiri, Maurizio

    2017-07-01

    Transfer entropy holds promise to advance our understanding of animal behavior, by affording the identification of causal relationships that underlie animal interactions. A critical step toward the reliable implementation of this powerful information-theoretic concept entails the design of experiments in which causal relationships could be systematically controlled. Here, we put forward a robotics-based experimental approach to test the validity of transfer entropy in the study of predator-prey interactions. We investigate the behavioral response of zebrafish to a fear-evoking robotic stimulus, designed after the morpho-physiology of the red tiger oscar and actuated along preprogrammed trajectories. From the time series of the positions of the zebrafish and the robotic stimulus, we demonstrate that transfer entropy correctly identifies the influence of the stimulus on the focal subject. Building on this evidence, we apply transfer entropy to study the interactions between zebrafish and a live red tiger oscar. The analysis of transfer entropy reveals a change in the direction of the information flow, suggesting a mutual influence between the predator and the prey, where the predator adapts its strategy as a function of the movement of the prey, which, in turn, adjusts its escape as a function of the predator motion. Through the integration of information theory and robotics, this study posits a new approach to study predator-prey interactions in freshwater fish.

  6. Interest in and reactions to genetic risk information: The role of implicit theories and self-affirmation.

    Science.gov (United States)

    Taber, Jennifer M; Klein, William M P; Persky, Susan; Ferrer, Rebecca A; Kaufman, Annette R; Thai, Chan L; Harris, Peter R

    2017-10-01

    Implicit theories reflect core assumptions about whether human attributes are malleable or fixed: Incremental theorists believe a characteristic is malleable whereas entity theorists believe it is fixed. People with entity theories about health may be less likely to engage in risk-mitigating behavior. Spontaneous self-affirmation (e.g., reflecting on one's values when threatened) may lessen defensiveness and unhealthy behaviors associated with fixed beliefs, and reduce the likelihood of responding to health risk information with fixed beliefs. Across two studies conducted in the US from 2012 to 2015, we investigated how self-affirmation and implicit theories about health and body weight were linked to engagement with genetic risk information. In Study 1, participants in a genome sequencing trial (n = 511) completed cross-sectional assessments of implicit theories, self-affirmation, and intentions to learn, share, and use genetic information. In Study 2, overweight women (n = 197) were randomized to receive genetic or behavioral explanations for weight; participants completed surveys assessing implicit theories, self-affirmation, self-efficacy, motivation, and intentions. Fixed beliefs about weight were infrequently endorsed across studies (10.8-15.2%). In Study 1, participants with stronger fixed theories were less interested in learning and using genetic risk information about medically actionable disease; these associations were weaker among participants higher in self-affirmation. In Study 2, among participants given behavioral explanations for weight, stronger fixed theories about weight were associated with lower motivation and intentions to eat a healthy diet. Among participants given genetic explanations, being higher in self-affirmation was associated with less fixed beliefs. Stronger health-related fixed theories may decrease the likelihood of benefiting from genetic information, but less so for people who self-affirm. Published by Elsevier Ltd.

  7. Tracking Theory Building and Use Trends in Selected LIS Journals: More Research is Needed. A review of: Kim, Sung‐Jin, and Dong Y. Jeong. “An Analysis of the Development and Use of Theory in Library and Information Science Research Articles.” Library & Information Science Research 28.4 (Sept. 2006: 548‐62.

    Directory of Open Access Journals (Sweden)

    Carol Perryman

    2007-09-01

    Full Text Available Objective ‐ The authors measure theory incidents occurring in four LIS journals between 1984‐2003 in order to examine their number and quality and to analyze them by topic. A third objective, only identified later in the text of the study, was to compare theory development and use between Korean and international journals. Research questions asked include whether LIS has its own theoretical base as a discipline, and what characteristics the theoretical framework has.Design – Bibliometric study.Setting – Journal issues selected from four LIS journals for the time span from 1984 ‐ 2003.Subjects – Two international journals, Journal of the American Society for Information Science and Technology (JASIST and Library and Information Science Research (LISR were selected based on their high ranking in the Social Sciences Citation Index (SSCI impact factors. Two Korean journals, Journal of the Korean Society for Information Management (JKSIM and Journal of the Korean Society for Library and Information Science (JKSLIS were selected.Methods ‐ After having determined a definition of theory, and identifying different levels of theory, the authors set up rules for the identification of theory incidents, which are defined as “events in which the author contributed to the development or the use of theory in his/her own paper” (550. Content analysis of 1661 research articles was performed to measure theory incidents according to working definitions. Interrater reliability was ensured by conducting independent coding for “subfield classification, identification of theory incidents, and quality measurement” (555,using a sample of 199 articles (random selection not specified, achieving 94‐97% interrater reliability. Incidents, once identified, were evaluated for quality using Dubin’s “efficiency of law” criteria, involving measures of relatedness, directionality, co‐variation, rate of change, and “profundity,” defined as the

  8. Power Load Prediction Based on Fractal Theory

    OpenAIRE

    Jian-Kai, Liang; Cattani, Carlo; Wan-Qing, Song

    2015-01-01

    The basic theories of load forecasting on the power system are summarized. Fractal theory, which is a new algorithm applied to load forecasting, is introduced. Based on the fractal dimension and fractal interpolation function theories, the correlation algorithms are applied to the model of short-term load forecasting. According to the process of load forecasting, the steps of every process are designed, including load data preprocessing, similar day selecting, short-term load forecasting, and...

  9. VALUE RELEVANCE OF GROUP FINANCIAL STATEMENTS BASED ON ENTITY VERSUS PARENT COMPANY THEORY: EVIDENCE FROM THE LARGEST THREE EUROPEAN CAPITAL MARKETS

    Directory of Open Access Journals (Sweden)

    Müller Victor-Octavian

    2012-07-01

    Full Text Available Financial statementsn#8217; main objective is to give information on the financial position, performance and changes in financial position of the reporting entity, which is useful to investors and other users in making economic decisions. In order to be useful, financial information needs to be relevant to the decision-making process of users in general, and investors in particular. Regarding consolidated financial statements, the accounting theory knows four perspectives (theories on which the preparation of those statements is based, namely, the proprietary theory, the parent company theory, the parent company extension theory and the entity theory (Baxter and Spinney, 1975. Of practical importance are especially the parent company extension perspective and the entity perspective. The IASB and FASB decided (within an ED regarding the Improvement of the Conceptual Framework that consolidated financial statements should be presented from the perspective of the group entity, and not from the perspective of the parent-company. However, this support for the entity theory is to our knowledge not backed by empirical findings in the academic literature. Therefore, in our paper we set to contribute with empirical arguments to finding an actual answer to the question about the superior market value relevance of one of the two concurrent perspectives (theories. We set to carry out an empirical association study on the problem of market value relevance of consolidated financial statements based on the entity theory respectively on the parent company (extension theory, searching for an answer to the above question. In this sense, we pursued an analysis of market value relevance of consolidated accounting information (based on the two perspectives of listed entities between 2003-2008 on the largest three European Stock Exchanges (London, Paris and Frankfurt. The obtained results showed that a n#8222;restrainedn#8221; entity perspective, which would combine

  10. Alice and Bob meet Banach the interface of asymptotic geometric analysis and quantum information theory

    CERN Document Server

    Aubrun, Guillaume

    2017-01-01

    The quest to build a quantum computer is arguably one of the major scientific and technological challenges of the twenty-first century, and quantum information theory (QIT) provides the mathematical framework for that quest. Over the last dozen or so years, it has become clear that quantum information theory is closely linked to geometric functional analysis (Banach space theory, operator spaces, high-dimensional probability), a field also known as asymptotic geometric analysis (AGA). In a nutshell, asymptotic geometric analysis investigates quantitative properties of convex sets, or other geometric structures, and their approximate symmetries as the dimension becomes large. This makes it especially relevant to quantum theory, where systems consisting of just a few particles naturally lead to models whose dimension is in the thousands, or even in the billions. Alice and Bob Meet Banach is aimed at multiple audiences connected through their interest in the interface of QIT and AGA: at quantum information resea...

  11. Surrogate Marker Evaluation from an Information Theory Perspective

    OpenAIRE

    Alonso Abad, Ariel; Molenberghs, Geert

    2006-01-01

    The last 20 years have seen lots of work in the area of surrogate marker validation, partly devoted to frame the evaluation in a multitrial framework, leading to definitions in terms of the quality of trial- and individual-level association between a potential surrogate and a true endpoint (Buyse et al., 2000, Biostatistics 1, 49–67). A drawback is that different settings have led to different measures at the individual level. Here, we use information theory to create a unified framework, lea...

  12. Theory-Based Stakeholder Evaluation – applied. Competing Stakeholder Theories in the Quality Management of Primary Education

    DEFF Research Database (Denmark)

    Hansen, Morten Balle; Heilesen, J. B.

    In the broader context of evaluation design, this paper examines and compares pros and cons of a theory-based approach to evaluation (TBE) with the Theory-Based Stakeholder evaluation (TSE) model, introduced by Morten Balle Hansen and Evert Vedung (Hansen and Vedung 2010). While most approaches...... to TBE construct one unitary theory of the program (Coryn et al. 2011), the TSE-model emphasizes the importance of keeping theories of diverse stakeholders apart. This paper applies the TSE-model to an evaluation study conducted by the Danish Evaluation Institute (EVA) of the Danish system of quality......-model, as an alternative to traditional program theory evaluation....

  13. Information flow, causality, and the classical theory of tachyons

    International Nuclear Information System (INIS)

    Basano, L.

    1977-01-01

    Causal paradoxes arising in the tachyon theory have been systematically solved by using the reinterpretation principle as a consequence of which cause and effect no longer retain an absolute meaning. However, even in the tachyon theory, a cause is always seen to chronologically precede its effect, but this is obtained at the price of allowing cause and effect to be interchanged when required. A recent result has shown that this interchange-ability of cause and effect must not be unlimited if heavy paradoxes are to be avoided. This partial recovery of the classical concept of causality has been expressed by the conjecture that transcendent tachyons cannot be absorbed by a tachyon detector. In this paper the directional properties of the flow of information between two observers in relative motion and its consequences on the logical self-consistency of the theory of superluminal particles are analyzed. It is shown that the above conjecture does not provide a satisfactory solution to the problem because it implies that tachyons of any speed cannot be intercepted by the same detector. (author)

  14. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  15. Exploratory Study Based on Stakeholder Theory in the Development of Accounting Information Systems in the Catholic Church: A Case Study in the Archdiocese of Semarang, Indonesia

    Directory of Open Access Journals (Sweden)

    Siswanto Fransiscus Asisi Joko

    2017-01-01

    Full Text Available This study aims to find a strategy in the development of computer-based accounting information system in the church. With exploratory study on the theory of stakeholders, this study identifies the needs of financial information for the purposes of making a decision required by the parish priest, the parish treasurer, and a team of economists at the archdiocese of Semarang (AS. This research was conducted by using qualitative and quantitative approach. Qualitative method is conducted by applying a focus group discussion with economist team in AS (the users who have major influence in the development of the system. In addition to that, quantitative method is also applied to the parish treasurer (the users who have great interest in the system development. The results showed that the parish treasurer has high perceived usefulness, perceived ease of use, perceived of relevance, and the self-efficacy toward the accounting information system (AIS for the parish. This study provides an answer on the benefits of a bottom-up strategy based on the stakeholder analysis in the development of AIS in the area of the Catholic Church AS.

  16. Quantum Gravity, Information Theory and the CMB

    Science.gov (United States)

    Kempf, Achim

    2018-04-01

    We review connections between the metric of spacetime and the quantum fluctuations of fields. We start with the finding that the spacetime metric can be expressed entirely in terms of the 2-point correlator of the fluctuations of quantum fields. We then discuss the open question whether the knowledge of only the spectra of the quantum fluctuations of fields also suffices to determine the spacetime metric. This question is of interest because spectra are geometric invariants and their quantization would, therefore, have the benefit of not requiring the modding out of diffeomorphisms. Further, we discuss the fact that spacetime at the Planck scale need not necessarily be either discrete or continuous. Instead, results from information theory show that spacetime may be simultaneously discrete and continuous in the same way that information can. Finally, we review the recent finding that a covariant natural ultraviolet cutoff at the Planck scale implies a signature in the cosmic microwave background (CMB) that may become observable.

  17. Demystifying theory and its use in improvement

    Science.gov (United States)

    Davidoff, Frank; Dixon-Woods, Mary; Leviton, Laura; Michie, Susan

    2015-01-01

    The role and value of theory in improvement work in healthcare has been seriously underrecognised. We join others in proposing that more informed use of theory can strengthen improvement programmes and facilitate the evaluation of their effectiveness. Many professionals, including improvement practitioners, are unfortunately mystified—and alienated—by theory, which discourages them from using it in their work. In an effort to demystify theory we make the point in this paper that, far from being discretionary or superfluous, theory (‘reason-giving’), both informal and formal, is intimately woven into virtually all human endeavour. We explore the special characteristics of grand, mid-range and programme theory; consider the consequences of misusing theory or failing to use it; review the process of developing and applying programme theory; examine some emerging criteria of ‘good’ theory; and emphasise the value, as well as the challenge, of combining informal experience-based theory with formal, publicly developed theory. We conclude that although informal theory is always at work in improvement, practitioners are often not aware of it or do not make it explicit. The germane issue for improvement practitioners, therefore, is not whether they use theory but whether they make explicit the particular theory or theories, informal and formal, they actually use. PMID:25616279

  18. Graph-based linear scaling electronic structure theory

    Energy Technology Data Exchange (ETDEWEB)

    Niklasson, Anders M. N., E-mail: amn@lanl.gov; Negre, Christian F. A.; Cawkwell, Marc J.; Swart, Pieter J.; Germann, Timothy C.; Bock, Nicolas [Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); Mniszewski, Susan M.; Mohd-Yusof, Jamal; Wall, Michael E.; Djidjev, Hristo [Computer, Computational, and Statistical Sciences Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); Rubensson, Emanuel H. [Division of Scientific Computing, Department of Information Technology, Uppsala University, Box 337, SE-751 05 Uppsala (Sweden)

    2016-06-21

    We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.

  19. Smoking Beliefs Among Chinese Secondary School Students: A Theory-Based Qualitative Study.

    Science.gov (United States)

    Zhao, Xiang; White, Katherine M; Young, Ross McD; Obst, Patricia L

    2018-02-07

    China has the world's greatest number of smokers but theory-based smoking interventions are rare. To develop an effective intervention, understanding the determinants of Chinese adolescent smoking is crucial. The Theory of Planned Behavior (TPB) is empirically supported to predict and assist in informing intervention strategies to change health-related behaviors. Based on the TPB, the elicitation of shared smoking beliefs among adolescents can inform future intervention designs among this at-risk population. We investigated the beliefs from six focus groups (N = 30) of one senior secondary school in Kunming, Yunnan Province, China. We used semi-structured questions based on the TPB framework, including prompts about behavioral (advantages and disadvantages), normative (important referents), and control (barriers and facilitators) beliefs. Following the Consensual Qualitative Research (CQR) methodology, data were discussed until consensus was reached. Auditing was undertaken by an external researcher. Seven domains (advantages, disadvantages, approvers, disapprovers, facilitators, barriers, and smoker images) were examined. Smoking as a gendered behavior, smoking as influenced by cultural and environmental contexts, smoking as a strategy to cope with stress, and awareness of the harm of smoking, are highlighted themes across domains. Data suggested an extended-TPB framework as an appropriate approach to adopt when addressing smoking beliefs among the target population. These beliefs can be utilized to inform future school-based interventions and public health campaigns targeting smoking among Chinese adolescents. A modified TPB approach has potential for future smoking interventions among Chinese adolescents. Beliefs elicited in this study form a strong basis for designing a location- and population-specific antismoking programme. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights

  20. Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems

    Directory of Open Access Journals (Sweden)

    Ali Mohammad-Djafari

    2015-06-01

    Full Text Available The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP, information theory, relative entropy and the Kullback–Leibler (KL divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC and, in particular, the variational Bayesian approximation (VBA methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC methods. We will also see that VBA englobes joint maximum a posteriori (MAP, as well as the different expectation-maximization (EM algorithms as particular cases.

  1. Making a difference: incorporating theories of autonomy into models of informed consent.

    Science.gov (United States)

    Delany, C

    2008-09-01

    Obtaining patients' informed consent is an ethical and legal obligation in healthcare practice. Whilst the law provides prescriptive rules and guidelines, ethical theories of autonomy provide moral foundations. Models of practice of consent, have been developed in the bioethical literature to assist in understanding and integrating the ethical theory of autonomy and legal obligations into the clinical process of obtaining a patient's informed consent to treatment. To review four models of consent and analyse the way each model incorporates the ethical meaning of autonomy and how, as a consequence, they might change the actual communicative process of obtaining informed consent within clinical contexts. An iceberg framework of consent is used to conceptualise how ethical theories of autonomy are positioned and underpin the above surface, and visible clinical communication, including associated legal guidelines and ethical rules. Each model of consent is critically reviewed from the perspective of how it might shape the process of informed consent. All four models would alter the process of obtaining consent. Two models provide structure and guidelines for the content and timing of obtaining patients' consent. The two other models rely on an attitudinal shift in clinicians. They provide ideas for consent by focusing on underlying values, attitudes and meaning associated with the ethical meaning of autonomy. The paper concludes that models of practice that explicitly incorporate the underlying ethical meaning of autonomy as their basis, provide less prescriptive, but more theoretically rich guidance for healthcare communicative practices.

  2. Action-Based Jurisprudence: Praxeological Legal Theory in Relation to Economic Theory, Ethics, and Legal Practice

    Directory of Open Access Journals (Sweden)

    Konrad Graf

    2011-08-01

    Full Text Available Action-based legal theory is a discrete branch of praxeology and the basis of an emerging school of jurisprudence related to, but distinct from, natural law. Legal theory and economic theory share content that is part of praxeology itself: the action axiom, the a priori of argumentation, universalizable property theory, and counterfactual-deductive methodology. Praxeological property-norm justification is separate from the strictly ethical “ought” question of selecting ends in an action context. Examples of action-based jurisprudence are found in existing “Austro-libertarian” literature. Legal theory and legal practice must remain distinct and work closely together if justice is to be found in real cases. Legal theorizing was shaped in religious ethical contexts, which contributed to confused field boundaries between law and ethics. The carrot and stick influence of rulers on theorists has distorted conventional economics and jurisprudence in particular directions over the course of centuries. An action-based approach is relatively immune to such sources of distortion in its methods and conclusions, but has tended historically to be marginalized from conventional institutions for this same reason.

  3. Anticipated detection of favorable periods for wind energy production by means of information theory

    Science.gov (United States)

    Vogel, Eugenio; Saravia, Gonzalo; Kobe, Sigismund; Schumann, Rolf; Schuster, Rolf

    Managing the electric power produced by different sources requires mixing the different response times they present. Thus, for instance, coal burning presents large time lags until operational conditions are reached while hydroelectric generation can react in a matter of some seconds or few minutes to reach the desired productivity. Wind energy production (WEP) can be instantaneously fed to the network to save fuels with low thermal inertia (gas burning for instance), but this source presents sudden variations within few hours. We report here for the first time a method based on information theory to handle WEP. This method has been successful in detecting dynamical changes in magnetic transitions and variations of stock markets. An algorithm called wlzip based on information recognition is used to recognize the information content of a time series. We make use of publically available energy data in Germany to simulate real applications. After a calibration process the system can recognize directly on the WEP data the onset of favorable periods of a desired strength. Optimization can lead to a few hours of anticipation which is enough to control the mixture of WEP with other energy sources, thus saving fuels.

  4. Practice Evaluation Strategies Among Social Workers: Why an Evidence-Informed Dual-Process Theory Still Matters.

    Science.gov (United States)

    Davis, Thomas D

    2017-01-01

    Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.

  5. A spread willingness computing-based information dissemination model.

    Science.gov (United States)

    Huang, Haojing; Cui, Zhiming; Zhang, Shukui

    2014-01-01

    This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.

  6. Research on Disaster Early Warning and Disaster Relief Integrated Service System Based on Block Data Theory

    Science.gov (United States)

    Yang, J.; Zhang, H.; Wang, C.; Tang, D.

    2018-04-01

    With the continuous development of social economy, the interaction between mankind and nature has become increasingly evident. Disastrous global catastrophes have occurred from time to time, causing huge losses to people's lives and property. All governments recognize the importance of the establishment of disaster early warning and release mechanisms, and it is also an urgent issue to improve the comprehensive service level of emergency response and disaster relief. However, disaster early warning and emergency relief information is usually generated by different departments, and the diverse data sources, difficult integration, and limited release speed have always been difficult issues to be solved. Block data is the aggregation of various distributed (point data) and segmentation (data) big data on a specific platform and make them happen continuous polymerization effect, block data theory is a good solution to cross-sectoral, cross-platform Disaster information data sharing and integration problems. This paper attempts to discuss the integrated service mechanism of disaster information aggregation and disaster relief based on block data theory and introduces a location-based integrated service system for disaster early warning and disaster relief.

  7. Contraction theory based adaptive synchronization of chaotic systems

    International Nuclear Information System (INIS)

    Sharma, B.B.; Kar, I.N.

    2009-01-01

    Contraction theory based stability analysis exploits the incremental behavior of trajectories of a system with respect to each other. Application of contraction theory provides an alternative way for stability analysis of nonlinear systems. This paper considers the design of a control law for synchronization of certain class of chaotic systems based on backstepping technique. The controller is selected so as to make the error dynamics between the two systems contracting. Synchronization problem with and without uncertainty in system parameters is discussed and necessary stability proofs are worked out using contraction theory. Suitable adaptation laws for unknown parameters are proposed based on the contraction principle. The numerical simulations verify the synchronization of the chaotic systems. Also parameter estimates converge to their true values with the proposed adaptation laws.

  8. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  9. Unifying ecology and macroevolution with individual-based theory.

    Science.gov (United States)

    Rosindell, James; Harmon, Luke J; Etienne, Rampal S

    2015-05-01

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. © 2015 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.

  10. Use of theory in computer-based interventions to reduce alcohol use among adolescents and young adults: a systematic review.

    Science.gov (United States)

    Tebb, Kathleen P; Erenrich, Rebecca K; Jasik, Carolyn Bradner; Berna, Mark S; Lester, James C; Ozer, Elizabeth M

    2016-06-17

    Alcohol use and binge drinking among adolescents and young adults remain frequent causes of preventable injuries, disease, and death, and there has been growing attention to computer-based modes of intervention delivery to prevent/reduce alcohol use. Research suggests that health interventions grounded in established theory are more effective than those with no theoretical basis. The goal of this study was to conduct a literature review of computer-based interventions (CBIs) designed to address alcohol use among adolescents and young adults (aged 12-21 years) and examine the extent to which CBIs use theories of behavior change in their development and evaluations. This study also provides an update on extant CBIs addressing alcohol use among youth and their effectiveness. Between November and December of 2014, a literature review of CBIs aimed at preventing or reducing alcohol in PsychINFO, PubMed, and Google Scholar was conducted. The use of theory in each CBI was examined using a modified version of the classification system developed by Painter et al. (Ann Behav Med 35:358-362, 2008). The search yielded 600 unique articles, 500 were excluded because they did not meet the inclusion criteria. The 100 remaining articles were retained for analyses. Many articles were written about a single intervention; thus, the search revealed a total of 42 unique CBIs. In examining the use of theory, 22 CBIs (52 %) explicitly named one or more theoretical frameworks. Primary theories mentioned were social cognitive theory, transtheoretical model, theory of planned behavior and reasoned action, and health belief model. Less than half (48 %), did not use theory, but mentioned either use of a theoretical construct (such as self-efficacy) or an intervention technique (e.g., manipulating social norms). Only a few articles provided detailed information about how the theory was applied to the CBI; the vast majority included little to no information. Given the importance of theory in

  11. Use of theory in computer-based interventions to reduce alcohol use among adolescents and young adults: a systematic review

    Directory of Open Access Journals (Sweden)

    Kathleen P. Tebb

    2016-06-01

    Full Text Available Abstract Background Alcohol use and binge drinking among adolescents and young adults remain frequent causes of preventable injuries, disease, and death, and there has been growing attention to computer-based modes of intervention delivery to prevent/reduce alcohol use. Research suggests that health interventions grounded in established theory are more effective than those with no theoretical basis. The goal of this study was to conduct a literature review of computer-based interventions (CBIs designed to address alcohol use among adolescents and young adults (aged 12–21 years and examine the extent to which CBIs use theories of behavior change in their development and evaluations. This study also provides an update on extant CBIs addressing alcohol use among youth and their effectiveness. Methods Between November and December of 2014, a literature review of CBIs aimed at preventing or reducing alcohol in PsychINFO, PubMed, and Google Scholar was conducted. The use of theory in each CBI was examined using a modified version of the classification system developed by Painter et al. (Ann Behav Med 35:358–362, 2008. Results The search yielded 600 unique articles, 500 were excluded because they did not meet the inclusion criteria. The 100 remaining articles were retained for analyses. Many articles were written about a single intervention; thus, the search revealed a total of 42 unique CBIs. In examining the use of theory, 22 CBIs (52 % explicitly named one or more theoretical frameworks. Primary theories mentioned were social cognitive theory, transtheoretical model, theory of planned behavior and reasoned action, and health belief model. Less than half (48 %, did not use theory, but mentioned either use of a theoretical construct (such as self-efficacy or an intervention technique (e.g., manipulating social norms. Only a few articles provided detailed information about how the theory was applied to the CBI; the vast majority included little

  12. Geospatial Information Service System Based on GeoSOT Grid & Encoding

    Directory of Open Access Journals (Sweden)

    LI Shizhong

    2016-12-01

    Full Text Available With the rapid development of the space and earth observation technology, it is important to establish a multi-source, multi-scale and unified cross-platform reference for global data. In practice, the production and maintenance of geospatial data are scattered in different units, and the standard of the data grid varies between departments and systems. All these bring out the disunity of standards among different historical periods or orgnizations. Aiming at geospatial information security library for the national high resolution earth observation, there are some demands for global display, associated retrieval and template applications and other integrated services for geospatial data. Based on GeoSOT grid and encoding theory system, "geospatial information security library information of globally unified grid encoding management" data subdivision organization solutions have been proposed; system-level analyses, researches and designs have been carried out. The experimental results show that the data organization and management method based on GeoSOT can significantly improve the overall efficiency of the geospatial information security service system.

  13. Advancing the literature on designing audit and feedback interventions: identifying theory-informed hypotheses.

    Science.gov (United States)

    Colquhoun, Heather L; Carroll, Kelly; Eva, Kevin W; Grimshaw, Jeremy M; Ivers, Noah; Michie, Susan; Sales, Anne; Brehaut, Jamie C

    2017-09-29

    Audit and feedback (A&F) is a common strategy for helping health providers to implement evidence into practice. Despite being extensively studied, health care A&F interventions remain variably effective, with overall effect sizes that have not improved since 2003. Contributing to this stagnation is the fact that most health care A&F interventions have largely been designed without being informed by theoretical understanding from the behavioral and social sciences. To determine if the trend can be improved, the objective of this study was to develop a list of testable, theory-informed hypotheses about how to design more effective A&F interventions. Using purposive sampling, semi-structured 60-90-min telephone interviews were conducted with experts in theories related to A&F from a range of fields (e.g., cognitive, health and organizational psychology, medical decision-making, economics). Guided by detailed descriptions of A&F interventions from the health care literature, interviewees described how they would approach the problem of designing improved A&F interventions. Specific, theory-informed hypotheses about the conditions for effective design and delivery of A&F interventions were elicited from the interviews. The resulting hypotheses were assigned by three coders working independently into themes, and categories of themes, in an iterative process. We conducted 28 interviews and identified 313 theory-informed hypotheses, which were placed into 30 themes. The 30 themes included hypotheses related to the following five categories: A&F recipient (seven themes), content of the A&F (ten themes), process of delivery of the A&F (six themes), behavior that was the focus of the A&F (three themes), and other (four themes). We have identified a set of testable, theory-informed hypotheses from a broad range of behavioral and social science that suggest conditions for more effective A&F interventions. This work demonstrates the breadth of perspectives about A&F from non

  14. Reflections on the Right to Information Based on Citizenship Theories

    Directory of Open Access Journals (Sweden)

    Vitor Gentilli

    2007-06-01

    Full Text Available In modern societies, structured as representative democracies, all rights to some extent are related to the right to information: the enlargement of participation in citizenship presupposes an enlargement of the right to information as a premise. It is a right which encourages the exercising of citizenship and aff ords the citizens access to and criticism of the instruments necessary for the full exercising of the group of citizenship rights. The right to information can have characteristics of emancipation or of tutelage. An emancipating right is a right to freedom, a right whose basic presupposition is freedom of choice. Accordingly, the maxim which could sum up the ethical issue of the right to information would be: give maximum publicity to everything which refers to the public sphere and keep secret that which refers to the private sphere.

  15. Informed consent in neurosurgery--translating ethical theory into action.

    Science.gov (United States)

    Schmitz, Dagmar; Reinacher, Peter C

    2006-09-01

    Although a main principle of medical ethics and law since the 1970s, standards of informed consent are regarded with great scepticism by many clinicans. By reviewing the reactions to and adoption of this principle of medical ethics in neurosurgery, the characteristic conflicts that emerge between theory and everyday clinical experience are emphasised and a modified conception of informed consent is proposed. The adoption and debate of informed consent in neurosurgery took place in two steps. Firstly, respect for patient autonomy was included into the ethical codes of the professional organisations. Secondly, the legal demands of the principle were questioned by clinicians. Informed consent is mainly interpreted in terms of freedom from interference and absolute autonomy. It lacks a constructive notion of physician-patient interaction in its effort to promote the best interest of the patient, which, however, potentially emerges from a reconsideration of the principle of beneficence. To avoid insufficient legal interpretations, informed consent should be understood in terms of autonomy and beneficence. A continuous interaction between the patient and the given physician is considered as an essential prerequisite for the realisation of the standards of informed consent.

  16. Informed consent in neurosurgery—translating ethical theory into action

    Science.gov (United States)

    Schmitz, Dagmar; Reinacher, Peter C

    2006-01-01

    Objective Although a main principle of medical ethics and law since the 1970s, standards of informed consent are regarded with great scepticism by many clinicans. Methods By reviewing the reactions to and adoption of this principle of medical ethics in neurosurgery, the characteristic conflicts that emerge between theory and everyday clinical experience are emphasised and a modified conception of informed consent is proposed. Results The adoption and debate of informed consent in neurosurgery took place in two steps. Firstly, respect for patient autonomy was included into the ethical codes of the professional organisations. Secondly, the legal demands of the principle were questioned by clinicians. Informed consent is mainly interpreted in terms of freedom from interference and absolute autonomy. It lacks a constructive notion of physician–patient interaction in its effort to promote the best interest of the patient, which, however, potentially emerges from a reconsideration of the principle of beneficence. Conclusion To avoid insufficient legal interpretations, informed consent should be understood in terms of autonomy and beneficence. A continuous interaction between the patient and the given physician is considered as an essential prerequisite for the realisation of the standards of informed consent. PMID:16943326

  17. Critical Theory-Based Approaches in Geography Teaching Departments in Turkey

    Science.gov (United States)

    Bilgili, Münür

    2018-01-01

    The aim of this study is to understand the relationships between critical theory-based approaches and its implementations in geography teaching departments in Turkey. Critical theory dates back to 1930s and has developed over time aiming to deal with institutions, culture and society through critical lens. Currently, critical theory-based research…

  18. Ownership as an Issue in Data and Information Sharing: a philosophically based review

    Directory of Open Access Journals (Sweden)

    Dennis Hart

    2002-11-01

    Full Text Available It has long been an aim of information management and information systems development to enable more effective and efficient data and information sharing within organisations. A commonplace assertion has been that data and information belong, or should belong, to the organisation as a whole as opposed to any individual or stakeholder within it. Nevertheless, despite the potential benefits of data and information sharing within organisations, efforts to achieve it have typically run into more difficulty than expected and have frequently been less successful than the technological capabilities would, at least prima facie, allow. This paper is based on the proposition that perceptions of ownership can have an important influence on data and information sharing behaviour, and explores philosophical theories of ownership and property with the aim of better understanding the origins of such behaviour. It is further proposed that what are here called “implicit” theories of information ownership on the part of different individuals or parties within an organisation can lead to varying perceptions as to who is the legitimate owner of particular data or information, and that this view is illuminating of the difficulties that have often been experienced in trying to achieve effective organisational data and information sharing.

  19. A theory-informed approach to mental health care capacity building for pharmacists.

    Science.gov (United States)

    Murphy, Andrea L; Gardner, David M; Kutcher, Stan P; Martin-Misener, Ruth

    2014-01-01

    Pharmacists are knowledgeable, accessible health care professionals who can provide services that improve outcomes in mental health care. Various challenges and opportunities can exist in pharmacy practice to hinder or support pharmacists' efforts. We used a theory-informed approach to development and implementation of a capacity-building program to enhance pharmacists' roles in mental health care. Theories and frameworks including the Consolidated Framework for Implementation Research, the Theoretical Domains Framework, and the Behaviour Change Wheel were used to inform the conceptualization, development, and implementation of a capacity-building program to enhance pharmacists' roles in mental health care. The More Than Meds program was developed and implemented through an iterative process. The main program components included: an education and training day; use of a train-the-trainer approach from partnerships with pharmacists and people with lived experience of mental illness; development of a community of practice through email communications, a website, and a newsletter; and use of educational outreach delivered by pharmacists. Theories and frameworks used throughout the program's development and implementation facilitated a means to conceptualize the component parts of the program as well as its overall presence as a whole from inception through evolution in implementation. Using theoretical foundations for the program enabled critical consideration and understanding of issues related to trialability and adaptability of the program. Theory was essential to the underlying development and implementation of a capacity-building program for enhancing services by pharmacists for people with lived experience of mental illness. Lessons learned from the development and implementation of this program are informing current research and evolution of the program.

  20. Quantum Information, computation and cryptography. An introductory survey of theory, technology and experiments

    International Nuclear Information System (INIS)

    Benatti, Fabio; Fannes, Mark; Floreanini, Roberto; Petritis, Dimitri

    2010-01-01

    This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation. (orig.)

  1. A Performance-Based Instructional Theory

    Science.gov (United States)

    Lawson, Tom E.

    1974-01-01

    The rationale for a performanced- based instructional theory has arisen from significant advances during the past several years in instructional psychology. Four major areas of concern are: analysis of subject-matter content in terms of performance competencies, diagnosis of pre-instructional behavior, formulation of an instructional…

  2. Impact of the Cybernetic Law of Requisite Variety on a Theory of Information Science.

    Science.gov (United States)

    Heilprin, Laurence B.

    Search for an integrated, comprehensive theory of information science (IS) has so far been unsuccessful. Appearance of a theory has been retarded by one central constraint, the large number of disciplines concerned with human communication. Crossdisciplinary interdependence occurs in two ways: theoretical relation of IS phenomena to a given…

  3. Model for Electromagnetic Information Leakage

    OpenAIRE

    Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming

    2013-01-01

    Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...

  4. Automated image segmentation using information theory

    International Nuclear Information System (INIS)

    Hibbard, L.S.

    2001-01-01

    Full text: Our development of automated contouring of CT images for RT planning is based on maximum a posteriori (MAP) analyses of region textures, edges, and prior shapes, and assumes stationary Gaussian distributions for voxel textures and contour shapes. Since models may not accurately represent image data, it would be advantageous to compute inferences without relying on models. The relative entropy (RE) from information theory can generate inferences based solely on the similarity of probability distributions. The entropy of a distribution of a random variable X is defined as -Σ x p(x)log 2 p(x) for all the values x which X may assume. The RE (Kullback-Liebler divergence) of two distributions p(X), q(X), over X is Σ x p(x)log 2 {p(x)/q(x)}. The RE is a kind of 'distance' between p,q, equaling zero when p=q and increasing as p,q are more different. Minimum-error MAP and likelihood ratio decision rules have RE equivalents: minimum error decisions obtain with functions of the differences between REs of compared distributions. One applied result is the contour ideally separating two regions is that which maximizes the relative entropy of the two regions' intensities. A program was developed that automatically contours the outlines of patients in stereotactic headframes, a situation most often requiring manual drawing. The relative entropy of intensities inside the contour (patient) versus outside (background) was maximized by conjugate gradient descent over the space of parameters of a deformable contour. shows the computed segmentation of a patient from headframe backgrounds. This program is particularly useful for preparing images for multimodal image fusion. Relative entropy and allied measures of distribution similarity provide automated contouring criteria that do not depend on statistical models of image data. This approach should have wide utility in medical image segmentation applications. Copyright (2001) Australasian College of Physical Scientists and

  5. Towards a Structurational Theory of Information Systems: a substantive case analysis

    DEFF Research Database (Denmark)

    Rose, Jeremy; Hackney, R. H

    2003-01-01

    This paper employs the analysis of an interpretive case study within a Regional Train Operating Company (RTOC) to arrive at theoretical understandings of Information Systems (IS). Giddens’ ‘structuration theory’ is developed which offers an account of structure and agency; social practices develo...

  6. Informed Grounded Theory

    Science.gov (United States)

    Thornberg, Robert

    2012-01-01

    There is a widespread idea that in grounded theory (GT) research, the researcher has to delay the literature review until the end of the analysis to avoid contamination--a dictum that might turn educational researchers away from GT. Nevertheless, in this article the author (a) problematizes the dictum of delaying a literature review in classic…

  7. BOOK REVIEW: Theory of Neural Information Processing Systems

    Science.gov (United States)

    Galla, Tobias

    2006-04-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  8. Using findings in multimedia learning to inform technology-based behavioral health interventions.

    Science.gov (United States)

    Aronson, Ian David; Marsch, Lisa A; Acosta, Michelle C

    2013-09-01

    Clinicians and researchers are increasingly using technology-based behavioral health interventions to improve intervention effectiveness and to reach underserved populations. However, these interventions are rarely informed by evidence-based findings of how technology can be optimized to promote acquisition of key skills and information. At the same time, experts in multimedia learning generally do not apply their findings to health education or conduct research in clinical contexts. This paper presents an overview of some key aspects of multimedia learning research that may allow those developing health interventions to apply informational technology with the same rigor as behavioral science content. We synthesized empirical multimedia learning literature from 1992 to 2011. We identified key findings and suggested a framework for integrating technology with educational and behavioral science theory. A scientific, evidence-driven approach to developing technology-based interventions can yield greater effectiveness, improved fidelity, increased outcomes, and better client service.

  9. The theory of reasoned action and intention to seek cancer information.

    Science.gov (United States)

    Ross, Levi; Kohler, Connie L; Grimley, Diane M; Anderson-Lewis, Charkarra

    2007-01-01

    To evaluate the applicability of the theory of reasoned action to explain men's intentions to seek prostate cancer information. Three hundred randomly selected African American men participated in telephone interviews. Correlational and regression analyses were conducted to examine relationships among measures. All relationships were significant in regression analyses. Attitudes and subjective norm were significantly related to intentions. Indirect measures of beliefs derived from elicitation research were associated with direct measures of attitude and subjective norms. The data are sufficiently clear to support the applicability of the theory for this behavioral domain with African American men and suggest several important areas for future research.

  10. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    Science.gov (United States)

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  11. Language Learning Strategies and English Proficiency: Interpretations from Information-Processing Theory

    Science.gov (United States)

    Rao, Zhenhui

    2016-01-01

    The research reported here investigated the relationship between students' use of language learning strategies and their English proficiency, and then interpreted the data from two models in information-processing theory. Results showed that the students' English proficiency significantly affected their use of learning strategies, with high-level…

  12. The contribution of process tracing to theory-based evaluations of complex aid instruments

    DEFF Research Database (Denmark)

    Beach, Derek; Schmitt, Johannes

    2015-01-01

    studies in demanding settings. For the specific task of evaluating the governance effectiveness of budget support interventions, we developed a more fine-grained causal mechanism for a subset of the comprehensive program theory of budget support. Moreover, based on the informal use of Bayesian logic, we...... remedy some of the problems at hand in much case-study research and increase the inferential leverage in complex within-case evaluation studies....

  13. Epistemic Information in Stratified M-Spaces

    Directory of Open Access Journals (Sweden)

    Mark Burgin

    2011-12-01

    Full Text Available Information is usually related to knowledge. However, the recent development of information theory demonstrated that information is a much broader concept, being actually present in and virtually related to everything. As a result, many unknown types and kinds of information have been discovered. Nevertheless, information that acts on knowledge, bringing new and updating existing knowledge, is of primary importance to people. It is called epistemic information, which is studied in this paper based on the general theory of information and further developing its mathematical stratum. As a synthetic approach, which reveals the essence of information, organizing and encompassing all main directions in information theory, the general theory of information provides efficient means for such a study. Different types of information dynamics representation use tools of mathematical disciplines such as the theory of categories, functional analysis, mathematical logic and algebra. Here we employ algebraic structures for exploration of information and knowledge dynamics. In Introduction (Section 1, we discuss previous studies of epistemic information. Section 2 gives a compressed description of the parametric phenomenological definition of information in the general theory of information. In Section 3, anthropic information, which is received, exchanged, processed and used by people is singled out and studied based on the Componential Triune Brain model. One of the basic forms of anthropic information called epistemic information, which is related to knowledge, is analyzed in Section 4. Mathematical models of epistemic information are studied in Section 5. In Conclusion, some open problems related to epistemic information are given.

  14. The Scope of Usage-based Theory

    Directory of Open Access Journals (Sweden)

    Paul eIbbotson

    2013-05-01

    Full Text Available Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the ‘cognitive commitment’ of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure versus cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works.

  15. A Practice-Based Theory of Healing Through Therapeutic Touch: Advancing Holistic Nursing Practice.

    Science.gov (United States)

    Hanley, Mary Anne; Coppa, Denise; Shields, Deborah

    2017-08-01

    For nearly 50 years, Therapeutic Touch (TT) has contributed to advancing holistic nursing practice and has been recognized as a uniquely human approach to healing. This narrative explores the development of a practice-based theory of healing through TT, which occurred between 2010 and 2016. Through the in-depth self-inquiry of participatory reflective dialogue in concert with constant narrative analysis, TT practitioners revealed the meaning of healing within the context of their TT practice. As the community of TT experts participated in an iterative process of small group and community dialogues with analysis and synthesis of emerging themes, the assumptions and concepts central to a theory of healing emerged, were clarified and verified. Exemplars of practice illustrate the concepts. A model of the theory of healing illuminates the movement and relationship among concepts and evolved over time. Feedback from nursing and inter-professional practitioners indicate that the theory of healing, while situated within the context of TT, may be useful in advancing holistic nursing practice, informing healing and caring approaches, stimulating research and education, and contributing to future transformations in health care.

  16. Optimizing Sparse Representations of Kinetic Distributions via Information Theory

    Science.gov (United States)

    2017-07-31

    Information Theory Robert Martin and Daniel Eckhardt Air Force Research Laboratory (AFMC) AFRL/RQRS 1 Ara Drive Edwards AFB, CA 93524-7013 Air Force...momentum, energy, and physical entropy. N/A Unclassified Unclassified Unclassified SAR 7 Robert Martin N/A Research in Industrial Projects for Students...Journal of Computational Physics, vol. 145, no. 1, pp. 382 – 405, 1998. [7] R. S. Martin , H. Le, D. L. Bilyeu, and S. Gildea, “Plasma model V&V of

  17. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    International Nuclear Information System (INIS)

    Tadaki, Kohtaro

    2010-01-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  18. Computer-based teaching module design: principles derived from learning theories.

    Science.gov (United States)

    Lau, K H Vincent

    2014-03-01

    The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to

  19. Applied information science, engineering and technology selected topics from the field of production information engineering and IT for manufacturing : theory and practice

    CERN Document Server

    Tóth, Tibor

    2014-01-01

    The objective of the book is to give a selection from the papers, which summarize several important results obtained within the framework of the József Hatvany Doctoral School operating at the University of Miskolc, Hungary. In accordance with the three main research areas of the Doctoral School established for Information Science, Engineering and Technology, the papers can be classified into three groups. They are as follows: (1) Applied Computational Science; (2) Production Information Engineering (IT for Manufacturing included); (3) Material Stream Systems and IT for Logistics. As regards the first area, some papers deal with special issues of algorithms theory and its applications, with computing algorithms for engineering tasks, as well as certain issues of data base systems and knowledge intensive systems. Related to the second research area, the focus is on Production Information Engineering with special regard to discrete production processes. In the second research area the papers show some new inte...

  20. Continuing Bonds in Bereavement: An Attachment Theory Based Perspective

    Science.gov (United States)

    Field, Nigel P.; Gao, Beryl; Paderna, Lisa

    2005-01-01

    An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual…

  1. A Spread Willingness Computing-Based Information Dissemination Model

    Science.gov (United States)

    Cui, Zhiming; Zhang, Shukui

    2014-01-01

    This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network. PMID:25110738

  2. A Spread Willingness Computing-Based Information Dissemination Model

    Directory of Open Access Journals (Sweden)

    Haojing Huang

    2014-01-01

    Full Text Available This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user’s spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.

  3. Efficiency and credit ratings: a permutation-information-theory analysis

    International Nuclear Information System (INIS)

    Bariviera, Aurelio Fernandez; Martinez, Lisana B; Zunino, Luciano; Belén Guercio, M; Rosso, Osvaldo A

    2013-01-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity–entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification. (paper)

  4. From motivation and cognition theories to everyday applications and back again: the case of product-integrated information and feedback

    Energy Technology Data Exchange (ETDEWEB)

    McCalley, L.T. [Technical Univ. Eindhoven (Netherlands)

    2003-07-01

    Various moderators of the relationship of goal setting and feedback are explored in four examples of applied empirical research. A selection of theoretical frameworks borrowed from varied disciplines guided the studies and are discussed in terms of their value to the particular questions investigated. The experiments all entailed the use of product-integrated energy feedback and illustrate a progressive understanding of how goals, feedback and other information provided to the user can generate or support better energy conservation. Experiment 1 exemplifies the successful use of combining goal setting and feedback and provides a basic understanding of the interaction from the perspectives of goal setting theory and Feedback Intervention Theory (FIT). Experiment 2 compares FIT to another, fundamentally different, cognitive framework, and the minimal justification principle. The study gives insight into how goals and feedback work through attention focus and the goal hierarchy to guide behaviour, the role of attitude in this process, and offers evidence that FIT better accounts for task specific conservation behaviour. Experiment 3 addresses the role of goals and information in strategy planning through the perspective of goal setting theory. Results of this study suggest the need for more development of the basic theory and illustrate the strong motivational properties of having a goal. Experiment 4 investigates a more fundamental process, anchoring bias, taken from decision theory and the theory of rational choice. This experiment was based again on FIT and provided further evidence of behavioural control through the focus of attention at a particular level of the goal hierarchy.

  5. From motivation and cognition theories to everyday applications and back again. The case of product-integrated information and feedback

    Energy Technology Data Exchange (ETDEWEB)

    McCalley, L.T. [Technical University Eindhoven/ TUE, Eindhoven (Netherlands)

    2003-07-01

    Various moderators of the relationship of goal setting and feedback are explored in four examples of applied empirical research. A selection of theoretical frameworks borrowed from varied disciplines guided the studies and are discussed in terms of their value to the particular questions investigated. The experiments all entailed the use of product-integrated energy feedback and illustrate a progressive understanding of how goals, feedback and other information provided to the user can generate or support better energy conservation. Experiment 1 exemplifies the successful use of combining goal setting and feedback and provides a basic understanding of the interaction from the perspectives of goal setting theory and Feedback Intervention Theory (FIT). Experiment 2 compares FIT to another, fundamentally different, cognitive framework, and the minimal justification principle. The study gives insight into how goals and feedback work through attention focus and the goal hierarchy to guide behaviour, the role of attitude in this process, and offers evidence that FIT better accounts for task specific conservation behaviour. Experiment 3 addresses the role of goals and information in strategy planning through the perspective of goal setting theory. Results of this study suggest the need for more development of the basic theory and illustrate the strong motivational properties of having a goal. Experiment 4 investigates a more fundamental process, anchoring bias, taken from decision theory and the theory of rational choice. This experiment was based again on FIT and provided further evidence of behavioural control through the focus of attention at a particular level of the goal hierarchy.

  6. Switching theory-based steganographic system for JPEG images

    Science.gov (United States)

    Cherukuri, Ravindranath C.; Agaian, Sos S.

    2007-04-01

    Cellular communications constitute a significant portion of the global telecommunications market. Therefore, the need for secured communication over a mobile platform has increased exponentially. Steganography is an art of hiding critical data into an innocuous signal, which provide answers to the above needs. The JPEG is one of commonly used format for storing and transmitting images on the web. In addition, the pictures captured using mobile cameras are in mostly in JPEG format. In this article, we introduce a switching theory based steganographic system for JPEG images which is applicable for mobile and computer platforms. The proposed algorithm uses the fact that energy distribution among the quantized AC coefficients varies from block to block and coefficient to coefficient. Existing approaches are effective with a part of these coefficients but when employed over all the coefficients they show there ineffectiveness. Therefore, we propose an approach that works each set of AC coefficients with different frame work thus enhancing the performance of the approach. The proposed system offers a high capacity and embedding efficiency simultaneously withstanding to simple statistical attacks. In addition, the embedded information could be retrieved without prior knowledge of the cover image. Based on simulation results, the proposed method demonstrates an improved embedding capacity over existing algorithms while maintaining a high embedding efficiency and preserving the statistics of the JPEG image after hiding information.

  7. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  8. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  9. Dynamical theory of subconstituents based on ternary algebras

    International Nuclear Information System (INIS)

    Bars, I.; Guenaydin, M.

    1980-01-01

    We propose a dynamical theory of possible fundamental constituents of matter. Our scheme is based on (super) ternary algebras which are building blocks of Lie (super) algebras. Elementary fields, called ''ternons,'' are associated with the elements of a (super) ternary algebra. Effective gauge bosons, ''quarks,'' and ''leptons'' are constructed as composite fields from ternons. We propose two- and four-dimensional (super) ternon theories whose structures are closely related to CP/sub N/ and Yang-Mills theories and their supersymmetric extensions. We conjecture that at large distances (low energies) the ternon theories dynamically produce effective gauge theories and thus may be capable of explaining the present particle-physics phenomenology. Such a scenario is valid in two dimensions

  10. Looking to the future of new media in health marketing: deriving propositions based on traditional theories.

    Science.gov (United States)

    Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice

    2008-01-01

    Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future.

  11. Information and information flow an introduction

    CERN Document Server

    Bremer, Manuel

    2004-01-01

    This book is conceived as an introductory text into the theory of syntactic and semantic information, and information flow. Syntactic information theory is concerned with the information contained in the very fact that some signal has a non-random structure. Semantic information theory is concerned with the meaning or information content of messages and the like. The theory of information flow is concerned with deriving some piece of information from another. The main part will take us to situation semantics as a foundation of modern approaches in information theory. We give a brief overview o

  12. The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory

    Science.gov (United States)

    Anil, Duygu

    2008-01-01

    In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…

  13. Expanding resource theory and feminist-informed theory to explain intimate partner violence perpetration by court-ordered men.

    Science.gov (United States)

    Basile, Kathleen C; Hall, Jeffrey E; Walters, Mikel L

    2013-07-01

    This study tested resource and feminist-informed theories to explain physical, sexual, psychological, and stalking intimate partner violence (IPV) perpetrated by court-mandated men. Data were obtained from 340 men arrested for physical assault of a partner before their court-ordered treatment. Using path analysis, findings provided partial support for each model. Ineffective arguing and substance-use problems were moderators of resources and perpetration. Dominance mediated early exposures and perpetration in the feminist-informed model. In both models, predictors of stalking were different than those for other types of perpetration. Future studies should replicate this research and determine the utility of combining models.

  14. Position-specific prediction of methylation sites from sequence conservation based on information theory.

    Science.gov (United States)

    Shi, Yinan; Guo, Yanzhi; Hu, Yayun; Li, Menglong

    2015-07-23

    Protein methylation plays vital roles in many biological processes and has been implicated in various human diseases. To fully understand the mechanisms underlying methylation for use in drug design and work in methylation-related diseases, an initial but crucial step is to identify methylation sites. The use of high-throughput bioinformatics methods has become imperative to predict methylation sites. In this study, we developed a novel method that is based only on sequence conservation to predict protein methylation sites. Conservation difference profiles between methylated and non-methylated peptides were constructed by the information entropy (IE) in a wider neighbor interval around the methylation sites that fully incorporated all of the environmental information. Then, the distinctive neighbor residues were identified by the importance scores of information gain (IG). The most representative model was constructed by support vector machine (SVM) for Arginine and Lysine methylation, respectively. This model yielded a promising result on both the benchmark dataset and independent test set. The model was used to screen the entire human proteome, and many unknown substrates were identified. These results indicate that our method can serve as a useful supplement to elucidate the mechanism of protein methylation and facilitate hypothesis-driven experimental design and validation.

  15. Integrated information theory of consciousness: an updated account.

    Science.gov (United States)

    Tononi, G

    2012-12-01

    This article presents an updated account of integrated information theory of consciousness (liT) and some of its implications. /IT stems from thought experiments that lead to phenomenological axioms (existence, compositionality, information, integration, exclusion) and corresponding ontological postulates. The information axiom asserts that every experience is spec~fic - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is unified- it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. /IT formalizes these intuitions with postulates. The information postulate states that only "differences that make a difference" from the intrinsic perpective of a system matter: a mechanism generates cause-effect information if its present state has selective past causes and selective future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated over elements and at the optimal spatiatemporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of

  16. Investigating uncertainty and emotions in conversations about family health history: a test of the theory of motivated information management.

    Science.gov (United States)

    Rauscher, Emily A; Hesse, Colin

    2014-01-01

    Although the importance of being knowledgeable of one's family health history is widely known, very little research has investigated how families communicate about this important topic. This study investigated how young adults seek information from parents about family health history. The authors used the Theory of Motivated Information Management as a framework to understand the process of uncertainty discrepancy and emotion in seeking information about family health history. Results of this study show the Theory of Motivated Information Management to be a good model to explain the process young adults go through in deciding to seek information from parents about family health history. Results also show that emotions other than anxiety can be used with success in the Theory of Motivated Information Management framework.

  17. A density functional theory-based chemical potential equalisation

    Indian Academy of Sciences (India)

    A chemical potential equalisation scheme is proposed for the calculation of these quantities and hence the dipole polarizability within the framework of density functional theory based linear response theory. The resulting polarizability is expressed in terms of the contributions from individual atoms in the molecule. A few ...

  18. Neighborhood Hypergraph Based Classification Algorithm for Incomplete Information System

    Directory of Open Access Journals (Sweden)

    Feng Hu

    2015-01-01

    Full Text Available The problem of classification in incomplete information system is a hot issue in intelligent information processing. Hypergraph is a new intelligent method for machine learning. However, it is hard to process the incomplete information system by the traditional hypergraph, which is due to two reasons: (1 the hyperedges are generated randomly in traditional hypergraph model; (2 the existing methods are unsuitable to deal with incomplete information system, for the sake of missing values in incomplete information system. In this paper, we propose a novel classification algorithm for incomplete information system based on hypergraph model and rough set theory. Firstly, we initialize the hypergraph. Second, we classify the training set by neighborhood hypergraph. Third, under the guidance of rough set, we replace the poor hyperedges. After that, we can obtain a good classifier. The proposed approach is tested on 15 data sets from UCI machine learning repository. Furthermore, it is compared with some existing methods, such as C4.5, SVM, NavieBayes, and KNN. The experimental results show that the proposed algorithm has better performance via Precision, Recall, AUC, and F-measure.

  19. Measuring Engagement in Later Life Activities: Rasch-Based Scenario Scales for Work, Caregiving, Informal Helping, and Volunteering

    Science.gov (United States)

    Ludlow, Larry H.; Matz-Costa, Christina; Johnson, Clair; Brown, Melissa; Besen, Elyssa; James, Jacquelyn B.

    2014-01-01

    The development of Rasch-based "comparative engagement scenarios" based on Guttman's facet theory and sentence mapping procedures is described. The scenario scales measuring engagement in work, caregiving, informal helping, and volunteering illuminate the lived experiences of role involvement among older adults and offer multiple…

  20. Learning Theory Foundations of Simulation-Based Mastery Learning.

    Science.gov (United States)

    McGaghie, William C; Harris, Ilene B

    2018-06-01

    Simulation-based mastery learning (SBML), like all education interventions, has learning theory foundations. Recognition and comprehension of SBML learning theory foundations are essential for thoughtful education program development, research, and scholarship. We begin with a description of SBML followed by a section on the importance of learning theory foundations to shape and direct SBML education and research. We then discuss three principal learning theory conceptual frameworks that are associated with SBML-behavioral, constructivist, social cognitive-and their contributions to SBML thought and practice. We then discuss how the three learning theory frameworks converge in the course of planning, conducting, and evaluating SBML education programs in the health professions. Convergence of these learning theory frameworks is illustrated by a description of an SBML education and research program in advanced cardiac life support. We conclude with a brief coda.

  1. Towards socio-material approaches in simulation-based education: lessons from complexity theory.

    Science.gov (United States)

    Fenwick, Tara; Dahlgren, Madeleine Abrandt

    2015-04-01

    Review studies of simulation-based education (SBE) consistently point out that theory-driven research is lacking. The literature to date is dominated by discourses of fidelity and authenticity - creating the 'real' - with a strong focus on the developing of clinical procedural skills. Little of this writing incorporates the theory and research proliferating in professional studies more broadly, which show how professional learning is embodied, relational and situated in social - material relations. A key concern for medical educators concerns how to better prepare students for the unpredictable and dynamic ambiguity of professional practice; this has stimulated the movement towards socio-material theories in education that address precisely this question. Among the various socio-material theories that are informing new developments in professional education, complexity theory has been of particular importance for medical educators interested in updating current practices. This paper outlines key elements of complexity theory, illustrated with examples from empirical study, to argue its particular relevance for improving SBE. Complexity theory can make visible important material dynamics, and their problematic consequences, that are not often noticed in simulated experiences in medical training. It also offers conceptual tools that can be put to practical use. This paper focuses on concepts of emergence, attunement, disturbance and experimentation. These suggest useful new approaches for designing simulated settings and scenarios, and for effective pedagogies before, during and following simulation sessions. Socio-material approaches such as complexity theory are spreading through research and practice in many aspects of professional education across disciplines. Here, we argue for the transformative potential of complexity theory in medical education using simulation as our focus. Complexity tools open questions about the socio-material contradictions inherent in

  2. Information theory and coding solved problems

    CERN Document Server

    Ivaniš, Predrag

    2017-01-01

    This book is offers a comprehensive overview of information theory and error control coding, using a different approach then in existed literature. The chapters are organized according to the Shannon system model, where one block affects the others. A relatively brief theoretical introduction is provided at the beginning of every chapter, including a few additional examples and explanations, but without any proofs. And a short overview of some aspects of abstract algebra is given at the end of the corresponding chapters. The characteristic complex examples with a lot of illustrations and tables are chosen to provide detailed insights into the nature of the problem. Some limiting cases are presented to illustrate the connections with the theoretical bounds. The numerical values are carefully selected to provide in-depth explanations of the described algorithms. Although the examples in the different chapters can be considered separately, they are mutually connected and the conclusions for one considered proble...

  3. What Communication Theories Can Teach the Designer of Computer-Based Training.

    Science.gov (United States)

    Larsen, Ronald E.

    1985-01-01

    Reviews characteristics of computer-based training (CBT) that make application of communication theories appropriate and presents principles from communication theory (e.g., general systems theory, symbolic interactionism, rule theories, and interpersonal communication theories) to illustrate how CBT developers can profitably apply them to…

  4. Improving health equity through theory-informed evaluations: a look at housing first strategies, cross-sectoral health programs, and prostitution policy.

    Science.gov (United States)

    Dunn, James R; van der Meulen, Emily; O'Campo, Patricia; Muntaner, Carles

    2013-02-01

    The emergent realist perspective on evaluation is instructive in the quest to use theory-informed evaluations to reduce health inequities. This perspective suggests that in addition to knowing whether a program works, it is imperative to know 'what works for whom in what circumstances and in what respects, and how?' (Pawson & Tilley, 1997). This addresses the important issue of heterogeneity of effect, in other words, that programs have different effects for different people, potentially even exacerbating inequities and worsening the situation of marginalized groups. But in addition, the realist perspective implies that a program may not only have a greater or lesser effect, but even for the same effect, it may work by way of a different mechanism, about which we must theorize, for different groups. For this reason, theory, and theory-based evaluations are critical to health equity. We present here three examples of evaluations with a focus on program theories and their links to inequalities. All three examples illustrate the importance of theory-based evaluations in reducing health inequities. We offer these examples from a wide variety of settings to illustrate that the problem of which we write is not an exception to usual practice. The 'Housing First' model of supportive housing for people with severe mental illness is based on a theory of the role of housing in living with mental illness that has a number of elements that directly contradict the theory underlying the dominant model. Multisectoral action theories form the basis for the second example on Venezuela's revolutionary national Barrio Adentro health improvement program. Finally, decriminalization of prostitution and related health and safety policies in New Zealand illustrate how evaluations can play an important role in both refining the theory and contributing to improved policy interventions to address inequalities. The theoretically driven and transformative nature of these interventions create

  5. Comparing integral and incidental emotions: Testing insights from emotions as social information theory and attribution theory.

    Science.gov (United States)

    Hillebrandt, Annika; Barclay, Laurie J

    2017-05-01

    Studies have indicated that observers can infer information about others' behavioral intentions from others' emotions and use this information in making their own decisions. Integrating emotions as social information (EASI) theory and attribution theory, we argue that the interpersonal effects of emotions are not only influenced by the type of discrete emotion (e.g., anger vs. happiness) but also by the target of the emotion (i.e., how the emotion relates to the situation). We compare the interpersonal effects of emotions that are integral (i.e., related to the situation) versus incidental (i.e., lacking a clear target in the situation) in a negotiation context. Results from 4 studies support our general argument that the target of an opponent's emotion influences the degree to which observers attribute the emotion to their own behavior. These attributions influence observers' inferences regarding the perceived threat of an impasse or cooperativeness of an opponent, which can motivate observers to strategically adjust their behavior. Specifically, emotion target influenced concessions for both anger and happiness (Study 1, N = 254), with perceived threat and cooperativeness mediating the effects of anger and happiness, respectively (Study 2, N = 280). Study 3 (N = 314) demonstrated the mediating role of attributions and moderating role of need for closure. Study 4 (N = 193) outlined how observers' need for cognitive closure influences how they attribute incidental anger. We discuss theoretical implications related to the social influence of emotions as well as practical implications related to the impact of personality on negotiators' biases and behaviors. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. A study of driver's route choice behavior based on evolutionary game theory.

    Science.gov (United States)

    Jiang, Xiaowei; Ji, Yanjie; Du, Muqing; Deng, Wei

    2014-01-01

    This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers' route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver's route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver's route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent.

  7. A ROADMAP FOR A COMPUTATIONAL THEORY OF THE VALUE OF INFORMATION IN ORIGIN OF LIFE QUESTIONS

    Directory of Open Access Journals (Sweden)

    Soumya Banerjee

    2016-06-01

    Full Text Available Information plays a critical role in complex biological systems. Complex systems like immune systems and ant colonies co-ordinate heterogeneous components in a decentralized fashion. How do these distributed decentralized systems function? One key component is how these complex systems efficiently process information. These complex systems have an architecture for integrating and processing information coming in from various sources and points to the value of information in the functioning of different complex biological systems. This article proposes a role for information processing in questions around the origin of life and suggests how computational simulations may yield insights into questions related to the origin of life. Such a computational model of the origin of life would unify thermodynamics with information processing and we would gain an appreciation of why proteins and nucleotides evolved as the substrate of computation and information processing in living systems that we see on Earth. Answers to questions like these may give us insights into non-carbon based forms of life that we could search for outside Earth. We hypothesize that carbon-based life forms are only one amongst a continuum of life-like systems in the universe. Investigations into the role of computational substrates that allow information processing is important and could yield insights into: 1 novel non-carbon based computational substrates that may have “life-like” properties, and 2 how life may have actually originated from non-life on Earth. Life may exist as a continuum between non-life and life and we may have to revise our notion of life and how common it is in the universe. Looking at life or life-like phenomenon through the lens of information theory may yield a broader view of life.

  8. MOTIVATING ENGLISH TEACHERS BASED ON THE BASIC NEEDS THEORY AND AN EXPECTANCY THEORY

    Directory of Open Access Journals (Sweden)

    Hidayatus Sholihah

    2017-08-01

    Full Text Available There are two main motivation theories. a hierarchy of basic needs theory,  and an expectancy theory. In a Hyrarchy of basic needs theory, Maslow has stated that the basic needs as a main behaviour direction are structured into a hierarchy. There are five basic human needs.  The first: Physiological needs such as: salary, bonus or working condition. The second: the safety needs, such as: safe job environment, job security or health cover. The third, social needs, such as  union and team work. The next is self esteem, such as getting an award, medal, certificate or any other recognisition. Then the last is self actualization, for example is by providing an opportunity to share knowledge, skills and eprerience. The evaluation of this theory are: there is no spiritual needs as human basic needs is a main weakness of this theory. Then it is possible that different level of  needs  have to be satisfied in the same time, or not in hierarchy level or, not always have to be fulfilled in order. The next motivation theory is an Expectancy Theory. This theory is based on three main factors. The first factor is: English teachers will be motivated to work harder if they have a good perception to their own competences in accordance with their job. The second, individual motivation depends on the rewards given when they finish a  particular job. Finally, it also depends on their regards to the rewards given from the job that they do. Expectancy theory is a good theory, however, it is not easy to be implemented because the principals should provide various types of reward to satisfy the expectation of their English teachers. Considering the strengths and weaknesses of these two theories, it is better to combine both of them in the practice to get more effective results.

  9. How cultural evolutionary theory can inform social psychology and vice versa.

    Science.gov (United States)

    Mesoudi, Alex

    2009-10-01

    Cultural evolutionary theory is an interdisciplinary field in which human culture is viewed as a Darwinian process of variation, competition, and inheritance, and the tools, methods, and theories developed by evolutionary biologists to study genetic evolution are adapted to study cultural change. It is argued here that an integration of the theories and findings of mainstream social psychology and of cultural evolutionary theory can be mutually beneficial. Social psychology provides cultural evolution with a set of empirically verified microevolutionary cultural processes, such as conformity, model-based biases, and content biases, that are responsible for specific patterns of cultural change. Cultural evolutionary theory provides social psychology with ultimate explanations for, and an understanding of the population-level consequences of, many social psychological phenomena, such as social learning, conformity, social comparison, and intergroup processes, as well as linking social psychology with other social science disciplines such as cultural anthropology, archaeology, and sociology.

  10. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    Science.gov (United States)

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support.

  11. Using attachment theory to inform the design and delivery of mental health services: a systematic review of the literature.

    Science.gov (United States)

    Bucci, Sandra; Roberts, Nicola H; Danquah, Adam N; Berry, Katherine

    2015-03-01

    The aim of this review was to propose and describe the design and delivery of an attachment-informed general mental health service. We systematically searched the PsycINFO, MEDLINE, Web of Knowledge, COPAC, CINAHL, and Science Direct databases from 1960 to 2013. We also searched reference lists of relevant papers and directly contacted authors in the field. Literature describing attachment theory and its applicability in designing and delivering general mental health services was synthesized using thematic analysis. Papers published in English, books or chapters in edited books that described applying attachment theory in designing and delivering mental health services for adults and adolescents were included in the review. Of the 1,105 articles identified, 14 met inclusion criteria for the review. Eight key themes, and four subthemes, were extracted and organized to reflect the experience of a service user moving through the mental health system. Key themes extracted were as follows: service policy and evaluation; referrals; assessment and formulation; intervention; support for staff; support for carers; moving on; and potential service benefits. Papers reviewed suggested that service users with severe mental health problems have attachment needs that should be met in general mental health services. Attachment theory provides a useful framework to inform the design and delivery of general mental health services. The resource implications for services are discussed, as are limitations of the review and recommendations for future research. Attachment theory should be used to inform the design and delivery of general mental health services. Mental health services should evaluate the extent to which they meet service users' attachment needs. Attachment-informed mental health services should assess outcomes, including cost-effectiveness over time. Papers included in this review focus on long-stay residential care or secure services and there is a limited experimental

  12. A Corpus-Based Discourse Information Analysis of Chinese EFL Learners' Autonomy in Legal Case Brief Writing

    Science.gov (United States)

    Chen, Jinshi

    2017-01-01

    Legal case brief writing is pedagogically important yet insufficiently discussed for Chinese EFL learners majoring in law. Based on process genre approach and discourse information theory (DIT), the present study designs a corpus-based analytical model for Chinese EFL learners' autonomy in legal case brief writing and explores the process of case…

  13. Opera house acoustics based on subjective preference theory

    CERN Document Server

    Ando, Yoichi

    2015-01-01

    This book focuses on opera house acoustics based on subjective preference theory; it targets researchers in acoustics and vision who are working in physics, psychology, and brain physiology. This book helps readers to understand any subjective attributes in relation to objective parameters based on the powerful and workable model of the auditory system. It is reconfirmed here that the well-known Helmholtz theory, which was based on a peripheral model of the auditory system, may not well describe pitch, timbre, and duration as well as the spatial sensations described in this book, nor overall responses such as subjective preference of sound fields and the annoyance of environmental noise.

  14. Task-Based Language Teaching and Expansive Learning Theory

    Science.gov (United States)

    Robertson, Margaret

    2014-01-01

    Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…

  15. From motivation and cognition theories to everyday applications and back again: the case of product-integrated information and feedback

    International Nuclear Information System (INIS)

    McCalley, L.T.

    2006-01-01

    Various moderators of the relationship of goal setting and feedback are explored in four examples of applied empirical research. A selection of theoretical frameworks adapted from varied disciplines guided the studies and are discussed in terms of their value to the particular questions investigated. The experiments all entailed the use of product-integrated energy feedback and illustrate a progressive understanding of how goals, feedback and other information provided to the user can generate or support better energy conservation. Experiment 1 exemplifies the successful use of combining goal setting and feedback, and provides a basic understanding of the interaction from the perspectives of goal setting theory and feedback intervention theory (FIT). Experiment 2 compares FIT to another, fundamentally different, cognitive framework, and the minimal justification principle. The study gives insight into how goals and feedback work through attention focus and the goal hierarchy to guide behavior, the role of attitude in this process, and offers evidence that FIT better accounts for task specific conservation behavior. Experiment 3 addresses the role of goals and information in strategy planning through the perspective of goal setting theory. Results of this study suggest the need for more development of the basic theory and illustrate the strong motivational properties of having a goal. Experiment 4 investigates a more fundamental process, anchoring bias, taken from decision theory and the theory of rational choice. This experiment was based again on FIT and provided further evidence of behavioral control through the focus of attention at a particular level of the goal hierarchy. Findings are discussed in terms of potential energy savings and policy development impact

  16. From motivation and cognition theories to everyday applications and back again: the case of product-integrated information and feedback

    Energy Technology Data Exchange (ETDEWEB)

    McCalley, L.T. [Technical University Eindhoven/TUE, Den Dolech 2, P.O. Box 513, Eindhoven 5600 MB (Netherlands)

    2006-01-01

    Various moderators of the relationship of goal setting and feedback are explored in four examples of applied empirical research. A selection of theoretical frameworks adapted from varied disciplines guided the studies and are discussed in terms of their value to the particular questions investigated. The experiments all entailed the use of product-integrated energy feedback and illustrate a progressive understanding of how goals, feedback and other information provided to the user can generate or support better energy conservation. Experiment 1 exemplifies the successful use of combining goal setting and feedback, and provides a basic understanding of the interaction from the perspectives of goal setting theory and feedback intervention theory (FIT). Experiment 2 compares FIT to another, fundamentally different, cognitive framework, and the minimal justification principle. The study gives insight into how goals and feedback work through attention focus and the goal hierarchy to guide behavior, the role of attitude in this process, and offers evidence that FIT better accounts for task specific conservation behavior. Experiment 3 addresses the role of goals and information in strategy planning through the perspective of goal setting theory. Results of this study suggest the need for more development of the basic theory and illustrate the strong motivational properties of having a goal. Experiment 4 investigates a more fundamental process, anchoring bias, taken from decision theory and the theory of rational choice. This experiment was based again on FIT and provided further evidence of behavioral control through the focus of attention at a particular level of the goal hierarchy. Findings are discussed in terms of potential energy savings and policy development impact. (author)

  17. Models for Theory-Based M.A. and Ph.D. Programs.

    Science.gov (United States)

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  18. Attachment and the processing of social information across the life span: theory and evidence.

    Science.gov (United States)

    Dykas, Matthew J; Cassidy, Jude

    2011-01-01

    Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the patterns of results that have emerged from these studies. The central proposition is that individuals who possess secure experience-based internal working models of attachment will process--in a relatively open manner--a broad range of positive and negative attachment-relevant social information. Moreover, secure individuals will draw on their positive attachment-related knowledge to process this information in a positively biased schematic way. In contrast, individuals who possess insecure internal working models of attachment will process attachment-relevant social information in one of two ways, depending on whether the information could cause the individual psychological pain. If processing the information is likely to lead to psychological pain, insecure individuals will defensively exclude this information from further processing. If, however, the information is unlikely to lead to psychological pain, then insecure individuals will process this information in a negatively biased schematic fashion that is congruent with their negative attachment-related experiences. In a comprehensive literature review, we describe studies that illustrate these patterns of attachment-related information processing from childhood to adulthood. This review focuses on studies that have examined specific components (e.g., attention and memory) and broader aspects (e.g., attributions) of social information processing. We also provide general conclusions and suggestions for future research.

  19. The Nature of the Chemical Process. 1. Symmetry Evolution – Revised Information Theory, Similarity Principle and Ugly Symmetry

    Directory of Open Access Journals (Sweden)

    Shu-Kun Lin

    2001-03-01

    Full Text Available Abstract: Symmetry is a measure of indistinguishability. Similarity is a continuous measure of imperfect symmetry. Lewis' remark that “gain of entropy means loss of information” defines the relationship of entropy and information. Three laws of information theory have been proposed. Labeling by introducing nonsymmetry and formatting by introducing symmetry are defined. The function L ( L=lnw, w is the number of microstates, or the sum of entropy and information, L=S+I of the universe is a constant (the first law of information theory. The entropy S of the universe tends toward a maximum (the second law law of information theory. For a perfect symmetric static structure, the information is zero and the static entropy is the maximum (the third law law of information theory. Based on the Gibbs inequality and the second law of the revised information theory we have proved the similarity principle (a continuous higher similarity−higher entropy relation after the rejection of the Gibbs paradox and proved the Curie-Rosen symmetry principle (a higher symmetry−higher stability relation as a special case of the similarity principle. The principles of information minimization and potential energy minimization are compared. Entropy is the degree of symmetry and information is the degree of nonsymmetry. There are two kinds of symmetries: dynamic and static symmetries. Any kind of symmetry will define an entropy and, corresponding to the dynamic and static symmetries, there are static entropy and dynamic entropy. Entropy in thermodynamics is a special kind of dynamic entropy. Any spontaneous process will evolve towards the highest possible symmetry, either dynamic or static or both. Therefore the revised information theory can be applied to characterizing all kinds of structural stability and process spontaneity. Some examples in chemical physics have been given. Spontaneous processes of all kinds of molecular

  20. Reconstructing Historical Changes in Watersheds from Environmental Records: An Information Theory Approach

    Science.gov (United States)

    Guerrero, F. J.; Hatten, J. A.; Ruddell, B.; Penaranda, V.; Murillo, P.

    2015-12-01

    A 20% of the world's population is living in watersheds that suffer from water shortage. This situation has complex causes associated with historical changes in watersheds. However, disentangling the role of key drivers of water availability like climate change or land use practices is challenging. Part of the difficulty resides in that historical analysis is basically a process of empirical reconstruction from available environmental records (e.g. sediment cores or long-term hydrologic time series). We developed a mathematical approach, based on information theory, for historical reconstructions in watersheds. We analyze spectral entropies calculated directly or indirectly for sediment cores or long-term hydrologic time series respectively. Spectral entropy measures changes in Shannon's information of natural patterns (e.g. particle size distributions in lake bottoms or streamflow regimes) as they respond to different drivers. We illustrate the application of our approach with two case studies: a reconstruction of a time series of historical changes from a sediment core, and the detection of hydrologic alterations in watersheds associated to climate and forestry activities. In the first case we calculated spectral entropies from 700 sediment layers encompassing 1500 years of history in Loon Lake (Southern Oregon). In the second case, we calculated annual spectral entropies from daily discharge for the last 45 years in two experimental watersheds in the H. J. Andrews LTER site (Oregon Cascades). In Loon Lake our approach separated, without supervision, earthquakes from landslides and floods. It can also help to improve age models for sedimentary layers. At H. J. Andrews's sites our approach was able to identify hydrological alterations following a complete clear cut in 1975. It is also helpful to identify potential long-term impacts of these forestry activities, enhanced by climate change. Our results suggest that spectral entropy is central for translating between

  1. Robust Feature Selection from Microarray Data Based on Cooperative Game Theory and Qualitative Mutual Information

    Directory of Open Access Journals (Sweden)

    Atiyeh Mortazavi

    2016-01-01

    Full Text Available High dimensionality of microarray data sets may lead to low efficiency and overfitting. In this paper, a multiphase cooperative game theoretic feature selection approach is proposed for microarray data classification. In the first phase, due to high dimension of microarray data sets, the features are reduced using one of the two filter-based feature selection methods, namely, mutual information and Fisher ratio. In the second phase, Shapley index is used to evaluate the power of each feature. The main innovation of the proposed approach is to employ Qualitative Mutual Information (QMI for this purpose. The idea of Qualitative Mutual Information causes the selected features to have more stability and this stability helps to deal with the problem of data imbalance and scarcity. In the third phase, a forward selection scheme is applied which uses a scoring function to weight each feature. The performance of the proposed method is compared with other popular feature selection algorithms such as Fisher ratio, minimum redundancy maximum relevance, and previous works on cooperative game based feature selection. The average classification accuracy on eleven microarray data sets shows that the proposed method improves both average accuracy and average stability compared to other approaches.

  2. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    Science.gov (United States)

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  3. Academic Primer Series: Eight Key Papers about Education Theory.

    Science.gov (United States)

    Gottlieb, Michael; Boysen-Osborn, Megan; Chan, Teresa M; Krzyzaniak, Sara M; Pineda, Nicolas; Spector, Jordan; Sherbino, Jonathan

    2017-02-01

    Many teachers adopt instructional methods based on assumptions of best practices without attention to or knowledge of supporting education theory. Familiarity with a variety of theories informs education that is efficient, strategic, and evidence-based. As part of the Academic Life in Emergency Medicine Faculty Incubator Program, a list of key education theories for junior faculty was developed. A list of key papers on theories relevant to medical education was generated using an expert panel, a virtual community of practice synthetic discussion, and a social media call for resources. A three-round, Delphi-informed voting methodology including novice and expert educators produced a rank order of the top papers. These educators identified 34 unique papers. Eleven papers described the general use of education theory, while 23 papers focused on a specific theory. The top three papers on general education theories and top five papers on specific education theory were selected and summarized. The relevance of each paper for junior faculty and faculty developers is also presented. This paper presents a reading list of key papers for junior faculty in medical education roles. Three papers about general education theories and five papers about specific educational theories are identified and annotated. These papers may help provide foundational knowledge in education theory to inform junior faculty teaching practice.

  4. Consensus based on learning game theory with a UAV rendezvous application

    Directory of Open Access Journals (Sweden)

    Zhongjie Lin

    2015-02-01

    Full Text Available Multi-agent cooperation problems are becoming more and more attractive in both civilian and military applications. In multi-agent cooperation problems, different network topologies will decide different manners of cooperation between agents. A centralized system will directly control the operation of each agent with information flow from a single centre, while in a distributed system, agents operate separately under certain communication protocols. In this paper, a systematic distributed optimization approach will be established based on a learning game algorithm. The convergence of the algorithm will be proven under the game theory framework. Two typical consensus problems will be analyzed with the proposed algorithm. The contributions of this work are threefold. First, the designed algorithm inherits the properties in learning game theory for problem simplification and proof of convergence. Second, the behaviour of learning endows the algorithm with robustness and autonomy. Third, with the proposed algorithm, the consensus problems will be analyzed from a novel perspective.

  5. Information and meaning revisiting Shannon's theory of communication and extending it to address todays technical problems.

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Travis LaDell

    2009-12-01

    This paper has three goals. The first is to review Shannon's theory of information and the subsequent advances leading to today's statistics-based text analysis algorithms, showing that the semantics of the text is neglected. The second goal is to propose an extension of Shannon's original model that can take into account semantics, where the 'semantics' of a message is understood in terms of the intended or actual changes on the recipient of a message. The third goal is to propose several lines of research that naturally fall out of the proposed model. Each computational approach to solving some problem rests on an underlying model or set of models that describe how key phenomena in the real world are represented and how they are manipulated. These models are both liberating and constraining. They are liberating in that they suggest a path of development for new tools and algorithms. They are constraining in that they intentionally ignore other potential paths of development. Modern statistical-based text analysis algorithms have a specific intellectual history and set of underlying models rooted in Shannon's theory of communication. For Shannon, language is treated as a stochastic generator of symbol sequences. Shannon himself, subsequently Weaver, and at least one of his predecessors are all explicit in their decision to exclude semantics from their models. This rejection of semantics as 'irrelevant to the engineering problem' is elegant and combined with developments particularly by Salton and subsequently by Latent Semantic Analysis, has led to a whole collection of powerful algorithms and an industry for data mining technologies. However, the kinds of problems currently facing us go beyond what can be accounted for by this stochastic model. Today's problems increasingly focus on the semantics of specific pieces of information. And although progress is being made with the old models, it seems natural to develop or

  6. Feasibility study of molecular memory device based on DNA using methylation to store information

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Liming; Al-Dirini, Feras [Department of Electrical and Electronic Engineering, The University of Melbourne, Parkville 3010 (Australia); Center for Neural Engineering (CfNE), The University of Melbourne, Carlton 3053 (Australia); National ICT Australia, The University of Melbourne, Parkville 3010 (Australia); Qiu, Wanzhi; Skafidas, Efstratios, E-mail: sskaf@unimelb.edu.au [Department of Electrical and Electronic Engineering, The University of Melbourne, Parkville 3010 (Australia); Center for Neural Engineering (CfNE), The University of Melbourne, Carlton 3053 (Australia); Hossain, Faruque M. [Center for Neural Engineering (CfNE), The University of Melbourne, Carlton 3053 (Australia); Evans, Robin [Department of Electrical and Electronic Engineering, The University of Melbourne, Parkville 3010 (Australia)

    2016-07-14

    DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibrium Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.

  7. Feasibility study of molecular memory device based on DNA using methylation to store information

    International Nuclear Information System (INIS)

    Jiang, Liming; Al-Dirini, Feras; Qiu, Wanzhi; Skafidas, Efstratios; Hossain, Faruque M.; Evans, Robin

    2016-01-01

    DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibrium Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.

  8. Semantic Mining based on graph theory and ontologies. Case Study: Cell Signaling Pathways

    Directory of Open Access Journals (Sweden)

    Carlos R. Rangel

    2016-08-01

    Full Text Available In this paper we use concepts from graph theory and cellular biology represented as ontologies, to carry out semantic mining tasks on signaling pathway networks. Specifically, the paper describes the semantic enrichment of signaling pathway networks. A cell signaling network describes the basic cellular activities and their interactions. The main contribution of this paper is in the signaling pathway research area, it proposes a new technique to analyze and understand how changes in these networks may affect the transmission and flow of information, which produce diseases such as cancer and diabetes. Our approach is based on three concepts from graph theory (modularity, clustering and centrality frequently used on social networks analysis. Our approach consists into two phases: the first uses the graph theory concepts to determine the cellular groups in the network, which we will call them communities; the second uses ontologies for the semantic enrichment of the cellular communities. The measures used from the graph theory allow us to determine the set of cells that are close (for example, in a disease, and the main cells in each community. We analyze our approach in two cases: TGF-ß and the Alzheimer Disease.

  9. Translation Theory 'Translated'

    DEFF Research Database (Denmark)

    Wæraas, Arild; Nielsen, Jeppe

    2016-01-01

    Translation theory has proved to be a versatile analytical lens used by scholars working from different traditions. On the basis of a systematic literature review, this study adds to our understanding of the ‘translations’ of translation theory by identifying the distinguishing features of the most...... common theoretical approaches to translation within the organization and management discipline: actor-network theory, knowledge-based theory, and Scandinavian institutionalism. Although each of these approaches already has borne much fruit in research, the literature is diverse and somewhat fragmented......, but also overlapping. We discuss the ways in which the three versions of translation theory may be combined and enrich each other so as to inform future research, thereby offering a more complete understanding of translation in and across organizational settings....

  10. STUDENTS’ GEOMETRIC THINKING BASED ON VAN HIELE’S THEORY

    Directory of Open Access Journals (Sweden)

    Harina Fitriyani

    2018-02-01

    Full Text Available The current study aims to identify the development level of students’ geometric thinking in mathematics education department, Universitas Ahmad Dahlan based on van Hiele’s theory. This is a descriptive qualitative research with the respondents as many as 129 students. In addition to researchers, the instrument used in this study is a test consisting of 25 items multiple choice questions. The data is analyzed by using Milles and Huberman model. The result shows that there were 30,65% of students in pre-visualization level, 21,51% of students in visualizes level, and 29,03% of students in analyze level, 16,67% of students in informal deduction level, 2,15% of students in deduction level, and 0,00% of student in rigor level. Furthermore, findings indicated a transition level among development levels of geometric thinking in pre-analyze, pre-informal deduction, pre-deduction, and pre-rigor that were 20%; 13,44%; 6,45%; 1,08% respectively. The other findings were 40,32% of students were difficult to determine and 4,3% of students cannot be identified.

  11. INFORMATIONAL-METHODICAL SUPPORT OF THE COURSE «MATHEMATICAL LOGIC AND THEORY OF ALGORITHMS»

    Directory of Open Access Journals (Sweden)

    Y. I. Sinko

    2010-06-01

    Full Text Available In this article the basic principles of training technique of future teachers of mathematics to foundations of mathematical logic and theory of algorithms in the Kherson State University with the use of information technologies are examined. General description of functioning of the methodical system of learning of mathematical logic with the use of information technologies, in that variant, when information technologies are presented by the integrated specialized programmatic environment of the educational purpose «MatLog» is given.

  12. Jigsaw Cooperative Learning: Acid-Base Theories

    Science.gov (United States)

    Tarhan, Leman; Sesen, Burcin Acar

    2012-01-01

    This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

  13. Unity-Based Diversity: System Approach to Defining Information

    Directory of Open Access Journals (Sweden)

    Yixin Zhong

    2011-07-01

    Full Text Available What is information? This is the first question that information science should answer clearly. However, the definitions of information have been so diversified that people are questioning if there is any unity among the diversity, leading to a suspicion on whether it is possible to establish a unified theory of information or not. To answer this question, a system approach to defining information is introduced in this paper. It is proved that the unity of information definitions can be maintained with this approach. As a by-product, an important concept, the information eco-system, was also achieved.

  14. Communication Theory.

    Science.gov (United States)

    Penland, Patrick R.

    Three papers are presented which delineate the foundation of theory and principles which underlie the research and instructional approach to communications at the Graduate School of Library and Information Science, University of Pittsburgh. Cybernetic principles provide the integration, and validation is based in part on a situation-producing…

  15. Centralizing Data Management with Considerations of Uncertainty and Information-Based Flexibility

    OpenAIRE

    Velu, Chander K.; Madnick, Stuart E.; Van Alstyne, Marshall W.

    2013-01-01

    This paper applies the theory of real options to analyze how the value of information-based flexibility should affect the decision to centralize or decentralize data management under low and high uncertainty. This study makes two main contributions. First, we show that in the presence of low uncertainty, centralization of data management decisions creates more total surplus for the firm as the similarity of business units increases. In contrast, in the presence of high uncertainty, centraliza...

  16. Behavioral change theories can inform the prediction of young adults' adoption of a plant-based diet.

    Science.gov (United States)

    Wyker, Brett A; Davison, Kirsten K

    2010-01-01

    Drawing on the Theory of Planned Behavior (TPB) and the Transtheoretical Model (TTM), this study (1) examines links between stages of change for following a plant-based diet (PBD) and consuming more fruits and vegetables (FV); (2) tests an integrated theoretical model predicting intention to follow a PBD; and (3) identifies associated salient beliefs. Cross-sectional. Large public university in the northeastern United States. 204 college students. TPB and TTM constructs were assessed using validated scales. Outcome, normative, and control beliefs were measured using open-ended questions. The overlap between stages of change for FV consumption and adopting a PBD was assessed using Spearman rank correlation analysis and cross-tab comparisons. The proposed model predicting adoption of a PBD was tested using structural equation modeling (SEM). Salient beliefs were coded using automatic response coding software. No association was found between stages of change for FV consumption and following a PBD. Results from SEM analyses provided support for the proposed model predicting intention to follow a PBD. Gender differences in salient beliefs for following a PBD were found. Results demonstrate the potential for effective theory-driven and stage-tailored public health interventions to promote PBDs. Copyright 2010 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.

  17. New frontiers in information and production systems modelling and analysis incentive mechanisms, competence management, knowledge-based production

    CERN Document Server

    Novikov, Dmitry; Bakhtadze, Natalia; Zaikin, Oleg

    2016-01-01

    This book demonstrates how to apply modern approaches to complex system control in practical applications involving knowledge-based systems. The dimensions of knowledge-based systems are extended by incorporating new perspectives from control theory, multimodal systems and simulation methods.  The book is divided into three parts: theory, production system and information system applications. One of its main focuses is on an agent-based approach to complex system analysis. Moreover, specialised forms of knowledge-based systems (like e-learning, social network, and production systems) are introduced with a new formal approach to knowledge system modelling.   The book, which offers a valuable resource for researchers engaged in complex system analysis, is the result of a unique cooperation between scientists from applied computer science (mainly from Poland) and leading system control theory researchers from the Russian Academy of Sciences’ Trapeznikov Institute of Control Sciences.

  18. Evidence of improved fluid management in patients receiving haemodialysis following a self-affirmation theory-based intervention: A randomised controlled trial.

    Science.gov (United States)

    Wileman, Vari; Chilcot, Joseph; Armitage, Christopher J; Farrington, Ken; Wellsted, David M; Norton, Sam; Davenport, Andrew; Franklin, Gail; Da Silva Gane, Maria; Horne, Robert; Almond, Mike

    2016-01-01

    Haemodialysis patients are at risk of serious health complications; yet, treatment non-adherence remains high. Warnings about health risks associated with non-adherence may trigger defensive reactions. We studied whether an intervention based on self-affirmation theory reduced resistance to health-risk information and improved fluid treatment adherence. In a cluster randomised controlled trial, 91 patients either self-affirmed or completed a matched control task before reading about the health-risks associated with inadequate fluid control. Patients' perceptions of the health-risk information, intention and self-efficacy to control fluid were assessed immediately after presentation of health-risk information. Interdialytic weight gain (IDWG), excess fluid removed during haemodialysis, is a clinical measure of fluid treatment adherence. IDWG data were collected up to 12 months post-intervention. Self-affirmed patients had significantly reduced IDWG levels over 12 months. However, contrary to predictions derived from self-affirmation theory, self-affirmed participants and controls did not differ in their evaluation of the health-risk information, intention to control fluid or self-efficacy. A low-cost, high-reach health intervention based on self-affirmation theory was shown to reduce IDWG over a 12-month period, but the mechanism by which this apparent behaviour change occurred is uncertain. Further work is still required to identify mediators of the observed effects.

  19. Workshop on The Functional Analysis of Quantum Information Theory : a Collection of Notes Based on Lectures by Gilles Pisier, K. R. Parthasarathy, Vern Paulsen and Andreas Winter

    CERN Document Server

    Gupta, Ved Prakash; Sunder, V S

    2015-01-01

    This book provides readers with a concise introduction to current studies on operator-algebras and their generalizations, operator spaces and operator systems, with a special focus on their application in quantum information science. This basic framework for the mathematical formulation of quantum information can be traced back to the mathematical work of John von Neumann, one of the pioneers of operator algebras, which forms the underpinning of most current mathematical treatments of the quantum theory, besides being one of the most dynamic areas of twentieth century functional analysis. Today, von Neumann’s foresight finds expression in the rapidly growing field of quantum information theory. These notes gather the content of lectures given by a very distinguished group of mathematicians and quantum information theorists, held at the IMSc in Chennai some years ago, and great care has been taken to present the material as a primer on the subject matter. Starting from the basic definitions of operator space...

  20. Understanding Casual-Leisure Information Behaviour

    DEFF Research Database (Denmark)

    Elsweiler, David; Wilson, Max L.; Lunn, Brian Kirkegaard

    2011-01-01

    Originally grounded in library and information science, the majority of information behaviour and information-seeking theories focus on task-based scenarios where users try to resolve information needs. While other theories exist, such as how people unexpectedly encounter information, for example......, they are typically related back to tasks, motivated by work or personal goals. This chapter, however, focuses on casual-leisure scenarios that are typically motivated by hedonistic needs rather than information needs, where people engage in searching behaviours for pleasure rather than to find information......-leisure scenarios. The results of these two studies are then used to define an initial model of casual-leisure information behaviour, which highlights the key differences between casual-leisure scenarios and typical information behaviour theory. The chapter concludes by discussing how this new model of casual...

  1. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    Science.gov (United States)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate

  2. Learning Styles of Baccalaureate Nursing Students and Attitudes toward Theory-Based Nursing.

    Science.gov (United States)

    Laschinger, Heather K.; Boss, Marvin K.

    1989-01-01

    The personal and environmental factors related to undergraduate and post-RN nursing students' attitudes toward theory-based nursing from Kolb's experiential learning theory perspective were investigated. Learning style and environmental press perceptions were found to be related to attitudes toward theory-based nursing. (Author/MLW)

  3. Making Theory Come Alive through Practice-based Design Research

    DEFF Research Database (Denmark)

    Markussen, Thomas; Knutz, Eva; Rind Christensen, Poul

    The aim of this paper is to demonstrate how practice-based design research is able not only to challenge, but also to push toward further development of some of the basic assumpstions in emotion theories as used within design research. In so doing, we wish to increase knolwedge on a central...... epistemological question for design research, namely how practice-based design research can be a vehicle for the construction of new theory for design research....

  4. A Christian faith-based recovery theory: understanding God as sponsor.

    Science.gov (United States)

    Timmons, Shirley M

    2012-12-01

    This article reports the development of a substantive theory to explain an evangelical Christian-based process of recovery from addiction. Faith-based, 12-step, mutual aid programs can improve drug abstinence by offering: (a) an intervention option alone and/or in conjunction with secular programs and (b) an opportunity for religious involvement. Although literature on religion, spirituality, and addiction is voluminous, traditional 12-step programs fail to explain the mechanism that underpins the process of Christian-based recovery (CR). This pilot study used grounded theory to explore and describe the essence of recovery of 10 former crack cocaine-addicted persons voluntarily enrolled in a CR program. Data were collected from in-depth interviews during 4 months of 2008. Audiotapes were transcribed verbatim, and the constant comparative method was used to analyze data resulting in the basic social process theory, understanding God as sponsor. The theory was determined through writing theoretical memos that generated key elements that allow persons to recover: acknowledging God-centered crises, communicating with God, and planning for the future. Findings from this preliminary study identifies important factors that can help persons in recovery to sustain sobriety and program administrators to benefit from theory that guides the development of evidence-based addiction interventions.

  5. Theory-Based Evaluation Meets Ambiguity: The Role of Janus Variables

    Science.gov (United States)

    Dahler-Larsen, Peter

    2018-01-01

    As theory-based evaluation (TBE) engages in situations where multiple stakeholders help develop complex program theory about dynamic phenomena in politically contested settings, it becomes difficult to develop and use program theory without ambiguity. The purpose of this article is to explore ambiguity as a fruitful perspective that helps TBE face…

  6. Towards a general theory of neural computation based on prediction by single neurons.

    Directory of Open Access Journals (Sweden)

    Christopher D Fiorillo

    Full Text Available Although there has been tremendous progress in understanding the mechanics of the nervous system, there has not been a general theory of its computational function. Here I present a theory that relates the established biophysical properties of single generic neurons to principles of Bayesian probability theory, reinforcement learning and efficient coding. I suggest that this theory addresses the general computational problem facing the nervous system. Each neuron is proposed to mirror the function of the whole system in learning to predict aspects of the world related to future reward. According to the model, a typical neuron receives current information about the state of the world from a subset of its excitatory synaptic inputs, and prior information from its other inputs. Prior information would be contributed by synaptic inputs representing distinct regions of space, and by different types of non-synaptic, voltage-regulated channels representing distinct periods of the past. The neuron's membrane voltage is proposed to signal the difference between current and prior information ("prediction error" or "surprise". A neuron would apply a Hebbian plasticity rule to select those excitatory inputs that are the most closely correlated with reward but are the least predictable, since unpredictable inputs provide the neuron with the most "new" information about future reward. To minimize the error in its predictions and to respond only when excitation is "new and surprising," the neuron selects amongst its prior information sources through an anti-Hebbian rule. The unique inputs of a mature neuron would therefore result from learning about spatial and temporal patterns in its local environment, and by extension, the external world. Thus the theory describes how the structure of the mature nervous system could reflect the structure of the external world, and how the complexity and intelligence of the system might develop from a population of

  7. Hamiltonian theories quantization based on a probability operator

    International Nuclear Information System (INIS)

    Entral'go, E.E.

    1986-01-01

    The quantization method with a linear reflection of classical coordinate-momentum-time functions Λ(q,p,t) at quantum operators in a space of quantum states ψ, is considered. The probability operator satisfies a system of equations representing the principles of dynamical and canonical correspondences between the classical and quantum theories. The quantization based on a probability operator leads to a quantum theory with a nonnegative joint coordinate-momentum distribution function for any state ψ. The main consequences of quantum mechanics with a probability operator are discussed in comparison with the generally accepted quantum and classical theories. It is shown that a probability operator leads to an appearance of some new notions called ''subquantum'' ones. Hence the quantum theory with a probability operator does not pretend to any complete description of physical reality in terms of classical variables and by this reason contains no problems like Einstein-Podolsky-Rosen paradox. The results of some concrete problems are given: a free particle, a harmonic oscillator, an electron in the Coulomb field. These results give hope on the possibility of an experimental verification of the quantization based on a probability operator

  8. Theory and research in audiology education: understanding and representing complexity through informed methodological decisions.

    Science.gov (United States)

    Ng, Stella L

    2013-05-01

    The discipline of audiology has the opportunity to embark on research in education from an informed perspective, learning from professions that began this journey decades ago. The goal of this article is to position our discipline as a new member in the academic field of health professional education (HPE), with much to learn and contribute. In this article, I discuss the need for theory in informing HPE research. I also stress the importance of balancing our research goals by selecting appropriate methodologies for relevant research questions, to ensure that we respect the complexity of social processes inherent in HPE. Examples of relevant research questions are used to illustrate the need to consider alternative methodologies and to rethink the traditional hierarchy of evidence. I also provide an example of the thought processes and decisions that informed the design of an educational research study using a constructivist grounded theory methodology. As audiology enters the scholarly field of HPE, we need to arm ourselves with some of the knowledge and perspective that informs the field. Thus, we need to broaden our conceptions of what we consider to be appropriate styles of academic writing, relevant research questions, and valid evidence. Also, if we are to embark on qualitative inquiry into audiology education (or other audiology topics), we need to ensure that we conduct this research with an adequate understanding of the theories and methodologies informing such approaches. We must strive to conduct high quality, rigorous qualitative research more often than uninformed, generic qualitative research. These goals are imperative to the advancement of the theoretical landscape of audiology education and evolving the place of audiology in the field of HPE. American Academy of Audiology.

  9. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  10. Generalization of information-based concepts in forecast verification

    Science.gov (United States)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  11. A Study of Driver’s Route Choice Behavior Based on Evolutionary Game Theory

    Directory of Open Access Journals (Sweden)

    Xiaowei Jiang

    2014-01-01

    Full Text Available This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers’ route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver’s route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver’s route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent.

  12. Applying Shannon's information theory to bacterial and phage genomes and metagenomes

    Science.gov (United States)

    Akhter, Sajia; Bailey, Barbara A.; Salamon, Peter; Aziz, Ramy K.; Edwards, Robert A.

    2013-01-01

    All sequence data contain inherent information that can be measured by Shannon's uncertainty theory. Such measurement is valuable in evaluating large data sets, such as metagenomic libraries, to prioritize their analysis and annotation, thus saving computational resources. Here, Shannon's index of complete phage and bacterial genomes was examined. The information content of a genome was found to be highly dependent on the genome length, GC content, and sequence word size. In metagenomic sequences, the amount of information correlated with the number of matches found by comparison to sequence databases. A sequence with more information (higher uncertainty) has a higher probability of being significantly similar to other sequences in the database. Measuring uncertainty may be used for rapid screening for sequences with matches in available database, prioritizing computational resources, and indicating which sequences with no known similarities are likely to be important for more detailed analysis.

  13. Information theory and signal transduction systems: from molecular information processing to network inference.

    Science.gov (United States)

    Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H

    2014-11-01

    Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach.

    Science.gov (United States)

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing.

  15. Cognitive effects of mindfulness training: Results of a pilot study based on a theory driven approach

    Directory of Open Access Journals (Sweden)

    Lena Wimmer

    2016-07-01

    Full Text Available The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group, a concentration training (active control group or no treatment (passive control group. Based on the operational definition of mindfulness by Bishop et al. (2004, effects on sustained attention, cognitive flexibility, cognitive inhibition and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing.

  16. A novel string field theory solving string theory by liberating left and right movers

    International Nuclear Information System (INIS)

    Nielsen, Holger B.; Ninomiya, Masao

    2014-01-01

    We put forward ideas to a novel string field theory based on making some “objects” that essentially describe “liberated” left- and right- mover fields X L μ (τ+σ) and X R μ (τ−σ) on the string. Our novel string field theory is completely definitely different from any other string theory in as far as a “null set” of information in the string field theory Fock space has been removed relatively, to the usual string field theories. So our theory is definitely new. The main progress is that we manage to make our novel string field theory provide the correct mass square spectrum for the string. We finally suggest how to obtain the Veneziano amplitude in our model

  17. Mobile applications for weight management: theory-based content analysis.

    Science.gov (United States)

    Azar, Kristen M J; Lesser, Lenard I; Laing, Brian Y; Stephens, Janna; Aurora, Magi S; Burke, Lora E; Palaniappan, Latha P

    2013-11-01

    The use of smartphone applications (apps) to assist with weight management is increasingly prevalent, but the quality of these apps is not well characterized. The goal of the study was to evaluate diet/nutrition and anthropometric tracking apps based on incorporation of features consistent with theories of behavior change. A comparative, descriptive assessment was conducted of the top-rated free apps in the Health and Fitness category available in the iTunes App Store. Health and Fitness apps (N=200) were evaluated using predetermined inclusion/exclusion criteria and categorized based on commonality in functionality, features, and developer description. Four researchers then evaluated the two most popular apps in each category using two instruments: one based on traditional behavioral theory (score range: 0-100) and the other on the Fogg Behavioral Model (score range: 0-6). Data collection and analysis occurred in November 2012. Eligible apps (n=23) were divided into five categories: (1) diet tracking; (2) healthy cooking; (3) weight/anthropometric tracking; (4) grocery decision making; and (5) restaurant decision making. The mean behavioral theory score was 8.1 (SD=4.2); the mean persuasive technology score was 1.9 (SD=1.7). The top-rated app on both scales was Lose It! by Fitnow Inc. All apps received low overall scores for inclusion of behavioral theory-based strategies. © 2013 American Journal of Preventive Medicine.

  18. Informational Closed-Loop Coding-Decoding Control Concept as the Base of the Living or Organized Systems Theory

    Science.gov (United States)

    Kirvelis, Dobilas; Beitas, Kastytis

    2008-10-01

    The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.

  19. Informal information for web-based engineering catalogues

    Science.gov (United States)

    Allen, Richard D.; Culley, Stephen J.; Hicks, Ben J.

    2001-10-01

    Success is highly dependent on the ability of a company to efficiently produce optimal designs. In order to achieve this companies must minimize time to market and possess the ability to make fully informed decisions at the early phase of the design process. Such decisions may include the choice of component and suppliers, as well as cost and maintenance considerations. Computer modeling and electronic catalogues are becoming the preferred medium for the selection and design of mechanical components. In utilizing these techniques, the designer demands the capability to identify, evaluate and select mechanical components both quantitatively and qualitatively. Quantitative decisions generally encompass performance data included in the formal catalogue representation. It is in the area of qualitative decisions that the use of what the authors call 'Informal Information' is of crucial importance. Thus, 'Informal Information' must often be incorporated into the selection process and selection systems. This would enable more informed decisions to be made quicker, without the need for information retrieval via discussion with colleagues in the design environment. This paper provides an overview of the use of electronic information in the design of mechanical systems, including a discussion of limitations of current technology. The importance of Informal Information is discussed and the requirements for association with web based electronic catalogues are developed. This system is based on a flexible XML schema and enables the storage, classification and recall of Informal Information packets. Furthermore, a strategy for the inclusion of Informal Information is proposed, and an example case is used to illustrate the benefits.

  20. Applied research of quantum information based on linear optics

    International Nuclear Information System (INIS)

    Xu, Xiao-Ye

    2016-01-01

    This thesis reports on outstanding work in two main subfields of quantum information science: one involves the quantum measurement problem, and the other concerns quantum simulation. The thesis proposes using a polarization-based displaced Sagnac-type interferometer to achieve partial collapse measurement and its reversal, and presents the first experimental verification of the nonlocality of the partial collapse measurement and its reversal. All of the experiments are carried out in the linear optical system, one of the earliest experimental systems to employ quantum communication and quantum information processing. The thesis argues that quantum measurement can yield quantum entanglement recovery, which is demonstrated by using the frequency freedom to simulate the environment. Based on the weak measurement theory, the author proposes that white light can be used to precisely estimate phase, and effectively demonstrates that the imaginary part of the weak value can be introduced by means of weak measurement evolution. Lastly, a nine-order polarization-based displaced Sagnac-type interferometer employing bulk optics is constructed to perform quantum simulation of the Landau-Zener evolution, and by tuning the system Hamiltonian, the first experiment to research the Kibble-Zurek mechanism in non-equilibrium kinetics processes is carried out in the linear optical system.

  1. Applied research of quantum information based on linear optics

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Xiao-Ye

    2016-08-01

    This thesis reports on outstanding work in two main subfields of quantum information science: one involves the quantum measurement problem, and the other concerns quantum simulation. The thesis proposes using a polarization-based displaced Sagnac-type interferometer to achieve partial collapse measurement and its reversal, and presents the first experimental verification of the nonlocality of the partial collapse measurement and its reversal. All of the experiments are carried out in the linear optical system, one of the earliest experimental systems to employ quantum communication and quantum information processing. The thesis argues that quantum measurement can yield quantum entanglement recovery, which is demonstrated by using the frequency freedom to simulate the environment. Based on the weak measurement theory, the author proposes that white light can be used to precisely estimate phase, and effectively demonstrates that the imaginary part of the weak value can be introduced by means of weak measurement evolution. Lastly, a nine-order polarization-based displaced Sagnac-type interferometer employing bulk optics is constructed to perform quantum simulation of the Landau-Zener evolution, and by tuning the system Hamiltonian, the first experiment to research the Kibble-Zurek mechanism in non-equilibrium kinetics processes is carried out in the linear optical system.

  2. Activity-Based Design as a Way to Bridge Artifacts, Professions, and Theories

    DEFF Research Database (Denmark)

    Brynskov, Martin

    2007-01-01

    This paper will focus on the challenges in designing pervasive computing technology for children’s play, taking into account current trends in popular culture. In search of theoretical support for this work I have been exploring an activity-based approach called ‘habitats’ to describe the conditi......, informational, and pragmatic – together with the ability to describe their relations are a useful platform for practitioners and theorists who are forced to span a heterogeneous mash-up of technologies, theories, and professions.......This paper will focus on the challenges in designing pervasive computing technology for children’s play, taking into account current trends in popular culture. In search of theoretical support for this work I have been exploring an activity-based approach called ‘habitats’ to describe...

  3. Evaluating accounting information systems that support multiple GAAP reporting using Normalized Systems Theory

    NARCIS (Netherlands)

    Vanhoof, E.; Huysmans, P.; Aerts, Walter; Verelst, J.; Aveiro, D.; Tribolet, J.; Gouveia, D.

    2014-01-01

    This paper uses a mixed methods approach of design science and case study research to evaluate structures of Accounting Information Systems (AIS) that report in multiple Generally Accepted Accounting Principles (GAAP), using Normalized Systems Theory (NST). To comply with regulation, many companies

  4. A four stage approach for ontology-based health information system design.

    Science.gov (United States)

    Kuziemsky, Craig E; Lau, Francis

    2010-11-01

    To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Generating information-rich high-throughput experimental materials genomes using functional clustering via multitree genetic programming and information theory.

    Science.gov (United States)

    Suram, Santosh K; Haber, Joel A; Jin, Jian; Gregoire, John M

    2015-04-13

    High-throughput experimental methodologies are capable of synthesizing, screening and characterizing vast arrays of combinatorial material libraries at a very rapid rate. These methodologies strategically employ tiered screening wherein the number of compositions screened decreases as the complexity, and very often the scientific information obtained from a screening experiment, increases. The algorithm used for down-selection of samples from higher throughput screening experiment to a lower throughput screening experiment is vital in achieving information-rich experimental materials genomes. The fundamental science of material discovery lies in the establishment of composition-structure-property relationships, motivating the development of advanced down-selection algorithms which consider the information value of the selected compositions, as opposed to simply selecting the best performing compositions from a high throughput experiment. Identification of property fields (composition regions with distinct composition-property relationships) in high throughput data enables down-selection algorithms to employ advanced selection strategies, such as the selection of representative compositions from each field or selection of compositions that span the composition space of the highest performing field. Such strategies would greatly enhance the generation of data-driven discoveries. We introduce an informatics-based clustering of composition-property functional relationships using a combination of information theory and multitree genetic programming concepts for identification of property fields in a composition library. We demonstrate our approach using a complex synthetic composition-property map for a 5 at. % step ternary library consisting of four distinct property fields and finally explore the application of this methodology for capturing relationships between composition and catalytic activity for the oxygen evolution reaction for 5429 catalyst compositions in a

  6. Condition Evaluation of Storage Equipment Based on Improved D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Zhang Xiao-yu

    2017-01-01

    Full Text Available Assessment and prediction of the storage equipment’s condition is always a difficult aspect in PHM technology. The current Condition evaluation of equipment lacks of the state level, and a single test data can’t reflect the change of equipment’s state. To solve the problem, this paper proposes an evaluation method based on improved D-S evidence theory. Firstly, use analytic hierarchy process (AHP to establish a hierarchical structure model of equipment and divide the qualified state into 4 grades. Then respectively compare the test data with the last test value, historical test mean value and standard value. And the triangular fuzzy function to calculate the index membership degree, combined with D-S evidence theory to fuse information from multiple sources, to achieve such equipment real-time state assessment. Finally, the model is used to a servo mechanism. The result shows that this method has a good performance in condition evaluation for the storage equipment

  7. The Theory-based Influence of Map Features on Risk Beliefs: Self-reports of What is Seen and Understood for Maps Depicting an Environmental Health Hazard

    OpenAIRE

    Severtson, Dolores J.; Vatovec, Christine

    2012-01-01

    Theory-based research is needed to understand how maps of environmental health risk information influence risk beliefs and protective behavior. Using theoretical concepts from multiple fields of study including visual cognition, semiotics, health behavior, and learning and memory supports a comprehensive assessment of this influence. We report results from thirteen cognitive interviews that provide theory-based insights into how visual features influenced what participants saw ...

  8. Game Theory Based Security in Wireless Body Area Network with Stackelberg Security Equilibrium.

    Science.gov (United States)

    Somasundaram, M; Sivakumar, R

    2015-01-01

    Wireless Body Area Network (WBAN) is effectively used in healthcare to increase the value of the patient's life and also the value of healthcare services. The biosensor based approach in medical care system makes it difficult to respond to the patients with minimal response time. The medical care unit does not deploy the accessing of ubiquitous broadband connections full time and hence the level of security will not be high always. The security issue also arises in monitoring the user body function records. Most of the systems on the Wireless Body Area Network are not effective in facing the security deployment issues. To access the patient's information with higher security on WBAN, Game Theory with Stackelberg Security Equilibrium (GTSSE) is proposed in this paper. GTSSE mechanism takes all the players into account. The patients are monitored by placing the power position authority initially. The position authority in GTSSE is the organizer and all the other players react to the organizer decision. Based on our proposed approach, experiment has been conducted on factors such as security ratio based on patient's health information, system flexibility level, energy consumption rate, and information loss rate. Stackelberg Security considerably improves the strength of solution with higher security.

  9. Game Theory Based Security in Wireless Body Area Network with Stackelberg Security Equilibrium

    Science.gov (United States)

    Somasundaram, M.; Sivakumar, R.

    2015-01-01

    Wireless Body Area Network (WBAN) is effectively used in healthcare to increase the value of the patient's life and also the value of healthcare services. The biosensor based approach in medical care system makes it difficult to respond to the patients with minimal response time. The medical care unit does not deploy the accessing of ubiquitous broadband connections full time and hence the level of security will not be high always. The security issue also arises in monitoring the user body function records. Most of the systems on the Wireless Body Area Network are not effective in facing the security deployment issues. To access the patient's information with higher security on WBAN, Game Theory with Stackelberg Security Equilibrium (GTSSE) is proposed in this paper. GTSSE mechanism takes all the players into account. The patients are monitored by placing the power position authority initially. The position authority in GTSSE is the organizer and all the other players react to the organizer decision. Based on our proposed approach, experiment has been conducted on factors such as security ratio based on patient's health information, system flexibility level, energy consumption rate, and information loss rate. Stackelberg Security considerably improves the strength of solution with higher security. PMID:26759829

  10. 'Sustaining Place' - a grounded theory of how informal carers of people with dementia manage alterations to relationships within their social worlds.

    Science.gov (United States)

    Daly, Louise; McCarron, Mary; Higgins, Agnes; McCallion, Philip

    2013-02-01

    This paper presents a theory explaining the processes used by informal carers of people with dementia to mange alterations to their, and people with dementias' relationships with and places within their social worlds. Informal carers provide the majority of care to people with dementia. A great deal of international informal dementia care research is available, much of which elucidates the content, impacts and consequences of the informal caring role and the coping mechanisms that carers use. However, the socially situated experiences and processes integral to informal caring in dementia have not yet been robustly accounted for. A classic grounded theory approach was used as it is designed for research enquiries that aim to generate theory illustrating social patterns of action used to address an identified problem. Thirty interviews were conducted with 31 participants between 2006-2008. The theory was conceptualised from the data using the concurrent methods of theoretical sampling, constant comparative analysis, memo writing and theoretical sensitivity. Informal carers' main concern was identified as 'Living on the fringes', which was stimulated by dementia-related stigma and living a different life. The theory of 'Sustaining Place' explains the social pattern of actions employed by informal carers to manage this problem on behalf of themselves and the person with dementia. The theory of 'Sustaining Place' identifies an imperative for nurses, other formal carers and society to engage in actions to support and enable social connectedness, social inclusion and citizenship for informal carers and people with dementia. 'Sustaining Place' facilitates enhanced understanding of the complex and socially situated nature of informal dementia care through its portrayal of informal carers as social agents and can be used to guide nurses to better support those who live with dementia. © 2012 Blackwell Publishing Ltd.

  11. Support vector machines optimization based theory, algorithms, and extensions

    CERN Document Server

    Deng, Naiyang; Zhang, Chunhua

    2013-01-01

    Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)-classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built.The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twi

  12. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    Science.gov (United States)

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  13. Introduction to the theory of bases

    CERN Document Server

    Marti, Jürg T

    1969-01-01

    Since the publication of Banach's treatise on the theory of linear operators, the literature on the theory of bases in topological vector spaces has grown enormously. Much of this literature has for its origin a question raised in Banach's book, the question whether every sepa­ rable Banach space possesses a basis or not. The notion of a basis employed here is a generalization of that of a Hamel basis for a finite dimensional vector space. For a vector space X of infinite dimension, the concept of a basis is closely related to the convergence of the series which uniquely correspond to each point of X. Thus there are different types of bases for X, according to the topology imposed on X and the chosen type of convergence for the series. Although almost four decades have elapsed since Banach's query, the conjectured existence of a basis for every separable Banach space is not yet proved. On the other hand, no counter examples have been found to show the existence of a special Banach space having no basis. Howe...

  14. Increasing Bellevue School District's elementary teachers' capacity for teaching inquiry-based science: Using ideas from contemporary learning theory to inform professional development

    Science.gov (United States)

    Maury, Tracy Anne

    This Capstone project examined how leaders in the Bellevue School District can increase elementary teachers' capacity for teaching inquiry-based science through the use of professional learning activities that are grounded in ideas from human learning theory. A framework for professional development was constructed and from that framework, a set of professional learning activities were developed as a means to support teacher learning while project participants piloted new curriculum called the Isopod Habitat Challenge. Teachers in the project increased their understanding of the learning theory principles of preconceptions and metacognition. Teachers did not increase their understanding of the principle of learning with understanding, although they did articulate the significance of engaging children in student-led inquiry cycles. Data from the curriculum revision and professional development project coupled with ideas from learning theory, cognition and policy implementation, and learning community literatures suggest Bellevue's leaders can encourage peer-to-peer interaction, link professional development to teachers' daily practice, and capitalize on technology as ways to increase elementary teachers' capacity for teaching inquiry-based science. These lessons also have significance for supporting teacher learning and efficacy in other subject areas and at other levels in the system.

  15. FUSION SEGMENTATION METHOD BASED ON FUZZY THEORY FOR COLOR IMAGES

    Directory of Open Access Journals (Sweden)

    J. Zhao

    2017-09-01

    Full Text Available The image segmentation method based on two-dimensional histogram segments the image according to the thresholds of the intensity of the target pixel and the average intensity of its neighborhood. This method is essentially a hard-decision method. Due to the uncertainties when labeling the pixels around the threshold, the hard-decision method can easily get the wrong segmentation result. Therefore, a fusion segmentation method based on fuzzy theory is proposed in this paper. We use membership function to model the uncertainties on each color channel of the color image. Then, we segment the color image according to the fuzzy reasoning. The experiment results show that our proposed method can get better segmentation results both on the natural scene images and optical remote sensing images compared with the traditional thresholding method. The fusion method in this paper can provide new ideas for the information extraction of optical remote sensing images and polarization SAR images.

  16. Nano-resonator frequency response based on strain gradient theory

    International Nuclear Information System (INIS)

    Miandoab, Ehsan Maani; Yousefi-Koma, Aghil; Pishkenari, Hossein Nejat; Fathi, Mohammad

    2014-01-01

    This paper aims to explore the dynamic behaviour of a nano-resonator under ac and dc excitation using strain gradient theory. To achieve this goal, the partial differential equation of nano-beam vibration is first converted to an ordinary differential equation by the Galerkin projection method and the lumped model is derived. Lumped parameters of the nano-resonator, such as linear and nonlinear springs and damper coefficients, are compared with those of classical theory and it is demonstrated that beams with smaller thickness display greater deviation from classical parameters. Stable and unstable equilibrium points based on classic and non-classical theories are also compared. The results show that, regarding the applied dc voltage, the dynamic behaviours expected by classical and non-classical theories are significantly different, such that one theory predicts the un-deformed shape as the stable condition, while the other theory predicts that the beam will experience bi-stability. To obtain the frequency response of the nano-resonator, a general equation including cubic and quadratic nonlinearities in addition to parametric electrostatic excitation terms is derived, and the analytical solution is determined using a second-order multiple scales method. Based on frequency response analysis, the softening and hardening effects given by two theories are investigated and compared, and it is observed that neglecting the size effect can lead to two completely different predictions in the dynamic behaviour of the resonators. The findings of this article can be helpful in the design and characterization of the size-dependent dynamic behaviour of resonators on small scales. (paper)

  17. Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.

    Science.gov (United States)

    Hilbig, Benjamin E; Michalkiewicz, Martha; Castela, Marta; Pohl, Rüdiger F; Erdfelder, Edgar

    2015-05-01

    One of the most prominent models of probabilistic inferences from memory is the simple recognition heuristic (RH). The RH theory assumes that judgments are based on recognition in isolation, such that other information is ignored. However, some prior research has shown that available knowledge is not generally ignored. In line with the notion of adaptive strategy selection--and, thus, a trade-off between accuracy and effort--we hypothesized that information integration crucially depends on how easily accessible information beyond recognition is, how much confidence decision makers have in this information, and how (cognitively) costly it is to acquire it. In three experiments, we thus manipulated (a) the availability of information beyond recognition, (b) the subjective usefulness of this information, and (c) the cognitive costs associated with acquiring this information. In line with the predictions, we found that RH use decreased substantially, the more easily and confidently information beyond recognition could be integrated, and increased substantially with increasing cognitive costs.

  18. Evidence Combination From an Evolutionary Game Theory Perspective.

    Science.gov (United States)

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2016-09-01

    Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.

  19. Improving a health information system for real-time data entries: An action research project using socio-technical systems theory.

    Science.gov (United States)

    Adaba, Godfried Bakiyem; Kebebew, Yohannes

    2018-03-01

    This paper presents the findings of an action research (AR) project to improve a health information system (HIS) at the Operating Theater Department (OTD) of a National Health Service (NHS) hospital in South East England, the UK. Informed by socio-technical systems (STS) theory, AR was used to design an intervention to enhance an existing patient administration system (PAS) to enable data entries in real time while contributing to the literature. The study analyzed qualitative data collected through interviews, participant observations, and document reviews. The study found that the design of the PAS was unsuitable to the work of the three units of the OTD. Based on the diagnoses and STS theory, the project developed and implemented a successful intervention to enhance the legacy system for data entries in real time. The study demonstrates the value of AR from a socio-technical perspective for improving existing systems in healthcare settings. The steps adopted in this study could be applied to improve similar systems. A follow-up study will be essential to assess the sustainability of the improved system.

  20. Labor Informality: General Causes

    Directory of Open Access Journals (Sweden)

    Gustavo Sandoval Betancour

    2016-04-01

    Full Text Available The article examines the main causes of labor informality in order to verify the validity of classical theories that explain unemployment in market economies and its relationship to informality. Methodologically, the project was based, in the empirical part, on international statistics, comparing the evolution of labor market structure in a combined sample of highly industrialized countries and other less industrialized ones. Empirical evidence supports the conclusion that the classical economic theory of Marxist origin is inefficient to explain the causes of unemployment in contemporary market economies, as well as it fails to satisfactorily explain informality. On the contrary, we conclude that the theory in question is more relevant to explain informality in centrally planned economies where this phenomenon has been present even more significantly than in free market economies.

  1. Experimental status of unified theories

    International Nuclear Information System (INIS)

    Bilen'kij, S.M.

    1979-01-01

    A standard SU(2)xU(1) theory is discussed. It is based on an assumption that the left components of fields form doublets, and the rignt ones - singlets. From the weak interaction lagrangian an expression is obtained for the effective hamiltonian describing neutrino-lepton processes. The results of discussing the experimental status of the unified theories of weak and electromagnetic interactions are in agreement with the simplest version of the unified theories - the Weinberg-Salam theory. It has been noted that the accuracy of the experiments (not exceeding 20%) is insufficient and no information is available on diagonal terms of the hamiltonian

  2. PENGALAMAN MAHASISWA DALAM MELAKUKAN WIRAUSAHA INFORMASI: SEBUAH PENELITIAN GROUNDED THEORY

    Directory of Open Access Journals (Sweden)

    Afdini Rihlatul Mahmudah

    2016-11-01

    Full Text Available The main focus of this paper is to explore students’ understanding of information entrepreneurship. This is a qualitative study using grounded theory. Selection of the method is based on research conducted purpose, namely to build a theory based on the views of respondents. The steps taken to obtain the data are interviews, observation, and document review. The data is analyzed based on data analysis model constructivism form of transcription of data and interpretation of data through open coding, axial coding, and selective coding. Because that’s grounded theory meant to know in depth understanding of information entrepreneurship is mainly done by the students. The study also covers entrepreneurship and entrepreneurial activity information extracted information from the perspective of students. The conclusion of this study is an entrepreneur information definition. That defi nition is work in the fi eld of information that can be done independently or in groups, with flexibility in the time and place of the work, in the form of search activity information, creating web and library systems, consulting libraries, teachers, and librarians to make money, add experience and knowledge, as well as develop social activities.

  3. An activity theory analysis of boundary objects in cross-border information systems development for disaster management

    NARCIS (Netherlands)

    Bharosa, N.; Lee, J.; Janssen, M.; Rao, H.R.

    2012-01-01

    One of the main challenges in cross-border disaster management is the development and use of information systems that cater the needs of heterogeneous relief agencies, policies, activities and cultures. Drawing upon activity theory, this paper examines cross-border information systems development

  4. Hand Vein Images Enhancement Based on Local Gray-level Information Histogram

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2015-06-01

    Full Text Available Based on the Histogram equalization theory, this paper presents a novel concept of histogram to realize the contrast enhancement of hand vein images, avoiding the lost of topological vein structure or importing the fake vein information. Firstly, we propose the concept of gray-level information histogram, the fundamental characteristic of which is that the amplitudes of the components can objectively reflect the contribution of the gray levels and information to the representation of image information. Then, we propose the histogram equalization method that is composed of an automatic histogram separation module and an intensity transformation module, and the histogram separation module is a combination of the proposed prompt multiple threshold procedure and an optimum peak signal-to-noise (PSNR calculation to separate the histogram into small-scale detail, the use of the intensity transformation module can enhance the vein images with vein topological structure and gray information preservation for each generated sub-histogram. Experimental results show that the proposed method can achieve extremely good contrast enhancement effect.

  5. Information theory applied to econophysics: stock market behaviors

    Science.gov (United States)

    Vogel, Eugenio E.; Saravia, Gonzalo

    2014-08-01

    The use of data compressor techniques has allowed to recognize magnetic transitions and their associated critical temperatures [E.E. Vogel, G. Saravia, V. Cortez, Physica A 391, 1591 (2012)]. In the present paper we introduce some new concepts associated to data recognition and extend the use of these techniques to econophysics to explore the variations of stock market indicators showing that information theory can help to recognize different regimes. Modifications and further developments to previously introduced data compressor wlzip are introduced yielding two measurements. Additionally, we introduce an algorithm that allows to tune the number of significant digits over which the data compression is due to act complementing, this with an appropriate method to round off the truncation. The application is done to IPSA, the main indicator of the Chilean Stock Market during the year 2010 due to availability of quality data and also to consider a rare effect: the earthquake of the 27th of February on that year which is as of now the sixth strongest earthquake ever recorded by instruments (8.8 Richter scale) according to United States Geological Survey. Along the year 2010 different regimes are recognized. Calm days show larger compression than agitated days allowing for classification and recognition. Then the focus turns onto selected days showing that it is possible to recognize different regimes with the data of the last hour (60 entries) allowing to determine actions in a safer way. The "day of the week" effect is weakly present but "the hour of the day" effect is clearly present; its causes and implications are discussed. This effect also establishes the influence of Asian, European and American stock markets over the smaller Chilean Stock Market. Then dynamical studies are conducted intended to search a system that can help to realize in real time about sudden variations of the market; it is found that information theory can be really helpful in this respect.

  6. Commitment-based action: Rational choice theory and contrapreferential choice

    Directory of Open Access Journals (Sweden)

    Radovanović Bojana

    2014-01-01

    Full Text Available This paper focuses on Sen’s concept of contrapreferential choice. Sen has developed this concept in order to overcome weaknesses of the rational choice theory. According to rational choice theory a decision-maker can be always seen as someone who maximises utility, and each choice he makes as the one that brings to him the highest level of personal wellbeing. Sen argues that in some situations we chose alternatives that bring us lower level of wellbeing than we could achieve if we had chosen some other alternative available to us. This happens when we base our decisions on moral principles, when we act out of duty. Sen calls such action a commitment-based action. When we act out of commitment we actually neglect our preferences and thus we make a contrapreferential choice, as Sen argues. This paper shows that, contrary to Sen, a commitment-based action can be explained within the framework of rational choice theory. However, when each choice we make can be explained within the framework of rational choice theory, when in everything we do maximisation principle can be loaded, then the variety of our motives and traits is lost, and the explanatory power of the rational choice theory is questionable. [Projekat Ministarstva nauke Republike Srbije, br. 47009: Evropske integracije i društveno-ekonomske promene privrede Srbije na putu ka EU i br. 179015: Izazovi i perspektive strukturnih promena u Srbiji: Strateški pravci ekonomskog razvoja i usklađivanje sa zahtevima EU

  7. System parameter identification information criteria and algorithms

    CERN Document Server

    Chen, Badong; Hu, Jinchun; Principe, Jose C

    2013-01-01

    Recently, criterion functions based on information theoretic measures (entropy, mutual information, information divergence) have attracted attention and become an emerging area of study in signal processing and system identification domain. This book presents a systematic framework for system identification and information processing, investigating system identification from an information theory point of view. The book is divided into six chapters, which cover the information needed to understand the theory and application of system parameter identification. The authors' research pr

  8. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    Science.gov (United States)

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.

  9. Aligning Theory and Design: The Development of an Online Learning Intervention to Teach Evidence-based Practice for Maximal Reach.

    Science.gov (United States)

    Delagran, Louise; Vihstadt, Corrie; Evans, Roni

    2015-09-01

    Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development.

  10. Towards evidence-based palliative care in nursing homes in Sweden: a qualitative study informed by the organizational readiness to change theory.

    Science.gov (United States)

    Nilsen, Per; Wallerstedt, Birgitta; Behm, Lina; Ahlström, Gerd

    2018-01-04

    Sweden has a policy of supporting older people to live a normal life at home for as long as possible. Therefore, it is often the oldest, most frail people who move into nursing homes. Nursing home staff are expected to meet the existential needs of the residents, yet conversations about death and dying tend to cause emotional strain. This study explores organizational readiness to implement palliative care based on evidence-based guidelines in nursing homes in Sweden. The aim was to identify barriers and facilitators to implementing evidence-based palliative care in nursing homes. Interviews were carried out with 20 managers from 20 nursing homes in two municipalities who had participated along with staff members in seminars aimed at conveying knowledge and skills of relevance for providing evidence-based palliative care. Two managers responsible for all elderly care in each municipality were also interviewed. The questions were informed by the theory of Organizational Readiness for Change (ORC). ORC was also used as a framework to analyze the data by means of categorizing barriers and facilitators for implementing evidence-based palliative care. Analysis of the data yielded ten factors (i.e., sub-categories) acting as facilitators and/or barriers. Four factors constituted barriers: the staff's beliefs in their capabilities to face dying residents, their attitudes to changes at work as well as the resources and time required. Five factors functioned as either facilitators or barriers because there was considerable variation with regard to the staff's competence and confidence, motivation, and attitudes to work in general, as well as the managers' plans and decisional latitude concerning efforts to develop evidence-based palliative care. Leadership was a facilitator to implementing evidence-based palliative care. There is a limited organizational readiness to develop evidence-based palliative care as a result of variation in the nursing home staff's change efficacy

  11. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

    Science.gov (United States)

    Han, Gang; Newell, Jay

    2014-01-01

    This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…

  12. Properties of some nonlinear Schroedinger equations motivated through information theory

    International Nuclear Information System (INIS)

    Yuan, Liew Ding; Parwani, Rajesh R

    2009-01-01

    We update our understanding of nonlinear Schroedinger equations motivated through information theory. In particular we show that a q-deformation of the basic nonlinear equation leads to a perturbative increase in the energy of a system, thus favouring the simplest q = 1 case. Furthermore the energy minimisation criterion is shown to be equivalent, at leading order, to an uncertainty maximisation argument. The special value η = 1/4 for the interpolation parameter, where leading order energy shifts vanish, implies the preservation of existing supersymmetry in nonlinearised supersymmetric quantum mechanics. Physically, η might be encoding relativistic effects.

  13. A Theory for Educational Research: Socialisation Theory and Symbolic Interaction

    Science.gov (United States)

    Potts, Anthony

    2015-01-01

    This article develops a theory of socialisation based on the Chicago School of symbolic interactionism but infused with new and important insights offered by contemporary scholars and their writings on roles and relationships in the twenty first century and life in the informational, network and global world. While still rooted in the seminal…

  14. Cyber Power Theory First, Then Information Operations

    National Research Council Canada - National Science Library

    Smart, Antoinette G

    2001-01-01

    ...) seems disconcerting, at least on the surface. Think tanks, government research organizations, and learned individuals have all pointed to the need for a viable theory of IO, yet no such theory has emerged...

  15. Energy information data base: subject thesaurus

    International Nuclear Information System (INIS)

    1979-10-01

    The technical staff of the DOE Technical Information Center, during its subject indexing activities, develops and structures a vocabulary that allows consistent machine storage and retrieval of information necessary to the accomplishment of the DOE mission. This thesaurus incorporates that structured vocabulary. The terminology of this thesaurus is used for the subject control of information announced in DOE Energy Research Abstracts, Energy Abstracts for Policy Analysis, Solar Energy Update, Geothermal Energy Update, Fossil Energy Update, Fusion Energy Update, and Energy Conservation Update. This terminology also facilitates subject searching of the DOE energy information data base, a research in progress data base, a general and practical energy information data base, power reactor docket information data base, nuclear science abstracts data base, and the federal energy information data base on the DOE on-line retrieval system, RECON. The rapid expansion of the DOE's activities will result in a concomitant thesaurus expansion as information relating to new activities is indexed. Only the terms used in the indexing of documents at the Technical Information Center to date are included

  16. Needs Adapted Data Presentation in e-Information Tools

    DEFF Research Database (Denmark)

    Bergenholtz, Henning; Bothma, Theo

    2011-01-01

    In the current debate about the status of lexicography there are at least three quite different opinions: 1. Lexicography does not have and does not need any kind of own theory but can use all relevant linguistic theories; 2. Lexicography needs a theory special for the lexicographical praxis, but...... needs in the information society partly using the function theory of lexicography. The paper will briefly address issues regarding information overload and information stress showing how commercial systems try to address this by means of relevance ranking based on system relevance....

  17. Deconstructing dementia and delirium hospital practice: using cultural historical activity theory to inform education approaches.

    Science.gov (United States)

    Teodorczuk, Andrew; Mukaetova-Ladinska, Elizabeta; Corbett, Sally; Welfare, Mark

    2015-08-01

    Older patients with dementia and delirium receive suboptimal hospital care. Policy calls for more effective education to address this though there is little consensus on what this entails. The purpose of this clarification study is to explore how practice gaps are constructed in relation to managing the confused hospitalised older patient. The intent is to inform educational processes in the work-place beyond traditional approaches such as training. Adopting grounded theory as a research method and working within a social constructionist paradigm we explored the practice gaps of 15 healthcare professionals by interview and conducted five focus groups with patients, carers and Liaison mental health professionals. Data were thematically analysed by constant comparison and theoretical sampling was undertaken until saturation reached. Categories were identified and pragmatic concepts developed grounded within the data. Findings were then further analysed using cultural historical activity theory as a deductive lens. Practice gaps in relation to managing the confused older patient are determined by factors operating at individual (knowledge and skill gaps, personal philosophy, task based practice), team (leadership, time and ward environmental factors) and organisational (power relationships, dominance of medical model, fragmentation of care services) levels. Conceptually, practice appeared to be influenced by socio-cultural ward factors and compounded by a failure to join up existing "patient" knowledge amongst professionals. Applying cultural historical activity theory to further illuminate the findings, the central object is defined as learning about the patient and the mediating artifacts are the care relationships. The overarching medical dominance emerges as an important cultural historical factor at play and staff rules and divisions of labour are exposed. Lastly key contradictions and tensions in the system that work against learning about the patient are

  18. Bayesian networks and information theory for audio-visual perception modeling.

    Science.gov (United States)

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  19. Theoretical frameworks informing family-based child and adolescent obesity interventions

    DEFF Research Database (Denmark)

    Alulis, Sarah; Grabowski, Dan

    2017-01-01

    into focus. However, the use of theoretical frameworks to strengthen these interventions is rare and very uneven. OBJECTIVE AND METHOD: To conduct a qualitative meta-synthesis of family-based interventions for child and adolescent obesity to identify the theoretical frameworks applied, thus understanding how...... inconsistencies and a significant void between research results and health care practice. Based on the analysis, this article proposes three themes to be used as focus points when designing future interventions and when selecting theories for the development of solid, theory-based frameworks for application...... cognitive, self-efficacy and Family Systems Theory appeared most frequently. The remaining 24 were classified as theory-related as theoretical elements of self-monitoring; stimulus control, reinforcement and modelling were used. CONCLUSION: The designs of family-based interventions reveal numerous...

  20. The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories

    Directory of Open Access Journals (Sweden)

    Robert Logozar

    2011-12-01

    Full Text Available We present the modeling of dynamical systems and finding of their complexity indicators by the use of concepts from computation and information theories, within the framework of J. P. Crutchfield's theory of  ε-machines. A short formal outline of the  ε-machines is given. In this approach, dynamical systems are analyzed directly from the time series that is received from a properly adjusted measuring instrument. The binary strings are parsed through the parse tree, within which morphologically and probabilistically unique subtrees or morphs are recognized as system states. The outline and precise interrelation of the information-theoretic entropies and complexities emanating from the model is given. The paper serves also as a theoretical foundation for the future presentation of the DSA program that implements the  ε-machines modeling up to the stochastic finite automata level.

  1. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    Science.gov (United States)

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  2. Application of the Theory of Constraints in Project Based Structures

    OpenAIRE

    Martynas Sarapinas; Vytautas Pranas Sūdžius

    2011-01-01

    The article deals with the application of the Theory of Constraints (TOC) in project management. This article involves a short introduction to TOC as a project management method and deep analysis of project management specialties using the TOC: TOC based project planning, timetable management, tasks synchronization, project control and “relay runner work ethic”. Moreover, the article describes traditional and TOC based project management theories in their comparison, and emphasize the main be...

  3. Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis

    Science.gov (United States)

    Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.

    2012-01-01

    Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…

  4. From Anakin Skywalker to Darth Vader: understanding «Star Wars» based on Theodore Millon´s theory of personality pathology

    Directory of Open Access Journals (Sweden)

    Lucas de Francisco CARVALHO

    2017-09-01

    Full Text Available The aim of this work was to psychologically investigate Anakin Skywalker (also known as Darth Vader, using a non sistematic idiographic clinic analysis, based on Thedore Millon theory, with solid theoretical and empirical bases for personality pathological traits and personality disorders. The character Anakin Skywalker allows this analysis, since in the films it is possible to observe fragments of his childhood, adolescence and adult life, making viable an analysis of his psychological development. According to Millon´s theory and the information from the movies, we present as a conclusion a possible pathological personality funtioning for the character.

  5. Information-theoretic metamodel of organizational evolution

    Science.gov (United States)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  6. Learning Theory Bases of Communicative Methodology and the Notional/Functional Syllabus

    OpenAIRE

    Jacqueline D., Beebe

    1992-01-01

    This paper examines the learning theories that underlie the philosophy and practices known as communicative language teaching methodology. These theories are identified first as a reaction against the behavioristic learning theory of audiolingualism. Approaches to syllabus design based on both the "weak" version of communicative language teaching-learning to use the second language-and the "strong" version-using the second language to learn it-are examined. The application of cognitive theory...

  7. Category Theory as a Formal Mathematical Foundation for Model-Based Systems Engineering

    KAUST Repository

    Mabrok, Mohamed; Ryan, Michael J.

    2017-01-01

    In this paper, we introduce Category Theory as a formal foundation for model-based systems engineering. A generalised view of the system based on category theory is presented, where any system can be considered as a category. The objects

  8. Human Capital as a Challenge for Economics Theory

    Directory of Open Access Journals (Sweden)

    Barbara Wyrzykowska

    2014-12-01

    Full Text Available The issue of human capital is increasingly attracting the attention of both theorists and practitioners, because at present human resources play a decisive role in the creation of competitive economies and business entities. Human capital and knowledge are becoming key factors in the area of entity competitiveness. Consequently, human capital is currently being analysed in a multi-faceted way in the context of numerous economic theories. The aim of this study is to summarize, analyse, and synthesise the information published on the subject of the theory of human capital and to present new theories and scientific paradigms. The theories presented in this study show that employees constitute the basic capital of modern organizations. One of the contemporary paradigms of modern management is the concept of knowledge-based economy and the paradigm of information technology. This article is based on literature studies and theoretical reflections of the author.

  9. Continuing bonds in bereavement: an attachment theory based perspective.

    Science.gov (United States)

    Field, Nigel P; Gao, Beryl; Paderna, Lisa

    2005-05-01

    An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual differences in attachment security on effective versus ineffective use of CB in coping with bereavement also is addressed. Finally, the moderating influence of type of loss (e.g., death of a spouse vs. child), culture, and religion on type of CB expression within an overarching attachment framework is discussed.

  10. Information-Based Analysis of Data Assimilation (Invited)

    Science.gov (United States)

    Nearing, G. S.; Gupta, H. V.; Crow, W. T.; Gong, W.

    2013-12-01

    Data assimilation is defined as the Bayesian conditioning of uncertain model simulations on observations for the purpose of reducing uncertainty about model states. Practical data assimilation methods make the application of Bayes' law tractable either by employing assumptions about the prior, posterior and likelihood distributions (e.g., the Kalman family of filters) or by using resampling methods (e.g., bootstrap filter). We propose to quantify the efficiency of these approximations in an OSSE setting using information theory and, in an OSSE or real-world validation setting, to measure the amount - and more importantly, the quality - of information extracted from observations during data assimilation. To analyze DA assumptions, uncertainty is quantified as the Shannon-type entropy of a discretized probability distribution. The maximum amount of information that can be extracted from observations about model states is the mutual information between states and observations, which is equal to the reduction in entropy in our estimate of the state due to Bayesian filtering. The difference between this potential and the actual reduction in entropy due to Kalman (or other type of) filtering measures the inefficiency of the filter assumptions. Residual uncertainty in DA posterior state estimates can be attributed to three sources: (i) non-injectivity of the observation operator, (ii) noise in the observations, and (iii) filter approximations. The contribution of each of these sources is measurable in an OSSE setting. The amount of information extracted from observations by data assimilation (or system identification, including parameter estimation) can also be measured by Shannon's theory. Since practical filters are approximations of Bayes' law, it is important to know whether the information that is extracted form observations by a filter is reliable. We define information as either good or bad, and propose to measure these two types of information using partial

  11. Research on electricity consumption forecast based on mutual information and random forests algorithm

    Science.gov (United States)

    Shi, Jing; Shi, Yunli; Tan, Jian; Zhu, Lei; Li, Hu

    2018-02-01

    Traditional power forecasting models cannot efficiently take various factors into account, neither to identify the relation factors. In this paper, the mutual information in information theory and the artificial intelligence random forests algorithm are introduced into the medium and long-term electricity demand prediction. Mutual information can identify the high relation factors based on the value of average mutual information between a variety of variables and electricity demand, different industries may be highly associated with different variables. The random forests algorithm was used for building the different industries forecasting models according to the different correlation factors. The data of electricity consumption in Jiangsu Province is taken as a practical example, and the above methods are compared with the methods without regard to mutual information and the industries. The simulation results show that the above method is scientific, effective, and can provide higher prediction accuracy.

  12. A Theory-Based Contextual Nutrition Education Manual Enhanced Nutrition Teaching Skill.

    Science.gov (United States)

    Kupolati, Mojisola D; MacIntyre, Una E; Gericke, Gerda J

    2018-01-01

    Background: A theory-based contextual nutrition education manual (NEM) may enhance effective teaching of nutrition in schools. School nutrition education should lead to the realization of such benefits as improved health, scholarly achievement leading to manpower development and consequently the nation's development. The purpose of the study was to develop a contextual NEM for teachers of Grade 5 and 6 learners in the Bronkhorstspruit district, South Africa, and to assess teachers' perception on the use of the manual for teaching nutrition. Methods: This descriptive case study used an interpretivist paradigm. The study involved teachers ( N = 6) who taught nutrition in Life Skills (LS) and Natural Science and Technology (NST) in a randomly selected primary school in the Bronkhorstspruit district. Findings from a nutrition education needs assessment were integrated with the constructs of the Social cognitive theory (SCT) and the Meaningful learning model (MLM) and the existing curriculum of the Department of Basic Education (DoBE) to develop a contextual NEM. The manual was used by the teachers to teach nutrition to Grades 5 and 6 learners during the 2015 academic year as a pilot project. A focus group discussion (FDG) was conducted with teachers to gauge their perceptions of the usefulness of the NEM. Data were analyzed using the thematic approach of the framework method for qualitative research. Results: Teachers described the NEM as rich in information, easy to use and perceived the supporting materials and activities as being effective. The goal setting activities contained in the NEM were deemed to be ineffective. Teachers felt that they did not have enough time to teach all the important things that the learners needed to know. Conclusion: Teachers perceived the NEM as helpful toward improving their nutrition teaching skills.The NEM template may furthermore guide teachers in planning theory-based nutrition lessons.

  13. The Theory of Optimal Taxation

    DEFF Research Database (Denmark)

    Sørensen, Peter Birch

    The paper discusses the implications of optimal tax theory for the debates on uniform commodity taxation and neutral capital income taxation. While strong administrative and political economy arguments in favor of uniform and neutral taxation remain, recent advances in optimal tax theory suggest...... that the information needed to implement the differentiated taxation prescribed by optimal tax theory may be easier to obtain than previously believed. The paper also points to the strong similarity between optimal commodity tax rules and the rules for optimal source-based capital income taxation...

  14. The theory of optimal taxation

    DEFF Research Database (Denmark)

    Sørensen, Peter Birch

    2007-01-01

    The paper discusses the implications of optimal tax theory for the debates on uniform commodity taxation and neutral capital income taxation. While strong administrative and political economy arguments in favor of uniform and neutral taxation remain, recent advances in optimal tax theory suggest...... that the information needed to implement the differentiated taxation prescribed by optimal tax theory may be easier to obtain than previously believed. The paper also points to the strong similarity between optimal commodity tax rules and the rules for optimal source-based capital income taxation...

  15. A novel method of range measuring for a mobile robot based on multi-sensor information fusion

    International Nuclear Information System (INIS)

    Zhang Yi; Luo Yuan; Wang Jifeng

    2005-01-01

    The traditional measuring range for a mobile robot is based on a sonar sensor. Because of different working environments, it is very difficult to obtain high precision by using just one single method of range measurement. So, a hybrid sonar sensor and laser scanner method is put forward to overcome these shortcomings. A novel fusion model is proposed based on basic theory and a method of information fusion. An optimal measurement result has been obtained with information fusion from different sensors. After large numbers of experiments and performance analysis, a conclusion can be drawn that the laser scanner and sonar sensor method with multi-sensor information fusion have a higher precision than the single method of sonar. It can also be the same with different environments

  16. Investigating an approach to the alliance based on interpersonal defense theory.

    Science.gov (United States)

    Westerman, Michael A; Muran, J Christopher

    2017-09-01

    Notwithstanding consistent findings of significant relationships between the alliance and outcome, questions remain to be answered about the relatively small magnitude of those correlations, the mechanisms underlying the association, and how to conceptualize the alliance construct. We conducted a preliminary study of an approach to the alliance based on interpersonal defense theory, which is an interpersonal reconceptualization of defense processes, to investigate the promise of this alternative approach as a way to address the outstanding issues. We employed qualitative, theory-building case study methodology, closely examining alliance processes at four time points in the treatment of a case in terms of a case formulation based on interpersonal defense theory. The results suggested that our approach made it possible to recognize key processes in the alliance and that it helps explain how the alliance influences outcome. Our analyses also provided a rich set of concrete illustrations of the alliance phenomena identified by the theory. The findings suggest that an approach to the alliance based on interpersonal defense theory holds promise. However, although the qualitative method we employed has advantages, it also has limitations. We offer suggestions about how future qualitative and quantitative investigations could build on this study.

  17. Cognitive load theory: implications of cognitive load theory on the design of learning

    NARCIS (Netherlands)

    Kirschner, P.A.

    2002-01-01

    Cognitive load theory (CLT) can provide guidelines to assist in the presentation of information in a manner that encourages learner activities that optimise intellectual performance. It is based on a cognitive architecture that consists of a limited working memory, with partly independent

  18. The Influence of Base Rate and Case Information on Health-Risk Perceptions: A Unified Model of Self-Positivity and Self-Negativity

    OpenAIRE

    Dengfeng Yan; Jaideep Sengupta

    2013-01-01

    This research examines how consumers use base rate (e.g., disease prevalence in a population) and case information (e.g., an individual's disease symptoms) to estimate health risks. Drawing on construal level theory, we propose that consumers' reliance on base rate (case information) will be enhanced (weakened) by psychological distance. A corollary of this premise is that self-positivity (i.e., underestimating self-risk vs. other-risk) is likely when the disease base rate is high but the cas...

  19. Outline of a Theory of Truth as Correctness for Semantic Information

    Directory of Open Access Journals (Sweden)

    Luciano Floridi

    2009-11-01

    Full Text Available The article develops a correctness theory of truth (CTT for semantic information. After the introduction, in section two, semantic information is shown to be translatable into propositional semantic information (i. In section three, i is polarised into a query (Q and a result (R, qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in section four, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In sections five and six, it is argued that (1 A is the correct answer to Q if and only if (2 A correctly saturates (in a Fregean sense Q by verifying and validating it (in the computer science’s sense of “verification” and “validation”; that (2 is the case if and only if (3 [Q + A] generates an adequate model (m of the relevant system (s identified by Q; that (3 is the case if and only if (4 m is a proxy of s (in the computer science’s sense of “proxy” and (5 proximal access to m commutes with the distal access to s (in the category theory’s sense of “commutation”; and that (5 is the case if and only if (6 reading/writing (accessing, in the computer science’s technical sense of the term m enables one to read/write (access s. The last section draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users.

  20. Introduction to quantum information science

    CERN Document Server

    Hayashi, Masahito; Kawachi, Akinori; Kimura, Gen; Ogawa, Tomohiro

    2015-01-01

    This book presents the basics of quantum information, e.g., foundation of quantum theory, quantum algorithms, quantum entanglement, quantum entropies, quantum coding, quantum error correction and quantum cryptography. The required knowledge is only elementary calculus and linear algebra. This way the book can be understood by undergraduate students. In order to study quantum information, one usually has to study the foundation of quantum theory. This book describes it from more an operational viewpoint which is suitable for quantum information while traditional textbooks of quantum theory lack this viewpoint. The current  book bases on Shor's algorithm, Grover's algorithm, Deutsch-Jozsa's algorithm as basic algorithms. To treat several topics in quantum information, this book covers several kinds of information quantities in quantum systems including von Neumann entropy. The limits of several kinds of quantum information processing are given. As important quantum protocols,this book contains quantum teleport...