WorldWideScience

Sample records for information theory based

  1. Information theory based approaches to cellular signaling.

    Science.gov (United States)

    Waltermann, Christian; Klipp, Edda

    2011-10-01

    Cells interact with their environment and they have to react adequately to internal and external changes such changes in nutrient composition, physical properties like temperature or osmolarity and other stresses. More specifically, they must be able to evaluate whether the external change is significant or just in the range of noise. Based on multiple external parameters they have to compute an optimal response. Cellular signaling pathways are considered as the major means of information perception and transmission in cells. Here, we review different attempts to quantify information processing on the level of individual cells. We refer to Shannon entropy, mutual information, and informal measures of signaling pathway cross-talk and specificity. Information theory in systems biology has been successfully applied to identification of optimal pathway structures, mutual information and entropy as system response in sensitivity analysis, and quantification of input and output information. While the study of information transmission within the framework of information theory in technical systems is an advanced field with high impact in engineering and telecommunication, its application to biological objects and processes is still restricted to specific fields such as neuroscience, structural and molecular biology. However, in systems biology dealing with a holistic understanding of biochemical systems and cellular signaling only recently a number of examples for the application of information theory have emerged. This article is part of a Special Issue entitled Systems Biology of Microorganisms. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  3. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  4. Preservation of information in Fourier theory based deconvolved nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishnan, K.R.; Sharma, R.C.; Rattan, S.S.

    1995-01-01

    Nuclear spectroscopy is extremely useful to the internal radiation dosimetry for the estimation of body burden due to gamma emitters. Analysis of nuclear spectra is concerned with the extraction of qualitative and quantitative information embedded in the spectra. A spectral deconvolution method based on Fourier theory is probably the simplest method of deconvolving nuclear spectra. It is proved mathematically that the deconvolution method preserves the qualitative information. It is shown by using simulated spectra and an observed gamma ray spectrum that the method preserves the quantitative information. This may provide a novel approach of information extraction from a deconvolved spectrum. The paper discusses the methodology, mathematical analysis, and the results obtained by deconvolving spectra. (author). 6 refs., 2 tabs

  5. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  6. Information theory

    CERN Document Server

    Alencar, Marcelo S

    2014-01-01

    During the last decade we have witnessed rapid developments of computer networks and Internet technologies along with dramatic improvements in the processing power of personal computers. These developments make Interactive Distance Education a reality. By designing and deploying distributed and collaborative applications running on computers disseminated over the Internet, distance educators can reach remote learners, overcoming the time and distance constraints. Besides the necessary theoretical base provided by lectures and written materials, hands-on experience provided by physical laboratories is a vital part for engineering education. It helps engineering students become effective professionals. Such instruction not only provides the students with the knowledge of the physical equipment but also adds the important dimension of group work and collaboration. However, laboratories are expensive to setup, to maintain and provide long hours of daily staffing. Due to budget limitations, many universities and c...

  7. Pre-Game-Theory Based Information Technology (GAMBIT) Study

    National Research Council Canada - National Science Library

    Polk, Charles

    2003-01-01

    .... The generic GAMBIT scenario has been characterized as Dynamic Hierarchical Gaming (DHG). Game theory is not yet ready to fully support analysis of DHG, though existing partial analysis suggests that a full treatment is practical in the midterm...

  8. Quantum biological information theory

    CERN Document Server

    Djordjevic, Ivan B

    2016-01-01

    This book is a self-contained, tutorial-based introduction to quantum information theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological information theory required to describe the quantum information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects. Integrates quantum information and quantum biology concepts; Assumes only knowledge of basic concepts of vector algebra at undergraduate level; Provides a thorough introduction to basic concepts of quantum information processing, quantum information theory, and quantum biology; Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models o...

  9. Reflections on the Right to Information Based on Citizenship Theories

    Directory of Open Access Journals (Sweden)

    Vitor Gentilli

    2007-06-01

    Full Text Available In modern societies, structured as representative democracies, all rights to some extent are related to the right to information: the enlargement of participation in citizenship presupposes an enlargement of the right to information as a premise. It is a right which encourages the exercising of citizenship and aff ords the citizens access to and criticism of the instruments necessary for the full exercising of the group of citizenship rights. The right to information can have characteristics of emancipation or of tutelage. An emancipating right is a right to freedom, a right whose basic presupposition is freedom of choice. Accordingly, the maxim which could sum up the ethical issue of the right to information would be: give maximum publicity to everything which refers to the public sphere and keep secret that which refers to the private sphere.

  10. Human vision is determined based on information theory

    Science.gov (United States)

    Delgado-Bonal, Alfonso; Martín-Torres, Javier

    2016-11-01

    It is commonly accepted that the evolution of the human eye has been driven by the maximum intensity of the radiation emitted by the Sun. However, the interpretation of the surrounding environment is constrained not only by the amount of energy received but also by the information content of the radiation. Information is related to entropy rather than energy. The human brain follows Bayesian statistical inference for the interpretation of visual space. The maximization of information occurs in the process of maximizing the entropy. Here, we show that the photopic and scotopic vision absorption peaks in humans are determined not only by the intensity but also by the entropy of radiation. We suggest that through the course of evolution, the human eye has not adapted only to the maximum intensity or to the maximum information but to the optimal wavelength for obtaining information. On Earth, the optimal wavelengths for photopic and scotopic vision are 555 nm and 508 nm, respectively, as inferred experimentally. These optimal wavelengths are determined by the temperature of the star (in this case, the Sun) and by the atmospheric composition.

  11. Efficient Information Hiding Based on Theory of Numbers

    Directory of Open Access Journals (Sweden)

    Yanjun Liu

    2018-01-01

    Full Text Available Data hiding is an efficient technique that conceals secret data into a digital medium. In 2006, Zhang and Wang proposed a data hiding scheme called exploiting modification direction (EMD which has become a milestone in the field of data hiding. In recent years, many EMD-type data hiding schemes have been developed, but their embedding capacity remains restricted. In this paper, a novel data hiding scheme based on the combination of Chinese remainder theorem (CRT and a new extraction function is proposed. By the proposed scheme, the cover image is divided into non-overlapping pixel groups for embedding to increase the embedding capacity. Experimental results show that the embedding capacity of the proposed scheme is significantly higher (greater than 2.5 bpp than previously proposed schemes while ensuring very good visual quality of the stego image. In addition, security analysis is given to show that the proposed scheme can resist visual attack.

  12. Dynamic statistical information theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel

  13. Information theory of molecular systems

    CERN Document Server

    Nalewajski, Roman F

    2006-01-01

    As well as providing a unified outlook on physics, Information Theory (IT) has numerous applications in chemistry and biology owing to its ability to provide a measure of the entropy/information contained within probability distributions and criteria of their information ""distance"" (similarity) and independence. Information Theory of Molecular Systems applies standard IT to classical problems in the theory of electronic structure and chemical reactivity. The book starts by introducing the basic concepts of modern electronic structure/reactivity theory based upon the Density Functional Theory

  14. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    Energy Technology Data Exchange (ETDEWEB)

    Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it [Sapienza Università di Roma, Dipartimento di Ingegneria Civile, Edile e Ambientale (Italy); Alfonso, L. [Hydroinformatics Chair Group, UNESCO-IHE, Delft (Netherlands); Di Baldassarre, G. [Department of Earth Sciences, Program for Air, Water and Landscape Sciences, Uppsala University (Sweden)

    2016-06-08

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  15. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    International Nuclear Information System (INIS)

    Ridolfi, E.; Napolitano, F.; Alfonso, L.; Di Baldassarre, G.

    2016-01-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  16. The criteria for selecting a method for unfolding neutron spectra based on the information entropy theory

    International Nuclear Information System (INIS)

    Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin

    2014-01-01

    To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts

  17. Cooperative Localization for Multi-AUVs Based on GM-PHD Filters and Information Entropy Theory

    Directory of Open Access Journals (Sweden)

    Lichuan Zhang

    2017-10-01

    Full Text Available Cooperative localization (CL is considered a promising method for underwater localization with respect to multiple autonomous underwater vehicles (multi-AUVs. In this paper, we proposed a CL algorithm based on information entropy theory and the probability hypothesis density (PHD filter, aiming to enhance the global localization accuracy of the follower. In the proposed framework, the follower carries lower cost navigation systems, whereas the leaders carry better ones. Meanwhile, the leaders acquire the followers’ observations, including both measurements and clutter. Then, the PHD filters are utilized on the leaders and the results are communicated to the followers. The followers then perform weighted summation based on all received messages and obtain a final positioning result. Based on the information entropy theory and the PHD filter, the follower is able to acquire a precise knowledge of its position.

  18. Science and information theory

    CERN Document Server

    Brillouin, Léon

    1962-01-01

    A classic source for exploring the connections between information theory and physics, this text is geared toward upper-level undergraduates and graduate students. The author, a giant of 20th-century mathematics, applies the principles of information theory to a variety of issues, including Maxwell's demon, thermodynamics, and measurement problems. 1962 edition.

  19. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    National Research Council Canada - National Science Library

    Nolte, Loren

    2002-01-01

    The hypothesis is that one can use signal detection theory to improve the performance in detecting tumors in the breast by using this theory to develop task-oriented information processing techniques...

  20. Information Design Theories

    Science.gov (United States)

    Pettersson, Rune

    2014-01-01

    Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…

  1. Theories of information behavior

    CERN Document Server

    Erdelez, Sandra; McKechnie, Lynne

    2005-01-01

    This unique book presents authoritative overviews of more than 70 conceptual frameworks for understanding how people seek, manage, share, and use information in different contexts. A practical and readable reference to both well-established and newly proposed theories of information behavior, the book includes contributions from 85 scholars from 10 countries. Each theory description covers origins, propositions, methodological implications, usage, links to related conceptual frameworks, and listings of authoritative primary and secondary references. The introductory chapters explain key concepts, theory–method connections, and the process of theory development.

  2. Hybrid Multicriteria Group Decision Making Method for Information System Project Selection Based on Intuitionistic Fuzzy Theory

    Directory of Open Access Journals (Sweden)

    Jian Guo

    2013-01-01

    Full Text Available Information system (IS project selection is of critical importance to every organization in dynamic competing environment. The aim of this paper is to develop a hybrid multicriteria group decision making approach based on intuitionistic fuzzy theory for IS project selection. The decision makers’ assessment information can be expressed in the form of real numbers, interval-valued numbers, linguistic variables, and intuitionistic fuzzy numbers (IFNs. All these evaluation pieces of information can be transformed to the form of IFNs. Intuitionistic fuzzy weighted averaging (IFWA operator is utilized to aggregate individual opinions of decision makers into a group opinion. Intuitionistic fuzzy entropy is used to obtain the entropy weights of the criteria. TOPSIS method combined with intuitionistic fuzzy set is proposed to select appropriate IS project in group decision making environment. Finally, a numerical example for information system projects selection is given to illustrate application of hybrid multi-criteria group decision making (MCGDM method based on intuitionistic fuzzy theory and TOPSIS method.

  3. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    Science.gov (United States)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  4. The Research on Safety Management Information System of Railway Passenger Based on Risk Management Theory

    Science.gov (United States)

    Zhu, Wenmin; Jia, Yuanhua

    2018-01-01

    Based on the risk management theory and the PDCA cycle model, requirements of the railway passenger transport safety production is analyzed, and the establishment of the security risk assessment team is proposed to manage risk by FTA with Delphi from both qualitative and quantitative aspects. The safety production committee is also established to accomplish performance appraisal, which is for further ensuring the correctness of risk management results, optimizing the safety management business processes and improving risk management capabilities. The basic framework and risk information database of risk management information system of railway passenger transport safety are designed by Ajax, Web Services and SQL technologies. The system realizes functions about risk management, performance appraisal and data management, and provides an efficient and convenient information management platform for railway passenger safety manager.

  5. Optimal design of hydrometric monitoring networks with dynamic components based on Information Theory

    Science.gov (United States)

    Alfonso, Leonardo; Chacon, Juan; Solomatine, Dimitri

    2016-04-01

    The EC-FP7 WeSenseIt project proposes the development of a Citizen Observatory of Water, aiming at enhancing environmental monitoring and forecasting with the help of citizens equipped with low-cost sensors and personal devices such as smartphones and smart umbrellas. In this regard, Citizen Observatories may complement the limited data availability in terms of spatial and temporal density, which is of interest, among other areas, to improve hydraulic and hydrological models. At this point, the following question arises: how can citizens, who are part of a citizen observatory, be optimally guided so that the data they collect and send is useful to improve modelling and water management? This research proposes a new methodology to identify the optimal location and timing of potential observations coming from moving sensors of hydrological variables. The methodology is based on Information Theory, which has been widely used in hydrometric monitoring design [1-4]. In particular, the concepts of Joint Entropy, as a measure of the amount of information that is contained in a set of random variables, which, in our case, correspond to the time series of hydrological variables captured at given locations in a catchment. The methodology presented is a step forward in the state of the art because it solves the multiobjective optimisation problem of getting simultaneously the minimum number of informative and non-redundant sensors needed for a given time, so that the best configuration of monitoring sites is found at every particular moment in time. To this end, the existing algorithms have been improved to make them efficient. The method is applied to cases in The Netherlands, UK and Italy and proves to have a great potential to complement the existing in-situ monitoring networks. [1] Alfonso, L., A. Lobbrecht, and R. Price (2010a), Information theory-based approach for location of monitoring water level gauges in polders, Water Resour. Res., 46(3), W03528 [2] Alfonso, L., A

  6. Highly accurate fluorogenic DNA sequencing with information theory-based error correction.

    Science.gov (United States)

    Chen, Zitian; Zhou, Wenxiong; Qiao, Shuo; Kang, Li; Duan, Haifeng; Xie, X Sunney; Huang, Yanyi

    2017-12-01

    Eliminating errors in next-generation DNA sequencing has proved challenging. Here we present error-correction code (ECC) sequencing, a method to greatly improve sequencing accuracy by combining fluorogenic sequencing-by-synthesis (SBS) with an information theory-based error-correction algorithm. ECC embeds redundancy in sequencing reads by creating three orthogonal degenerate sequences, generated by alternate dual-base reactions. This is similar to encoding and decoding strategies that have proved effective in detecting and correcting errors in information communication and storage. We show that, when combined with a fluorogenic SBS chemistry with raw accuracy of 98.1%, ECC sequencing provides single-end, error-free sequences up to 200 bp. ECC approaches should enable accurate identification of extremely rare genomic variations in various applications in biology and medicine.

  7. Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory

    Science.gov (United States)

    Pei, Di; Yue, Jianhai; Jiao, Jing

    2017-10-01

    This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.

  8. Constructor theory of information

    Science.gov (United States)

    Deutsch, David; Marletto, Chiara

    2015-01-01

    We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803

  9. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    Science.gov (United States)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  10. [Prediction of regional soil quality based on mutual information theory integrated with decision tree algorithm].

    Science.gov (United States)

    Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu

    2012-02-01

    In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.

  11. Optimised Selection of Stroke Biomarker Based on Svm and Information Theory

    Directory of Open Access Journals (Sweden)

    Wang Xiang

    2017-01-01

    Full Text Available With the development of molecular biology and gene-engineering technology, gene diagnosis has been an emerging approach for modern life sciences. Biological marker, recognized as the hot topic in the molecular and gene fields, has important values in early diagnosis, malignant tumor stage, treatment and therapeutic efficacy evaluation. So far, the researcher has not found any effective way to predict and distinguish different type of stroke. In this paper, we aim to optimize stroke biomarker and figure out effective stroke detection index based on SVM (support vector machine and information theory. Through mutual information analysis and principal component analysis to complete the selection of biomarkers and then we use SVM to verify our model. According to the testing data of patients provided by Xuanwu Hospital, we explore the significant markers of the stroke through data analysis. Our model can predict stroke well. Then discuss the effects of each biomarker on the incidence of stroke.

  12. Wave theory of information

    CERN Document Server

    Franceschetti, Massimo

    2017-01-01

    Understand the relationship between information theory and the physics of wave propagation with this expert guide. Balancing fundamental theory with engineering applications, it describes the mechanism and limits for the representation and communication of information using electromagnetic waves. Information-theoretic laws relating functional approximation and quantum uncertainty principles to entropy, capacity, mutual information, rate distortion, and degrees of freedom of band-limited radiation are derived and explained. Both stochastic and deterministic approaches are explored, and applications for sensing and signal reconstruction, wireless communication, and networks of multiple transmitters and receivers are reviewed. With end-of-chapter exercises and suggestions for further reading enabling in-depth understanding of key concepts, it is the ideal resource for researchers and graduate students in electrical engineering, physics and applied mathematics looking for a fresh perspective on classical informat...

  13. Quantum information theory

    CERN Document Server

    Wilde, Mark M

    2017-01-01

    Developing many of the major, exciting, pre- and post-millennium developments from the ground up, this book is an ideal entry point for graduate students into quantum information theory. Significant attention is given to quantum mechanics for quantum information theory, and careful studies of the important protocols of teleportation, superdense coding, and entanglement distribution are presented. In this new edition, readers can expect to find over 100 pages of new material, including detailed discussions of Bell's theorem, the CHSH game, Tsirelson's theorem, the axiomatic approach to quantum channels, the definition of the diamond norm and its interpretation, and a proof of the Choi–Kraus theorem. Discussion of the importance of the quantum dynamic capacity formula has been completely revised, and many new exercises and references have been added. This new edition will be welcomed by the upcoming generation of quantum information theorists and the already established community of classical information theo...

  14. Uncertainty analysis of an integrated energy system based on information theory

    International Nuclear Information System (INIS)

    Fu, Xueqian; Sun, Hongbin; Guo, Qinglai; Pan, Zhaoguang; Xiong, Wen; Wang, Li

    2017-01-01

    Currently, a custom-designed configuration of different renewable technologies named the integrated energy system (IES) has become popular due to its high efficiency, benefiting from complementary multi-energy technologies. This paper proposes an information entropy approach to quantify uncertainty in an integrated energy system based on a stochastic model that drives a power system model derived from an actual network on Barry Island. Due to the complexity of co-behaviours between generators, a copula-based approach is utilized to articulate the dependency structure of the generator outputs with regard to such factors as weather conditions. Correlation coefficients and mutual information, which are effective for assessing the dependence relationships, are applied to judge whether the stochastic IES model is correct. The calculated information values can be used to analyse the impacts of the coupling of power and heat on power flows and heat flows, and this approach will be helpful for improving the operation of IES. - Highlights: • The paper explores uncertainty of an integrated energy system. • The dependent weather model is verified from the perspective of correlativity. • The IES model considers the dependence between power and heat. • The information theory helps analyse the complexity of IES operation. • The application of the model is studied using an operational system on Barry Island.

  15. Information fusion-based approach for studying influence on Twitter using belief theory.

    Science.gov (United States)

    Azaza, Lobna; Kirgizov, Sergey; Savonnet, Marinette; Leclercq, Éric; Gastineau, Nicolas; Faiz, Rim

    2016-01-01

    Influence in Twitter has become recently a hot research topic, since this micro-blogging service is widely used to share and disseminate information. Some users are more able than others to influence and persuade peers. Thus, studying most influential users leads to reach a large-scale information diffusion area, something very useful in marketing or political campaigns. In this study, we propose a new approach for multi-level influence assessment on multi-relational networks, such as Twitter . We define a social graph to model the relationships between users as a multiplex graph where users are represented by nodes, and links model the different relations between them (e.g., retweets , mentions , and replies ). We explore how relations between nodes in this graph could reveal about the influence degree and propose a generic computational model to assess influence degree of a certain node. This is based on the conjunctive combination rule from the belief functions theory to combine different types of relations. We experiment the proposed method on a large amount of data gathered from Twitter during the European Elections 2014 and deduce top influential candidates. The results show that our model is flexible enough to to consider multiple interactions combination according to social scientists needs or requirements and that the numerical results of the belief theory are accurate. We also evaluate the approach over the CLEF RepLab 2014 data set and show that our approach leads to quite interesting results.

  16. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 1: Building a Foundation

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-06-01

    Full Text Available Objective – Part I of this paper aims to create a framework for an emerging theory of evidence based information literacy instruction. In order to ground this framework in existing theory, a holistic perspective views inquiry as a learning process that synthesizes information searching and knowledge building. An interdisciplinary approach is taken to relate user-centric information behavior theory and constructivist learning theory that supports this synthesis. The substantive theories that emerge serve as a springboard for emerging theory. A second objective of this paper is to define evidence based information literacy instruction by assessing the suitability of performance based assessment and action research as tools of evidence based practice.Methods – An historical review of research grounded in user-centered information behavior theory and constructivist learning theory establishes a body of existing substantive theory that supports emerging theory for evidence based information literacy instruction within an information-to-knowledge approach. A focused review of the literature presents supporting research for an evidence based pedagogy that is performance assessment based, i.e., information users are immersed in real-world tasks that include formative assessments. An analysis of the meaning of action research in terms of its purpose and methodology establishes its suitability for structuring an evidence based pedagogy. Supporting research tests a training model for school librarians and educators which integrates performance based assessment, as well as action research. Results – Findings of an historical analysis of information behavior theory and constructivist teaching practices, and a literature review that explores teaching models for evidence based information literacy instruction, point to two elements of evidence based information literacy instruction: the micro level of information searching behavior and the macro level of

  17. Electricity procurement for large consumers based on Information Gap Decision Theory

    Energy Technology Data Exchange (ETDEWEB)

    Zare, Kazem; Moghaddam, Mohsen Parsa; Sheikh El Eslami, Mohammad Kazem [Tarbiat Modares University, P.O. Box 14115-111, Tehran (Iran)

    2010-01-15

    In the competitive electricity market, consumers seek strategies to meet their electricity needs at minimum cost and risk. This paper provides a technique based on Information Gap Decision Theory (IGDT) to assess different procurement strategies for large consumers. Supply sources include bilateral contracts, a limited self-generating facility, and the pool. It is considered that the pool price is uncertain and its volatility around the estimated value is modeled using an IGDT model. The proposed method does not minimize the procurement cost but assesses the risk aversion or risk-taking nature of some procurement strategies with regard to the minimum cost. Using this method, the robustness of experiencing costs higher than the expected one is optimized and the related strategy is determined. The proposed method deals with optimizing the opportunities to take advantage of low procurement costs or low pool prices. A case study is used to illustrate the proposed technique. (author)

  18. Electricity procurement for large consumers based on Information Gap Decision Theory

    International Nuclear Information System (INIS)

    Zare, Kazem; Moghaddam, Mohsen Parsa; Sheikh El Eslami, Mohammad Kazem

    2010-01-01

    In the competitive electricity market, consumers seek strategies to meet their electricity needs at minimum cost and risk. This paper provides a technique based on Information Gap Decision Theory (IGDT) to assess different procurement strategies for large consumers. Supply sources include bilateral contracts, a limited self-generating facility, and the pool. It is considered that the pool price is uncertain and its volatility around the estimated value is modeled using an IGDT model. The proposed method does not minimize the procurement cost but assesses the risk aversion or risk-taking nature of some procurement strategies with regard to the minimum cost. Using this method, the robustness of experiencing costs higher than the expected one is optimized and the related strategy is determined. The proposed method deals with optimizing the opportunities to take advantage of low procurement costs or low pool prices. A case study is used to illustrate the proposed technique.

  19. Discovery and validation of information theory-based transcription factor and cofactor binding site motifs.

    Science.gov (United States)

    Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K

    2017-03-17

    Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. An introduction to information theory

    CERN Document Server

    Reza, Fazlollah M

    1994-01-01

    Graduate-level study for engineering students presents elements of modern probability theory, information theory, coding theory, more. Emphasis on sample space, random variables, capacity, etc. Many reference tables and extensive bibliography. 1961 edition.

  1. Parametric sensitivity analysis for biochemical reaction networks based on pathwise information theory.

    Science.gov (United States)

    Pantazis, Yannis; Katsoulakis, Markos A; Vlachos, Dionisios G

    2013-10-22

    Stochastic modeling and simulation provide powerful predictive methods for the intrinsic understanding of fundamental mechanisms in complex biochemical networks. Typically, such mathematical models involve networks of coupled jump stochastic processes with a large number of parameters that need to be suitably calibrated against experimental data. In this direction, the parameter sensitivity analysis of reaction networks is an essential mathematical and computational tool, yielding information regarding the robustness and the identifiability of model parameters. However, existing sensitivity analysis approaches such as variants of the finite difference method can have an overwhelming computational cost in models with a high-dimensional parameter space. We develop a sensitivity analysis methodology suitable for complex stochastic reaction networks with a large number of parameters. The proposed approach is based on Information Theory methods and relies on the quantification of information loss due to parameter perturbations between time-series distributions. For this reason, we need to work on path-space, i.e., the set consisting of all stochastic trajectories, hence the proposed approach is referred to as "pathwise". The pathwise sensitivity analysis method is realized by employing the rigorously-derived Relative Entropy Rate, which is directly computable from the propensity functions. A key aspect of the method is that an associated pathwise Fisher Information Matrix (FIM) is defined, which in turn constitutes a gradient-free approach to quantifying parameter sensitivities. The structure of the FIM turns out to be block-diagonal, revealing hidden parameter dependencies and sensitivities in reaction networks. As a gradient-free method, the proposed sensitivity analysis provides a significant advantage when dealing with complex stochastic systems with a large number of parameters. In addition, the knowledge of the structure of the FIM can allow to efficiently address

  2. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 2: Building a Culture of Inquiry

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-09-01

    Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action

  3. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    International Nuclear Information System (INIS)

    Tadaki, Kohtaro

    2010-01-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  4. Self-Instructional Module Based on Cognitive Load Theory: A Study on Information Retention among Trainee Teachers

    Science.gov (United States)

    Ong, Chiek Pin; Tasir, Zaidatun

    2015-01-01

    The aim of the research is to study the information retention among trainee teachers using a self-instructional printed module based on Cognitive Load Theory for learning spreadsheet software. Effective pedagogical considerations integrating the theoretical concepts related to cognitive load are reflected in the design and development of the…

  5. Application of Bayesian Decision Theory Based on Prior Information in the Multi-Objective Optimization Problem

    Directory of Open Access Journals (Sweden)

    Xia Lei

    2010-12-01

    Full Text Available General multi-objective optimization methods are hard to obtain prior information, how to utilize prior information has been a challenge. This paper analyzes the characteristics of Bayesian decision-making based on maximum entropy principle and prior information, especially in case that how to effectively improve decision-making reliability in deficiency of reference samples. The paper exhibits effectiveness of the proposed method using the real application of multi-frequency offset estimation in distributed multiple-input multiple-output system. The simulation results demonstrate Bayesian decision-making based on prior information has better global searching capability when sampling data is deficient.

  6. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  7. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  8. Information systems theory

    CERN Document Server

    Dwivedi, Yogesh K; Schneberger, Scott L

    2011-01-01

    The overall mission of this book is to provide a comprehensive understanding and coverage of the various theories and models used in IS research. Specifically, it aims to focus on the following key objectives: To describe the various theories and models applicable to studying IS/IT management issues. To outline and describe, for each of the various theories and models, independent and dependent constructs, reference discipline/originating area, originating author(s), seminal articles, level of analysis (i.e. firm, individual, industry) and links with other theories. To provide a critical revie

  9. Evaluation of the efficiency of computer-aided spectra search systems based on information theory

    International Nuclear Information System (INIS)

    Schaarschmidt, K.

    1979-01-01

    Application of information theory allows objective evaluation of the efficiency of computer-aided spectra search systems. For this purpose, a significant number of search processes must be analyzed. The amount of information gained by computer application is considered as the difference between the entropy of the data bank and a conditional entropy depending on the proportion of unsuccessful search processes and ballast. The influence of the following factors can be estimated: volume, structure, and quality of the spectra collection stored, efficiency of the encoding instruction and the comparing algorithm, and subjective errors involved in the encoding of spectra. The relations derived are applied to two published storage and retrieval systems for infared spectra. (Auth.)

  10. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster-Shafer Evidence Theory.

    Science.gov (United States)

    Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang

    2017-08-28

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.

  11. Web 2.0 systems supporting childhood chronic disease management: design guidelines based on information behaviour and social learning theories.

    Science.gov (United States)

    Ekberg, Joakim; Ericson, Leni; Timpka, Toomas; Eriksson, Henrik; Nordfeldt, Sam; Hanberger, Lena; Ludvigsson, Johnny

    2010-04-01

    Self-directed learning denotes that the individual is in command of what should be learned and why it is important. In this study, guidelines for the design of Web 2.0 systems for supporting diabetic adolescents' every day learning needs are examined in light of theories about information behaviour and social learning. A Web 2.0 system was developed to support a community of practice and social learning structures were created to support building of relations between members on several levels in the community. The features of the system included access to participation in the culture of diabetes management practice, entry to information about the community and about what needs to be learned to be a full practitioner or respected member in the community, and free sharing of information, narratives and experience-based knowledge. After integration with the key elements derived from theories of information behaviour, a preliminary design guideline document was formulated.

  12. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  13. Informed Grounded Theory

    Science.gov (United States)

    Thornberg, Robert

    2012-01-01

    There is a widespread idea that in grounded theory (GT) research, the researcher has to delay the literature review until the end of the analysis to avoid contamination--a dictum that might turn educational researchers away from GT. Nevertheless, in this article the author (a) problematizes the dictum of delaying a literature review in classic…

  14. Genre theory in information studies

    CERN Document Server

    Andersen, Jack

    2015-01-01

    This book highlights the important role genre theory plays within information studies. It illustrates how modern genre studies inform and enrich the study of information, and conversely how the study of information makes its own independent contributions to the study of genre.

  15. Geometric theory of information

    CERN Document Server

    2014-01-01

    This book brings together geometric tools and their applications for Information analysis. It collects current and many uses of in the interdisciplinary fields of Information Geometry Manifolds in Advanced Signal, Image & Video Processing, Complex Data Modeling and Analysis, Information Ranking and Retrieval, Coding, Cognitive Systems, Optimal Control, Statistics on Manifolds, Machine Learning, Speech/sound recognition, and natural language treatment which are also substantially relevant for the industry.

  16. Recoverability in quantum information theory

    Science.gov (United States)

    Wilde, Mark

    The fact that the quantum relative entropy is non-increasing with respect to quantum physical evolutions lies at the core of many optimality theorems in quantum information theory and has applications in other areas of physics. In this work, we establish improvements of this entropy inequality in the form of physically meaningful remainder terms. One of the main results can be summarized informally as follows: if the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small, then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution. Our proof method is elementary, relying on the method of complex interpolation, basic linear algebra, and the recently introduced Renyi generalization of a relative entropy difference. The theorem has a number of applications in quantum information theory, which have to do with providing physically meaningful improvements to many known entropy inequalities. This is based on arXiv:1505.04661, now accepted for publication in Proceedings of the Royal Society A. I acknowledge support from startup funds from the Department of Physics and Astronomy at LSU, the NSF under Award No. CCF-1350397, and the DARPA Quiness Program through US Army Research Office award W31P4Q-12-1-0019.

  17. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  18. Multi-Sensor Building Fire Alarm System with Information Fusion Technology Based on D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Qian Ding

    2014-10-01

    Full Text Available Multi-sensor and information fusion technology based on Dempster-Shafer evidence theory is applied in the system of a building fire alarm to realize early detecting and alarming. By using a multi-sensor to monitor the parameters of the fire process, such as light, smoke, temperature, gas and moisture, the range of fire monitoring in space and time is expanded compared with a single-sensor system. Then, the D-S evidence theory is applied to fuse the information from the multi-sensor with the specific fire model, and the fire alarm is more accurate and timely. The proposed method can avoid the failure of the monitoring data effectively, deal with the conflicting evidence from the multi-sensor robustly and improve the reliability of fire warning significantly.

  19. Robust Feature Selection from Microarray Data Based on Cooperative Game Theory and Qualitative Mutual Information

    Directory of Open Access Journals (Sweden)

    Atiyeh Mortazavi

    2016-01-01

    Full Text Available High dimensionality of microarray data sets may lead to low efficiency and overfitting. In this paper, a multiphase cooperative game theoretic feature selection approach is proposed for microarray data classification. In the first phase, due to high dimension of microarray data sets, the features are reduced using one of the two filter-based feature selection methods, namely, mutual information and Fisher ratio. In the second phase, Shapley index is used to evaluate the power of each feature. The main innovation of the proposed approach is to employ Qualitative Mutual Information (QMI for this purpose. The idea of Qualitative Mutual Information causes the selected features to have more stability and this stability helps to deal with the problem of data imbalance and scarcity. In the third phase, a forward selection scheme is applied which uses a scoring function to weight each feature. The performance of the proposed method is compared with other popular feature selection algorithms such as Fisher ratio, minimum redundancy maximum relevance, and previous works on cooperative game based feature selection. The average classification accuracy on eleven microarray data sets shows that the proposed method improves both average accuracy and average stability compared to other approaches.

  20. An information theory based approach for quantitative evaluation of man-machine interface complexity

    International Nuclear Information System (INIS)

    Kang, Hyun Gook

    1999-02-01

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  1. An information theory based approach for quantitative evaluation of man-machine interface complexity

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hyun Gook

    1999-02-15

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  2. Quantum: information theory: technological challenge

    International Nuclear Information System (INIS)

    Calixto, M.

    2001-01-01

    The new Quantum Information Theory augurs powerful machines that obey the entangled logic of the subatomic world. Parallelism, entanglement, teleportation, no-cloning and quantum cryptography are typical peculiarities of this novel way of understanding computation. (Author) 24 refs

  3. An informational theory of privacy

    NARCIS (Netherlands)

    Schottmuller, C.; Jann, Ole

    2016-01-01

    We develop a theory that explains how and when privacy can increase welfare. Without privacy, some individuals misrepresent their preferences, because they will otherwise be statistically discriminated against. This "chilling effect" hurts them individually, and impairs information aggregation. The

  4. Position-specific prediction of methylation sites from sequence conservation based on information theory.

    Science.gov (United States)

    Shi, Yinan; Guo, Yanzhi; Hu, Yayun; Li, Menglong

    2015-07-23

    Protein methylation plays vital roles in many biological processes and has been implicated in various human diseases. To fully understand the mechanisms underlying methylation for use in drug design and work in methylation-related diseases, an initial but crucial step is to identify methylation sites. The use of high-throughput bioinformatics methods has become imperative to predict methylation sites. In this study, we developed a novel method that is based only on sequence conservation to predict protein methylation sites. Conservation difference profiles between methylated and non-methylated peptides were constructed by the information entropy (IE) in a wider neighbor interval around the methylation sites that fully incorporated all of the environmental information. Then, the distinctive neighbor residues were identified by the importance scores of information gain (IG). The most representative model was constructed by support vector machine (SVM) for Arginine and Lysine methylation, respectively. This model yielded a promising result on both the benchmark dataset and independent test set. The model was used to screen the entire human proteome, and many unknown substrates were identified. These results indicate that our method can serve as a useful supplement to elucidate the mechanism of protein methylation and facilitate hypothesis-driven experimental design and validation.

  5. Novel information theory-based measures for quantifying incongruence among phylogenetic trees.

    Science.gov (United States)

    Salichos, Leonidas; Stamatakis, Alexandros; Rokas, Antonis

    2014-05-01

    Phylogenies inferred from different data matrices often conflict with each other necessitating the development of measures that quantify this incongruence. Here, we introduce novel measures that use information theory to quantify the degree of conflict or incongruence among all nontrivial bipartitions present in a set of trees. The first measure, internode certainty (IC), calculates the degree of certainty for a given internode by considering the frequency of the bipartition defined by the internode (internal branch) in a given set of trees jointly with that of the most prevalent conflicting bipartition in the same tree set. The second measure, IC All (ICA), calculates the degree of certainty for a given internode by considering the frequency of the bipartition defined by the internode in a given set of trees in conjunction with that of all conflicting bipartitions in the same underlying tree set. Finally, the tree certainty (TC) and TC All (TCA) measures are the sum of IC and ICA values across all internodes of a phylogeny, respectively. IC, ICA, TC, and TCA can be calculated from different types of data that contain nontrivial bipartitions, including from bootstrap replicate trees to gene trees or individual characters. Given a set of phylogenetic trees, the IC and ICA values of a given internode reflect its specific degree of incongruence, and the TC and TCA values describe the global degree of incongruence between trees in the set. All four measures are implemented and freely available in version 8.0.0 and subsequent versions of the widely used program RAxML.

  6. Information theory in molecular biology

    OpenAIRE

    Adami, Christoph

    2004-01-01

    This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the...

  7. The theory of quantum information

    CERN Document Server

    Watrous, John

    2018-01-01

    This largely self-contained book on the theory of quantum information focuses on precise mathematical formulations and proofs of fundamental facts that form the foundation of the subject. It is intended for graduate students and researchers in mathematics, computer science, and theoretical physics seeking to develop a thorough understanding of key results, proof techniques, and methodologies that are relevant to a wide range of research topics within the theory of quantum information and computation. The book is accessible to readers with an understanding of basic mathematics, including linear algebra, mathematical analysis, and probability theory. An introductory chapter summarizes these necessary mathematical prerequisites, and starting from this foundation, the book includes clear and complete proofs of all results it presents. Each subsequent chapter includes challenging exercises intended to help readers to develop their own skills for discovering proofs concerning the theory of quantum information.

  8. Selective information seeking: can consumers' avoidance of evidence-based information on colorectal cancer screening be explained by the theory of cognitive dissonance?

    Directory of Open Access Journals (Sweden)

    Mühlhauser, Ingrid

    2007-08-01

    Full Text Available Background: Evidence-based patient information (EBPI is a prerequisite for informed decision-making. However, presentation of EBPI may lead to irrational reactions causing avoidance, minimisation and devaluation of the information. Objective: To explore whether the theory of cognitive dissonance is applicable to medical decision-making and useful to explain these phenomena. Setting and participants: 261 volunteers from Hamburg (157 women, ≥50 years old without diagnosis of colorectal cancer. Design and variables: Within an experiment we simulated information seeking on colorectal cancer screening. Consumers’ attitudes towards screening were surveyed using a rating scale from -5 (participate in no way to +5 (participate unconditionally (independent variable. Using a cover story, participants were asked to sort 5 article headlines according to their reading preferences. The headlines simulated the pro to contra variety of contents to be found in print media about colorectal cancer screening. The dependent variable was the sequence of article headlines. Results: Participants were very much in favour of screening with scores for faecal occult blood test of 4.0 (0.1 and for colonoscopy 3.3 (0.1. According to our hypothesis we found statistically significant positive correlations between the stimuli in favour of screening and attitudes and significant negative correlations between the stimuli against screening and attitudes. Conclusion: The theory of cognitive dissonance is applicable to medical decision-making. It may explain some phenomena of irrational reactions to evidence-based patient information.

  9. Selective information seeking: can consumers' avoidance of evidence-based information on colorectal cancer screening be explained by the theory of cognitive dissonance?

    Science.gov (United States)

    Steckelberg, Anke; Kasper, Jürgen; Mühlhauser, Ingrid

    2007-08-27

    Evidence-based patient information (EBPI) is a prerequisite for informed decision-making. However, presentation of EBPI may lead to irrational reactions causing avoidance, minimisation and devaluation of the information. To explore whether the theory of cognitive dissonance is applicable to medical decision-making and useful to explain these phenomena. 261 volunteers from Hamburg (157 women), >or=50 years old without diagnosis of colorectal cancer. DESIGN AND VARIABLES: Within an experiment we simulated information seeking on colorectal cancer screening. Consumers' attitudes towards screening were surveyed using a rating scale from -5 (participate in no way) to +5 (participate unconditionally) (independent variable). Using a cover story, participants were asked to sort 5 article headlines according to their reading preferences. The headlines simulated the pro to contra variety of contents to be found in print media about colorectal cancer screening. The dependent variable was the sequence of article headlines. Participants were very much in favour of screening with scores for faecal occult blood test of 4.0 (0.1) and for colonoscopy 3.3 (0.1). According to our hypothesis we found statistically significant positive correlations between the stimuli in favour of screening and attitudes and significant negative correlations between the stimuli against screening and attitudes. The theory of cognitive dissonance is applicable to medical decision-making. It may explain some phenomena of irrational reactions to evidence-based patient information.

  10. Financial markets theory equilibrium, efficiency and information

    CERN Document Server

    Barucci, Emilio

    2017-01-01

    This work, now in a thoroughly revised second edition, presents the economic foundations of financial markets theory from a mathematically rigorous standpoint and offers a self-contained critical discussion based on empirical results. It is the only textbook on the subject to include more than two hundred exercises, with detailed solutions to selected exercises. Financial Markets Theory covers classical asset pricing theory in great detail, including utility theory, equilibrium theory, portfolio selection, mean-variance portfolio theory, CAPM, CCAPM, APT, and the Modigliani-Miller theorem. Starting from an analysis of the empirical evidence on the theory, the authors provide a discussion of the relevant literature, pointing out the main advances in classical asset pricing theory and the new approaches designed to address asset pricing puzzles and open problems (e.g., behavioral finance). Later chapters in the book contain more advanced material, including on the role of information in financial markets, non-c...

  11. Processing Information in Quantum Decision Theory

    OpenAIRE

    Yukalov, V. I.; Sornette, D.

    2008-01-01

    A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...

  12. The Quantitative Theory of Information

    DEFF Research Database (Denmark)

    Topsøe, Flemming; Harremoës, Peter

    2008-01-01

    Information Theory as developed by Shannon and followers is becoming more and more important in a number of sciences. The concepts appear to be just the right ones with intuitively appealing operational interpretations. Furthermore, the information theoretical quantities are connected by powerful...

  13. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  14. Quantum information and relativity theory

    International Nuclear Information System (INIS)

    Peres, Asher; Terno, Daniel R.

    2004-01-01

    This article discusses the intimate relationship between quantum mechanics, information theory, and relativity theory. Taken together these are the foundations of present-day theoretical physics, and their interrelationship is an essential part of the theory. The acquisition of information from a quantum system by an observer occurs at the interface of classical and quantum physics. The authors review the essential tools needed to describe this interface, i.e., Kraus matrices and positive-operator-valued measures. They then discuss how special relativity imposes severe restrictions on the transfer of information between distant systems and the implications of the fact that quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about that Lorentz transformations of reduced density matrices for entangled systems may not be completely positive maps. Quantum field theory is, of course, necessary for a consistent description of interactions. Its structure implies a fundamental tradeoff between detector reliability and localizability. Moreover, general relativity produces new and counterintuitive effects, particularly when black holes (or, more generally, event horizons) are involved. In this more general context the authors discuss how most of the current concepts in quantum information theory may require a reassessment

  15. Towards a critical theory of information

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2009-11-01

    The debate on redistribution and recognition between critical theorists Nancy Fraser and Axel Honneth gives the opportunity to renew the discussion of the relationship of base and superstructure in critical social theory. Critical information theory needs to be aware of economic, political, and cultural demands that it needs to make in struggles for ending domination and oppression, and of the unifying role that the economy and class play in these demands and struggles. Objective and subjective information concepts are based on the underlying worldview of reification. Reification endangers human existence. Information as process and relation enables political and ethical alternatives that have radical implications for society.

  16. Information Theory and Plasma Turbulence

    International Nuclear Information System (INIS)

    Dendy, R. O.

    2009-01-01

    Information theory, applied directly to measured signals, yields new perspectives on, and quantitative knowledge of, the physics of strongly nonlinear and turbulent phenomena in plasmas. It represents a new and productive element of the topical research programmes that use modern techniques to characterise strongly nonlinear signals from plasmas, and that address global plasma behaviour from a complex systems perspective. We here review some pioneering studies of mutual information in solar wind and magnetospheric plasmas, using techniques tested on standard complex systems.

  17. Information, Understanding, and Influence: An Agency Theory Strategy for Air Base Communications and Cyberspace Support

    Science.gov (United States)

    2014-05-01

    high speed data network. The system went operational in 1963 and would go on to grow into the Department of Defense Automated Digital Information...decisions to a single agent. For example, in wholesale merchandise trade, manufacturer agents (common agent) represent the potentially conflicting...operations. “It goes without saying that subversion, espionage, and sabotage— digitally facilitated or not—may accompany military operations,” Rid

  18. A computational model for knowledge-driven monitoring of nuclear power plant operators based on information theory

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2006-01-01

    To develop operator behavior models such as IDAC, quantitative models for the cognitive activities of nuclear power plant (NPP) operators in abnormal situations are essential. Among them, only few quantitative models for the monitoring and detection have been developed. In this paper, we propose a computational model for the knowledge-driven monitoring, which is also known as model-driven monitoring, of NPP operators in abnormal situations, based on the information theory. The basic assumption of the proposed model is that the probability that an operator shifts his or her attention to an information source is proportional to the expected information from the information source. A small experiment performed to evaluate the feasibility of the proposed model shows that the predictions made by the proposed model have high correlations with the experimental results. Even though it has been argued that heuristics might play an important role on human reasoning, we believe that the proposed model can provide part of the mathematical basis for developing quantitative models for knowledge-driven monitoring of NPP operators when NPP operators are assumed to behave very logically

  19. On Representation in Information Theory

    Directory of Open Access Journals (Sweden)

    Joseph E. Brenner

    2011-09-01

    Full Text Available Semiotics is widely applied in theories of information. Following the original triadic characterization of reality by Peirce, the linguistic processes involved in information—production, transmission, reception, and understanding—would all appear to be interpretable in terms of signs and their relations to their objects. Perhaps the most important of these relations is that of the representation-one, entity, standing for or representing some other. For example, an index—one of the three major kinds of signs—is said to represent something by being directly related to its object. My position, however, is that the concept of symbolic representations having such roles in information, as intermediaries, is fraught with the same difficulties as in representational theories of mind. I have proposed an extension of logic to complex real phenomena, including mind and information (Logic in Reality; LIR, most recently at the 4th International Conference on the Foundations of Information Science (Beijing, August, 2010. LIR provides explanations for the evolution of complex processes, including information, that do not require any entities other than the processes themselves. In this paper, I discuss the limitations of the standard relation of representation. I argue that more realistic pictures of informational systems can be provided by reference to information as an energetic process, following the categorial ontology of LIR. This approach enables naïve, anti-realist conceptions of anti-representationalism to be avoided, and enables an approach to both information and meaning in the same novel logical framework.

  20. Quantum Information Theory - an Invitation

    Science.gov (United States)

    Werner, Reinhard F.

    Quantum information and quantum computers have received a lot of public attention recently. Quantum computers have been advertised as a kind of warp drive for computing, and indeed the promise of the algorithms of Shor and Grover is to perform computations which are extremely hard or even provably impossible on any merely ``classical'' computer.In this article I shall give an account of the basic concepts of quantum information theory is given, staying as much as possible in the area of general agreement.The article is divided into two parts. The first (up to the end of Sect. 2.5) is mostly in plain English, centered around the exploration of what can or cannot be done with quantum systems as information carriers. The second part, Sect. 2.6, then gives a description of the mathematical structures and of some of the tools needed to develop the theory.

  1. An Information Theory-Based Approach to Assessing the Sustainability and Stability of an Island System

    Science.gov (United States)

    It is well-documented that a sustainable system is based on environmental stewardship, economic viability and social equity. What is often overlooked is the need for continuity such that desirable system behavior is maintained with mechanisms in place that facilitate the ability ...

  2. Modelling and Analysis of Automobile Vibration System Based on Fuzzy Theory under Different Road Excitation Information

    Directory of Open Access Journals (Sweden)

    Xue-wen Chen

    2018-01-01

    Full Text Available A fuzzy increment controller is designed aimed at the vibration system of automobile active suspension with seven degrees of freedom (DOF. For decreasing vibration, an active control force is acquired by created Proportion-Integration-Differentiation (PID controller. The controller’s parameters are adjusted by a fuzzy increment controller with self-modifying parameters functions, which adopts the deviation and its rate of change of the body’s vertical vibration velocity and the desired value in the position of the front and rear suspension as the input variables based on 49 fuzzy control rules. Adopting Simulink, the fuzzy increment controller is validated under different road excitation, such as the white noise input with four-wheel correlation in time-domain, the sinusoidal input, and the pulse input of C-grade road surface. The simulation results show that the proposed controller can reduce obviously the vehicle vibration compared to other independent control types in performance indexes, such as, the root mean square value of the body’s vertical vibration acceleration, pitching, and rolling angular acceleration.

  3. Quantum information theory mathematical foundation

    CERN Document Server

    Hayashi, Masahito

    2017-01-01

    This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics – all of which are addressed here – made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an impro...

  4. Rebooting Kirkpatrick: Integrating Information System Theory Into the Evaluation of Web-based Continuing Professional Development Interventions for Interprofessional Education.

    Science.gov (United States)

    Shen, Nelson; Yufe, Shira; Saadatfard, Omid; Sockalingam, Sanjeev; Wiljer, David

    2017-01-01

    Information system research has stressed the importance of theory in understanding how user perceptions can motivate the use and adoption of technology such as web-based continuing professional development programs for interprofessional education (WCPD-IPE). A systematic review was conducted to provide an information system perspective on the current state of WCPD-IPE program evaluation and how current evaluations capture essential theoretical constructs in promoting technology adoption. Six databases were searched to identify studies evaluating WCPD-IPE. Three investigators determined eligibility of the articles. Evaluation items extracted from the studies were assessed using the Kirkpatrick-Barr framework and mapped to the Benefits Evaluation Framework. Thirty-seven eligible studies yielded 362 evaluation items for analysis. Most items (n = 252) were assessed as Kirkpatrick-Barr level 1 (reaction) and were mainly focused on the quality (information, service, and quality) and satisfaction dimensions of the Benefits Evaluation. System quality was the least evaluated quality dimension, accounting for 26 items across 13 studies. WCPD-IPE use was reported in 17 studies and its antecedent factors were evaluated in varying degrees of comprehensiveness. Although user reactions were commonly evaluated, greater focus on user perceptions of system quality (ie, functionality and performance), usefulness, and usability of the web-based platform is required. Surprisingly, WCPD-IPE use was reported in less than half of the studies. This is problematic as use is a prerequisite to realizing any individual, organizational, or societal benefit of WCPD-IPE. This review proposes an integrated framework which accounts for these factors and provides a theoretically grounded guide for future evaluations.

  5. Towards an Information Retrieval Theory of Everything

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Lammerink, J.M.W.; Katoen, Joost P.; Kok, J.N.; van de Pol, Jan Cornelis; Raamsdonk, F.

    2009-01-01

    I present three well-known probabilistic models of information retrieval in tutorial style: The binary independence probabilistic model, the language modeling approach, and Google's page rank. Although all three models are based on probability theory, they are very different in nature. Each model

  6. Reasonable fermionic quantum information theories require relativity

    International Nuclear Information System (INIS)

    Friis, Nicolai

    2016-01-01

    We show that any quantum information theory based on anticommuting operators must be supplemented by a superselection rule deeply rooted in relativity to establish a reasonable notion of entanglement. While quantum information may be encoded in the fermionic Fock space, the unrestricted theory has a peculiar feature: the marginals of bipartite pure states need not have identical entropies, which leads to an ambiguous definition of entanglement. We solve this problem, by proving that it is removed by relativity, i.e., by the parity superselection rule that arises from Lorentz invariance via the spin-statistics connection. Our results hence unveil a fundamental conceptual inseparability of quantum information and the causal structure of relativistic field theory. (paper)

  7. The Identification of Reasons, Solutions, and Techniques Informing a Theory-Based Intervention Targeting Recreational Sports Participation

    Science.gov (United States)

    St Quinton, Tom; Brunton, Julie A.

    2018-01-01

    Purpose: This study is the 3rd piece of formative research utilizing the theory of planned behavior to inform the development of a behavior change intervention. Focus groups were used to identify reasons for and solutions to previously identified key beliefs in addition to potentially effective behavior change techniques. Method: A purposive…

  8. Geometrical identification of quantum and information theories

    International Nuclear Information System (INIS)

    Caianiello, E.R.

    1983-01-01

    The interrelation of quantum and information theories is investigation on the base of the conception of cross-entropy. It is assumed that ''complex information geometry'' may serve as a tool for ''technological transfer'' from one research field to the other which is not connected directly with the first one. It is pointed out that the ''infinitesimal distance'' ds 2 and ''infinitesimal cross-entropy'' dHsub(c) coincide

  9. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  10. Chemical Thermodynamics and Information Theory with Applications

    CERN Document Server

    Graham, Daniel J

    2011-01-01

    Thermodynamics and information touch theory every facet of chemistry. However, the physical chemistry curriculum digested by students worldwide is still heavily skewed toward heat/work principles established more than a century ago. Rectifying this situation, Chemical Thermodynamics and Information Theory with Applications explores applications drawn from the intersection of thermodynamics and information theory--two mature and far-reaching fields. In an approach that intertwines information science and chemistry, this book covers: The informational aspects of thermodynamic state equations The

  11. Multimedia information retrieval theory and techniques

    CERN Document Server

    Raieli, Roberto

    2013-01-01

    Novel processing and searching tools for the management of new multimedia documents have developed. Multimedia Information Retrieval (MMIR) is an organic system made up of Text Retrieval (TR); Visual Retrieval (VR); Video Retrieval (VDR); and Audio Retrieval (AR) systems. So that each type of digital document may be analysed and searched by the elements of language appropriate to its nature, search criteria must be extended. Such an approach is known as the Content Based Information Retrieval (CBIR), and is the core of MMIR. This novel content-based concept of information handling needs to be integrated with more traditional semantics. Multimedia Information Retrieval focuses on the tools of processing and searching applicable to the content-based management of new multimedia documents. Translated from Italian by Giles Smith, the book is divided in to two parts. Part one discusses MMIR and related theories, and puts forward new methodologies; part two reviews various experimental and operating MMIR systems, a...

  12. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  13. Comparing cosmic web classifiers using information theory

    International Nuclear Information System (INIS)

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin; Jasche, Jens

    2016-01-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  14. Comparing cosmic web classifiers using information theory

    Energy Technology Data Exchange (ETDEWEB)

    Leclercq, Florent [Institute of Cosmology and Gravitation (ICG), University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth PO1 3FX (United Kingdom); Lavaux, Guilhem; Wandelt, Benjamin [Institut d' Astrophysique de Paris (IAP), UMR 7095, CNRS – UPMC Université Paris 6, Sorbonne Universités, 98bis boulevard Arago, F-75014 Paris (France); Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: lavaux@iap.fr, E-mail: j.jasche@tum.de, E-mail: wandelt@iap.fr [Excellence Cluster Universe, Technische Universität München, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-08-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  15. Quantum information theory and quantum statistics

    International Nuclear Information System (INIS)

    Petz, D.

    2008-01-01

    Based on lectures given by the author, this book focuses on providing reliable introductory explanations of key concepts of quantum information theory and quantum statistics - rather than on results. The mathematically rigorous presentation is supported by numerous examples and exercises and by an appendix summarizing the relevant aspects of linear analysis. Assuming that the reader is familiar with the content of standard undergraduate courses in quantum mechanics, probability theory, linear algebra and functional analysis, the book addresses graduate students of mathematics and physics as well as theoretical and mathematical physicists. Conceived as a primer to bridge the gap between statistical physics and quantum information, a field to which the author has contributed significantly himself, it emphasizes concepts and thorough discussions of the fundamental notions to prepare the reader for deeper studies, not least through the selection of well chosen exercises. (orig.)

  16. Theory-Based Stakeholder Evaluation

    Science.gov (United States)

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  17. Behavioral change theories can inform the prediction of young adults' adoption of a plant-based diet.

    Science.gov (United States)

    Wyker, Brett A; Davison, Kirsten K

    2010-01-01

    Drawing on the Theory of Planned Behavior (TPB) and the Transtheoretical Model (TTM), this study (1) examines links between stages of change for following a plant-based diet (PBD) and consuming more fruits and vegetables (FV); (2) tests an integrated theoretical model predicting intention to follow a PBD; and (3) identifies associated salient beliefs. Cross-sectional. Large public university in the northeastern United States. 204 college students. TPB and TTM constructs were assessed using validated scales. Outcome, normative, and control beliefs were measured using open-ended questions. The overlap between stages of change for FV consumption and adopting a PBD was assessed using Spearman rank correlation analysis and cross-tab comparisons. The proposed model predicting adoption of a PBD was tested using structural equation modeling (SEM). Salient beliefs were coded using automatic response coding software. No association was found between stages of change for FV consumption and following a PBD. Results from SEM analyses provided support for the proposed model predicting intention to follow a PBD. Gender differences in salient beliefs for following a PBD were found. Results demonstrate the potential for effective theory-driven and stage-tailored public health interventions to promote PBDs. Copyright 2010 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.

  18. Astrophysical data analysis with information field theory

    International Nuclear Information System (INIS)

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented

  19. Astrophysical data analysis with information field theory

    Science.gov (United States)

    Enßlin, Torsten

    2014-12-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  20. Astrophysical data analysis with information field theory

    Energy Technology Data Exchange (ETDEWEB)

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  1. Comment on Gallistel: behavior theory and information theory: some parallels.

    Science.gov (United States)

    Nevin, John A

    2012-05-01

    In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Informational Closed-Loop Coding-Decoding Control Concept as the Base of the Living or Organized Systems Theory

    Science.gov (United States)

    Kirvelis, Dobilas; Beitas, Kastytis

    2008-10-01

    The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.

  3. Information theory and rate distortion theory for communications and compression

    CERN Document Server

    Gibson, Jerry

    2013-01-01

    This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the cover

  4. Information theory perspective on network robustness

    International Nuclear Information System (INIS)

    Schieber, Tiago A.; Carpi, Laura; Frery, Alejandro C.; Rosso, Osvaldo A.; Pardalos, Panos M.; Ravetti, Martín G.

    2016-01-01

    A crucial challenge in network theory is the study of the robustness of a network when facing a sequence of failures. In this work, we propose a dynamical definition of network robustness based on Information Theory, that considers measurements of the structural changes caused by failures of the network's components. Failures are defined here as a temporal process defined in a sequence. Robustness is then evaluated by measuring dissimilarities between topologies after each time step of the sequence, providing a dynamical information about the topological damage. We thoroughly analyze the efficiency of the method in capturing small perturbations by considering different probability distributions on networks. In particular, we find that distributions based on distances are more consistent in capturing network structural deviations, as better reflect the consequences of the failures. Theoretical examples and real networks are used to study the performance of this methodology. - Highlights: • A novel methodology to measure the robustness of a network to component failure or targeted attacks is proposed. • The use of the network's distance PDF allows a precise analysis. • The method provides a dynamic robustness profile showing the response of the topology to each failure event. • The measure is capable to detect network's critical elements.

  5. Econophysics: from Game Theory and Information Theory to Quantum Mechanics

    Science.gov (United States)

    Jimenez, Edward; Moya, Douglas

    2005-03-01

    Rationality is the universal invariant among human behavior, universe physical laws and ordered and complex biological systems. Econophysics isboth the use of physical concepts in Finance and Economics, and the use of Information Economics in Physics. In special, we will show that it is possible to obtain the Quantum Mechanics principles using Information and Game Theory.

  6. Information Theory for Information Science: Antecedents, Philosophy, and Applications

    Science.gov (United States)

    Losee, Robert M.

    2017-01-01

    This paper provides an historical overview of the theoretical antecedents leading to information theory, specifically those useful for understanding and teaching information science and systems. Information may be discussed in a philosophical manner and at the same time be measureable. This notion of information can thus be the subject of…

  7. An information theory account of cognitive control.

    Science.gov (United States)

    Fan, Jin

    2014-01-01

    Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  8. An information theory account of cognitive control

    Directory of Open Access Journals (Sweden)

    Jin eFan

    2014-09-01

    Full Text Available Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  9. Climate-informed flood frequency analysis based on Bayesian theory and teleconnection for the Three Gorges Dam (TGD)

    Science.gov (United States)

    DONG, Q.; Zhang, X.; Lall, U.; Sang, Y. F.; Xie, P.

    2017-12-01

    With the current global climate changing and human activities intensifying, the uncertainties and danger of floods increased significantly. However, the current flood frequency analysis is still based on the stationary assumption. This assumption not only limits the benefits of the water conservancy projects, but also brings hazard because it ignores the risk of flooding under climate change. In this paper, we relax the stationary hypothesis in the flood frequency analysis model based on the teleconnection and use the intrinsic relation of flood elements to improve the annual flood frequency results by Bayesian inference approaches. Daily discharges of the the Three Gorges Dam(TGD) in 1953-2013 years are used as an example. Firstly, according to the linear correlation between the climate indices and the distribution parameters, the prior distributions of peak and volume are established with the selected large scale climate predictors. After that, by using the copula function and predictands, the conditional probability function of peak and volume is obtained. Then, the Bayesian theory links the prior distributions and conditional distributions and get the posterior distributions. We compare the difference under different prior distributions and find the optimal flood frequency distribution model. Finally, we discuss the impact of dynamic flood frequency analysis on the plan and management of hydraulic engineering. The results show that compared with the prior probability, the posterior probability considering the correlation of the flood elements is more accurate and the uncertainty is smaller. And the dynamic flood frequency model has a great impact on the management of the existing hydraulic engineering, which can improve the engineering operation benefit and reducing its flood risk, but it nearly didn't influence the plan of hydraulic engineering. The study of this paper is helpful to the dynamic flood risk management of TGD, and provide reference for the

  10. Rudolf Ahlswede’s lectures on information theory

    CERN Document Server

    Althöfer, Ingo; Deppe, Christian; Tamm, Ulrich

    Volume 1 : The volume “Storing and Transmitting Data” is based on Rudolf Ahlswede's introductory course on "Information Theory I" and presents an introduction to Shannon Theory. Readers, familiar or unfamiliar with the technical intricacies of Information Theory, will benefit considerably from working through the book; especially Chapter VI with its lively comments and uncensored insider views from the world of science and research offers informative and revealing insights. This is the first of several volumes that will serve as a collected research documentation of Rudolf Ahlswede’s lectures on information theory. Each volume includes comments from an invited well-known expert. Holger Boche contributed his insights in the supplement of the present volume. Classical information processing concerns the main tasks of gaining knowledge, storage, transmitting and hiding data. The first task is the prime goal of Statistics. For the two next, Shannon presented an impressive mathematical theory called Informat...

  11. Correlation Feature Selection and Mutual Information Theory Based Quantitative Research on Meteorological Impact Factors of Module Temperature for Solar Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Yujing Sun

    2016-12-01

    Full Text Available The module temperature is the most important parameter influencing the output power of solar photovoltaic (PV systems, aside from solar irradiance. In this paper, we focus on the interdisciplinary research that combines the correlation analysis, mutual information (MI and heat transfer theory, which aims to figure out the correlative relations between different meteorological impact factors (MIFs and PV module temperature from both quality and quantitative aspects. The identification and confirmation of primary MIFs of PV module temperature are investigated as the first step of this research from the perspective of physical meaning and mathematical analysis about electrical performance and thermal characteristic of PV modules based on PV effect and heat transfer theory. Furthermore, the quantitative description of the MIFs influence on PV module temperature is mathematically formulated as several indexes using correlation-based feature selection (CFS and MI theory to explore the specific impact degrees under four different typical weather statuses named general weather classes (GWCs. Case studies for the proposed methods were conducted using actual measurement data of a 500 kW grid-connected solar PV plant in China. The results not only verified the knowledge about the main MIFs of PV module temperatures, more importantly, but also provide the specific ratio of quantitative impact degrees of these three MIFs respectively through CFS and MI based measures under four different GWCs.

  12. Generalized information theory: aims, results, and open problems

    International Nuclear Information System (INIS)

    Klir, George J.

    2004-01-01

    The principal purpose of this paper is to present a comprehensive overview of generalized information theory (GIT): a research program whose objective is to develop a broad treatment of uncertainty-based information, not restricted to classical notions of uncertainty. After a brief overview of classical information theories, a broad framework for formalizing uncertainty and the associated uncertainty-based information of a great spectrum of conceivable types is sketched. The various theories of imprecise probabilities that have already been developed within this framework are then surveyed, focusing primarily on some important unifying principles applying to all these theories. This is followed by introducing two higher levels of the theories of imprecise probabilities: (i) the level of measuring the amount of relevant uncertainty (predictive, retrodictive, prescriptive, diagnostic, etc.) in any situation formalizable in each given theory, and (ii) the level of some methodological principles of uncertainty, which are contingent upon the capability to measure uncertainty and the associated uncertainty-based information. Various issues regarding both the measurement of uncertainty and the uncertainty principles are discussed. Again, the focus is on unifying principles applicable to all the theories. Finally, the current status of GIT is assessed and future research in the area is discussed

  13. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  14. An Information Theory-Inspired Strategy for Design of Re-programmable Encrypted Graphene-based Coding Metasurfaces at Terahertz Frequencies.

    Science.gov (United States)

    Momeni, Ali; Rouhi, Kasra; Rajabalipanah, Hamid; Abdolali, Ali

    2018-04-18

    Inspired by the information theory, a new concept of re-programmable encrypted graphene-based coding metasurfaces was investigated at terahertz frequencies. A channel-coding function was proposed to convolutionally record an arbitrary information message onto unrecognizable but recoverable parity beams generated by a phase-encrypted coding metasurface. A single graphene-based reflective cell with dual-mode biasing voltages was designed to act as "0" and "1" meta-atoms, providing broadband opposite reflection phases. By exploiting graphene tunability, the proposed scheme enabled an unprecedented degree of freedom in the real-time mapping of information messages onto multiple parity beams which could not be damaged, altered, and reverse-engineered. Various encryption types such as mirroring, anomalous reflection, multi-beam generation, and scattering diffusion can be dynamically attained via our multifunctional metasurface. Besides, contrary to conventional time-consuming and optimization-based methods, this paper convincingly offers a fast, straightforward, and efficient design of diffusion metasurfaces of arbitrarily large size. Rigorous full-wave simulations corroborated the results where the phase-encrypted metasurfaces exhibited a polarization-insensitive reflectivity less than -10 dB over a broadband frequency range from 1 THz to 1.7 THz. This work reveals new opportunities for the extension of re-programmable THz-coding metasurfaces and may be of interest for reflection-type security systems, computational imaging, and camouflage technology.

  15. Information theory, spectral geometry, and quantum gravity.

    Science.gov (United States)

    Kempf, Achim; Martin, Robert

    2008-01-18

    We show that there exists a deep link between the two disciplines of information theory and spectral geometry. This allows us to obtain new results on a well-known quantum gravity motivated natural ultraviolet cutoff which describes an upper bound on the spatial density of information. Concretely, we show that, together with an infrared cutoff, this natural ultraviolet cutoff beautifully reduces the path integral of quantum field theory on curved space to a finite number of ordinary integrations. We then show, in particular, that the subsequent removal of the infrared cutoff is safe.

  16. Informal Risk Perceptions and Formal Theory

    International Nuclear Information System (INIS)

    Cayford, Jerry

    2001-01-01

    Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to a qualitative

  17. Informal Risk Perceptions and Formal Theory

    Energy Technology Data Exchange (ETDEWEB)

    Cayford, Jerry [Resources for the Future, Washington, DC (United States)

    2001-07-01

    Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to

  18. Battling the challenges of training nurses to use information systems through theory-based training material design

    NARCIS (Netherlands)

    M. Galani (Malatsi); P. Yu (Ping); G.W.C. Paas (Fred); P. Chandler (Paul)

    2014-01-01

    textabstractThe attempts to train nurses to effectively use information systems have had mixed results. One problem is that training materials are not adequately designed to guide trainees to gradually learn to use a system without experiencing a heavy cognitive load. This is because training design

  19. Writing, Proofreading and Editing in Information Theory

    Directory of Open Access Journals (Sweden)

    J. Ricardo Arias-Gonzalez

    2018-05-01

    Full Text Available Information is a physical entity amenable to be described by an abstract theory. The concepts associated with the creation and post-processing of the information have not, however, been mathematically established, despite being broadly used in many fields of knowledge. Here, inspired by how information is managed in biomolecular systems, we introduce writing, entailing any bit string generation, and revision, as comprising proofreading and editing, in information chains. Our formalism expands the thermodynamic analysis of stochastic chains made up of material subunits to abstract strings of symbols. We introduce a non-Markovian treatment of operational rules over the symbols of the chain that parallels the physical interactions responsible for memory effects in material chains. Our theory underlies any communication system, ranging from human languages and computer science to gene evolution.

  20. Realism and Antirealism in Informational Foundations of Quantum Theory

    Directory of Open Access Journals (Sweden)

    Tina Bilban

    2014-08-01

    Full Text Available Zeilinger-Brukner's informational foundations of quantum theory, a theory based on Zeilinger's foundational principle for quantum mechanics that an elementary system carried one bit of information, explains seemingly unintuitive quantum behavior with simple theoretical framework. It is based on the notion that distinction between reality and information cannot be made, therefore they are the same. As the critics of informational foundations of quantum theory show, this antirealistic move captures the theory in tautology, where information only refers to itself, while the relationships outside the information with the help of which the nature of information would be defined are lost and the questions "Whose information? Information about what?" cannot be answered. The critic's solution is a return to realism, where the observer's effects on the information are neglected. We show that radical antirealism of informational foundations of quantum theory is not necessary and that the return to realism is not the only way forward. A comprehensive approach that exceeds mere realism and antirealism is also possible: we can consider both sources of the constraints on the information, those coming from the observer and those coming from the observed system/nature/reality. The information is always the observer's information about the observed. Such a comprehensive philosophical approach can still support the theoretical framework of informational foundations of quantum theory: If we take that one bit is the smallest amount of information in the form of which the observed reality can be grasped by the observer, we can say that an elementary system (grasped and defined as such by the observer correlates to one bit of information. Our approach thus explains all the features of the quantum behavior explained by informational foundations of quantum theory: the wave function and its collapse, entanglement, complementarity and quantum randomness. However, it does

  1. Planting contemporary practice theory in the garden of information science

    NARCIS (Netherlands)

    Huizing, A.; Cavanagh, M.

    2011-01-01

    Introduction. The purpose of this paper is to introduce to information science in a coherent fashion the core premises of contemporary practice theory, and thus to engage the information research community in further debate and discussion. Method. Contemporary practice-based approaches are

  2. Information theory and the ethylene genetic network.

    Science.gov (United States)

    González-García, José S; Díaz, José

    2011-10-01

    The original aim of the Information Theory (IT) was to solve a purely technical problem: to increase the performance of communication systems, which are constantly affected by interferences that diminish the quality of the transmitted information. That is, the theory deals only with the problem of transmitting with the maximal precision the symbols constituting a message. In Shannon's theory messages are characterized only by their probabilities, regardless of their value or meaning. As for its present day status, it is generally acknowledged that Information Theory has solid mathematical foundations and has fruitful strong links with Physics in both theoretical and experimental areas. However, many applications of Information Theory to Biology are limited to using it as a technical tool to analyze biopolymers, such as DNA, RNA or protein sequences. The main point of discussion about the applicability of IT to explain the information flow in biological systems is that in a classic communication channel, the symbols that conform the coded message are transmitted one by one in an independent form through a noisy communication channel, and noise can alter each of the symbols, distorting the message; in contrast, in a genetic communication channel the coded messages are not transmitted in the form of symbols but signaling cascades transmit them. Consequently, the information flow from the emitter to the effector is due to a series of coupled physicochemical processes that must ensure the accurate transmission of the message. In this review we discussed a novel proposal to overcome this difficulty, which consists of the modeling of gene expression with a stochastic approach that allows Shannon entropy (H) to be directly used to measure the amount of uncertainty that the genetic machinery has in relation to the correct decoding of a message transmitted into the nucleus by a signaling pathway. From the value of H we can define a function I that measures the amount of

  3. Exploratory Study Based on Stakeholder Theory in the Development of Accounting Information Systems in the Catholic Church: A Case Study in the Archdiocese of Semarang, Indonesia

    Directory of Open Access Journals (Sweden)

    Siswanto Fransiscus Asisi Joko

    2017-01-01

    Full Text Available This study aims to find a strategy in the development of computer-based accounting information system in the church. With exploratory study on the theory of stakeholders, this study identifies the needs of financial information for the purposes of making a decision required by the parish priest, the parish treasurer, and a team of economists at the archdiocese of Semarang (AS. This research was conducted by using qualitative and quantitative approach. Qualitative method is conducted by applying a focus group discussion with economist team in AS (the users who have major influence in the development of the system. In addition to that, quantitative method is also applied to the parish treasurer (the users who have great interest in the system development. The results showed that the parish treasurer has high perceived usefulness, perceived ease of use, perceived of relevance, and the self-efficacy toward the accounting information system (AIS for the parish. This study provides an answer on the benefits of a bottom-up strategy based on the stakeholder analysis in the development of AIS in the area of the Catholic Church AS.

  4. Structural information theory and visual form

    NARCIS (Netherlands)

    Leeuwenberg, E.L.J.; Kaernbach, C.; Schroeger, E.; Mueller, H.

    2003-01-01

    The paper attends to basic characteristics of visual form as approached by Structural information theory, or SIT, (Leeuwenberg, Van der Helm and Van Lier). The introduction provides a global survey of this approach. The main part of the paper focuses on three characteristics of SIT. Each one is made

  5. A THEORY OF MAXIMIZING SENSORY INFORMATION

    NARCIS (Netherlands)

    Hateren, J.H. van

    1992-01-01

    A theory is developed on the assumption that early sensory processing aims at maximizing the information rate in the channels connecting the sensory system to more central parts of the brain, where it is assumed that these channels are noisy and have a limited dynamic range. Given a stimulus power

  6. Quantum theory informational foundations and foils

    CERN Document Server

    Spekkens, Robert

    2016-01-01

    This book provides the first unified overview of the burgeoning research area at the interface between Quantum Foundations and Quantum Information.  Topics include: operational alternatives to quantum theory, information-theoretic reconstructions of the quantum formalism, mathematical frameworks for operational theories, and device-independent features of the set of quantum correlations. Powered by the injection of fresh ideas from the field of Quantum Information and Computation, the foundations of Quantum Mechanics are in the midst of a renaissance. The last two decades have seen an explosion of new results and research directions, attracting broad interest in the scientific community. The variety and number of different approaches, however, makes it challenging for a newcomer to obtain a big picture of the field and of its high-level goals. Here, fourteen original contributions from leading experts in the field cover some of the most promising research directions that have emerged in the new wave of quant...

  7. Towards an Information Theory of Complex Networks

    CERN Document Server

    Dehmer, Matthias; Mehler, Alexander

    2011-01-01

    For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti

  8. An information integration theory of consciousness

    Directory of Open Access Journals (Sweden)

    Tononi Giulio

    2004-11-01

    Full Text Available Abstract Background Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition? Presentation of the hypothesis This paper presents a theory about what consciousness is and how it can be measured. According to the theory, consciousness corresponds to the capacity of a system to integrate information. This claim is motivated by two key phenomenological properties of consciousness: differentiation – the availability of a very large number of conscious experiences; and integration – the unity of each such experience. The theory states that the quantity of consciousness available to a system can be measured as the Φ value of a complex of elements. Φ is the amount of causally effective information that can be integrated across the informational weakest link of a subset of elements. A complex is a subset of elements with Φ>0 that is not part of a subset of higher Φ. The theory also claims that the quality of consciousness is determined by the informational relationships among the elements of a complex, which are specified by the values of effective information among them. Finally, each particular conscious experience is specified by the value, at any given time, of the variables mediating informational interactions among the elements of a complex. Testing the hypothesis The information integration theory accounts, in a principled manner, for several neurobiological observations

  9. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  10. The g-theorem and quantum information theory

    Energy Technology Data Exchange (ETDEWEB)

    Casini, Horacio; Landea, Ignacio Salazar; Torroba, Gonzalo [Centro Atómico Bariloche and CONICET,S.C. de Bariloche, Río Negro, R8402AGP (Argentina)

    2016-10-25

    We study boundary renormalization group flows between boundary conformal field theories in 1+1 dimensions using methods of quantum information theory. We define an entropic g-function for theories with impurities in terms of the relative entanglement entropy, and we prove that this g-function decreases along boundary renormalization group flows. This entropic g-theorem is valid at zero temperature, and is independent from the g-theorem based on the thermal partition function. We also discuss the mutual information in boundary RG flows, and how it encodes the correlations between the impurity and bulk degrees of freedom. Our results provide a quantum-information understanding of (boundary) RG flow as increase of distinguishability between the UV fixed point and the theory along the RG flow.

  11. How to Produce a Transdisciplinary Information Concept for a Universal Theory of Information?

    DEFF Research Database (Denmark)

    Brier, Søren

    2017-01-01

    the natural, technical, social and humanistic sciences must be defined as a part of real relational meaningful sign-processes manifesting as tokens. Thus Peirce’s information theory is empirically based in a realistic worldview, which through modern biosemiotics includes all living systems....... concept of information as a difference that makes a difference and in Luhmann’s triple autopoietic communication based system theory, where information is always a part of a message. Charles Sanders Peirce’s pragmaticist semiotics differs from other paradigms in that it integrates logic and information...... in interpretative semiotics. I therefore suggest alternatively building information theories based on semiotics from the basic relations of embodied living systems meaningful cognition and communication. I agree with Peircean biosemiotics that all transdisciplinary information concepts in order to work across...

  12. Limited information estimation of the diffusion-based item response theory model for responses and response times.

    Science.gov (United States)

    Ranger, Jochen; Kuhn, Jörg-Tobias; Szardenings, Carsten

    2016-05-01

    Psychological tests are usually analysed with item response models. Recently, some alternative measurement models have been proposed that were derived from cognitive process models developed in experimental psychology. These models consider the responses but also the response times of the test takers. Two such models are the Q-diffusion model and the D-diffusion model. Both models can be calibrated with the diffIRT package of the R statistical environment via marginal maximum likelihood (MML) estimation. In this manuscript, an alternative approach to model calibration is proposed. The approach is based on weighted least squares estimation and parallels the standard estimation approach in structural equation modelling. Estimates are determined by minimizing the discrepancy between the observed and the implied covariance matrix. The estimator is simple to implement, consistent, and asymptotically normally distributed. Least squares estimation also provides a test of model fit by comparing the observed and implied covariance matrix. The estimator and the test of model fit are evaluated in a simulation study. Although parameter recovery is good, the estimator is less efficient than the MML estimator. © 2016 The British Psychological Society.

  13. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    Science.gov (United States)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate

  14. Automated image segmentation using information theory

    International Nuclear Information System (INIS)

    Hibbard, L.S.

    2001-01-01

    Full text: Our development of automated contouring of CT images for RT planning is based on maximum a posteriori (MAP) analyses of region textures, edges, and prior shapes, and assumes stationary Gaussian distributions for voxel textures and contour shapes. Since models may not accurately represent image data, it would be advantageous to compute inferences without relying on models. The relative entropy (RE) from information theory can generate inferences based solely on the similarity of probability distributions. The entropy of a distribution of a random variable X is defined as -Σ x p(x)log 2 p(x) for all the values x which X may assume. The RE (Kullback-Liebler divergence) of two distributions p(X), q(X), over X is Σ x p(x)log 2 {p(x)/q(x)}. The RE is a kind of 'distance' between p,q, equaling zero when p=q and increasing as p,q are more different. Minimum-error MAP and likelihood ratio decision rules have RE equivalents: minimum error decisions obtain with functions of the differences between REs of compared distributions. One applied result is the contour ideally separating two regions is that which maximizes the relative entropy of the two regions' intensities. A program was developed that automatically contours the outlines of patients in stereotactic headframes, a situation most often requiring manual drawing. The relative entropy of intensities inside the contour (patient) versus outside (background) was maximized by conjugate gradient descent over the space of parameters of a deformable contour. shows the computed segmentation of a patient from headframe backgrounds. This program is particularly useful for preparing images for multimodal image fusion. Relative entropy and allied measures of distribution similarity provide automated contouring criteria that do not depend on statistical models of image data. This approach should have wide utility in medical image segmentation applications. Copyright (2001) Australasian College of Physical Scientists and

  15. The informationally-complete quantum theory

    OpenAIRE

    Chen, Zeng-Bing

    2014-01-01

    Quantum mechanics is a cornerstone of our current understanding of nature and extremely successful in describing physics covering a huge range of scales. However, its interpretation remains controversial since the early days of quantum mechanics. What does a quantum state really mean? Is there any way out of the so-called quantum measurement problem? Here we present an informationally-complete quantum theory (ICQT) and the trinary property of nature to beat the above problems. We assume that ...

  16. Information theory of open fragmenting systems

    International Nuclear Information System (INIS)

    Gulminelli, F.; Juillet, O.; Chomaz, Ph.; Ison, M. J.; Dorso, C. O.

    2007-01-01

    An information theory description of finite systems explicitly evolving in time is presented. We impose a MaxEnt variational principle on the Shannon entropy at a given time while the constraints are set at a former time. The resulting density matrix contains explicit time odd components in the form of collective flows. As a specific application we consider the dynamics of the expansion in connection with heavy ion experiments. Lattice gas and classical molecular dynamics simulations are shown

  17. Final Summary: Genre Theory in Information Studies

    DEFF Research Database (Denmark)

    Andersen, Jack

    2015-01-01

    Purpose This chapter offers a re-description of knowledge organization in light of genre and activity theory. Knowledge organization needs a new description in order to account for those activities and practices constituting and causing concrete knowledge organization activity. Genre and activity...... informing and shaping concrete forms of knowledge organization activity. With this, we are able to understand how knowledge organization activity also contributes to construct genre and activity systems and not only aid them....

  18. A force-matching Stillinger-Weber potential for MoS2: Parameterization and Fisher information theory based sensitivity analysis

    Science.gov (United States)

    Wen, Mingjian; Shirodkar, Sharmila N.; Plecháč, Petr; Kaxiras, Efthimios; Elliott, Ryan S.; Tadmor, Ellad B.

    2017-12-01

    Two-dimensional molybdenum disulfide (MoS2) is a promising material for the next generation of switchable transistors and photodetectors. In order to perform large-scale molecular simulations of the mechanical and thermal behavior of MoS2-based devices, an accurate interatomic potential is required. To this end, we have developed a Stillinger-Weber potential for monolayer MoS2. The potential parameters are optimized to reproduce the geometry (bond lengths and bond angles) of MoS2 in its equilibrium state and to match as closely as possible the forces acting on the atoms along a dynamical trajectory obtained from ab initio molecular dynamics. Verification calculations indicate that the new potential accurately predicts important material properties including the strain dependence of the cohesive energy, the elastic constants, and the linear thermal expansion coefficient. The uncertainty in the potential parameters is determined using a Fisher information theory analysis. It is found that the parameters are fully identified, and none are redundant. In addition, the Fisher information matrix provides uncertainty bounds for predictions of the potential for new properties. As an example, bounds on the average vibrational thickness of a MoS2 monolayer at finite temperature are computed and found to be consistent with the results from a molecular dynamics simulation. The new potential is available through the OpenKIM interatomic potential repository at https://openkim.org/cite/MO_201919462778_000.

  19. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  20. Information Foraging Theory: A Framework for Intelligence Analysis

    Science.gov (United States)

    2014-11-01

    oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT

  1. Workshop on The Functional Analysis of Quantum Information Theory : a Collection of Notes Based on Lectures by Gilles Pisier, K. R. Parthasarathy, Vern Paulsen and Andreas Winter

    CERN Document Server

    Gupta, Ved Prakash; Sunder, V S

    2015-01-01

    This book provides readers with a concise introduction to current studies on operator-algebras and their generalizations, operator spaces and operator systems, with a special focus on their application in quantum information science. This basic framework for the mathematical formulation of quantum information can be traced back to the mathematical work of John von Neumann, one of the pioneers of operator algebras, which forms the underpinning of most current mathematical treatments of the quantum theory, besides being one of the most dynamic areas of twentieth century functional analysis. Today, von Neumann’s foresight finds expression in the rapidly growing field of quantum information theory. These notes gather the content of lectures given by a very distinguished group of mathematicians and quantum information theorists, held at the IMSc in Chennai some years ago, and great care has been taken to present the material as a primer on the subject matter. Starting from the basic definitions of operator space...

  2. An information theory framework for dynamic functional domain connectivity.

    Science.gov (United States)

    Vergara, Victor M; Miller, Robyn; Calhoun, Vince

    2017-06-01

    Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Towards evidence-based palliative care in nursing homes in Sweden: a qualitative study informed by the organizational readiness to change theory.

    Science.gov (United States)

    Nilsen, Per; Wallerstedt, Birgitta; Behm, Lina; Ahlström, Gerd

    2018-01-04

    Sweden has a policy of supporting older people to live a normal life at home for as long as possible. Therefore, it is often the oldest, most frail people who move into nursing homes. Nursing home staff are expected to meet the existential needs of the residents, yet conversations about death and dying tend to cause emotional strain. This study explores organizational readiness to implement palliative care based on evidence-based guidelines in nursing homes in Sweden. The aim was to identify barriers and facilitators to implementing evidence-based palliative care in nursing homes. Interviews were carried out with 20 managers from 20 nursing homes in two municipalities who had participated along with staff members in seminars aimed at conveying knowledge and skills of relevance for providing evidence-based palliative care. Two managers responsible for all elderly care in each municipality were also interviewed. The questions were informed by the theory of Organizational Readiness for Change (ORC). ORC was also used as a framework to analyze the data by means of categorizing barriers and facilitators for implementing evidence-based palliative care. Analysis of the data yielded ten factors (i.e., sub-categories) acting as facilitators and/or barriers. Four factors constituted barriers: the staff's beliefs in their capabilities to face dying residents, their attitudes to changes at work as well as the resources and time required. Five factors functioned as either facilitators or barriers because there was considerable variation with regard to the staff's competence and confidence, motivation, and attitudes to work in general, as well as the managers' plans and decisional latitude concerning efforts to develop evidence-based palliative care. Leadership was a facilitator to implementing evidence-based palliative care. There is a limited organizational readiness to develop evidence-based palliative care as a result of variation in the nursing home staff's change efficacy

  4. Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics

    International Nuclear Information System (INIS)

    Altaner, Bernhard

    2017-01-01

    Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. (paper)

  5. Quantum information theory. Mathematical foundation. 2. ed.

    International Nuclear Information System (INIS)

    Hayashi, Masahito

    2017-01-01

    This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics - all of which are addressed here - made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an improved method for the evaluation of leaked information and identifies a remarkable relation between quantum security and quantum coherence. Taken together, these two improvements allow a better analysis of quantum state transmission. In addition, various types of the newly discovered uncertainty relation are explained. Presenting a wealth of new developments, the book introduces readers to the latest advances and challenges in quantum information. To aid in understanding, each chapter is accompanied by a set of exercises and solutions.

  6. Quantum information theory. Mathematical foundation. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, Masahito [Nagoya Univ. (Japan). Graduate School of Mathematics

    2017-07-01

    This graduate textbook provides a unified view of quantum information theory. Clearly explaining the necessary mathematical basis, it merges key topics from both information-theoretic and quantum- mechanical viewpoints and provides lucid explanations of the basic results. Thanks to this unified approach, it makes accessible such advanced topics in quantum communication as quantum teleportation, superdense coding, quantum state transmission (quantum error-correction) and quantum encryption. Since the publication of the preceding book Quantum Information: An Introduction, there have been tremendous strides in the field of quantum information. In particular, the following topics - all of which are addressed here - made seen major advances: quantum state discrimination, quantum channel capacity, bipartite and multipartite entanglement, security analysis on quantum communication, reverse Shannon theorem and uncertainty relation. With regard to the analysis of quantum security, the present book employs an improved method for the evaluation of leaked information and identifies a remarkable relation between quantum security and quantum coherence. Taken together, these two improvements allow a better analysis of quantum state transmission. In addition, various types of the newly discovered uncertainty relation are explained. Presenting a wealth of new developments, the book introduces readers to the latest advances and challenges in quantum information. To aid in understanding, each chapter is accompanied by a set of exercises and solutions.

  7. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  8. Quantum information theory with Gaussian systems

    Energy Technology Data Exchange (ETDEWEB)

    Krueger, O.

    2006-04-06

    This thesis applies ideas and concepts from quantum information theory to systems of continuous-variables such as the quantum harmonic oscillator. The focus is on three topics: the cloning of coherent states, Gaussian quantum cellular automata and Gaussian private channels. Cloning was investigated both for finite-dimensional and for continuous-variable systems. We construct a private quantum channel for the sequential encryption of coherent states with a classical key, where the key elements have finite precision. For the case of independent one-mode input states, we explicitly estimate this precision, i.e. the number of key bits needed per input state, in terms of these parameters. (orig.)

  9. Quantum information theory with Gaussian systems

    International Nuclear Information System (INIS)

    Krueger, O.

    2006-01-01

    This thesis applies ideas and concepts from quantum information theory to systems of continuous-variables such as the quantum harmonic oscillator. The focus is on three topics: the cloning of coherent states, Gaussian quantum cellular automata and Gaussian private channels. Cloning was investigated both for finite-dimensional and for continuous-variable systems. We construct a private quantum channel for the sequential encryption of coherent states with a classical key, where the key elements have finite precision. For the case of independent one-mode input states, we explicitly estimate this precision, i.e. the number of key bits needed per input state, in terms of these parameters. (orig.)

  10. Workflow management based on information management

    NARCIS (Netherlands)

    Lutters, Diederick; Mentink, R.J.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    2001-01-01

    In manufacturing processes, the role of the underlying information is of the utmost importance. Based on three different types of integration (function, information and control), as well as the theory of information management and the accompanying information structures, the entire product creation

  11. Grounded theory for radiotherapy practitioners: Informing clinical practice

    International Nuclear Information System (INIS)

    Walsh, N.A.

    2010-01-01

    Radiotherapy practitioners may be best placed to undertake qualitative research within the context of cancer, due to specialist knowledge of radiation treatment and sensitivity to radiotherapy patient's needs. The grounded theory approach to data collection and analysis is a unique method of identifying a theory directly based on data collected within a clinical context. Research for radiotherapy practitioners is integral to role expansion within the government's directive for evidence-based practice. Due to the paucity of information on qualitative research undertaken by radiotherapy radiographers, this article aims to assess the potential impact of qualitative research on radiotherapy patient and service outcomes.

  12. Feminist Praxis, Critical Theory and Informal Hierarchies

    Directory of Open Access Journals (Sweden)

    Eva Giraud

    2015-05-01

    Full Text Available This article draws on my experiences teaching across two undergraduate media modules in a UK research-intensive institution to explore tactics for combatting both institutional and informal hierarchies within university teaching contexts. Building on Sara Motta’s (2012 exploration of implementing critical pedagogic principles at postgraduate level in an elite university context, I discuss additional tactics for combatting these hierarchies in undergraduate settings, which were developed by transferring insights derived from informal workshops led by the University of Nottingham’s Feminism and Teaching network into the classroom. This discussion is framed in relation to the concepts of “cyborg pedagogies” and “political semiotics of articulation,” derived from the work of Donna Haraway, in order to theorize how these tactics can engender productive relationships between radical pedagogies and critical theory.

  13. Quantum Gravity, Information Theory and the CMB

    Science.gov (United States)

    Kempf, Achim

    2018-04-01

    We review connections between the metric of spacetime and the quantum fluctuations of fields. We start with the finding that the spacetime metric can be expressed entirely in terms of the 2-point correlator of the fluctuations of quantum fields. We then discuss the open question whether the knowledge of only the spectra of the quantum fluctuations of fields also suffices to determine the spacetime metric. This question is of interest because spectra are geometric invariants and their quantization would, therefore, have the benefit of not requiring the modding out of diffeomorphisms. Further, we discuss the fact that spacetime at the Planck scale need not necessarily be either discrete or continuous. Instead, results from information theory show that spacetime may be simultaneously discrete and continuous in the same way that information can. Finally, we review the recent finding that a covariant natural ultraviolet cutoff at the Planck scale implies a signature in the cosmic microwave background (CMB) that may become observable.

  14. An information theory of image gathering

    Science.gov (United States)

    Fales, Carl L.; Huck, Friedrich O.

    1991-01-01

    Shannon's mathematical theory of communication is extended to image gathering. Expressions are obtained for the total information that is received with a single image-gathering channel and with parallel channels. It is concluded that the aliased signal components carry information even though these components interfere with the within-passband components in conventional image gathering and restoration, thereby degrading the fidelity and visual quality of the restored image. An examination of the expression for minimum mean-square-error, or Wiener-matrix, restoration from parallel image-gathering channels reveals a method for unscrambling the within-passband and aliased signal components to restore spatial frequencies beyond the sampling passband out to the spatial frequency response cutoff of the optical aperture.

  15. Cognition and biology: perspectives from information theory.

    Science.gov (United States)

    Wallace, Rodrick

    2014-02-01

    The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.

  16. Informal Theory: The Ignored Link in Theory-to-Practice

    Science.gov (United States)

    Love, Patrick

    2012-01-01

    Applying theory to practice in student affairs is dominated by the assumption that formal theory is directly applied to practice. Among the problems with this assumption is that many practitioners believe they must choose between their lived experiences and formal theory, and that graduate students are taught that their experience "does not…

  17. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  18. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  19. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Science.gov (United States)

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  20. 基于信息熵的SVM入侵检测技术%Exploring SVM-based intrusion detection through information entropy theory

    Institute of Scientific and Technical Information of China (English)

    朱文杰; 王强; 翟献军

    2013-01-01

    在传统基于SVM的入侵检测中,核函数构造和特征选择采用先验知识,普遍存在准确度不高、效率低下的问题.通过信息熵理论与SVM算法相结合的方法改进为基于信息熵的SVM入侵检测算法,可以提高入侵检测的准确性,提升入侵检测的效率.基于信息熵的SVM入侵检测算法包括两个方面:一方面,根据样本包含的用户信息熵和方差,将样本特征统一,以特征是否属于置信区间来度量.将得到的样本特征置信向量作为SVM核函数的构造参数,既可保证训练样本集与最优分类面之间的对应关系,又可得到入侵检测需要的最大分类间隔;另一方面,将样本包含的用户信息量作为度量大幅度约简样本特征子集,不但降低了样本计算规模,而且提高了分类器的训练速度.实验表明,该算法在入侵检测系统中的应用优于传统的SVM算法.%In traditional SVM based intrusion detection approaches,both core function construction and feature selection use prior knowdege.Due to this,they are not only inefficient but also inaccurate.It is observed that integrating information entropy theory into SVM-based intrusion detection can enhance both the precision and the speed.Concludely speaking,SVM-based entropy intrusion detection algorithms are made up of two aspects:on one hand,setting sample confidence vector as core function's constructor of SVM algorithm can guarantee the mapping relationship between training sample and optimization classification plane.Also,the intrusion detection's maximum interval can be acquired.On the other hand,simplifying feature subset with samples's entropy as metric standard can not only shrink the computing scale but also improve the speed.Experiments prove that the SVM based entropy intrusion detection algoritm outperfomrs other tradional algorithms.

  1. Information theory and coding solved problems

    CERN Document Server

    Ivaniš, Predrag

    2017-01-01

    This book is offers a comprehensive overview of information theory and error control coding, using a different approach then in existed literature. The chapters are organized according to the Shannon system model, where one block affects the others. A relatively brief theoretical introduction is provided at the beginning of every chapter, including a few additional examples and explanations, but without any proofs. And a short overview of some aspects of abstract algebra is given at the end of the corresponding chapters. The characteristic complex examples with a lot of illustrations and tables are chosen to provide detailed insights into the nature of the problem. Some limiting cases are presented to illustrate the connections with the theoretical bounds. The numerical values are carefully selected to provide in-depth explanations of the described algorithms. Although the examples in the different chapters can be considered separately, they are mutually connected and the conclusions for one considered proble...

  2. An application of information theory to stochastic classical gravitational fields

    Science.gov (United States)

    Angulo, J.; Angulo, J. C.; Angulo, J. M.

    2018-06-01

    The objective of this study lies on the incorporation of the concepts developed in the Information Theory (entropy, complexity, etc.) with the aim of quantifying the variation of the uncertainty associated with a stochastic physical system resident in a spatiotemporal region. As an example of application, a relativistic classical gravitational field has been considered, with a stochastic behavior resulting from the effect induced by one or several external perturbation sources. One of the key concepts of the study is the covariance kernel between two points within the chosen region. Using this concept and the appropriate criteria, a methodology is proposed to evaluate the change of uncertainty at a given spatiotemporal point, based on available information and efficiently applying the diverse methods that Information Theory provides. For illustration, a stochastic version of the Einstein equation with an added Gaussian Langevin term is analyzed.

  3. Cyber Power Theory First, Then Information Operations

    National Research Council Canada - National Science Library

    Smart, Antoinette G

    2001-01-01

    ...) seems disconcerting, at least on the surface. Think tanks, government research organizations, and learned individuals have all pointed to the need for a viable theory of IO, yet no such theory has emerged...

  4. Increasing Bellevue School District's elementary teachers' capacity for teaching inquiry-based science: Using ideas from contemporary learning theory to inform professional development

    Science.gov (United States)

    Maury, Tracy Anne

    This Capstone project examined how leaders in the Bellevue School District can increase elementary teachers' capacity for teaching inquiry-based science through the use of professional learning activities that are grounded in ideas from human learning theory. A framework for professional development was constructed and from that framework, a set of professional learning activities were developed as a means to support teacher learning while project participants piloted new curriculum called the Isopod Habitat Challenge. Teachers in the project increased their understanding of the learning theory principles of preconceptions and metacognition. Teachers did not increase their understanding of the principle of learning with understanding, although they did articulate the significance of engaging children in student-led inquiry cycles. Data from the curriculum revision and professional development project coupled with ideas from learning theory, cognition and policy implementation, and learning community literatures suggest Bellevue's leaders can encourage peer-to-peer interaction, link professional development to teachers' daily practice, and capitalize on technology as ways to increase elementary teachers' capacity for teaching inquiry-based science. These lessons also have significance for supporting teacher learning and efficacy in other subject areas and at other levels in the system.

  5. The development and implementation of theory-driven programs capable of addressing poverty-impacted children's health, mental health, and prevention needs: CHAMP and CHAMP+, evidence-informed, family-based interventions to address HIV risk and care.

    Science.gov (United States)

    McKernan McKay, Mary; Alicea, Stacey; Elwyn, Laura; McClain, Zachary R B; Parker, Gary; Small, Latoya A; Mellins, Claude Ann

    2014-01-01

    This article describes a program of prevention and intervention research conducted by the CHAMP (Collaborative HIV prevention and Adolescent Mental health Project; McKay & Paikoff, 2007 ) investigative team. CHAMP refers to a set of theory-driven, evidence-informed, collaboratively designed, family-based approaches meant to address the prevention, health, and mental health needs of poverty-impacted African American and Latino urban youth who are either at risk for HIV exposure or perinatally infected and at high risk for reinfection and possible transmission. CHAMP approaches are informed by theoretical frameworks that incorporate an understanding of the critical influences of multilevel contextual factors on youth risk taking and engagement in protective health behaviors. Highly influential theories include the triadic theory of influence, social action theory, and ecological developmental perspectives. CHAMP program delivery strategies were developed via a highly collaborative process drawing upon community-based participatory research methods in order to enhance cultural and contextual sensitivity of program content and format. The development and preliminary outcomes associated with a family-based intervention for a new population, perinatally HIV-infected youth and their adult caregivers, referred to as CHAMP+, is described to illustrate the integration of theory, existing evidence, and intensive input from consumers and healthcare providers.

  6. Novel information theory techniques for phonon spectroscopy

    International Nuclear Information System (INIS)

    Hague, J P

    2007-01-01

    The maximum entropy method (MEM) and spectral reverse Monte Carlo (SRMC) techniques are applied to the determination of the phonon density of states (PDOS) from heat-capacity data. The approach presented here takes advantage of the standard integral transform relating the PDOS with the specific heat at constant volume. MEM and SRMC are highly successful numerical approaches for inverting integral transforms. The formalism and algorithms necessary to carry out the inversion of specific heat curves are introduced, and where possible, I have concentrated on algorithms and experimental details for practical usage. Simulated data are used to demonstrate the accuracy of the approach. The main strength of the techniques presented here is that the resulting spectra are always physical: Computed PDOS is always positive and properly applied information theory techniques only show statistically significant detail. The treatment set out here provides a simple, cost-effective and reliable method to determine phonon properties of new materials. In particular, the new technique is expected to be very useful for establishing where interesting phonon modes and properties can be found, before spending time at large scale facilities

  7. Pangenesis as a source of new genetic information. The history of a now disproven theory.

    Science.gov (United States)

    Bergman, Gerald

    2006-01-01

    Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.

  8. Theory-based explanation as intervention.

    Science.gov (United States)

    Weisman, Kara; Markman, Ellen M

    2017-10-01

    Cogent explanations are an indispensable means of providing new information and an essential component of effective education. Beyond this, we argue that there is tremendous untapped potential in using explanations to motivate behavior change. In this article we focus on health interventions. We review four case studies that used carefully tailored explanations to address gaps and misconceptions in people's intuitive theories, providing participants with a conceptual framework for understanding how and why some recommended behavior is an effective way of achieving a health goal. These case studies targeted a variety of health-promoting behaviors: (1) children washing their hands to prevent viral epidemics; (2) parents vaccinating their children to stem the resurgence of infectious diseases; (3) adults completing the full course of an antibiotic prescription to reduce antibiotic resistance; and (4) children eating a variety of healthy foods to improve unhealthy diets. Simply telling people to engage in these behaviors has been largely ineffective-if anything, concern about these issues is mounting. But in each case, teaching participants coherent explanatory frameworks for understanding health recommendations has shown great promise, with such theory-based explanations outperforming state-of-the-art interventions from national health authorities. We contrast theory-based explanations both with simply listing facts, information, and advice and with providing a full-blown educational curriculum, and argue for providing the minimum amount of information required to understand the causal link between a target behavior and a health outcome. We argue that such theory-based explanations lend people the motivation and confidence to act on their new understanding.

  9. Exploring a Theory Describing the Physics of Information Systems, Characterizing the Phenomena of Complex Information Systems

    National Research Council Canada - National Science Library

    Harmon, Scott

    2001-01-01

    This project accomplished all of its objectives: document a theory of information physics, conduct a workshop on planing experiments to test this theory, and design experiments that validate this theory...

  10. Client-Controlled Case Information: A General System Theory Perspective

    Science.gov (United States)

    Fitch, Dale

    2004-01-01

    The author proposes a model for client control of case information via the World Wide Web built on principles of general system theory. It incorporates the client into the design, resulting in an information structure that differs from traditional human services information-sharing practices. Referencing general system theory, the concepts of…

  11. Critical Theory and Information Studies: A Marcusean Infusion

    Science.gov (United States)

    Pyati, Ajit K.

    2006-01-01

    In the field of library and information science, also known as information studies, critical theory is often not included in debates about the discipline's theoretical foundations. This paper argues that the critical theory of Herbert Marcuse, in particular, has a significant contribution to make to the field of information studies. Marcuse's…

  12. Information theoretic resources in quantum theory

    Science.gov (United States)

    Meznaric, Sebastian

    Resource identification and quantification is an essential element of both classical and quantum information theory. Entanglement is one of these resources, arising when quantum communication and nonlocal operations are expensive to perform. In the first part of this thesis we quantify the effective entanglement when operations are additionally restricted to account for both fundamental restrictions on operations, such as those arising from superselection rules, as well as experimental errors arising from the imperfections in the apparatus. For an important class of errors we find a linear relationship between the usual and effective higher dimensional generalization of concurrence, a measure of entanglement. Following the treatment of effective entanglement, we focus on a related concept of nonlocality in the presence of superselection rules (SSR). Here we propose a scheme that may be used to activate nongenuinely multipartite nonlocality, in that a single copy of a state is not multipartite nonlocal, while two or more copies exhibit nongenuinely multipartite nonlocality. The states used exhibit the more powerful genuinely multipartite nonlocality when SSR are not enforced, but not when they are, raising the question of what is needed for genuinely multipartite nonlocality. We show that whenever the number of particles is insufficient, the degrading of genuinely multipartite to nongenuinely multipartite nonlocality is necessary. While in the first few chapters we focus our attention on understanding the resources present in quantum states, in the final part we turn the picture around and instead treat operations themselves as a resource. We provide our observers with free access to classical operations - ie. those that cannot detect or generate quantum coherence. We show that the operation of interest can then be used to either generate or detect quantum coherence if and only if it violates a particular commutation relation. Using the relative entropy, the

  13. Role of information theoretic uncertainty relations in quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  14. Role of information theoretic uncertainty relations in quantum theory

    International Nuclear Information System (INIS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-01-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed

  15. Image matching navigation based on fuzzy information

    Institute of Scientific and Technical Information of China (English)

    田玉龙; 吴伟仁; 田金文; 柳健

    2003-01-01

    In conventional image matching methods, the image matching process is mostly based on image statistic information. One aspect neglected by all these methods is that there is much fuzzy information contained in these images. A new fuzzy matching algorithm based on fuzzy similarity for navigation is presented in this paper. Because the fuzzy theory is of the ability of making good description of the fuzzy information contained in images, the image matching method based on fuzzy similarity would look forward to producing good performance results. Experimental results using matching algorithm based on fuzzy information also demonstrate its reliability and practicability.

  16. Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics

    Science.gov (United States)

    Altaner, Bernhard

    2017-11-01

    Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Bernhard Altaner was selected by the Editorial Board of J. Phys. A as an Emerging Talent.

  17. Elaborations of grounded theory in information research: arenas/social worlds theory, discourse and situational analysis

    OpenAIRE

    Vasconcelos, A.C.; Sen, B.A.; Rosa, A.; Ellis, D.

    2012-01-01

    This paper explores elaborations of Grounded Theory in relation to Arenas/Social Worlds Theory. The notions of arenas and social worlds were present in early applications of Grounded Theory but have not been as much used or recognised as the general Grounded Theory approach, particularly in the information studies field. The studies discussed here are therefore very unusual in information research. The empirical contexts of these studies are those of (1) the role of discourse in the organisat...

  18. Applying Information Processing Theory to Supervision: An Initial Exploration

    Science.gov (United States)

    Tangen, Jodi L.; Borders, L. DiAnne

    2017-01-01

    Although clinical supervision is an educational endeavor (Borders & Brown, [Borders, L. D., 2005]), many scholars neglect theories of learning in working with supervisees. The authors describe 1 learning theory--information processing theory (Atkinson & Shiffrin, 1968, 1971; Schunk, 2016)--and the ways its associated interventions may…

  19. USING INFORMATION THEORY TO DEFINE A SUSTAINABILITY INDEX

    Science.gov (United States)

    Information theory has many applications in Ecology and Environmental science, such as a biodiversity indicator, as a measure of evolution, a measure of distance from thermodynamic equilibrium, and as a measure of system organization. Fisher Information, in particular, provides a...

  20. Client-controlled case information: a general system theory perspective.

    Science.gov (United States)

    Fitch, Dale

    2004-07-01

    The author proposes a model for client control of case information via the World Wide Web built on principles of general system theory. It incorporates the client into the design, resulting in an information structure that differs from traditional human services information-sharing practices. Referencing general system theory, the concepts of controller and controlled system, as well as entropy and negentropy, are applied to the information flow and autopoietic behavior as they relate to the boundary-maintaining functions of today's organizations. The author's conclusions synthesize general system theory and human services values to lay the foundation for an information-sharing framework for human services in the 21st century.

  1. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    Directory of Open Access Journals (Sweden)

    Abe D Hofman

    Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.

  2. Epistemology as Information Theory: From Leibniz to Omega

    OpenAIRE

    Chaitin, G. J.

    2005-01-01

    In 1686 in his Discours de Metaphysique, Leibniz points out that if an arbitrarily complex theory is permitted then the notion of "theory" becomes vacuous because there is always a theory. This idea is developed in the modern theory of algorithmic information, which deals with the size of computer programs and provides a new view of Godel's work on incompleteness and Turing's work on uncomputability. Of particular interest is the halting probability Omega, whose bits are irreducible, i.e., ma...

  3. Finding an Information Concept Suited for a Universal Theory of Information

    DEFF Research Database (Denmark)

    Brier, Søren

    2015-01-01

    . There is no conclusive evidence that the core of reality across nature, culture, life and mind is purely either mathematical, logical or of a computational nature. Therefore the core of the information concept should not only be based only on pure logical or mathematical rationality. We need to include interpretation...... definition information in a transdisciplinary theory cannot be ‘objective’, but has to be relativized in relation to the receiver's knowledge, as also proposed by Floridi. It is difficult to produce a quantitative statement independently of a qualitative analysis based on some sort of relation to the human...

  4. Theory-Based Evaluation Meets Ambiguity

    DEFF Research Database (Denmark)

    Dahler-Larsen, Peter

    2017-01-01

    As theory-based evaluation (TBE) engages in situations where multiple stakeholders help develop complex program theory about dynamic phenomena in politically contested settings, it becomes difficult to develop and use program theory without ambiguity. The purpose of this article is to explore...... ambiguity as a fruitful perspective that helps TBE face current challenges. Literatures in organization theory and political theory are consulted in order to cultivate the concept of ambiguity. Janus variables (which work in two ways) and other ambiguous aspects of program theories are classified...... and exemplified. Stances towards ambiguity are considered, as are concrete steps that TBE evaluators can take to identify and deal with ambiguity in TBE....

  5. Quantum theory from first principles an informational approach

    CERN Document Server

    D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2017-01-01

    Quantum theory is the soul of theoretical physics. It is not just a theory of specific physical systems, but rather a new framework with universal applicability. This book shows how we can reconstruct the theory from six information-theoretical principles, by rebuilding the quantum rules from the bottom up. Step by step, the reader will learn how to master the counterintuitive aspects of the quantum world, and how to efficiently reconstruct quantum information protocols from first principles. Using intuitive graphical notation to represent equations, and with shorter and more efficient derivations, the theory can be understood and assimilated with exceptional ease. Offering a radically new perspective on the field, the book contains an efficient course of quantum theory and quantum information for undergraduates. The book is aimed at researchers, professionals, and students in physics, computer science and philosophy, as well as the curious outsider seeking a deeper understanding of the theory.

  6. Why hydrological predictions should be evaluated using information theory

    Directory of Open Access Journals (Sweden)

    S. V. Weijs

    2010-12-01

    Full Text Available Probabilistic predictions are becoming increasingly popular in hydrology. Equally important are methods to test such predictions, given the topical debate on uncertainty analysis in hydrology. Also in the special case of hydrological forecasting, there is still discussion about which scores to use for their evaluation. In this paper, we propose to use information theory as the central framework to evaluate predictions. From this perspective, we hope to shed some light on what verification scores measure and should measure. We start from the ''divergence score'', a relative entropy measure that was recently found to be an appropriate measure for forecast quality. An interpretation of a decomposition of this measure provides insight in additive relations between climatological uncertainty, correct information, wrong information and remaining uncertainty. When the score is applied to deterministic forecasts, it follows that these increase uncertainty to infinity. In practice, however, deterministic forecasts tend to be judged far more mildly and are widely used. We resolve this paradoxical result by proposing that deterministic forecasts either are implicitly probabilistic or are implicitly evaluated with an underlying decision problem or utility in mind. We further propose that calibration of models representing a hydrological system should be the based on information-theoretical scores, because this allows extracting all information from the observations and avoids learning from information that is not there. Calibration based on maximizing utility for society trains an implicit decision model rather than the forecasting system itself. This inevitably results in a loss or distortion of information in the data and more risk of overfitting, possibly leading to less valuable and informative forecasts. We also show this in an example. The final conclusion is that models should preferably be explicitly probabilistic and calibrated to maximize the

  7. Could information theory provide an ecological theory of sensory processing?

    Science.gov (United States)

    Atick, Joseph J

    2011-01-01

    The sensory pathways of animals are well adapted to processing a special class of signals, namely stimuli from the animal's environment. An important fact about natural stimuli is that they are typically very redundant and hence the sampled representation of these signals formed by the array of sensory cells is inefficient. One could argue for some animals and pathways, as we do in this review, that efficiency of information representation in the nervous system has several evolutionary advantages. Consequently, one might expect that much of the processing in the early levels of these sensory pathways could be dedicated towards recoding incoming signals into a more efficient form. In this review, we explore the principle of efficiency of information representation as a design principle for sensory processing. We give a preliminary discussion on how this principle could be applied in general to predict neural processing and then discuss concretely some neural systems where it recently has been shown to be successful. In particular, we examine the fly's LMC coding strategy and the mammalian retinal coding in the spatial, temporal and chromatic domains.

  8. A Mathematical Theory of System Information Flow

    Science.gov (United States)

    2016-06-27

    i.i.d. is usually quite involved. There are numerous experiments , often using photons, to test Bell’s Inequality recorded in the literature, but the...classical setting. Peter focused on non-locality as an alternative theory and experiments using the CHSH inequality , and devised a statistical procedure...761 (2014). 7. BIERHORST, P., A new loophole in recent Bell test experiments , arXiv:1311.4488, (2014). 8. BIERHORST, P., A Mathematical Foundation

  9. Finding an information concept suited for a universal theory of information.

    Science.gov (United States)

    Brier, Søren

    2015-12-01

    The view argued in this article is that if we want to define a universal concept of information covering subjective experiential and meaningful cognition - as well as intersubjective meaningful communication in nature, technology, society and life worlds - then the main problem is to decide, which epistemological, ontological and philosophy of science framework the concept of information should be based on and integrated in. All the ontological attempts to create objective concepts of information result in concepts that cannot encompass meaning and experience of embodied living and social systems. There is no conclusive evidence that the core of reality across nature, culture, life and mind is purely either mathematical, logical or of a computational nature. Therefore the core of the information concept should not only be based only on pure logical or mathematical rationality. We need to include interpretation, signification and meaning construction in our transdisciplinary framework for information as a basic aspect of reality alongside the physical, chemical and molecular biological. Dretske defines information as the content of new, true, meaningful, and understandable knowledge. According to this widely held definition information in a transdisciplinary theory cannot be 'objective', but has to be relativized in relation to the receiver's knowledge, as also proposed by Floridi. It is difficult to produce a quantitative statement independently of a qualitative analysis based on some sort of relation to the human condition as a semiotic animal. I therefore alternatively suggest to build information theories based on semiotics from the basic relations of embodied living systems meaningful cognition and communication. I agree with Peircean biosemiotics that all information must be part of real relational sign-processes manifesting as tokens. Copyright © 2015. Published by Elsevier Ltd.

  10. Theory of the Concealed Information Test

    NARCIS (Netherlands)

    Verschuere, B.; Ben-Shakhar, G.; Verschuere, B.; Ben-Shakhar, G.; Meijer, E.

    2011-01-01

    It is now well established that physiological measures can be validly used to detect concealed information. An important challenge is to elucidate the underlying mechanisms of concealed information detection. We review theoretical approaches that can be broadly classified in two major categories:

  11. Assessment of visual communication by information theory

    Science.gov (United States)

    Huck, Friedrich O.; Fales, Carl L.

    1994-01-01

    This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.

  12. Public Management Information Systems: Theory and Prescription.

    Science.gov (United States)

    Bozeman, Barry; Bretschneider, Stuart

    1986-01-01

    The existing theoretical framework for research in management information systems (MIS) is criticized for its lack of attention to the external environment of organizations, and a new framework is developed which better accommodates MIS in public organizations: public management information systems. Four models of publicness that reflect external…

  13. Entropy in quantum information theory - Communication and cryptography

    DEFF Research Database (Denmark)

    Majenz, Christian

    in quantum Shannon theory. While immensely more entanglement-consuming, the variant of port based teleportation is interesting for applications like instantaneous non-local computation and attacks on quantum position-based cryptography. Port based teleportation cannot be implemented perfectly......, for vanishing error. As a byproduct, a new lower bound for the size of the program register for an approximate universal programmable quantum processor is derived. Finally, the mix is completed with a result in quantum cryptography. While quantum key distribution is the most well-known quantum cryptographic...... protocol, there has been increased interest in extending the framework of symmetric key cryptography to quantum messages. We give a new denition for information-theoretic quantum non-malleability, strengthening the previous denition by Ambainis et al. We show that quantum non-malleability implies secrecy...

  14. Equity trees and graphs via information theory

    Science.gov (United States)

    Harré, M.; Bossomaier, T.

    2010-01-01

    We investigate the similarities and differences between two measures of the relationship between equities traded in financial markets. Our measures are the correlation coefficients and the mutual information. In the context of financial markets correlation coefficients are well established whereas mutual information has not previously been as well studied despite its theoretically appealing properties. We show that asset trees which are derived from either the correlation coefficients or the mutual information have a mixture of both similarities and differences at the individual equity level and at the macroscopic level. We then extend our consideration from trees to graphs using the "genus 0" condition recently introduced in order to study the networks of equities.

  15. Information Theoretic Characterization of Physical Theories with Projective State Space

    Science.gov (United States)

    Zaopo, Marco

    2015-08-01

    Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.

  16. Information carriers and (reading them through) information theory in quantum chemistry.

    Science.gov (United States)

    Geerlings, Paul; Borgoo, Alex

    2011-01-21

    This Perspective discusses the reduction of the electronic wave function via the second-order reduced density matrix to the electron density ρ(r), which is the key ingredient in density functional theory (DFT) as a basic carrier of information. Simplifying further, the 1-normalized density function turns out to contain essentially the same information as ρ(r) and is even of preferred use as an information carrier when discussing the periodic properties along Mendeleev's table where essentially the valence electrons are at stake. The Kullback-Leibler information deficiency turns out to be the most interesting choice to obtain information on the differences in ρ(r) or σ(r) between two systems. To put it otherwise: when looking for the construction of a functional F(AB) = F[ζ(A)(r),ζ(B)(r)] for extracting differences in information from an information carrier ζ(r) (i.e. ρ(r), σ(r)) for two systems A and B the Kullback-Leibler information measure ΔS is a particularly adequate choice. Examples are given, varying from atoms, to molecules and molecular interactions. Quantum similarity of atoms indicates that the shape function based KL information deficiency is the most appropriate tool to retrieve periodicity in the Periodic Table. The dissimilarity of enantiomers for which different information measures are presented at global and local (i.e. molecular and atomic) level leads to an extension of Mezey's holographic density theorem and shows numerical evidence that in a chiral molecule the whole molecule is pervaded by chirality. Finally Kullback-Leibler information profiles are discussed for intra- and intermolecular proton transfer reactions and a simple S(N)2 reaction indicating that the theoretical information profile can be used as a companion to the energy based Hammond postulate to discuss the early or late transition state character of a reaction. All in all this Perspective's answer is positive to the question of whether an even simpler carrier of

  17. Affect Theory and Autoethnography in Ordinary Information Systems

    DEFF Research Database (Denmark)

    Bødker, Mads; Chamberlain, Alan

    2016-01-01

    This paper uses philosophical theories of affect as a lens for exploring autoethnographic renderings of everyday experience with information technology. Affect theories, in the paper, denote a broad trend in post-humanistic philosophy that explores sensation and feeling as emergent and relational...

  18. Response to Patrick Love's "Informal Theory": A Rejoinder

    Science.gov (United States)

    Evans, Nancy J.; Guido, Florence M.

    2012-01-01

    This rejoinder to Patrick Love's article, "Informal Theory: The Ignored Link in Theory-to-Practice," which appears earlier in this issue of the "Journal of College Student Development", was written at the invitation of the Editor. In the critique, we point out the weaknesses of many of Love's arguments and propositions. We provide an alternative…

  19. Quantum: information theory: technological challenge; Computacion Cuantica: un reto tecnologico

    Energy Technology Data Exchange (ETDEWEB)

    Calixto, M.

    2001-07-01

    The new Quantum Information Theory augurs powerful machines that obey the entangled logic of the subatomic world. Parallelism, entanglement, teleportation, no-cloning and quantum cryptography are typical peculiarities of this novel way of understanding computation. (Author) 24 refs.

  20. Information Processing Theories and the Education of the Gifted.

    Science.gov (United States)

    Rawl, Ruth K.; O'Tuel, Frances S.

    1983-01-01

    The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)

  1. Information theory and its application to optical communication

    NARCIS (Netherlands)

    Willems, F.M.J.

    2017-01-01

    The lecture focusses on the foundations of communication which were developed within the field of information theory. Enumerative shaping techniques and the so-called squareroot transform will be discussed in detail.

  2. An introduction to single-user information theory

    CERN Document Server

    Alajaji, Fady

    2018-01-01

    This book presents a succinct and mathematically rigorous treatment of the main pillars of Shannon’s information theory, discussing the fundamental concepts and indispensable results of Shannon’s mathematical theory of communications. It includes five meticulously written core chapters (with accompanying problems), emphasizing the key topics of information measures; lossless and lossy data compression; channel coding; and joint source-channel coding for single-user (point-to-point) communications systems. It also features two appendices covering necessary background material in real analysis and in probability theory and stochastic processes. The book is ideal for a one-semester foundational course on information theory for senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences. A comprehensive instructor’s solutions manual is available.

  3. Reference group theory with implications for information studies: a theoretical essay

    Directory of Open Access Journals (Sweden)

    E. Murell Dawson

    2001-01-01

    Full Text Available This article explores the role and implications of reference group theory in relation to the field of library and information science. Reference group theory is based upon the principle that people take the standards of significant others as a basis for making self-appraisals, comparisons, and choices regarding need and use of information. Research that applies concepts of reference group theory to various sectors of library and information studies can provide data useful in enhancing areas such as information-seeking research, special populations, and uses of information. Implications are promising that knowledge gained from like research can be beneficial in helping information professionals better understand the role theory plays in examining ways in which people manage their information and social worlds.

  4. Vocation in theology-based nursing theories.

    Science.gov (United States)

    Lundmark, Mikael

    2007-11-01

    By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.

  5. Activity System Theory Approach to Healthcare Information System

    OpenAIRE

    Bai, Guohua

    2004-01-01

    Healthcare information system is a very complex system and has to be approached from systematic perspectives. This paper presents an Activity System Theory (ATS) approach by integrating system thinking and social psychology. First part of the paper, the activity system theory is presented, especially a recursive model of human activity system is introduced. A project ‘Integrated Mobile Information System for Diabetic Healthcare (IMIS)’ is then used to demonstrate a practical application of th...

  6. Advancing Theory? Landscape Archaeology and Geographical Information Systems

    Directory of Open Access Journals (Sweden)

    Di Hu

    2012-05-01

    Full Text Available This paper will focus on how Geographical Information Systems (GIS have been applied in Landscape Archaeology from the late 1980s to the present. GIS, a tool for organising and analysing spatial information, has exploded in popularity, but we still lack a systematic overview of how it has contributed to archaeological theory, specifically Landscape Archaeology. This paper will examine whether and how GIS has advanced archaeological theory through a historical review of its application in archaeology.

  7. Theory-based interventions for contraception.

    Science.gov (United States)

    Lopez, Laureen M; Grey, Thomas W; Chen, Mario; Tolley, Elizabeth E; Stockton, Laurie L

    2016-11-23

    The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, many educational interventions addressing contraception have no explicit theoretical base. To review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice and encourage or improve contraceptive use. To 1 November 2016, we searched for trials that tested a theory-based intervention for improving contraceptive use in PubMed, CENTRAL, POPLINE, Web of Science, ClinicalTrials.gov, and ICTRP. For the initial review, we wrote to investigators to find other trials. Included trials tested a theory-based intervention for improving contraceptive use. Interventions addressed the use of one or more methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy and contraceptive choice or use. We assessed titles and abstracts identified during the searches. One author extracted and entered the data into Review Manager; a second author verified accuracy. We examined studies for methodological quality.For unadjusted dichotomous outcomes, we calculated the Mantel-Haenszel odds ratio (OR) with 95% confidence interval (CI). Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. We did not conduct meta-analysis due to varied interventions and outcome measures. We included 10 new trials for a total of 25. Five were conducted outside the USA. Fifteen randomly assigned individuals and 10 randomized clusters. This section focuses on nine trials with high or

  8. Power Load Prediction Based on Fractal Theory

    OpenAIRE

    Jian-Kai, Liang; Cattani, Carlo; Wan-Qing, Song

    2015-01-01

    The basic theories of load forecasting on the power system are summarized. Fractal theory, which is a new algorithm applied to load forecasting, is introduced. Based on the fractal dimension and fractal interpolation function theories, the correlation algorithms are applied to the model of short-term load forecasting. According to the process of load forecasting, the steps of every process are designed, including load data preprocessing, similar day selecting, short-term load forecasting, and...

  9. Applications of quantum information theory to quantum gravity

    International Nuclear Information System (INIS)

    Smolin, L.

    2005-01-01

    Full text: I describe work by and with Fotini Markopoulou and Olaf Dreyeron the application of quantum information theory to quantum gravity. A particular application to black hole physics is described, which treats the black hole horizon as an open system, in interaction with an environment, which are the degrees of freedom in the bulk spacetime. This allows us to elucidate which quantum states of a general horizon contribute to the entropy of a Schwarzchild black hole. This case serves as an example of how methods from quantum information theory may help to elucidate how the classical limit emerges from a background independent quantum theory of gravity. (author)

  10. New Aspects of Probabilistic Forecast Verification Using Information Theory

    Science.gov (United States)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  11. Entanglement dynamics in quantum information theory

    Energy Technology Data Exchange (ETDEWEB)

    Cubitt, T.S.

    2007-03-29

    This thesis contributes to the theory of entanglement dynamics, that is, the behaviour of entanglement in systems that are evolving with time. Progressively more complex multipartite systems are considered, starting with low-dimensional tripartite systems, whose entanglement dynamics can nonetheless display surprising properties, progressing through larger networks of interacting particles, and finishing with infinitely large lattice models. Firstly, what is perhaps the most basic question in entanglement dynamics is considered: what resources are necessary in order to create entanglement between distant particles? The answer is surprising: sending separable states between the parties is sufficient; entanglement can be created without it being carried by a ''messenger'' particle. The analogous result also holds in the continuous-time case: two particles interacting indirectly via a common ancilla particle can be entangled without the ancilla ever itself becoming entangled. The latter result appears to discount any notion of entanglement flow. However, for pure states, this intuitive idea can be recovered, and even made quantitative. A ''bottleneck'' inequality is derived that relates the entanglement rate of the end particles in a tripartite chain to the entanglement of the middle one. In particular, no entanglement can be created if the middle particle is not entangled. However, although this result can be applied to general interaction networks, it does not capture the full entanglement dynamics of these more complex systems. This is remedied by the derivation of entanglement rate equations, loosely analogous to the rate equations describing a chemical reaction. A complete set of rate equations for a system reflects the full structure of its interaction network, and can be used to prove a lower bound on the scaling with chain length of the time required to entangle the ends of a chain. Finally, in contrast with these more

  12. Entanglement dynamics in quantum information theory

    International Nuclear Information System (INIS)

    Cubitt, T.S.

    2007-01-01

    This thesis contributes to the theory of entanglement dynamics, that is, the behaviour of entanglement in systems that are evolving with time. Progressively more complex multipartite systems are considered, starting with low-dimensional tripartite systems, whose entanglement dynamics can nonetheless display surprising properties, progressing through larger networks of interacting particles, and finishing with infinitely large lattice models. Firstly, what is perhaps the most basic question in entanglement dynamics is considered: what resources are necessary in order to create entanglement between distant particles? The answer is surprising: sending separable states between the parties is sufficient; entanglement can be created without it being carried by a ''messenger'' particle. The analogous result also holds in the continuous-time case: two particles interacting indirectly via a common ancilla particle can be entangled without the ancilla ever itself becoming entangled. The latter result appears to discount any notion of entanglement flow. However, for pure states, this intuitive idea can be recovered, and even made quantitative. A ''bottleneck'' inequality is derived that relates the entanglement rate of the end particles in a tripartite chain to the entanglement of the middle one. In particular, no entanglement can be created if the middle particle is not entangled. However, although this result can be applied to general interaction networks, it does not capture the full entanglement dynamics of these more complex systems. This is remedied by the derivation of entanglement rate equations, loosely analogous to the rate equations describing a chemical reaction. A complete set of rate equations for a system reflects the full structure of its interaction network, and can be used to prove a lower bound on the scaling with chain length of the time required to entangle the ends of a chain. Finally, in contrast with these more abstract results, the entanglement and

  13. Information Based Fault Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2008-01-01

    Fault detection and isolation, (FDI) of parametric faults in dynamic systems will be considered in this paper. An active fault diagnosis (AFD) approach is applied. The fault diagnosis will be investigated with respect to different information levels from the external inputs to the systems. These ...

  14. What Density Functional Theory could do for Quantum Information

    Science.gov (United States)

    Mattsson, Ann

    2015-03-01

    The Hohenberg-Kohn theorem of Density Functional Theory (DFT), and extensions thereof, tells us that all properties of a system of electrons can be determined through their density, which uniquely determines the many-body wave-function. Given access to the appropriate, universal, functionals of the density we would, in theory, be able to determine all observables of any electronic system, without explicit reference to the wave-function. On the other hand, the wave-function is at the core of Quantum Information (QI), with the wave-function of a set of qubits being the central computational resource in a quantum computer. While there is seemingly little overlap between DFT and QI, reliance upon observables form a key connection. Though the time-evolution of the wave-function and associated phase information is fundamental to quantum computation, the initial and final states of a quantum computer are characterized by observables of the system. While observables can be extracted directly from a system's wave-function, DFT tells us that we may be able to intuit a method for extracting them from its density. In this talk, I will review the fundamentals of DFT and how these principles connect to the world of QI. This will range from DFT's utility in the engineering of physical qubits, to the possibility of using it to efficiently (but approximately) simulate Hamiltonians at the logical level. The apparent paradox of describing algorithms based on the quantum mechanical many-body wave-function with a DFT-like theory based on observables will remain a focus throughout. The ultimate goal of this talk is to initiate a dialog about what DFT could do for QI, in theory and in practice. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  15. The Scope of Usage-based Theory

    OpenAIRE

    Paul eIbbotson

    2013-01-01

    Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highli...

  16. Wavelet-Based Quantum Field Theory

    Directory of Open Access Journals (Sweden)

    Mikhail V. Altaisky

    2007-11-01

    Full Text Available The Euclidean quantum field theory for the fields $phi_{Delta x}(x$, which depend on both the position $x$ and the resolution $Delta x$, constructed in SIGMA 2 (2006, 046, on the base of the continuous wavelet transform, is considered. The Feynman diagrams in such a theory become finite under the assumption there should be no scales in internal lines smaller than the minimal of scales of external lines. This regularisation agrees with the existing calculations of radiative corrections to the electron magnetic moment. The transition from the newly constructed theory to a standard Euclidean field theory is achieved by integration over the scale arguments.

  17. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  18. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  19. Year 7 Students, Information Literacy, and Transfer: A Grounded Theory

    Science.gov (United States)

    Herring, James E.

    2011-01-01

    This study examined the views of year 7 students, teacher librarians, and teachers in three state secondary schools in rural New South Wales, Australia, on information literacy and transfer. The aims of the study included the development of a grounded theory in relation to information literacy and transfer in these schools. The study's perspective…

  20. Using theories of behaviour change to inform interventions for addictive behaviours.

    Science.gov (United States)

    Webb, Thomas L; Sniehotta, Falko F; Michie, Susan

    2010-11-01

    This paper reviews a set of theories of behaviour change that are used outside the field of addiction and considers their relevance for this field. Ten theories are reviewed in terms of (i) the main tenets of each theory, (ii) the implications of the theory for promoting change in addictive behaviours and (iii) studies in the field of addiction that have used the theory. An augmented feedback loop model based on Control Theory is used to organize the theories and to show how different interventions might achieve behaviour change. Briefly, each theory provided the following recommendations for intervention: Control Theory: prompt behavioural monitoring, Goal-Setting Theory: set specific and challenging goals, Model of Action Phases: form 'implementation intentions', Strength Model of Self-Control: bolster self-control resources, Social Cognition Models (Protection Motivation Theory, Theory of Planned Behaviour, Health Belief Model): modify relevant cognitions, Elaboration Likelihood Model: consider targets' motivation and ability to process information, Prototype Willingness Model: change perceptions of the prototypical person who engages in behaviour and Social Cognitive Theory: modify self-efficacy. There are a range of theories in the field of behaviour change that can be applied usefully to addiction, each one pointing to a different set of modifiable determinants and/or behaviour change techniques. Studies reporting interventions should describe theoretical basis, behaviour change techniques and mode of delivery accurately so that effective interventions can be understood and replicated. © 2010 The Authors. Journal compilation © 2010 Society for the Study of Addiction.

  1. Information processing theory in the early design stages

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2014-01-01

    suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...

  2. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  3. Evaluation of EMG processing techniques using Information Theory.

    Science.gov (United States)

    Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J

    2010-11-12

    Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  4. Evaluation of EMG processing techniques using Information Theory

    Directory of Open Access Journals (Sweden)

    Felice Carmelo J

    2010-11-01

    Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  5. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  6. Design of Patient Satisfaction Evaluation Information System Based on Grey Fuzzy Theory%基于灰色模糊理论的病人满意度测评信息系统设计

    Institute of Scientific and Technical Information of China (English)

    刘莎; 曹锦丹

    2011-01-01

    Patient satisfaction is an important tool of evaluating medical services quality and improving the work of hospital. Patient satisfaction evaluation information system is an extremely important aspect of hospital informationization. However, satisfaction itself is a complex concept, involving much uncertainty. To make the patient satisfaction evaluation information more objective and real, this paper designed patient satisfaction information evaluation system based on grey fuzzy theory and introduced the structure, function and mathematical theory adopted by the system. The system constructed comprehensive evaluation model combining grey correlation method and fuzzy theory, assessed satisfaction using multi-level fuzzy comprehensive evaluation method and adopted equal time interval GM (1,1)model for grey system forecasting.%病人满意度是评价医院服务质量、改进医院工作的重要工具之一,因此,病人满意度测评信息系统就成为医院信息化一个非常重要的方面.但满意度本身存在很多不确定性,为使测评结果更趋于客观和真实,基于灰色模糊理论设计了病人满意度测评信息系统,并介绍了系统的结构、主要功能和采用的数学理论.系统结合灰色关联方法和模糊理论建立综合评价模型,用多层次模糊综合评判方法进行满意度评价,采用等时距EGM(1,1)模型进行灰色系统预测.

  7. Accounting bases of theory: Why they matter

    Directory of Open Access Journals (Sweden)

    Zafeer Nagdee

    2016-11-01

    Full Text Available It is widely agreed that contemporary accounting practice is largely based on the application of professional accounting standards rather than on the application of sound, academic bases of theory. This has led to uncertainty within the field which has in turn inhibited the ability of accounting to develop into a more robust academic discipline. In conducting a thematic analysis of existing literature, this study will identify and expand on three key themes which will collectively establish the argument positing that a lacking basis of accounting theory has impaired the scholastic development of accounting practice worldwide. By introducing this argument to the academic community, this study will expose the economic risks associated with accounting’s absent bases of theory and will consequently add value by highlighting the need for additional research into the development, clarification and refinement of accounting theories that will result in more useful accounting practices worldwide

  8. The use of information theory in evolutionary biology.

    Science.gov (United States)

    Adami, Christoph

    2012-05-01

    Information is a key concept in evolutionary biology. Information stored in a biological organism's genome is used to generate the organism and to maintain and control it. Information is also that which evolves. When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here, I review applications of information theory to the evolution of proteins and to the evolution of information processing in simulated agents that adapt to perform a complex task. © 2012 New York Academy of Sciences.

  9. Spacecraft TT&C and information transmission theory and technologies

    CERN Document Server

    Liu, Jiaxing

    2015-01-01

    Spacecraft TT&C and Information Transmission Theory and Technologies introduces the basic theory of spacecraft TT&C (telemetry, track and command) and information transmission. Combining TT&C and information transmission, the book presents several technologies for continuous wave radar including measurements for range, range rate and angle, analog and digital information transmissions, telecommand, telemetry, remote sensing and spread spectrum TT&C. For special problems occurred in the channels for TT&C and information transmission, the book represents radio propagation features and its impact on orbit measurement accuracy, and the effects caused by rain attenuation, atmospheric attenuation and multi-path effect, and polarization composition technology. This book can benefit researchers and engineers in the field of spacecraft TT&C and communication systems. Liu Jiaxing is a professor at The 10th Institute of China Electronics Technology Group Corporation.

  10. Probability and information theory, with applications to radar

    CERN Document Server

    Woodward, P M; Higinbotham, W

    1964-01-01

    Electronics and Instrumentation, Second Edition, Volume 3: Probability and Information Theory with Applications to Radar provides information pertinent to the development on research carried out in electronics and applied physics. This book presents the established mathematical techniques that provide the code in which so much of the mathematical theory of electronics and radar is expressed.Organized into eight chapters, this edition begins with an overview of the geometry of probability distributions in which moments play a significant role. This text then examines the mathematical methods in

  11. Generalised perturbation theory and source of information through chemical measurements

    International Nuclear Information System (INIS)

    Lelek, V.; Marek, T.

    2001-01-01

    It is important to make all analyses and collect all information from the work of the new facility (which the transmutation demonstration unit will surely be) to be sure that the operation corresponds to the forecast or to correct the equations of the facility. The behaviour of the molten salt reactor and in particular the system of measurement are very different from that of the solid fuel reactor. Key information from the long time kinetics could be the nearly on line knowledge of the fuel composition. In this work it is shown how to include it into the control and use such data for the correction of neutron cross-sections for the high actinides or other characteristics. Also the problem of safety - change of the boundary problem to the initial problem - is mentioned. The problem is transformed into the generalised perturbation theory in which the adjoint function is obtained through the solution of the equations with right hand side having the form of source. Such an approach should be a theoretical base for the calculation of the sensitivity coefficients. (authors)

  12. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  13. Understanding women's mammography intentions: a theory-based investigation.

    Science.gov (United States)

    Naito, Mikako; O'Callaghan, Frances V; Morrissey, Shirley

    2009-01-01

    The present study compared the utility of two models (the Theory of Planned Behavior and Protection Motivation Theory) in identifying factors associated with intentions to undertake screening mammography, before and after an intervention. The comparison was made between the unique components of the two models. The effect of including implementation intentions was also investigated. Two hundred and fifty-one women aged 37 to 69 years completed questionnaires at baseline and following the delivery of a standard (control) or a protection motivation theory-based informational intervention. Hierarchical multiple regressions indicated that theory of planned behavior variables were associated with mammography intentions. Results also showed that inclusion of implementation intention in the model significantly increased the association with mammography intentions. The findings suggest that future interventions aiming to increase screening mammography participation should focus on the theory of planned behavior variables and that implementation intention should also be targeted.

  14. Theory of information warfare: basic framework, methodology and conceptual apparatus

    Directory of Open Access Journals (Sweden)

    Олександр Васильович Курбан

    2015-11-01

    Full Text Available It is conducted a comprehensive theoretical study and determine the basic provisions of the modern theory of information warfare in on-line social networks. Three basic blocks, which systematized the theoretical and methodological basis of the topic, are established. There are information and psychological war, social off-line and on-line network. According to the three blocks, theoretical concepts are defined and methodological substantiation of information processes within the information warfare in the social on-line networks is formed

  15. Information in relational data bases

    Energy Technology Data Exchange (ETDEWEB)

    Abhyankar, R B

    1982-01-01

    A new knowledge representation scheme is proposed for representing incomplete information in relational data bases. The knowledge representation scheme introduces a novel convention for negative information based on modal logic and a novel data structure obtained by introducing tuple flags in the relational model of data. Standard and minimal forms are defined for relations conforming to the new data structure. The conventional relational operators, select, project and join, the redefined so they can be used to manipulate relations containing incomplete information. Conditions are presented for the lossless decomposition of relations containing incomplete information. 20 references.

  16. Observational information for f(T) theories and dark torsion

    Energy Technology Data Exchange (ETDEWEB)

    Bengochea, Gabriel R., E-mail: gabriel@iafe.uba.a [Instituto de Astronomia y Fisica del Espacio (IAFE), CC 67, Suc. 28, 1428 Buenos Aires (Argentina)

    2011-01-17

    In the present work we analyze and compare the information coming from different observational data sets in the context of a sort of f(T) theories. We perform a joint analysis with measurements of the most recent type Ia supernovae (SNe Ia), Baryon Acoustic Oscillation (BAO), Cosmic Microwave Background radiation (CMB), Gamma-Ray Bursts data (GRBs) and Hubble parameter observations (OHD) to constraint the only new parameter these theories have. It is shown that when the new combined BAO/CMB parameter is used to put constraints, the result is different from previous works. We also show that when we include Observational Hubble Data (OHD) the simpler {Lambda}CDM model is excluded to one sigma level, leading the effective equation of state of these theories to be of phantom type. Also, analyzing a tension criterion for SNe Ia and other observational sets, we obtain more consistent and better suited data sets to work with these theories.

  17. Cognitive Load Theory and the Effects of Transient Information on the Modality Effect

    Science.gov (United States)

    Leahy, Wayne; Sweller, John

    2016-01-01

    Based on cognitive load theory and the "transient information effect," this paper investigated the "modality effect" while interpreting a contour map. The length and complexity of auditory and visual text instructions were manipulated. Experiment 1 indicated that longer audio text information within a presentation was inferior…

  18. Comparison of Predictive Contract Mechanisms from an Information Theory Perspective

    OpenAIRE

    Zhang, Xin; Ward, Tomas; McLoone, Seamus

    2012-01-01

    Inconsistency arises across a Distributed Virtual Environment due to network latency induced by state changes communications. Predictive Contract Mechanisms (PCMs) combat this problem through reducing the amount of messages transmitted in return for perceptually tolerable inconsistency. To date there are no methods to quantify the efficiency of PCMs in communicating this reduced state information. This article presents an approach derived from concepts in information theory for a dee...

  19. Information richness in construction projects: A critical social theory

    NARCIS (Netherlands)

    Adriaanse, Adriaan Maria; Voordijk, Johannes T.; Greenwood, David

    2002-01-01

    Two important factors influencing the communication in construction projects are the interests of the people involved and the language spoken by the people involved. The objective of the paper is to analyse these factors by using recent insights in the information richness theory. The critical

  20. Information Architecture without Internal Theory: An Inductive Design Process.

    Science.gov (United States)

    Haverty, Marsha

    2002-01-01

    Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…

  1. The Philosophy of Information as an Underlying and Unifying Theory of Information Science

    Science.gov (United States)

    Tomic, Taeda

    2010-01-01

    Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…

  2. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  3. Information theory and stochastics for multiscale nonlinear systems

    CERN Document Server

    Majda, Andrew J; Grote, Marcus J

    2005-01-01

    This book introduces mathematicians to the fascinating emerging mathematical interplay between ideas from stochastics and information theory and important practical issues in studying complex multiscale nonlinear systems. It emphasizes the serendipity between modern applied mathematics and applications where rigorous analysis, the development of qualitative and/or asymptotic models, and numerical modeling all interact to explain complex phenomena. After a brief introduction to the emerging issues in multiscale modeling, the book has three main chapters. The first chapter is an introduction to information theory with novel applications to statistical mechanics, predictability, and Jupiter's Red Spot for geophysical flows. The second chapter discusses new mathematical issues regarding fluctuation-dissipation theorems for complex nonlinear systems including information flow, various approximations, and illustrates applications to various mathematical models. The third chapter discusses stochastic modeling of com...

  4. Entropy and information causality in general probabilistic theories

    International Nuclear Information System (INIS)

    Barnum, Howard; Leifer, Matthew; Spekkens, Robert; Barrett, Jonathan; Clark, Lisa Orloff; Stepanik, Nicholas; Wilce, Alex; Wilke, Robin

    2010-01-01

    We investigate the concept of entropy in probabilistic theories more general than quantum mechanics, with particular reference to the notion of information causality (IC) recently proposed by Pawlowski et al (2009 arXiv:0905.2292). We consider two entropic quantities, which we term measurement and mixing entropy. In the context of classical and quantum theory, these coincide, being given by the Shannon and von Neumann entropies, respectively; in general, however, they are very different. In particular, while measurement entropy is easily seen to be concave, mixing entropy need not be. In fact, as we show, mixing entropy is not concave whenever the state space is a non-simplicial polytope. Thus, the condition that measurement and mixing entropies coincide is a strong constraint on possible theories. We call theories with this property monoentropic. Measurement entropy is subadditive, but not in general strongly subadditive. Equivalently, if we define the mutual information between two systems A and B by the usual formula I(A: B)=H(A)+H(B)-H(AB), where H denotes the measurement entropy and AB is a non-signaling composite of A and B, then it can happen that I(A:BC)< I(A:B). This is relevant to IC in the sense of Pawlowski et al: we show that any monoentropic non-signaling theory in which measurement entropy is strongly subadditive, and also satisfies a version of the Holevo bound, is informationally causal, and on the other hand we observe that Popescu-Rohrlich boxes, which violate IC, also violate strong subadditivity. We also explore the interplay between measurement and mixing entropy and various natural conditions on theories that arise in quantum axiomatics.

  5. Actor Network Theory Approach and its Application in Investigating Agricultural Climate Information System

    Directory of Open Access Journals (Sweden)

    Maryam Sharifzadeh

    2013-03-01

    Full Text Available Actor network theory as a qualitative approach to study complex social factors and process of socio-technical interaction provides new concepts and ideas to understand socio-technical nature of information systems. From the actor network theory viewpoint, agricultural climate information system is a network consisting of actors, actions and information related processes (production, transformation, storage, retrieval, integration, diffusion and utilization, control and management, and system mechanisms (interfaces and networks. Analysis of such systemsembody the identification of basic components and structure of the system (nodes –thedifferent sources of information production, extension, and users, and the understanding of how successfully the system works (interaction and links – in order to promote climate knowledge content and improve system performance to reach agricultural development. The present research attempted to introduce actor network theory as research framework based on network view of agricultural climate information system.

  6. Testing components of Rothbard’s theory with the current information system

    Directory of Open Access Journals (Sweden)

    Aurelian Virgil BĂLUŢĂ

    2016-03-01

    Full Text Available The concept of aggression against property rights of individuals generates a series of developments that allow solutions and options to problems and dilemmas of today's economy: the dynamics of the tax system, focusing attention on shaping the budget with macro-economic calculations, the protection of competition, and customs policy in the modern era. The confidence in theory in general, especially in economic theory, is based on the logical and methodological validation of scientific reasoning and moral aspects. Transforming the theory into a means of changing the society can only be made when a theory is experimentally validated. The economic theory needs confirmation from specialized disciplines such as statistics and accounting. It is possible and necessary for the advantages of radical liberal thinking to be reflected in every company’s bookkeeping and in public statistics. As an example, the paper presents the way some components of Rothbard's theory are reflect in the accounting and statistics information system.

  7. Using institutional theory with sensemaking theory: a case study of information system implementation in healthcare

    DEFF Research Database (Denmark)

    Jensen, Tina Blegind; Kjærgaard, Annemette; Svejvig, Per

    2009-01-01

    Institutional theory has proven to be a central analytical perspective for investigating the role of social and historical structures of information systems (IS) implementation. However, it does not explicitly account for how organisational actors make sense of and enact technologies in their local...... context. We address this limitation by exploring the potential of using institutional theory with sensemaking theory to study IS implementation in organisations. We argue that each theoretical perspective has its own explanatory power and that a combination of the two facilitates a much richer...... interpretation of IS implementation by linking macro- and micro-levels of analysis. To illustrate this, we report from an empirical study of the implementation of an Electronic Patient Record (EPR) system in a clinical setting. Using key constructs from the two theories, our findings address the phenomenon...

  8. A Performance-Based Instructional Theory

    Science.gov (United States)

    Lawson, Tom E.

    1974-01-01

    The rationale for a performanced- based instructional theory has arisen from significant advances during the past several years in instructional psychology. Four major areas of concern are: analysis of subject-matter content in terms of performance competencies, diagnosis of pre-instructional behavior, formulation of an instructional…

  9. Jigsaw Cooperative Learning: Acid-Base Theories

    Science.gov (United States)

    Tarhan, Leman; Sesen, Burcin Acar

    2012-01-01

    This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

  10. Intuitive theories of information: beliefs about the value of redundancy.

    Science.gov (United States)

    Soll, J B

    1999-03-01

    In many situations, quantity estimates from multiple experts or diagnostic instruments must be collected and combined. Normatively, and all else equal, one should value information sources that are nonredundant, in the sense that correlation in forecast errors should be minimized. Past research on the preference for redundancy has been inconclusive. While some studies have suggested that people correctly place higher value on uncorrelated inputs when collecting estimates, others have shown that people either ignore correlation or, in some cases, even prefer it. The present experiments show that the preference for redundancy depends on one's intuitive theory of information. The most common intuitive theory identified is the Error Tradeoff Model (ETM), which explicitly distinguishes between measurement error and bias. According to ETM, measurement error can only be averaged out by consulting the same source multiple times (normatively false), and bias can only be averaged out by consulting different sources (normatively true). As a result, ETM leads people to prefer redundant estimates when the ratio of measurement error to bias is relatively high. Other participants favored different theories. Some adopted the normative model, while others were reluctant to mathematically average estimates from different sources in any circumstance. In a post hoc analysis, science majors were more likely than others to subscribe to the normative model. While tentative, this result lends insight into how intuitive theories might develop and also has potential ramifications for how statistical concepts such as correlation might best be learned and internalized. Copyright 1999 Academic Press.

  11. Should the model for risk-informed regulation be game theory rather than decision theory?

    Science.gov (United States)

    Bier, Vicki M; Lin, Shi-Woei

    2013-02-01

    Risk analysts frequently view the regulation of risks as being largely a matter of decision theory. According to this view, risk analysis methods provide information on the likelihood and severity of various possible outcomes; this information should then be assessed using a decision-theoretic approach (such as cost/benefit analysis) to determine whether the risks are acceptable, and whether additional regulation is warranted. However, this view ignores the fact that in many industries (particularly industries that are technologically sophisticated and employ specialized risk and safety experts), risk analyses may be done by regulated firms, not by the regulator. Moreover, those firms may have more knowledge about the levels of safety at their own facilities than the regulator does. This creates a situation in which the regulated firm has both the opportunity-and often also the motive-to provide inaccurate (in particular, favorably biased) risk information to the regulator, and hence the regulator has reason to doubt the accuracy of the risk information provided by regulated parties. Researchers have argued that decision theory is capable of dealing with many such strategic interactions as well as game theory can. This is especially true in two-player, two-stage games in which the follower has a unique best strategy in response to the leader's strategy, as appears to be the case in the situation analyzed in this article. However, even in such cases, we agree with Cox that game-theoretic methods and concepts can still be useful. In particular, the tools of mechanism design, and especially the revelation principle, can simplify the analysis of such games because the revelation principle provides rigorous assurance that it is sufficient to analyze only games in which licensees truthfully report their risk levels, making the problem more manageable. Without that, it would generally be necessary to consider much more complicated forms of strategic behavior (including

  12. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    Science.gov (United States)

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  13. Russian and Chinese Information Warfare: Theory and Practice

    Science.gov (United States)

    2004-06-01

    Integral neurolinguistic programming •Placing essential programs into the conscious or sub- conscious mind •Subconscious suggestions that modify human...Generators of special rays •Optical systems • Neurolinguistic programming •Computer psychotechnology •The mass media •Audiovisual effects •Special effects...Information Warfare: Theory and Practice 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

  14. Information theory, animal communication, and the search for extraterrestrial intelligence

    Science.gov (United States)

    Doyle, Laurance R.; McCowan, Brenda; Johnston, Simon; Hanser, Sean F.

    2011-02-01

    We present ongoing research in the application of information theory to animal communication systems with the goal of developing additional detectors and estimators for possible extraterrestrial intelligent signals. Regardless of the species, for intelligence (i.e., complex knowledge) to be transmitted certain rules of information theory must still be obeyed. We demonstrate some preliminary results of applying information theory to socially complex marine mammal species (bottlenose dolphins and humpback whales) as well as arboreal squirrel monkeys, because they almost exclusively rely on vocal signals for their communications, producing signals which can be readily characterized by signal analysis. Metrics such as Zipf's Law and higher-order information-entropic structure are emerging as indicators of the communicative complexity characteristic of an "intelligent message" content within these animals' signals, perhaps not surprising given these species' social complexity. In addition to human languages, for comparison we also apply these metrics to pulsar signals—perhaps (arguably) the most "organized" of stellar systems—as an example of astrophysical systems that would have to be distinguished from an extraterrestrial intelligence message by such information theoretic filters. We also look at a message transmitted from Earth (Arecibo Observatory) that contains a lot of meaning but little information in the mathematical sense we define it here. We conclude that the study of non-human communication systems on our own planet can make a valuable contribution to the detection of extraterrestrial intelligence by providing quantitative general measures of communicative complexity. Studying the complex communication systems of other intelligent species on our own planet may also be one of the best ways to deprovincialize our thinking about extraterrestrial communication systems in general.

  15. On long-only information-based portfolio diversification framework

    Science.gov (United States)

    Santos, Raphael A.; Takada, Hellinton H.

    2014-12-01

    Using the concepts from information theory, it is possible to improve the traditional frameworks for long-only asset allocation. In modern portfolio theory, the investor has two basic procedures: the choice of a portfolio that maximizes its risk-adjusted excess return or the mixed allocation between the maximum Sharpe portfolio and the risk-free asset. In the literature, the first procedure was already addressed using information theory. One contribution of this paper is the consideration of the second procedure in the information theory context. The performance of these approaches was compared with three traditional asset allocation methodologies: the Markowitz's mean-variance, the resampled mean-variance and the equally weighted portfolio. Using simulated and real data, the information theory-based methodologies were verified to be more robust when dealing with the estimation errors.

  16. Isotope-based quantum information

    CERN Document Server

    G Plekhanov, Vladimir

    2012-01-01

    The present book provides to the main ideas and techniques of the rapid progressing field of quantum information and quantum computation using isotope - mixed materials. It starts with an introduction to the isotope physics and then describes of the isotope - based quantum information and quantum computation. The ability to manipulate and control electron and/or nucleus spin in semiconductor devices provides a new route to expand the capabilities of inorganic semiconductor-based electronics and to design innovative devices with potential application in quantum computing. One of the major challenges towards these objectives is to develop semiconductor-based systems and architectures in which the spatial distribution of spins and their properties can be controlled. For instance, to eliminate electron spin decoherence resulting from hyperfine interaction due to nuclear spin background, isotopically controlled devices are needed (i.e., nuclear spin-depleted). In other emerging concepts, the control of the spatial...

  17. Isotope-based quantum information

    International Nuclear Information System (INIS)

    Plekhanov, Vladimir G.

    2012-01-01

    The present book provides to the main ideas and techniques of the rapid progressing field of quantum information and quantum computation using isotope - mixed materials. It starts with an introduction to the isotope physics and then describes of the isotope - based quantum information and quantum computation. The ability to manipulate and control electron and/or nucleus spin in semiconductor devices provides a new route to expand the capabilities of inorganic semiconductor-based electronics and to design innovative devices with potential application in quantum computing. One of the major challenges towards these objectives is to develop semiconductor-based systems and architectures in which the spatial distribution of spins and their properties can be controlled. For instance, to eliminate electron spin decoherence resulting from hyperfine interaction due to nuclear spin background, isotopically controlled devices are needed (i.e., nuclear spin-depleted). In other emerging concepts, the control of the spatial distribution of isotopes with nuclear spins is a prerequisite to implement the quantum bits (or qbits). Therefore, stable semiconductor isotopes are important elements in the development of solid-state quantum information. There are not only different algorithms of quantum computation discussed but also the different models of quantum computers are presented. With numerous illustrations this small book is of great interest for undergraduate students taking courses in mesoscopic physics or nanoelectronics as well as quantum information, and academic and industrial researches working in this field.

  18. Integrated information theory of consciousness: an updated account.

    Science.gov (United States)

    Tononi, G

    2012-12-01

    This article presents an updated account of integrated information theory of consciousness (liT) and some of its implications. /IT stems from thought experiments that lead to phenomenological axioms (existence, compositionality, information, integration, exclusion) and corresponding ontological postulates. The information axiom asserts that every experience is spec~fic - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is unified- it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. /IT formalizes these intuitions with postulates. The information postulate states that only "differences that make a difference" from the intrinsic perpective of a system matter: a mechanism generates cause-effect information if its present state has selective past causes and selective future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated over elements and at the optimal spatiatemporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of

  19. Critical Theory as a foundation for Pragmatic Information Systems Design

    OpenAIRE

    Gerald Benoît

    2001-01-01

    This paper considers how designers of information systems and end-user perspectives, communication models and linguistic behaviors differ. A critique of these differences is made by applying Habermas's communicative action principles. An empirical study of human-human information seeking, based on those principles, indicates which behaviors are predictors of successful interactions and so are candidate behaviors may be integrated into computerized information systems.

  20. Informed consent in neurosurgery--translating ethical theory into action.

    Science.gov (United States)

    Schmitz, Dagmar; Reinacher, Peter C

    2006-09-01

    Although a main principle of medical ethics and law since the 1970s, standards of informed consent are regarded with great scepticism by many clinicans. By reviewing the reactions to and adoption of this principle of medical ethics in neurosurgery, the characteristic conflicts that emerge between theory and everyday clinical experience are emphasised and a modified conception of informed consent is proposed. The adoption and debate of informed consent in neurosurgery took place in two steps. Firstly, respect for patient autonomy was included into the ethical codes of the professional organisations. Secondly, the legal demands of the principle were questioned by clinicians. Informed consent is mainly interpreted in terms of freedom from interference and absolute autonomy. It lacks a constructive notion of physician-patient interaction in its effort to promote the best interest of the patient, which, however, potentially emerges from a reconsideration of the principle of beneficence. To avoid insufficient legal interpretations, informed consent should be understood in terms of autonomy and beneficence. A continuous interaction between the patient and the given physician is considered as an essential prerequisite for the realisation of the standards of informed consent.

  1. Informed consent in neurosurgery—translating ethical theory into action

    Science.gov (United States)

    Schmitz, Dagmar; Reinacher, Peter C

    2006-01-01

    Objective Although a main principle of medical ethics and law since the 1970s, standards of informed consent are regarded with great scepticism by many clinicans. Methods By reviewing the reactions to and adoption of this principle of medical ethics in neurosurgery, the characteristic conflicts that emerge between theory and everyday clinical experience are emphasised and a modified conception of informed consent is proposed. Results The adoption and debate of informed consent in neurosurgery took place in two steps. Firstly, respect for patient autonomy was included into the ethical codes of the professional organisations. Secondly, the legal demands of the principle were questioned by clinicians. Informed consent is mainly interpreted in terms of freedom from interference and absolute autonomy. It lacks a constructive notion of physician–patient interaction in its effort to promote the best interest of the patient, which, however, potentially emerges from a reconsideration of the principle of beneficence. Conclusion To avoid insufficient legal interpretations, informed consent should be understood in terms of autonomy and beneficence. A continuous interaction between the patient and the given physician is considered as an essential prerequisite for the realisation of the standards of informed consent. PMID:16943326

  2. New approaches in mathematical biology: Information theory and molecular machines

    International Nuclear Information System (INIS)

    Schneider, T.

    1995-01-01

    My research uses classical information theory to study genetic systems. Information theory was founded by Claude Shannon in the 1940's and has had an enormous impact on communications engineering and computer sciences. Shannon found a way to measure information. This measure can be used to precisely characterize the sequence conservation at nucleic-acid binding sites. The resulting methods, by completely replacing the use of ''consensus sequences'', provide better models for molecular biologists. An excess of conservation led us to do experimental work on bacteriophage T7 promoters and the F plasmid IncD repeats. The wonderful fidelity of telephone communications and compact disk (CD) music can be traced directly to Shannon's channel capacity theorem. When rederived for molecular biology, this theorem explains the surprising precision of many molecular events. Through connections with the Second Law of Thermodyanmics and Maxwell's Demon, this approach also has implications for the development of technology at the molecular level. Discussions of these topics are held on the internet news group bionet.info-theo. (author). (Abstract only)

  3. A Rolling Element Bearing Fault Diagnosis Approach Based on Multifractal Theory and Gray Relation Theory.

    Science.gov (United States)

    Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying

    2016-01-01

    Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately.

  4. Online dating in Japan: a test of social information processing theory.

    Science.gov (United States)

    Farrer, James; Gavin, Jeff

    2009-08-01

    This study examines the experiences of past and present members of a popular Japanese online dating site in order to explore the extent to which Western-based theories of computer-mediated communication (CMC) and the development of online relationships are relevant to the Japanese online dating experience. Specifically, it examines whether social information processing theory (SIPT) is applicable to Japanese online dating interactions, and how and to what extent Japanese daters overcome the limitations of CMC through the use of contextual and other cues. Thirty-six current members and 27 former members of Match.com Japan completed an online survey. Using issue-based procedures for grounded theory analysis, we found strong support for SIPT. Japanese online daters adapt their efforts to present and acquire social information using the cues that the online dating platform provides, although many of these cues are specific to Japanese social context.

  5. Optimizing Sparse Representations of Kinetic Distributions via Information Theory

    Science.gov (United States)

    2017-07-31

    Information Theory Robert Martin and Daniel Eckhardt Air Force Research Laboratory (AFMC) AFRL/RQRS 1 Ara Drive Edwards AFB, CA 93524-7013 Air Force...momentum, energy, and physical entropy. N/A Unclassified Unclassified Unclassified SAR 7 Robert Martin N/A Research in Industrial Projects for Students...Journal of Computational Physics, vol. 145, no. 1, pp. 382 – 405, 1998. [7] R. S. Martin , H. Le, D. L. Bilyeu, and S. Gildea, “Plasma model V&V of

  6. Properties of some nonlinear Schroedinger equations motivated through information theory

    International Nuclear Information System (INIS)

    Yuan, Liew Ding; Parwani, Rajesh R

    2009-01-01

    We update our understanding of nonlinear Schroedinger equations motivated through information theory. In particular we show that a q-deformation of the basic nonlinear equation leads to a perturbative increase in the energy of a system, thus favouring the simplest q = 1 case. Furthermore the energy minimisation criterion is shown to be equivalent, at leading order, to an uncertainty maximisation argument. The special value η = 1/4 for the interpolation parameter, where leading order energy shifts vanish, implies the preservation of existing supersymmetry in nonlinearised supersymmetric quantum mechanics. Physically, η might be encoding relativistic effects.

  7. Surrogate Marker Evaluation from an Information Theory Perspective

    OpenAIRE

    Alonso Abad, Ariel; Molenberghs, Geert

    2006-01-01

    The last 20 years have seen lots of work in the area of surrogate marker validation, partly devoted to frame the evaluation in a multitrial framework, leading to definitions in terms of the quality of trial- and individual-level association between a potential surrogate and a true endpoint (Buyse et al., 2000, Biostatistics 1, 49–67). A drawback is that different settings have led to different measures at the individual level. Here, we use information theory to create a unified framework, lea...

  8. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  9. An introductory review of information theory in the context of computational neuroscience.

    Science.gov (United States)

    McDonnell, Mark D; Ikeda, Shiro; Manton, Jonathan H

    2011-07-01

    This article introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand its foundations to incorporate more comprehensively biological processes thereby helping reveal how neuronal networks achieve their remarkable information processing abilities.

  10. The Scope of Usage-based Theory

    Directory of Open Access Journals (Sweden)

    Paul eIbbotson

    2013-05-01

    Full Text Available Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the ‘cognitive commitment’ of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure versus cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works.

  11. Information flow, causality, and the classical theory of tachyons

    International Nuclear Information System (INIS)

    Basano, L.

    1977-01-01

    Causal paradoxes arising in the tachyon theory have been systematically solved by using the reinterpretation principle as a consequence of which cause and effect no longer retain an absolute meaning. However, even in the tachyon theory, a cause is always seen to chronologically precede its effect, but this is obtained at the price of allowing cause and effect to be interchanged when required. A recent result has shown that this interchange-ability of cause and effect must not be unlimited if heavy paradoxes are to be avoided. This partial recovery of the classical concept of causality has been expressed by the conjecture that transcendent tachyons cannot be absorbed by a tachyon detector. In this paper the directional properties of the flow of information between two observers in relative motion and its consequences on the logical self-consistency of the theory of superluminal particles are analyzed. It is shown that the above conjecture does not provide a satisfactory solution to the problem because it implies that tachyons of any speed cannot be intercepted by the same detector. (author)

  12. Novel theory of the human brain: information-commutation basis of architecture and principles of operation

    Directory of Open Access Journals (Sweden)

    Bryukhovetskiy AS

    2015-02-01

    Full Text Available Andrey S Bryukhovetskiy Center for Biomedical Technologies, Federal Research and Clinical Center for Specialized Types of Medical Assistance and Medical Technologies of the Federal Medical Biological Agency, NeuroVita Clinic of Interventional and Restorative Neurology and Therapy, Moscow, Russia Abstract: Based on the methodology of the informational approach and research of the genome, proteome, and complete transcriptome profiles of different cells in the nervous tissue of the human brain, the author proposes a new theory of information-commutation organization and architecture of the human brain which is an alternative to the conventional systemic connective morphofunctional paradigm of the brain framework. Informational principles of brain operation are defined: the modular principle, holographic principle, principle of systematicity of vertical commutative connection and complexity of horizontal commutative connection, regulatory principle, relay principle, modulation principle, “illumination” principle, principle of personalized memory and intellect, and principle of low energy consumption. The author demonstrates that the cortex functions only as a switchboard and router of information, while information is processed outside the nervous tissue of the brain in the intermeningeal space. The main structural element of information-commutation in the brain is not the neuron, but information-commutation modules that are subdivided into receiver modules, transmitter modules, and subscriber modules, forming a vertical architecture of nervous tissue in the brain as information lines and information channels, and a horizontal architecture as central, intermediate, and peripheral information-commutation platforms. Information in information-commutation modules is transferred by means of the carriers that are characteristic to the specific information level from inductome to genome, transcriptome, proteome, metabolome, secretome, and magnetome

  13. The Study on Electronic Commerce Information Ethic Problems Base on Information Asymmetry Theory%基于信息不对称的电子商务诚信问题研究

    Institute of Scientific and Technical Information of China (English)

    陈开慧

    2012-01-01

    Summarizing information ethic problems consisting of privacy problem, credit crisis and intellectual property problem in the electronic commerce environment, hnalysing process by emerging electronic commerce information ethic problems at the point of information asymmetry and effect on electronic commerce, hdvancing the effective measures for example establishing licensing system, guarantee and brand for indirect selecting; establishing the third party payment institution and the third party authentication institution, developing long-term cooperation and insurance business for moral risk, in order to providing new angle for electronic commerce ethic problems in our country.%概述电子商务环境下信息伦理隐私权、信用危机及知识产权。从信息不对称角度分析电子商务信息伦理问题产生的过程及其对电子商务发展的影响。针对逆向选择,提出建立许可制度,保证书和建立品牌作为应对措施;针对道德风险,提出建立第三方支付企业,成立第三方认证机构,采取长期合作机制和开展物品保险业务作为应对措施,为解决电子商务信息伦理问题提供新思路。

  14. Prolegomena to a theory of nuclear information exchange

    International Nuclear Information System (INIS)

    Van Nuffelen, Dominique

    1997-01-01

    From the researcher's point of view, the communications with the agricultural populations in case of radiological emergency can not be anything else but the application of a theory of nuclear information exchange among social groups. Consequently, it is essentially necessary to work out such a theory, the prolegomena of which are exposed in this paper. It describes an experiment conducted at 'Service de protection contre les radiations ionisantes' - Belgium (SPRI), and proposes an investigation within the scientific knowledge in this matter. The available empirical and theoretical data allow formulating pragmatic recommendations, among which the principal one is the necessity of creating in normal radiological situation of a number of scenarios of messages adapted to the agricultural populations. The author points out that in order to be perfectly adapted these scenarios must been negotiated between the emitter and receiver. If this condition is satisfied the information in case of nuclear emergency will really be an exchange of knowledge between experts and the agricultural population i.e. a 'communication'

  15. Understanding family health information seeking: a test of the theory of motivated information management.

    Science.gov (United States)

    Hovick, Shelly R

    2014-01-01

    Although a family health history can be used to assess disease risk and increase health prevention behaviors, research suggests that few people have collected family health information. Guided by the Theory of Motivated Information Management, this study seeks to understand the barriers to and facilitators of interpersonal information seeking about family health history. Individuals who were engaged to be married (N = 306) were surveyed online and in person to understand how factors such as uncertainty, expectations for an information search, efficacy, and anxiety influence decisions and strategies for obtaining family health histories. The results supported the Theory of Motivated Information Management by demonstrating that individuals who experienced uncertainty discrepancies regarding family heath history had greater intention to seek information from family members when anxiety was low, outcome expectancy was high, and communication efficacy was positive. Although raising uncertainty about family health history may be an effective tool for health communicators to increase communication among family members, low-anxiety situations may be optimal for information seeking. Health communication messages must also build confidence in people's ability to communicate with family to obtain the needed health information.

  16. Efficiency and credit ratings: a permutation-information-theory analysis

    International Nuclear Information System (INIS)

    Bariviera, Aurelio Fernandez; Martinez, Lisana B; Zunino, Luciano; Belén Guercio, M; Rosso, Osvaldo A

    2013-01-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity–entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification. (paper)

  17. The use of information theory for the evaluation of biomarkers of aging and physiological age.

    Science.gov (United States)

    Blokh, David; Stambler, Ilia

    2017-04-01

    The present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians. The use of information-theoretical methods, utilizing normalized mutual information, revealed the exact amount of information that various diagnostic parameters or their combinations contained about the persons' age. Based on those exact informative values for the correlation of measured parameters with age, we constructed a diagnostic rule (a decision tree) to evaluate physiological age, as compared to chronological age. The present data illustrated that younger subjects suffering from heart disease showed characteristics of people of higher age (higher physiological age). Utilizing information-theoretical measures, with additional data, it may be possible to create further clinically applicable information-theory-based markers and models for the evaluation of physiological age, its relation to age-related diseases and its potential modifications by therapeutic interventions. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A Game Theory Based Solution for Security Challenges in CRNs

    Science.gov (United States)

    Poonam; Nagpal, Chander Kumar

    2018-03-01

    Cognitive radio networks (CRNs) are being envisioned to drive the next generation Ad hoc wireless networks due to their ability to provide communications resilience in continuously changing environments through the use of dynamic spectrum access. Conventionally CRNs are dependent upon the information gathered by other secondary users to ensure the accuracy of spectrum sensing making them vulnerable to security attacks leading to the need of security mechanisms like cryptography and trust. However, a typical cryptography based solution is not a viable security solution for CRNs owing to their limited resources. Effectiveness of trust based approaches has always been, in question, due to credibility of secondary trust resources. Game theory with its ability to optimize in an environment of conflicting interests can be quite a suitable tool to manage an ad hoc network in the presence of autonomous selfish/malevolent/malicious and attacker nodes. The literature contains several theoretical proposals for augmenting game theory in the ad hoc networks without explicit/detailed implementation. This paper implements a game theory based solution in MATLAB-2015 to secure the CRN environment and compares the obtained results with the traditional approaches of trust and cryptography. The simulation result indicates that as the time progresses the game theory performs much better with higher throughput, lower jitter and better identification of selfish/malicious nodes.

  19. Defining information need in health - assimilating complex theories derived from information science.

    Science.gov (United States)

    Ormandy, Paula

    2011-03-01

    Key policy drivers worldwide include optimizing patients' roles in managing their care; focusing services around patients' needs and preferences; and providing information to support patients' contributions and choices. The term information need penetrates many policy documents. Information need is espoused as the foundation from which to develop patient-centred or patient-led services. Yet there is no clear definition as to what the term means or how patients' information needs inform and shape information provision and patient care. The assimilation of complex theories originating from information science has much to offer considerations of patient information need within the context of health care. Health-related research often focuses on the content of information patients prefer, not why they need information. This paper extends and applies knowledge of information behaviour to considerations of information need in health, exposing a working definition for patient information need that reiterates the importance of considering the patient's goals and understanding the patient's context/situation. A patient information need is defined as 'recognition that their knowledge is inadequate to satisfy a goal, within the context/situation that they find themselves at a specific point in the time'. This typifies the key concepts of national/international health policy, the centrality and importance of the patient. The proposed definition of patient information need provides a conceptual framework to guide health-care practitioners on what to consider and why when meeting the information needs of patients in practice. This creates a solid foundation from which to inform future research. © 2010 The Author. Health Expectations © 2010 Blackwell Publishing Ltd.

  20. Introduction to the theory of bases

    CERN Document Server

    Marti, Jürg T

    1969-01-01

    Since the publication of Banach's treatise on the theory of linear operators, the literature on the theory of bases in topological vector spaces has grown enormously. Much of this literature has for its origin a question raised in Banach's book, the question whether every sepa­ rable Banach space possesses a basis or not. The notion of a basis employed here is a generalization of that of a Hamel basis for a finite dimensional vector space. For a vector space X of infinite dimension, the concept of a basis is closely related to the convergence of the series which uniquely correspond to each point of X. Thus there are different types of bases for X, according to the topology imposed on X and the chosen type of convergence for the series. Although almost four decades have elapsed since Banach's query, the conjectured existence of a basis for every separable Banach space is not yet proved. On the other hand, no counter examples have been found to show the existence of a special Banach space having no basis. Howe...

  1. Platoon Dispersion Analysis Based on Diffusion Theory

    Directory of Open Access Journals (Sweden)

    Badhrudeen Mohamed

    2017-01-01

    Full Text Available Urbanization and gro wing demand for travel, causes the traffic system to work ineffectively in most urban areas leadin g to traffic congestion. Many approaches have been adopted to address this problem, one among them being the signal co-ordination. This can be achieved if the platoon of vehicles that gets discharged at one signal gets green at consecutive signals with minimal delay. However, platoons tend to get dispersed as they travel and this dispersion phenomenon should be taken into account for effective signal coordination. Reported studies in this area are from the homogeneous and lane disciplined traffic conditions. This paper analyse the platoon dispersion characteristics under heterogeneous and lane-less traffic conditions. Out of the various modeling techniques reported, the approach based on diffusion theory is used in this study. The diffusion theory based models so far assumed thedata to follow normal distribution. However, in the present study, the data was found to follow lognormal distribution and hence the implementation was carried out using lognormal distribution. The parameters of lognormal distribution were calibrated for the study condition. For comparison purpose, normal distribution was also calibrated and the results were evaluated. It was foun d that model with log normal distribution performed better in all cases than the o ne with normal distribution.

  2. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website.

    Science.gov (United States)

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; White, Peter; Yardley, Lucy; Bishop, Felicity L

    2016-06-10

    According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative 'think aloud' study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients' stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants' experiences of using the website. We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes

  3. BOOK REVIEW: Theory of Neural Information Processing Systems

    Science.gov (United States)

    Galla, Tobias

    2006-04-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  4. EDITORIAL: Focus on Quantum Information and Many-Body Theory

    Science.gov (United States)

    Eisert, Jens; Plenio, Martin B.

    2010-02-01

    Quantum many-body models describing natural systems or materials and physical systems assembled piece by piece in the laboratory for the purpose of realizing quantum information processing share an important feature: intricate correlations that originate from the coherent interaction between a large number of constituents. In recent years it has become manifest that the cross-fertilization between research devoted to quantum information science and to quantum many-body physics leads to new ideas, methods, tools, and insights in both fields. Issues of criticality, quantum phase transitions, quantum order and magnetism that play a role in one field find relations to the classical simulation of quantum systems, to error correction and fault tolerance thresholds, to channel capacities and to topological quantum computation, to name but a few. The structural similarities of typical problems in both fields and the potential for pooling of ideas then become manifest. Notably, methods and ideas from quantum information have provided fresh approaches to long-standing problems in strongly correlated systems in the condensed matter context, including both numerical methods and conceptual insights. Focus on quantum information and many-body theory Contents TENSOR NETWORKS Homogeneous multiscale entanglement renormalization ansatz tensor networks for quantum critical systems M Rizzi, S Montangero, P Silvi, V Giovannetti and Rosario Fazio Concatenated tensor network states R Hübener, V Nebendahl and W Dür Entanglement renormalization in free bosonic systems: real-space versus momentum-space renormalization group transforms G Evenbly and G Vidal Finite-size geometric entanglement from tensor network algorithms Qian-Qian Shi, Román Orús, John Ove Fjærestad and Huan-Qiang Zhou Characterizing symmetries in a projected entangled pair state D Pérez-García, M Sanz, C E González-Guillén, M M Wolf and J I Cirac Matrix product operator representations B Pirvu, V Murg, J I Cirac

  5. EDITORIAL: Quantum control theory for coherence and information dynamics Quantum control theory for coherence and information dynamics

    Science.gov (United States)

    Viola, Lorenza; Tannor, David

    2011-08-01

    Precisely characterizing and controlling the dynamics of realistic open quantum systems has emerged in recent years as a key challenge across contemporary quantum sciences and technologies, with implications ranging from physics, chemistry and applied mathematics to quantum information processing (QIP) and quantum engineering. Quantum control theory aims to provide both a general dynamical-system framework and a constructive toolbox to meet this challenge. The purpose of this special issue of Journal of Physics B: Atomic, Molecular and Optical Physics is to present a state-of-the-art account of recent advances and current trends in the field, as reflected in two international meetings that were held on the subject over the last summer and which motivated in part the compilation of this volume—the Topical Group: Frontiers in Open Quantum Systems and Quantum Control Theory, held at the Institute for Theoretical Atomic, Molecular and Optical Physics (ITAMP) in Cambridge, Massachusetts (USA), from 1-14 August 2010, and the Safed Workshop on Quantum Decoherence and Thermodynamics Control, held in Safed (Israel), from 22-27 August 2010. Initial developments in quantum control theory date back to (at least) the early 1980s, and have been largely inspired by the well-established mathematical framework for classical dynamical systems. As the above-mentioned meetings made clear, and as the burgeoning body of literature on the subject testifies, quantum control has grown since then well beyond its original boundaries, and has by now evolved into a highly cross-disciplinary field which, while still fast-moving, is also entering a new phase of maturity, sophistication, and integration. Two trends deserve special attention: on the one hand, a growing emphasis on control tasks and methodologies that are specifically motivated by QIP, in addition and in parallel to applications in more traditional areas where quantum coherence is nevertheless vital (such as, for instance

  6. Surrogate marker evaluation from an information theory perspective.

    Science.gov (United States)

    Alonso, Ariel; Molenberghs, Geert

    2007-03-01

    The last 20 years have seen lots of work in the area of surrogate marker validation, partly devoted to frame the evaluation in a multitrial framework, leading to definitions in terms of the quality of trial- and individual-level association between a potential surrogate and a true endpoint (Buyse et al., 2000, Biostatistics 1, 49-67). A drawback is that different settings have led to different measures at the individual level. Here, we use information theory to create a unified framework, leading to a definition of surrogacy with an intuitive interpretation, offering interpretational advantages, and applicable in a wide range of situations. Our method provides a better insight into the chances of finding a good surrogate endpoint in a given situation. We further show that some of the previous proposals follow as special cases of our method. We illustrate our methodology using data from a clinical study in psychiatry.

  7. Information theory applied to econophysics: stock market behaviors

    Science.gov (United States)

    Vogel, Eugenio E.; Saravia, Gonzalo

    2014-08-01

    The use of data compressor techniques has allowed to recognize magnetic transitions and their associated critical temperatures [E.E. Vogel, G. Saravia, V. Cortez, Physica A 391, 1591 (2012)]. In the present paper we introduce some new concepts associated to data recognition and extend the use of these techniques to econophysics to explore the variations of stock market indicators showing that information theory can help to recognize different regimes. Modifications and further developments to previously introduced data compressor wlzip are introduced yielding two measurements. Additionally, we introduce an algorithm that allows to tune the number of significant digits over which the data compression is due to act complementing, this with an appropriate method to round off the truncation. The application is done to IPSA, the main indicator of the Chilean Stock Market during the year 2010 due to availability of quality data and also to consider a rare effect: the earthquake of the 27th of February on that year which is as of now the sixth strongest earthquake ever recorded by instruments (8.8 Richter scale) according to United States Geological Survey. Along the year 2010 different regimes are recognized. Calm days show larger compression than agitated days allowing for classification and recognition. Then the focus turns onto selected days showing that it is possible to recognize different regimes with the data of the last hour (60 entries) allowing to determine actions in a safer way. The "day of the week" effect is weakly present but "the hour of the day" effect is clearly present; its causes and implications are discussed. This effect also establishes the influence of Asian, European and American stock markets over the smaller Chilean Stock Market. Then dynamical studies are conducted intended to search a system that can help to realize in real time about sudden variations of the market; it is found that information theory can be really helpful in this respect.

  8. Advances in heuristically based generalized perturbation theory

    International Nuclear Information System (INIS)

    Gandini, A.

    1994-01-01

    A distinctive feature of heuristically based generalized perturbation theory methodology consists in the systematic use of importance conservation concepts. As well known, this use leads to fundamental reciprocity relationship. Instead, the alternative variational and differential one approaches make a consistent use of the properties and adjoint functions. The equivalence between the importance and the adjoint functions have been demonstrated in important cases. There are some instances, however, in which the commonly known operator governing the adjoint function are not adequate. In this paper ways proposed to generalize this rules, as adopted with the heuristic generalized perturbation theory methodology, are illustrated. When applied to the neutron/nuclide field characterizing the core evolution in a power reactor system, in which also an intensive control variable (ρ) is defined, these rules leas to an orthogonality relationship connected to this same control variable. A set of ρ-mode eigenfunctions may be correspondingly defined and an extended concept of reactivity (generalizing that commonly associated with the multiplication factor) proposed as more directly indicative of the controllability of a critical reactor system. (author). 25 refs

  9. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  10. The Use of Ideas of Information Theory for Studying “Language” and Intelligence in Ants

    Directory of Open Access Journals (Sweden)

    Zhanna Reznikova

    2009-11-01

    Full Text Available In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a message (l and its frequency (p, i.e., l = –log p for rational communication systems. This approach enabled us to obtain the following important results on ants’ communication and intelligence: (i to reveal “distant homing” in ants, that is, their ability to transfer information about remote events; (ii to estimate the rate of information transmission; (iii to reveal that ants are able to grasp regularities and to use them for “compression” of information; (iv to reveal that ants are able to transfer to each other the information about the number of objects; (v to discover that ants can add and subtract small numbers. The obtained results show that information theory is not only excellent mathematical theory, but many of its results may be considered as Nature laws.

  11. The application of foraging theory to the information searching behaviour of general practitioners.

    Science.gov (United States)

    Dwairy, Mai; Dowell, Anthony C; Stahl, Jean-Claude

    2011-08-23

    General Practitioners (GPs) employ strategies to identify and retrieve medical evidence for clinical decision making which take workload and time constraints into account. Optimal Foraging Theory (OFT) initially developed to study animal foraging for food is used to explore the information searching behaviour of General Practitioners. This study is the first to apply foraging theory within this context.Study objectives were: 1. To identify the sequence and steps deployed in identifiying and retrieving evidence for clinical decision making. 2. To utilise Optimal Foraging Theory to assess the effectiveness and efficiency of General Practitioner information searching. GPs from the Wellington region of New Zealand were asked to document in a pre-formatted logbook the steps and outcomes of an information search linked to their clinical decision making, and fill in a questionnaire about their personal, practice and information-searching backgrounds. A total of 115/155 eligible GPs returned a background questionnaire, and 71 completed their information search logbook. GPs spent an average of 17.7 minutes addressing their search for clinical information. Their preferred information sources were discussions with colleagues (38% of sources) and books (22%). These were the two most profitable information foraging sources (15.9 min and 9.5 min search time per answer, compared to 34.3 minutes in databases). GPs nearly always accessed another source when unsuccessful (95% after 1st source), and frequently when successful (43% after 2nd source). Use of multiple sources accounted for 41% of searches, and increased search success from 70% to 89%. By consulting in foraging terms the most 'profitable' sources of information (colleagues, books), rapidly switching sources when unsuccessful, and frequently double checking, GPs achieve an efficient trade-off between maximizing search success and information reliability, and minimizing searching time. As predicted by foraging theory, GPs

  12. The application of foraging theory to the information searching behaviour of general practitioners

    Directory of Open Access Journals (Sweden)

    Dowell Anthony C

    2011-08-01

    minimizing searching time. As predicted by foraging theory, GPs trade time-consuming evidence-based (electronic information sources for sources with a higher information reward per unit time searched. Evidence-based practice must accommodate these 'real world' foraging pressures, and Internet resources should evolve to deliver information as effectively as traditional methods of information gathering.

  13. Context based multimedia information retrieval

    DEFF Research Database (Denmark)

    Mølgaard, Lasse Lohilahti

    The large amounts of digital media becoming available require that new approaches are developed for retrieving, navigating and recommending the data to users in a way that refl ects how we semantically perceive the content. The thesis investigates ways to retrieve and present content for users...... topics from a large collection of the transcribed speech to improve retrieval of spoken documents. The context modelling is done using a variant of probabilistic latent semantic analysis (PLSA), to extract properties of the textual sources that refl ect how humans perceive context. We perform PLSA...... of Wikipedia , as well as text-based semantic similarity. The final aspect investigated is how to include some of the structured data available in Wikipedia to include temporal information. We show that a multiway extension of PLSA makes it possible to extract temporally meaningful topics, better than using...

  14. Physically based rendering from theory to implementation

    CERN Document Server

    Pharr, Matt

    2010-01-01

    "Physically Based Rendering, 2nd Edition" describes both the mathematical theory behind a modern photorealistic rendering system as well as its practical implementation. A method - known as 'literate programming'- combines human-readable documentation and source code into a single reference that is specifically designed to aid comprehension. The result is a stunning achievement in graphics education. Through the ideas and software in this book, you will learn to design and employ a full-featured rendering system for creating stunning imagery. This book features new sections on subsurface scattering, Metropolis light transport, precomputed light transport, multispectral rendering, and much more. It includes a companion site complete with source code for the rendering system described in the book, with support for Windows, OS X, and Linux. Code and text are tightly woven together through a unique indexing feature that lists each function, variable, and method on the page that they are first described.

  15. Plasma balance equations based on orbit theory

    International Nuclear Information System (INIS)

    Lehnert, B.

    1982-01-01

    A set of plasma balance equations is proposed which is based on orbit theory and the particle distribution function, to provide means for theoretical analysis of a number of finite Larmor radius (FLR) phenomena without use of the Vlasov equation. Several important FLR effects originate from the inhomogeneity of an electric field in the plasma. The exact solution of a simple case shows that this inhomogeneity introduces fundamental changes in the physics of the particle motion. Thus, the periodic Larmor motion (gyration) is shifted in frequency and becomes elliptically polarized. Further, the non-periodic guiding-centre drift obtains additional components, part of which are accelerated such as to make the drift orbits intersect the equipotential surfaces of a static electric field. An attempt is finally made to classify the FLR effects, also with the purpose of identifying phenomena which have so far not been investigated. (author)

  16. Generalized phase retrieval algorithm based on information measures

    OpenAIRE

    Shioya, Hiroyuki; Gohara, Kazutoshi

    2006-01-01

    An iterative phase retrieval algorithm based on the maximum entropy method (MEM) is presented. Introducing a new generalized information measure, we derive a novel class of algorithms which includes the conventionally used error reduction algorithm and a MEM-type iterative algorithm which is presented for the first time. These different phase retrieval methods are unified on the basis of the framework of information measures used in information theory.

  17. Inference with minimal Gibbs free energy in information field theory

    International Nuclear Information System (INIS)

    Ensslin, Torsten A.; Weig, Cornelius

    2010-01-01

    Non-linear and non-Gaussian signal inference problems are difficult to tackle. Renormalization techniques permit us to construct good estimators for the posterior signal mean within information field theory (IFT), but the approximations and assumptions made are not very obvious. Here we introduce the simple concept of minimal Gibbs free energy to IFT, and show that previous renormalization results emerge naturally. They can be understood as being the Gaussian approximation to the full posterior probability, which has maximal cross information with it. We derive optimized estimators for three applications, to illustrate the usage of the framework: (i) reconstruction of a log-normal signal from Poissonian data with background counts and point spread function, as it is needed for gamma ray astronomy and for cosmography using photometric galaxy redshifts, (ii) inference of a Gaussian signal with unknown spectrum, and (iii) inference of a Poissonian log-normal signal with unknown spectrum, the combination of (i) and (ii). Finally we explain how Gaussian knowledge states constructed by the minimal Gibbs free energy principle at different temperatures can be combined into a more accurate surrogate of the non-Gaussian posterior.

  18. Untangling the drivers of nonlinear systems with information theory

    Science.gov (United States)

    Wing, S.; Johnson, J.

    2017-12-01

    Many systems found in nature are nonlinear. The drivers of the system are often nonlinearly correlated with one another, which makes it a challenge to understand the effects of an individual driver. For example, solar wind velocity (Vsw) and density (nsw) are both found to correlate well with radiation belt fluxes and are thought to be drivers of the magnetospheric dynamics; however, the Vsw is anti-correlated with nsw, which can potentially confuse interpretation of these relationships as causal or coincidental. Information theory can untangle the drivers of these systems, describe the underlying dynamics, and offer constraints to modelers and theorists, leading to better understanding of the systems. Two examples are presented. In the first example, the solar wind drivers of geosynchronous electrons with energy range of 1.8-3.5 MeV are investigated using mutual information (MI), conditional mutual information (CMI), and transfer entropy (TE). The information transfer from Vsw to geosynchronous MeV electron flux (Je) peaks with a lag time (t) of 2 days. As previously reported, Je is anticorrelated with nsw with a lag of 1 day. However, this lag time and anticorrelation can be attributed mainly to the Je(t + 2 days) correlation with Vsw(t) and nsw(t + 1 day) anticorrelation with Vsw(t). Analyses of solar wind driving of the magnetosphere need to consider the large lag times, up to 3 days, in the (Vsw, nsw) anticorrelation. Using CMI to remove the effects of Vsw, the response of Je to nsw is 30% smaller and has a lag time < 24 hr, suggesting that the loss mechanism due to nsw or solar wind dynamic pressure has to start operating in < 24 hr. nsw transfers about 36% as much information as Vsw (the primary driver) to Je. Nonstationarity in the system dynamics are investigated using windowed TE. When the data is ordered according to high or low transfer entropy it is possible to understand details of the triangle distribution that has been identified between Je(t + 2

  19. Comparing integral and incidental emotions: Testing insights from emotions as social information theory and attribution theory.

    Science.gov (United States)

    Hillebrandt, Annika; Barclay, Laurie J

    2017-05-01

    Studies have indicated that observers can infer information about others' behavioral intentions from others' emotions and use this information in making their own decisions. Integrating emotions as social information (EASI) theory and attribution theory, we argue that the interpersonal effects of emotions are not only influenced by the type of discrete emotion (e.g., anger vs. happiness) but also by the target of the emotion (i.e., how the emotion relates to the situation). We compare the interpersonal effects of emotions that are integral (i.e., related to the situation) versus incidental (i.e., lacking a clear target in the situation) in a negotiation context. Results from 4 studies support our general argument that the target of an opponent's emotion influences the degree to which observers attribute the emotion to their own behavior. These attributions influence observers' inferences regarding the perceived threat of an impasse or cooperativeness of an opponent, which can motivate observers to strategically adjust their behavior. Specifically, emotion target influenced concessions for both anger and happiness (Study 1, N = 254), with perceived threat and cooperativeness mediating the effects of anger and happiness, respectively (Study 2, N = 280). Study 3 (N = 314) demonstrated the mediating role of attributions and moderating role of need for closure. Study 4 (N = 193) outlined how observers' need for cognitive closure influences how they attribute incidental anger. We discuss theoretical implications related to the social influence of emotions as well as practical implications related to the impact of personality on negotiators' biases and behaviors. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Base Information Transport Infrastructure Wired (BITI Wired)

    Science.gov (United States)

    2016-03-01

    2016 Major Automated Information System Annual Report Base Information Transport Infrastructure Wired (BITI Wired) Defense Acquisition Management...Combat Information Transport System program was restructured into two pre-Major Automated Information System (pre-MAIS) components: Information...Major Automated Information System MAIS OE - MAIS Original Estimate MAR – MAIS Annual Report MDA - Milestone Decision Authority MDD - Materiel

  1. Information loss in effective field theory: Entanglement and thermal entropies

    Science.gov (United States)

    Boyanovsky, Daniel

    2018-03-01

    Integrating out high energy degrees of freedom to yield a low energy effective field theory leads to a loss of information with a concomitant increase in entropy. We obtain the effective field theory of a light scalar field interacting with heavy fields after tracing out the heavy degrees of freedom from the time evolved density matrix. The initial density matrix describes the light field in its ground state and the heavy fields in equilibrium at a common temperature T . For T =0 , we obtain the reduced density matrix in a perturbative expansion; it reveals an emergent mixed state as a consequence of the entanglement between light and heavy fields. We obtain the effective action that determines the time evolution of the reduced density matrix for the light field in a nonperturbative Dyson resummation of one-loop correlations of the heavy fields. The Von-Neumann entanglement entropy associated with the reduced density matrix is obtained for the nonresonant and resonant cases in the asymptotic long time limit. In the nonresonant case the reduced density matrix displays an incipient thermalization albeit with a wave-vector, time and coupling dependent effective temperature as a consequence of memory of initial conditions. The entanglement entropy is time independent and is the thermal entropy for this effective, nonequilibrium temperature. In the resonant case the light field fully thermalizes with the heavy fields, the reduced density matrix loses memory of the initial conditions and the entanglement entropy becomes the thermal entropy of the light field. We discuss the relation between the entanglement entropy ultraviolet divergences and renormalization.

  2. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

    Science.gov (United States)

    Kitazono, Jun; Kanai, Ryota; Oizumi, Masafumi

    2018-03-01

    The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($\\Phi$) in the brain is related to the level of consciousness. IIT proposes that to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that if a measure of $\\Phi$ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of $\\Phi$ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of $\\Phi$ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure $\\Phi$ in large systems within a practical amount of time.

  3. Quantum Information Biology: From Theory of Open Quantum Systems to Adaptive Dynamics

    Science.gov (United States)

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    This chapter reviews quantum(-like) information biology (QIB). Here biology is treated widely as even covering cognition and its derivatives: psychology and decision making, sociology, and behavioral economics and finances. QIB provides an integrative description of information processing by bio-systems at all scales of life: from proteins and cells to cognition, ecological and social systems. Mathematically QIB is based on the theory of adaptive quantum systems (which covers also open quantum systems). Ideologically QIB is based on the quantum-like (QL) paradigm: complex bio-systems process information in accordance with the laws of quantum information and probability. This paradigm is supported by plenty of statistical bio-data collected at all bio-scales. QIB re ects the two fundamental principles: a) adaptivity; and, b) openness (bio-systems are fundamentally open). In addition, quantum adaptive dynamics provides the most generally possible mathematical representation of these principles.

  4. A prediction method based on grey system theory in equipment condition based maintenance

    International Nuclear Information System (INIS)

    Yan, Shengyuan; Yan, Shengyuan; Zhang, Hongguo; Zhang, Zhijian; Peng, Minjun; Yang, Ming

    2007-01-01

    Grey prediction is a modeling method based on historical or present, known or indefinite information, which can be used for forecasting the development of the eigenvalues of the targeted equipment system and setting up the model by using less information. In this paper, the postulate of grey system theory, which includes the grey generating, the sorts of grey generating and the grey forecasting model, is introduced first. The concrete application process, which includes the grey prediction modeling, grey prediction, error calculation, equal dimension and new information approach, is introduced secondly. Application of a so-called 'Equal Dimension and New Information' (EDNI) technology in grey system theory is adopted in an application case, aiming at improving the accuracy of prediction without increasing the amount of calculation by replacing old data with new ones. The proposed method can provide a new way for solving the problem of eigenvalue data exploding in equal distance effectively, short time interval and real time prediction. The proposed method, which was based on historical or present, known or indefinite information, was verified by the vibration prediction of induced draft fan of a boiler of the Yantai Power Station in China, and the results show that the proposed method based on grey system theory is simple and provides a high accuracy in prediction. So, it is very useful and significant to the controlling and controllable management in safety production. (authors)

  5. Information theory in econophysics: stock market and retirement funds

    Science.gov (United States)

    Vogel, Eugenio; Saravia, G.; Astete, J.; Díaz, J.; Erribarren, R.; Riadi, F.

    2013-03-01

    Information theory can help to recognize magnetic phase transitions, what can be seen as a way to recognize different regimes. This is achieved by means of zippers specifically designed to compact data in a meaningful way at is the case for compressor wlzip. In the present contribution we first apply wlzip to the Chilean stock market interpreting the compression rates for the files storing the minute variation of the IPSA indicator. Agitated days yield poor compression rates while calm days yield high compressibility. We then correlate this behavior to the value of the five retirement funds related to the Chilean economy. It is found that the covariance between the profitability of the retirement funds and the compressibility of the IPSA values of previous day is high for those funds investing in risky stocks. Surprisingly, there seems to be no great difference among the three riskier funds contrary to what could be expected from the limitations on the portfolio composition established by the laws that regulate this market.

  6. Information structures in economics studies in the theory of markets with imperfect information

    CERN Document Server

    Nermuth, Manfred

    1982-01-01

    This book is intended as a contribution to the theory of markets with imperfect information. The subject being nearly limitless, only certain selected topics are discussed. These are outlined in the Introduction (Ch. 0). The remainder of the book is divided into three parts. All results of economic significance are contained in Parts II & III. Part I introduces the main tools for the analysis, in particular the concept of an information structure. Although most of the material presented in Part I is not original, it is hoped that the detailed and self-contained exposition will help the reader to understand not only the following pages, but also the existing technical and variegated literature on markets with imperfect information. The mathematical prerequisites needed, but not explained in the text rarely go beyond elementary calculus and probability theory. Whenever more advanced concepts are used, I have made an effort to give an intuitive explanation as well, so that the argument can also be followed o...

  7. Fundamentals of the fuzzy logic-based generalized theory of decisions

    CERN Document Server

    Aliev, Rafik Aziz

    2013-01-01

    Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes’s analysis of uncertainty. There is a need for further generalization – a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than comput...

  8. Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact

    Science.gov (United States)

    McQuillin, Samuel D.; Lyons, Michael D.

    2016-01-01

    This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…

  9. Informal meeting on recent developments in field theory

    International Nuclear Information System (INIS)

    Anon.

    1977-12-01

    A topical meeting on recent developments in field theory was organized by the International Centre for Theoretical Physics from 21 to 23 November 1977. The publication is a compilation of the abstracts of lecture given. The mayor themes of the meeting were the problem of confinement, the quantization of Yang-Mills theories and the topological aspects of field theories in flat and curved spaces

  10. Informal information for web-based engineering catalogues

    Science.gov (United States)

    Allen, Richard D.; Culley, Stephen J.; Hicks, Ben J.

    2001-10-01

    Success is highly dependent on the ability of a company to efficiently produce optimal designs. In order to achieve this companies must minimize time to market and possess the ability to make fully informed decisions at the early phase of the design process. Such decisions may include the choice of component and suppliers, as well as cost and maintenance considerations. Computer modeling and electronic catalogues are becoming the preferred medium for the selection and design of mechanical components. In utilizing these techniques, the designer demands the capability to identify, evaluate and select mechanical components both quantitatively and qualitatively. Quantitative decisions generally encompass performance data included in the formal catalogue representation. It is in the area of qualitative decisions that the use of what the authors call 'Informal Information' is of crucial importance. Thus, 'Informal Information' must often be incorporated into the selection process and selection systems. This would enable more informed decisions to be made quicker, without the need for information retrieval via discussion with colleagues in the design environment. This paper provides an overview of the use of electronic information in the design of mechanical systems, including a discussion of limitations of current technology. The importance of Informal Information is discussed and the requirements for association with web based electronic catalogues are developed. This system is based on a flexible XML schema and enables the storage, classification and recall of Informal Information packets. Furthermore, a strategy for the inclusion of Informal Information is proposed, and an example case is used to illustrate the benefits.

  11. A short course in quantum information theory. An approach from theoretical physics. 2. ed.

    International Nuclear Information System (INIS)

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition: ''The best things about this book are its brevity and clarity. In around 100 pages it provides a tutorial introduction to quantum information theory, including problems and solutions.. it's worth a look if you want to quickly get up to speed with the language and central concepts of quantum information theory, including the background classical information theory.'' (Craig Savage, Australian Physics, Vol. 44 (2), 2007). (orig.)

  12. Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling

    Science.gov (United States)

    Shin, Young Shik; Remacle, F.; Fan, Rong; Hwang, Kiwook; Wei, Wei; Ahmad, Habib; Levine, R.D.; Heath, James R.

    2011-01-01

    Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network. PMID:21575571

  13. Ontology-based Information Retrieval

    DEFF Research Database (Denmark)

    Styltsvig, Henrik Bulskov

    In this thesis, we will present methods for introducing ontologies in information retrieval. The main hypothesis is that the inclusion of conceptual knowledge such as ontologies in the information retrieval process can contribute to the solution of major problems currently found in information...... retrieval. This utilization of ontologies has a number of challenges. Our focus is on the use of similarity measures derived from the knowledge about relations between concepts in ontologies, the recognition of semantic information in texts and the mapping of this knowledge into the ontologies in use......, as well as how to fuse together the ideas of ontological similarity and ontological indexing into a realistic information retrieval scenario. To achieve the recognition of semantic knowledge in a text, shallow natural language processing is used during indexing that reveals knowledge to the level of noun...

  14. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  15. Biological information systems: Evolution as cognition-based information management.

    Science.gov (United States)

    Miller, William B

    2018-05-01

    An alternative biological synthesis is presented that conceptualizes evolutionary biology as an epiphenomenon of integrated self-referential information management. Since all biological information has inherent ambiguity, the systematic assessment of information is required by living organisms to maintain self-identity and homeostatic equipoise in confrontation with environmental challenges. Through their self-referential attachment to information space, cells are the cornerstone of biological action. That individualized assessment of information space permits self-referential, self-organizing niche construction. That deployment of information and its subsequent selection enacted the dominant stable unicellular informational architectures whose biological expressions are the prokaryotic, archaeal, and eukaryotic unicellular forms. Multicellularity represents the collective appraisal of equivocal environmental information through a shared information space. This concerted action can be viewed as systematized information management to improve information quality for the maintenance of preferred homeostatic boundaries among the varied participants. When reiterated in successive scales, this same collaborative exchange of information yields macroscopic organisms as obligatory multicellular holobionts. Cognition-Based Evolution (CBE) upholds that assessment of information precedes biological action, and the deployment of information through integrative self-referential niche construction and natural cellular engineering antecedes selection. Therefore, evolutionary biology can be framed as a complex reciprocating interactome that consists of the assessment, communication, deployment and management of information by self-referential organisms at multiple scales in continuous confrontation with environmental stresses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Information Theory for Gabor Feature Selection for Face Recognition

    Directory of Open Access Journals (Sweden)

    Shen Linlin

    2006-01-01

    Full Text Available A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004, our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.

  17. Revealing Relationships among Relevant Climate Variables with Information Theory

    Science.gov (United States)

    Knuth, Kevin H.; Golera, Anthony; Curry, Charles T.; Huyser, Karen A.; Kevin R. Wheeler; Rossow, William B.

    2005-01-01

    The primary objective of the NASA Earth-Sun Exploration Technology Office is to understand the observed Earth climate variability, thus enabling the determination and prediction of the climate's response to both natural and human-induced forcing. We are currently developing a suite of computational tools that will allow researchers to calculate, from data, a variety of information-theoretic quantities such as mutual information, which can be used to identify relationships among climate variables, and transfer entropy, which indicates the possibility of causal interactions. Our tools estimate these quantities along with their associated error bars, the latter of which is critical for describing the degree of uncertainty in the estimates. This work is based upon optimal binning techniques that we have developed for piecewise-constant, histogram-style models of the underlying density functions. Two useful side benefits have already been discovered. The first allows a researcher to determine whether there exist sufficient data to estimate the underlying probability density. The second permits one to determine an acceptable degree of round-off when compressing data for efficient transfer and storage. We also demonstrate how mutual information and transfer entropy can be applied so as to allow researchers not only to identify relations among climate variables, but also to characterize and quantify their possible causal interactions.

  18. Information Theory for Gabor Feature Selection for Face Recognition

    Science.gov (United States)

    Shen, Linlin; Bai, Li

    2006-12-01

    A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004), our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.

  19. A theory-informed, process-oriented Resident Scholarship Program.

    Science.gov (United States)

    Thammasitboon, Satid; Darby, John B; Hair, Amy B; Rose, Karen M; Ward, Mark A; Turner, Teri L; Balmer, Dorene F

    2016-01-01

    The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents' intrinsic motivation to learn and to engage in scholarly activity. To this end, residents' engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Based on our experience, and in line with the SDT, supporting residents' autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products.

  20. A theory-informed, process-oriented Resident Scholarship Program

    Science.gov (United States)

    Thammasitboon, Satid; Darby, John B.; Hair, Amy B.; Rose, Karen M.; Ward, Mark A.; Turner, Teri L.; Balmer, Dorene F.

    2016-01-01

    Background The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. Methods The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents’ intrinsic motivation to learn and to engage in scholarly activity. To this end, residents’ engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Results Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Conclusions Based on our experience, and in line with the SDT, supporting residents’ autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products. PMID:27306995

  1. Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools

    Directory of Open Access Journals (Sweden)

    Yuejun Guo

    2017-06-01

    Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.

  2. Quantum information processing in the radical-pair mechanism: Haberkorn's theory violates the Ozawa entropy bound

    Science.gov (United States)

    Mouloudakis, K.; Kominis, I. K.

    2017-02-01

    Radical-ion-pair reactions, central for understanding the avian magnetic compass and spin transport in photosynthetic reaction centers, were recently shown to be a fruitful paradigm of the new synthesis of quantum information science with biological processes. We show here that the master equation so far constituting the theoretical foundation of spin chemistry violates fundamental bounds for the entropy of quantum systems, in particular the Ozawa bound. In contrast, a recently developed theory based on quantum measurements, quantum coherence measures, and quantum retrodiction, thus exemplifying the paradigm of quantum biology, satisfies the Ozawa bound as well as the Lanford-Robinson bound on information extraction. By considering Groenewold's information, the quantum information extracted during the reaction, we reproduce the known and unravel other magnetic-field effects not conveyed by reaction yields.

  3. Quantum information processing in the radical-pair mechanism: Haberkorn's theory violates the Ozawa entropy bound.

    Science.gov (United States)

    Mouloudakis, K; Kominis, I K

    2017-02-01

    Radical-ion-pair reactions, central for understanding the avian magnetic compass and spin transport in photosynthetic reaction centers, were recently shown to be a fruitful paradigm of the new synthesis of quantum information science with biological processes. We show here that the master equation so far constituting the theoretical foundation of spin chemistry violates fundamental bounds for the entropy of quantum systems, in particular the Ozawa bound. In contrast, a recently developed theory based on quantum measurements, quantum coherence measures, and quantum retrodiction, thus exemplifying the paradigm of quantum biology, satisfies the Ozawa bound as well as the Lanford-Robinson bound on information extraction. By considering Groenewold's information, the quantum information extracted during the reaction, we reproduce the known and unravel other magnetic-field effects not conveyed by reaction yields.

  4. Local versus nonlocal information in quantum-information theory: Formalism and phenomena

    International Nuclear Information System (INIS)

    Horodecki, Michal; Horodecki, Ryszard; Synak-Radtke, Barbara; Horodecki, Pawel; Oppenheim, Jonathan; Sen, Aditi; Sen, Ujjwal

    2005-01-01

    In spite of many results in quantum information theory, the complex nature of compound systems is far from clear. In general the information is a mixture of local and nonlocal ('quantum') information. It is important from both pragmatic and theoretical points of view to know the relationships between the two components. To make this point more clear, we develop and investigate the quantum-information processing paradigm in which parties sharing a multipartite state distill local information. The amount of information which is lost because the parties must use a classical communication channel is the deficit. This scheme can be viewed as complementary to the notion of distilling entanglement. After reviewing the paradigm in detail, we show that the upper bound for the deficit is given by the relative entropy distance to so-called pseudoclassically correlated states; the lower bound is the relative entropy of entanglement. This implies, in particular, that any entangled state is informationally nonlocal - i.e., has nonzero deficit. We also apply the paradigm to defining the thermodynamical cost of erasing entanglement. We show the cost is bounded from below by relative entropy of entanglement. We demonstrate the existence of several other nonlocal phenomena which can be found using the paradigm of local information. For example, we prove the existence of a form of nonlocality without entanglement and with distinguishability. We analyze the deficit for several classes of multipartite pure states and obtain that in contrast to the GHZ state, the Aharonov state is extremely nonlocal. We also show that there do not exist states for which the deficit is strictly equal to the whole informational content (bound local information). We discuss the relation of the paradigm with measures of classical correlations introduced earlier. It is also proved that in the one-way scenario, the deficit is additive for Bell diagonal states. We then discuss complementary features of

  5. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  6. Informed Systems: Enabling Collaborative Evidence Based Organizational Learning

    Directory of Open Access Journals (Sweden)

    Mary M. Somerville

    2015-12-01

    Full Text Available Objective – In response to unrelenting disruptions in academic publishing and higher education ecosystems, the Informed Systems approach supports evidence based professional activities to make decisions and take actions. This conceptual paper presents two core models, Informed Systems Leadership Model and Collaborative Evidence-Based Information Process Model, whereby co-workers learn to make informed decisions by identifying the decisions to be made and the information required for those decisions. This is accomplished through collaborative design and iterative evaluation of workplace systems, relationships, and practices. Over time, increasingly effective and efficient structures and processes for using information to learn further organizational renewal and advance nimble responsiveness amidst dynamically changing circumstances. Methods – The integrated Informed Systems approach to fostering persistent workplace inquiry has its genesis in three theories that together activate and enable robust information usage and organizational learning. The information- and learning-intensive theories of Peter Checkland in England, which advance systems design, stimulate participants’ appreciation during the design process of the potential for using information to learn. Within a co-designed environment, intentional social practices continue workplace learning, described by Christine Bruce in Australia as informed learning enacted through information experiences. In addition, in Japan, Ikujiro Nonaka’s theories foster information exchange processes and knowledge creation activities within and across organizational units. In combination, these theories promote the kind of learning made possible through evolving and transferable capacity to use information to learn through design and usage of collaborative communication systems with associated professional practices. Informed Systems therein draws from three antecedent theories to create an original

  7. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  8. Coherent versus Measurement Feedback: Linear Systems Theory for Quantum Information

    Directory of Open Access Journals (Sweden)

    Naoki Yamamoto

    2014-11-01

    Full Text Available To control a quantum system via feedback, we generally have two options in choosing a control scheme. One is the coherent feedback, which feeds the output field of the system, through a fully quantum device, back to manipulate the system without involving any measurement process. The other one is measurement-based feedback, which measures the output field and performs a real-time manipulation on the system based on the measurement results. Both schemes have advantages and disadvantages, depending on the system and the control goal; hence, their comparison in several situations is important. This paper considers a general open linear quantum system with the following specific control goals: backaction evasion, generation of a quantum nondemolished variable, and generation of a decoherence-free subsystem, all of which have important roles in quantum information science. Some no-go theorems are proven, clarifying that those goals cannot be achieved by any measurement-based feedback control. On the other hand, it is shown that, for each control goal there exists a coherent feedback controller accomplishing the task. The key idea to obtain all the results is system theoretic characterizations of the above three notions in terms of controllability and observability properties or transfer functions of linear systems, which are consistent with their standard definitions.

  9. FUSION SEGMENTATION METHOD BASED ON FUZZY THEORY FOR COLOR IMAGES

    Directory of Open Access Journals (Sweden)

    J. Zhao

    2017-09-01

    Full Text Available The image segmentation method based on two-dimensional histogram segments the image according to the thresholds of the intensity of the target pixel and the average intensity of its neighborhood. This method is essentially a hard-decision method. Due to the uncertainties when labeling the pixels around the threshold, the hard-decision method can easily get the wrong segmentation result. Therefore, a fusion segmentation method based on fuzzy theory is proposed in this paper. We use membership function to model the uncertainties on each color channel of the color image. Then, we segment the color image according to the fuzzy reasoning. The experiment results show that our proposed method can get better segmentation results both on the natural scene images and optical remote sensing images compared with the traditional thresholding method. The fusion method in this paper can provide new ideas for the information extraction of optical remote sensing images and polarization SAR images.

  10. Integration of Information Literacy into the Curriculum: Constructive Alignment from Theory into Practice

    Directory of Open Access Journals (Sweden)

    Claes Dahlqvist

    2016-12-01

    Full Text Available Librarian-teacher cooperation is essential for the integration of information literacy into course syllabi. Therefore, a common theoretical and methodological platform is needed. As librarians at Kristianstad University we have had the opportunity to develop such a platform when teaching information literacy in a basic course for teachers in higher education pedagogy. Information literacy is taught in context with academic writing, distance learning and teaching, and development of course syllabi. Constructive Alignment in Theory: We used constructive alignment in designing our part of the course. John Biggs’ ideas tell us that assessment tasks (ATs should be aligned to what is intended to be learned. Intended learning outcomes (ILOs specify teaching/learning activities (TLAs based on the content of learning. TLAs should be designed in ways that enable students to construct knowledge from their own experience. The ILOs for the course are to have arguments for the role of information literacy in higher education and ideas of implementing them in TLAs. The content of learning is for example the concept of information literacy, theoretical perspectives and constructive alignment for integration in course syllabi. TLAs are written pre-lecture reflections on the concept of information literacy, used as a starting point for the three-hour seminar. Learning reflections are written afterwards. The AT is to revise a syllabus (preferably using constructive alignment for a course the teacher is responsible for, where information literacy must be integrated with the other parts and topics of the course. Constructive Alignment in Practice: Using constructive alignment has taught us that this model serves well as the foundation of the theoretical and methodological platform for librarian-teacher cooperation when integrating information literacy in course syllabi. It contains all important aspects of the integration of information literacy in course

  11. Towards a conceptual framework for protection of personal information from the perspective of activity theory

    Directory of Open Access Journals (Sweden)

    Tiko Iyamu

    2017-11-01

    Full Text Available Background: Personal information about individuals is stored by organisations including government agencies. The information is intended to be kept confidential and strictly used for its primary and legitimate purposes. However, that has not always been the case in many South African government agencies and departments. In recent years, personal information about individuals and groups has been illegally leaked for other motives, in which some were detrimental. Even though there exists a legislation, Protection of Personal Information (POPI Act, which prohibits such malpractices, illegally leaked information has however, not stopped or reduced. In addition to the adoption of the POPI Act, a more stringent approach is therefore needed in order to improve sanity in the use and management of personal information. Otherwise, the detriment that such malpractices cause too many citizens can only be on the increase. Objectives: The objectives of this study were in twofold: (1 to examine and understand the activities that happen with personal information leaks, which includes why and how information is leaked; and (2 to develop a conceptual framework, which includes identification of the factors that influence information leaks and breaches in an environment. Method: Qualitative research methods were followed in achieving the objectives of the study. Within the qualitative methods, documents including existing literature were gathered. The activity theory was employed as lens to guide the analysis. Result: From the analysis, four critical factors were found to be of influence in information leaks and breaches in organisations. The factors include: (1 information and its value, (2 the roles of society and its compliance to information protection, (3 government and its laws relating to information protection and (4 the need for standardisation of information usage and management within a community. Based on the factors, a conceptual framework was

  12. An information theory model for dissipation in open quantum systems

    Science.gov (United States)

    Rogers, David M.

    2017-08-01

    This work presents a general model for open quantum systems using an information game along the lines of Jaynes’ original work. It is shown how an energy based reweighting of propagators provides a novel moment generating function at each time point in the process. Derivatives of the generating function give moments of the time derivatives of observables. Aside from the mathematically helpful properties, the ansatz reproduces key physics of stochastic quantum processes. At high temperature, the average density matrix follows the Caldeira-Leggett equation. Its associated Langevin equation clearly demonstrates the emergence of dissipation and decoherence time scales, as well as an additional diffusion due to quantum confinement. A consistent interpretation of these results is that decoherence and wavefunction collapse during measurement are directly related to the degree of environmental noise, and thus occur because of subjective uncertainty of an observer.

  13. Quantum entanglement in non-local games, graph parameters and zero-error information theory

    NARCIS (Netherlands)

    Scarpa, G.

    2013-01-01

    We study quantum entanglement and some of its applications in graph theory and zero-error information theory. In Chapter 1 we introduce entanglement and other fundamental concepts of quantum theory. In Chapter 2 we address the question of how much quantum correlations generated by entanglement can

  14. Finding Commonalities: Social Information Processing and Domain Theory in the Study of Aggression

    Science.gov (United States)

    Nucci, Larry

    2004-01-01

    The Arsenio and Lemerise (this issue) proposal integrating social information processing (SIP) and domain theory to study children's aggression is evaluated from a domain theory perspective. Basic tenets of domain theory rendering it compatible with SIP are discussed as well as points of divergence. Focus is directed to the proposition that…

  15. An approach to higher dimensional theories based on lattice gauge theory

    International Nuclear Information System (INIS)

    Murata, M.; So, H.

    2004-01-01

    A higher dimensional lattice space can be decomposed into a number of four-dimensional lattices called as layers. The higher dimensional gauge theory on the lattice can be interpreted as four-dimensional gauge theories on the multi-layer with interactions between neighboring layers. We propose the new possibility to realize the continuum limit of a five-dimensional theory based on the property of the phase diagram

  16. Robust optimization based upon statistical theory.

    Science.gov (United States)

    Sobotta, B; Söhn, M; Alber, M

    2010-08-01

    Organ movement is still the biggest challenge in cancer treatment despite advances in online imaging. Due to the resulting geometric uncertainties, the delivered dose cannot be predicted precisely at treatment planning time. Consequently, all associated dose metrics (e.g., EUD and maxDose) are random variables with a patient-specific probability distribution. The method that the authors propose makes these distributions the basis of the optimization and evaluation process. The authors start from a model of motion derived from patient-specific imaging. On a multitude of geometry instances sampled from this model, a dose metric is evaluated. The resulting pdf of this dose metric is termed outcome distribution. The approach optimizes the shape of the outcome distribution based on its mean and variance. This is in contrast to the conventional optimization of a nominal value (e.g., PTV EUD) computed on a single geometry instance. The mean and variance allow for an estimate of the expected treatment outcome along with the residual uncertainty. Besides being applicable to the target, the proposed method also seamlessly includes the organs at risk (OARs). The likelihood that a given value of a metric is reached in the treatment is predicted quantitatively. This information reveals potential hazards that may occur during the course of the treatment, thus helping the expert to find the right balance between the risk of insufficient normal tissue sparing and the risk of insufficient tumor control. By feeding this information to the optimizer, outcome distributions can be obtained where the probability of exceeding a given OAR maximum and that of falling short of a given target goal can be minimized simultaneously. The method is applicable to any source of residual motion uncertainty in treatment delivery. Any model that quantifies organ movement and deformation in terms of probability distributions can be used as basis for the algorithm. Thus, it can generate dose

  17. Continuing Bonds in Bereavement: An Attachment Theory Based Perspective

    Science.gov (United States)

    Field, Nigel P.; Gao, Beryl; Paderna, Lisa

    2005-01-01

    An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual…

  18. Workplace-based assessment: raters' performance theories and constructs.

    Science.gov (United States)

    Govaerts, M J B; Van de Wiel, M W J; Schuwirth, L W T; Van der Vleuten, C P M; Muijtjens, A M M

    2013-08-01

    Weaknesses in the nature of rater judgments are generally considered to compromise the utility of workplace-based assessment (WBA). In order to gain insight into the underpinnings of rater behaviours, we investigated how raters form impressions of and make judgments on trainee performance. Using theoretical frameworks of social cognition and person perception, we explored raters' implicit performance theories, use of task-specific performance schemas and the formation of person schemas during WBA. We used think-aloud procedures and verbal protocol analysis to investigate schema-based processing by experienced (N = 18) and inexperienced (N = 16) raters (supervisor-raters in general practice residency training). Qualitative data analysis was used to explore schema content and usage. We quantitatively assessed rater idiosyncrasy in the use of performance schemas and we investigated effects of rater expertise on the use of (task-specific) performance schemas. Raters used different schemas in judging trainee performance. We developed a normative performance theory comprising seventeen inter-related performance dimensions. Levels of rater idiosyncrasy were substantial and unrelated to rater expertise. Experienced raters made significantly more use of task-specific performance schemas compared to inexperienced raters, suggesting more differentiated performance schemas in experienced raters. Most raters started to develop person schemas the moment they began to observe trainee performance. The findings further our understanding of processes underpinning judgment and decision making in WBA. Raters make and justify judgments based on personal theories and performance constructs. Raters' information processing seems to be affected by differences in rater expertise. The results of this study can help to improve rater training, the design of assessment instruments and decision making in WBA.

  19. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  20. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  1. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  2. The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory

    Science.gov (United States)

    Anil, Duygu

    2008-01-01

    In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…

  3. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  4. Theory-informed design of values clarification methods: a cognitive psychological perspective on patient health-related decision making.

    Science.gov (United States)

    Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb

    2013-01-01

    Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts

    2006-05-01

    This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and

  6. A ROADMAP FOR A COMPUTATIONAL THEORY OF THE VALUE OF INFORMATION IN ORIGIN OF LIFE QUESTIONS

    Directory of Open Access Journals (Sweden)

    Soumya Banerjee

    2016-06-01

    Full Text Available Information plays a critical role in complex biological systems. Complex systems like immune systems and ant colonies co-ordinate heterogeneous components in a decentralized fashion. How do these distributed decentralized systems function? One key component is how these complex systems efficiently process information. These complex systems have an architecture for integrating and processing information coming in from various sources and points to the value of information in the functioning of different complex biological systems. This article proposes a role for information processing in questions around the origin of life and suggests how computational simulations may yield insights into questions related to the origin of life. Such a computational model of the origin of life would unify thermodynamics with information processing and we would gain an appreciation of why proteins and nucleotides evolved as the substrate of computation and information processing in living systems that we see on Earth. Answers to questions like these may give us insights into non-carbon based forms of life that we could search for outside Earth. We hypothesize that carbon-based life forms are only one amongst a continuum of life-like systems in the universe. Investigations into the role of computational substrates that allow information processing is important and could yield insights into: 1 novel non-carbon based computational substrates that may have “life-like” properties, and 2 how life may have actually originated from non-life on Earth. Life may exist as a continuum between non-life and life and we may have to revise our notion of life and how common it is in the universe. Looking at life or life-like phenomenon through the lens of information theory may yield a broader view of life.

  7. Mathematics Education as a Proving-Ground for Information-Processing Theories.

    Science.gov (United States)

    Greer, Brian, Ed.; Verschaffel, Lieven, Ed.

    1990-01-01

    Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…

  8. Actor-network Theory and cartography of controversies in Information Science

    OpenAIRE

    LOURENÇO, Ramon Fernandes; TOMAÉL, Maria Inês

    2018-01-01

    Abstract The present study aims to discuss the interactions between the Actor-network Theory and the Cartography of Controversies method in Information Science research. A literature review was conducted on books, scholarly articles, and any other sources addressing the Theory-Actor Network and Cartography of Controversies. The understanding of the theoretical assumptions that guide the Network-Actor Theory allows examining important aspects to Information Science research, seeking to identif...

  9. The urban informal economy in Ethiopia: theory and empirical ...

    African Journals Online (AJOL)

    Eastern Africa Social Science Research Review ... data to explore the roles and characteristics of the informal sector in urban centers of Ethiopia, ... informal sources, 4) the level of income per person varied sharply among the various sectors.

  10. SMD-based numerical stochastic perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Dalla Brida, Mattia [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano-Bicocca (Italy); Luescher, Martin [CERN, Theoretical Physics Department, Geneva (Switzerland); AEC, Institute for Theoretical Physics, University of Bern (Switzerland)

    2017-05-15

    The viability of a variant of numerical stochastic perturbation theory, where the Langevin equation is replaced by the SMD algorithm, is examined. In particular, the convergence of the process to a unique stationary state is rigorously established and the use of higher-order symplectic integration schemes is shown to be highly profitable in this context. For illustration, the gradient-flow coupling in finite volume with Schroedinger functional boundary conditions is computed to two-loop (i.e. NNL) order in the SU(3) gauge theory. The scaling behaviour of the algorithm turns out to be rather favourable in this case, which allows the computations to be driven close to the continuum limit. (orig.)

  11. SMD-based numerical stochastic perturbation theory

    International Nuclear Information System (INIS)

    Dalla Brida, Mattia; Luescher, Martin

    2017-01-01

    The viability of a variant of numerical stochastic perturbation theory, where the Langevin equation is replaced by the SMD algorithm, is examined. In particular, the convergence of the process to a unique stationary state is rigorously established and the use of higher-order symplectic integration schemes is shown to be highly profitable in this context. For illustration, the gradient-flow coupling in finite volume with Schroedinger functional boundary conditions is computed to two-loop (i.e. NNL) order in the SU(3) gauge theory. The scaling behaviour of the algorithm turns out to be rather favourable in this case, which allows the computations to be driven close to the continuum limit. (orig.)

  12. SMD-based numerical stochastic perturbation theory

    Science.gov (United States)

    Dalla Brida, Mattia; Lüscher, Martin

    2017-05-01

    The viability of a variant of numerical stochastic perturbation theory, where the Langevin equation is replaced by the SMD algorithm, is examined. In particular, the convergence of the process to a unique stationary state is rigorously established and the use of higher-order symplectic integration schemes is shown to be highly profitable in this context. For illustration, the gradient-flow coupling in finite volume with Schrödinger functional boundary conditions is computed to two-loop (i.e. NNL) order in the SU(3) gauge theory. The scaling behaviour of the algorithm turns out to be rather favourable in this case, which allows the computations to be driven close to the continuum limit.

  13. FDI theories. A location-based approach

    Directory of Open Access Journals (Sweden)

    Popovici, Oana Cristina

    2014-09-01

    Full Text Available Given the importance of FDI for the economic growth of both home and host countries, the aim of this paper is to assess the importance granted to location advantages during the development of FDI theory. We start with the earliest theoretical directions as regards FDI location issues and extend our study to describing less debated theories, but of a particular importance for this theme. In this way, we have the opportunity to emphasize the changes in FDI location determinants. We find that a direction of the FDI theories’ expansion is due to the incorporation of new variables on location, although the location advantages are barely mentioned in the first explanations regarding the international activity of the firms.

  14. Using information theory to assess the communicative capacity of circulating microRNA.

    Science.gov (United States)

    Finn, Nnenna A; Searles, Charles D

    2013-10-11

    The discovery of extracellular microRNAs (miRNAs) and their transport modalities (i.e., microparticles, exosomes, proteins and lipoproteins) has sparked theories regarding their role in intercellular communication. Here, we assessed the information transfer capacity of different miRNA transport modalities in human serum by utilizing basic principles of information theory. Zipf Statistics were calculated for each of the miRNA transport modalities identified in human serum. Our analyses revealed that miRNA-mediated information transfer is redundant, as evidenced by negative Zipf's Statistics with magnitudes greater than one. In healthy subjects, the potential communicative capacity of miRNA in complex with circulating proteins was significantly lower than that of miRNA encapsulated in circulating microparticles and exosomes. Moreover, the presence of coronary heart disease significantly lowered the communicative capacity of all circulating miRNA transport modalities. To assess the internal organization of circulating miRNA signals, Shannon's zero- and first-order entropies were calculated. Microparticles (MPs) exhibited the lowest Shannon entropic slope, indicating a relatively high capacity for information transfer. Furthermore, compared to the other miRNA transport modalities, MPs appeared to be the most efficient at transferring miRNA to cultured endothelial cells. Taken together, these findings suggest that although all transport modalities have the capacity for miRNA-based information transfer, MPs may be the simplest and most robust way to achieve miRNA-based signal transduction in sera. This study presents a novel method for analyzing the quantitative capacity of miRNA-mediated information transfer while providing insight into the communicative characteristics of distinct circulating miRNA transport modalities. Published by Elsevier Inc.

  15. The empirical bases of the scientific theory

    International Nuclear Information System (INIS)

    Cook, A.

    1996-01-01

    This paper was written according to a speech given by the author at the French Academy of Sciences in Paris on November 14, 1994. In this educational paper, the author tries to explain the origins and limitations of the scientific theories. The aim of science is to allow the edification of a rational framework which situates the observations we can make about the world in which we live. These observations are determined by the human capacities and by the physical world itself. Thus, the structure of our theories is, up to a certain limit, imposed by the observation constraints: for example, the relationship between the time evolution equation in quantum mechanics and the definition of the atomic time standard, or between the restricted relativity and the observation of far away events using the electromagnetic radiation. A corollary is that several physical systems can be assimilated to the representation of abstract groups, and this is a possible explanation of mathematics power in scientific theories. However, the group representation is not suitable for all natural systems, such as those referring to a chaotic dynamics. In this case, and in others in physics and biology, questions exist which cannot be answered by the simple study of the natural world

  16. Teaching Qualitative Research: Using Theory to Inform Practice

    Science.gov (United States)

    Sallee, Margaret W.

    2010-01-01

    This article considers how theories of instructional scaffolding--which call for a skilled expert to teach a novice a new task by breaking it into smaller pieces--might be employed in graduate-level qualitative methods courses. The author discusses how she used instructional scaffolding in the design and delivery of a qualitative methods course…

  17. Information and Uncertainty in the Theory of Monetary Policy

    OpenAIRE

    Wagner, Helmut

    2007-01-01

    Theory and practice of monetary policy have changed significantly over the past three decades. A very important part of today's monetary policy is management of the expectations of private market participants. Publishing and justifying the central bank's best forecast of inflation, output, and the instrument rate is argued to be the most effective way to manage those expectations.

  18. Geographic information modeling of Econet of Northwestern Federal District territory on graph theory basis

    Science.gov (United States)

    Kopylova, N. S.; Bykova, A. A.; Beregovoy, D. N.

    2018-05-01

    Based on the landscape-geographical approach, a structural and logical scheme for the Northwestern Federal District Econet has been developed, which can be integrated into the federal and world ecological network in order to improve the environmental infrastructure of the region. The method of Northwestern Federal District Econet organization on the basis of graph theory by means of the Quantum GIS geographic information system is proposed as an effective mean of preserving and recreating the unique biodiversity of landscapes, regulation of the sphere of environmental protection.

  19. Designing theoretically-informed implementation interventions: Fine in theory, but evidence of effectiveness in practice is needed

    Directory of Open Access Journals (Sweden)

    Reeves Scott

    2006-02-01

    Full Text Available Abstract The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.

  20. Libor at crossroads: Stochastic switching detection using information theory quantifiers

    International Nuclear Information System (INIS)

    Bariviera, Aurelio F.; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2016-01-01

    Highlights: • 28 time series of Libor rates, classified in seven maturities and four currencies, during the last 14 years, were considered. • The analysis was performed using a novel technique in financial economics: the Complexity–Entropy Causality Plane. • Our analysis unveils an abnormal movement of Libor time series around the period of the 2007 financial crisis. • This alteration in the stochastic dynamics of Libor is contemporary of what press called “Libor scandal”. - Abstract: This paper studies the 28 time series of Libor rates, classified in seven maturities and four currencies, during the last 14 years. The analysis was performed using a novel technique in financial economics: the Complexity–Entropy Causality Plane. This planar representation allows the discrimination of different stochastic and chaotic regimes. Using a temporal analysis based on moving windows, this paper unveils an abnormal movement of Libor time series around the period of the 2007 financial crisis. This alteration in the stochastic dynamics of Libor is contemporary of what press called “Libor scandal”, i.e. the manipulation of interest rates carried out by several prime banks. We argue that our methodology is suitable as a market watch mechanism, as it makes visible the temporal redution in informational efficiency of the market.

  1. Support vector machines optimization based theory, algorithms, and extensions

    CERN Document Server

    Deng, Naiyang; Zhang, Chunhua

    2013-01-01

    Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)-classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built.The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twi

  2. Graph-based linear scaling electronic structure theory

    Energy Technology Data Exchange (ETDEWEB)

    Niklasson, Anders M. N., E-mail: amn@lanl.gov; Negre, Christian F. A.; Cawkwell, Marc J.; Swart, Pieter J.; Germann, Timothy C.; Bock, Nicolas [Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); Mniszewski, Susan M.; Mohd-Yusof, Jamal; Wall, Michael E.; Djidjev, Hristo [Computer, Computational, and Statistical Sciences Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); Rubensson, Emanuel H. [Division of Scientific Computing, Department of Information Technology, Uppsala University, Box 337, SE-751 05 Uppsala (Sweden)

    2016-06-21

    We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.

  3. Information theory and signal transduction systems: from molecular information processing to network inference.

    Science.gov (United States)

    Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H

    2014-11-01

    Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Attachment and the processing of social information across the life span: theory and evidence.

    Science.gov (United States)

    Dykas, Matthew J; Cassidy, Jude

    2011-01-01

    Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the patterns of results that have emerged from these studies. The central proposition is that individuals who possess secure experience-based internal working models of attachment will process--in a relatively open manner--a broad range of positive and negative attachment-relevant social information. Moreover, secure individuals will draw on their positive attachment-related knowledge to process this information in a positively biased schematic way. In contrast, individuals who possess insecure internal working models of attachment will process attachment-relevant social information in one of two ways, depending on whether the information could cause the individual psychological pain. If processing the information is likely to lead to psychological pain, insecure individuals will defensively exclude this information from further processing. If, however, the information is unlikely to lead to psychological pain, then insecure individuals will process this information in a negatively biased schematic fashion that is congruent with their negative attachment-related experiences. In a comprehensive literature review, we describe studies that illustrate these patterns of attachment-related information processing from childhood to adulthood. This review focuses on studies that have examined specific components (e.g., attention and memory) and broader aspects (e.g., attributions) of social information processing. We also provide general conclusions and suggestions for future research.

  5. Stakeholder theory and reporting information The case of performance prism

    Directory of Open Access Journals (Sweden)

    Bartłomiej Nita

    2016-07-01

    Full Text Available The aim of the paper is to explain the stakeholder theory in the context of performance measurement in integrated reporting. Main research methods used in the article include logical reasoning, critical analysis of academic literature, and observation. The principal result of the discussion is included in the statement that the stakeholder theory in the field of accounting is reflected in the so-called integrated reporting. Moreover, among the large variety of performance measurement methods, such as balanced scorecard and others, the concept of performance prism can be considered as the only method that fully takes into account the wide range of stakeholders. The analysis performed leads to the conclusion that development in accounting research takes into account the objectives of an organization in the context of the so-called corporate social responsibility as well as performance reporting oriented towards the communication of the company with its environment and the various stakeholder groups.

  6. Energy Information Data Base: corporate author entries

    International Nuclear Information System (INIS)

    1980-06-01

    One of the controls for information entered into the data bases created and maintained by the DOE Technical Information Center is the standardized name for the corporate entity or the corporate author. The purpose of Energy Information Data Base: Corporate Author Entries is to provide a means for the consistent citing of the names of organizations in bibliographic records. These entries serve as guides for users of the DOE/RECON computerized data bases who want to locate information originating in particular organizations. The entries in this revision include the corporate entries used in report bibliographic citations since 1973 and list approximately 28,000 corporate sources

  7. MOTIVATING ENGLISH TEACHERS BASED ON THE BASIC NEEDS THEORY AND AN EXPECTANCY THEORY

    Directory of Open Access Journals (Sweden)

    Hidayatus Sholihah

    2017-08-01

    Full Text Available There are two main motivation theories. a hierarchy of basic needs theory,  and an expectancy theory. In a Hyrarchy of basic needs theory, Maslow has stated that the basic needs as a main behaviour direction are structured into a hierarchy. There are five basic human needs.  The first: Physiological needs such as: salary, bonus or working condition. The second: the safety needs, such as: safe job environment, job security or health cover. The third, social needs, such as  union and team work. The next is self esteem, such as getting an award, medal, certificate or any other recognisition. Then the last is self actualization, for example is by providing an opportunity to share knowledge, skills and eprerience. The evaluation of this theory are: there is no spiritual needs as human basic needs is a main weakness of this theory. Then it is possible that different level of  needs  have to be satisfied in the same time, or not in hierarchy level or, not always have to be fulfilled in order. The next motivation theory is an Expectancy Theory. This theory is based on three main factors. The first factor is: English teachers will be motivated to work harder if they have a good perception to their own competences in accordance with their job. The second, individual motivation depends on the rewards given when they finish a  particular job. Finally, it also depends on their regards to the rewards given from the job that they do. Expectancy theory is a good theory, however, it is not easy to be implemented because the principals should provide various types of reward to satisfy the expectation of their English teachers. Considering the strengths and weaknesses of these two theories, it is better to combine both of them in the practice to get more effective results.

  8. Task-Based Language Teaching and Expansive Learning Theory

    Science.gov (United States)

    Robertson, Margaret

    2014-01-01

    Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…

  9. A density functional theory-based chemical potential equalisation

    Indian Academy of Sciences (India)

    A chemical potential equalisation scheme is proposed for the calculation of these quantities and hence the dipole polarizability within the framework of density functional theory based linear response theory. The resulting polarizability is expressed in terms of the contributions from individual atoms in the molecule. A few ...

  10. Internet information triangulation: Design theory and prototype evaluation

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.; Brinkhuis, Michel

    2014-01-01

    Many discussions exist regarding the credibility of information on the Internet. Similar discussions happen on the interpretation of social scientific research data, for which information triangulation has been proposed as a useful method. In this article, we explore a design theory—consisting of a

  11. Innovations in information retrieval perspectives for theory and practice

    CERN Document Server

    Foster, Allen

    2011-01-01

    The advent of various information retrieval (IR) technologies and approaches to storage and retrieval provide communities with opportunities for mass documentation, digitization, and the recording of information in different forms. This book introduces and contextualizes these developments and looks at supporting research in IR.

  12. On the assessment of visual communication by information theory

    Science.gov (United States)

    Huck, Friedrich O.; Fales, Carl L.

    1993-01-01

    This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.

  13. Information Theory Broadens the Spectrum of Molecular Ecology and Evolution.

    Science.gov (United States)

    Sherwin, W B; Chao, A; Jost, L; Smouse, P E

    2017-12-01

    Information or entropy analysis of diversity is used extensively in community ecology, and has recently been exploited for prediction and analysis in molecular ecology and evolution. Information measures belong to a spectrum (or q profile) of measures whose contrasting properties provide a rich summary of diversity, including allelic richness (q=0), Shannon information (q=1), and heterozygosity (q=2). We present the merits of information measures for describing and forecasting molecular variation within and among groups, comparing forecasts with data, and evaluating underlying processes such as dispersal. Importantly, information measures directly link causal processes and divergence outcomes, have straightforward relationship to allele frequency differences (including monotonicity that q=2 lacks), and show additivity across hierarchical layers such as ecology, behaviour, cellular processes, and nongenetic inheritance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Energy information data base: subject thesaurus

    International Nuclear Information System (INIS)

    1979-10-01

    The technical staff of the DOE Technical Information Center, during its subject indexing activities, develops and structures a vocabulary that allows consistent machine storage and retrieval of information necessary to the accomplishment of the DOE mission. This thesaurus incorporates that structured vocabulary. The terminology of this thesaurus is used for the subject control of information announced in DOE Energy Research Abstracts, Energy Abstracts for Policy Analysis, Solar Energy Update, Geothermal Energy Update, Fossil Energy Update, Fusion Energy Update, and Energy Conservation Update. This terminology also facilitates subject searching of the DOE energy information data base, a research in progress data base, a general and practical energy information data base, power reactor docket information data base, nuclear science abstracts data base, and the federal energy information data base on the DOE on-line retrieval system, RECON. The rapid expansion of the DOE's activities will result in a concomitant thesaurus expansion as information relating to new activities is indexed. Only the terms used in the indexing of documents at the Technical Information Center to date are included

  15. Switching theory-based steganographic system for JPEG images

    Science.gov (United States)

    Cherukuri, Ravindranath C.; Agaian, Sos S.

    2007-04-01

    Cellular communications constitute a significant portion of the global telecommunications market. Therefore, the need for secured communication over a mobile platform has increased exponentially. Steganography is an art of hiding critical data into an innocuous signal, which provide answers to the above needs. The JPEG is one of commonly used format for storing and transmitting images on the web. In addition, the pictures captured using mobile cameras are in mostly in JPEG format. In this article, we introduce a switching theory based steganographic system for JPEG images which is applicable for mobile and computer platforms. The proposed algorithm uses the fact that energy distribution among the quantized AC coefficients varies from block to block and coefficient to coefficient. Existing approaches are effective with a part of these coefficients but when employed over all the coefficients they show there ineffectiveness. Therefore, we propose an approach that works each set of AC coefficients with different frame work thus enhancing the performance of the approach. The proposed system offers a high capacity and embedding efficiency simultaneously withstanding to simple statistical attacks. In addition, the embedded information could be retrieved without prior knowledge of the cover image. Based on simulation results, the proposed method demonstrates an improved embedding capacity over existing algorithms while maintaining a high embedding efficiency and preserving the statistics of the JPEG image after hiding information.

  16. Learning Theory Foundations of Simulation-Based Mastery Learning.

    Science.gov (United States)

    McGaghie, William C; Harris, Ilene B

    2018-06-01

    Simulation-based mastery learning (SBML), like all education interventions, has learning theory foundations. Recognition and comprehension of SBML learning theory foundations are essential for thoughtful education program development, research, and scholarship. We begin with a description of SBML followed by a section on the importance of learning theory foundations to shape and direct SBML education and research. We then discuss three principal learning theory conceptual frameworks that are associated with SBML-behavioral, constructivist, social cognitive-and their contributions to SBML thought and practice. We then discuss how the three learning theory frameworks converge in the course of planning, conducting, and evaluating SBML education programs in the health professions. Convergence of these learning theory frameworks is illustrated by a description of an SBML education and research program in advanced cardiac life support. We conclude with a brief coda.

  17. Hiding data selected topics : Rudolf Ahlswede’s lectures on information theory 3

    CERN Document Server

    Althöfer, Ingo; Deppe, Christian; Tamm, Ulrich

    2016-01-01

    Devoted to information security, this volume begins with a short course on cryptography, mainly based on lectures given by Rudolf Ahlswede at the University of Bielefeld in the mid 1990s. It was the second of his cycle of lectures on information theory which opened with an introductory course on basic coding theorems, as covered in Volume 1 of this series. In this third volume, Shannon’s historical work on secrecy systems is detailed, followed by an introduction to an information-theoretic model of wiretap channels, and such important concepts as homophonic coding and authentication. Once the theoretical arguments have been presented, comprehensive technical details of AES are given. Furthermore, a short introduction to the history of public-key cryptology, RSA and El Gamal cryptosystems is provided, followed by a look at the basic theory of elliptic curves, and algorithms for efficient addition in elliptic curves. Lastly, the important topic of “oblivious transfer” is discussed, which is strongly conne...

  18. The information a history, a theory, a flood

    CERN Document Server

    Gleick, James

    2011-01-01

    Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing. We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code. In 'The Information' James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the 'bit', it is a fascinating account of the modern age's defining idea and a brilliant exploration of how information has revolutionised our lives.

  19. MaxEnt-Based Ecological Theory: A Template for Integrated Catchment Theory

    Science.gov (United States)

    Harte, J.

    2017-12-01

    The maximum information entropy procedure (MaxEnt) is both a powerful tool for inferring least-biased probability distributions from limited data and a framework for the construction of complex systems theory. The maximum entropy theory of ecology (METE) describes remarkably well widely observed patterns in the distribution, abundance and energetics of individuals and taxa in relatively static ecosystems. An extension to ecosystems undergoing change in response to disturbance or natural succession (DynaMETE) is in progress. I describe the structure of both the static and the dynamic theory and show a range of comparisons with census data. I then propose a generalization of the MaxEnt approach that could provide a framework for a predictive theory of both static and dynamic, fully-coupled, eco-socio-hydrological catchment systems.

  20. Consensus for linear multi-agent system with intermittent information transmissions using the time-scale theory

    Science.gov (United States)

    Taousser, Fatima; Defoort, Michael; Djemai, Mohamed

    2016-01-01

    This paper investigates the consensus problem for linear multi-agent system with fixed communication topology in the presence of intermittent communication using the time-scale theory. Since each agent can only obtain relative local information intermittently, the proposed consensus algorithm is based on a discontinuous local interaction rule. The interaction among agents happens at a disjoint set of continuous-time intervals. The closed-loop multi-agent system can be represented using mixed linear continuous-time and linear discrete-time models due to intermittent information transmissions. The time-scale theory provides a powerful tool to combine continuous-time and discrete-time cases and study the consensus protocol under a unified framework. Using this theory, some conditions are derived to achieve exponential consensus under intermittent information transmissions. Simulations are performed to validate the theoretical results.

  1. Using Activity Theory as a Base for Investigating Language Teacher ...

    African Journals Online (AJOL)

    Using Activity Theory as a Base for Investigating Language Teacher Education through Digital Technology. ... Log in or Register to get access to full text downloads. ... how the platform has created tensions, contradictions and transformations.

  2. Towards a Theory-Based Framework for Assessing the ...

    African Journals Online (AJOL)

    The theory-based framework attempts to capture ESD's complexity in terms of the ... projects in teacher education institutions in Botswana, a brief description of the ..... How the 'four pillars of learning' relate to education for sustainable human.

  3. Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems

    Directory of Open Access Journals (Sweden)

    Ali Mohammad-Djafari

    2015-06-01

    Full Text Available The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP, information theory, relative entropy and the Kullback–Leibler (KL divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC and, in particular, the variational Bayesian approximation (VBA methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC methods. We will also see that VBA englobes joint maximum a posteriori (MAP, as well as the different expectation-maximization (EM algorithms as particular cases.

  4. Modern Resource-Based Theory(ies)

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Stieglitz, Nils

    We survey the resource-based view in strategic management, focusing on its roots in economics. We organize our discussion in terms of the Gavetti and Levinthal distinction between a “high church” and a “low church” resource-based view, and argue that these hitherto rather separate streams...

  5. School-Based Management: Theory and Practice.

    Science.gov (United States)

    George, Patricia, Ed.; Potter, Eugenia Cooper, Ed.

    School-based management (SBM), sometimes called site-based management, is fast becoming the hottest restructuring item in the arsenal of reformers, teachers' unions, governors, and legislators who want to change the traditional ways in which schools and school districts do business. This document comprises three main sections with contributions…

  6. Making Theory Come Alive through Practice-based Design Research

    DEFF Research Database (Denmark)

    Markussen, Thomas; Knutz, Eva; Rind Christensen, Poul

    The aim of this paper is to demonstrate how practice-based design research is able not only to challenge, but also to push toward further development of some of the basic assumpstions in emotion theories as used within design research. In so doing, we wish to increase knolwedge on a central...... epistemological question for design research, namely how practice-based design research can be a vehicle for the construction of new theory for design research....

  7. Application of the Theory of Constraints in Project Based Structures

    OpenAIRE

    Martynas Sarapinas; Vytautas Pranas Sūdžius

    2011-01-01

    The article deals with the application of the Theory of Constraints (TOC) in project management. This article involves a short introduction to TOC as a project management method and deep analysis of project management specialties using the TOC: TOC based project planning, timetable management, tasks synchronization, project control and “relay runner work ethic”. Moreover, the article describes traditional and TOC based project management theories in their comparison, and emphasize the main be...

  8. Application of information and complexity theories to public opinion polls. The case of Greece (2004-2007)

    OpenAIRE

    Panos, C. P.; Chatzisavvas, K. Ch.

    2007-01-01

    A general methodology to study public opinion inspired from information and complexity theories is outlined. It is based on probabilistic data extracted from opinion polls. It gives a quantitative information-theoretic explanation of high job approval of Greek Prime Minister Mr. Constantinos Karamanlis (2004-2007), while the same time series of polls conducted by the company Metron Analysis showed that his party New Democracy (abbr. ND) was slightly higher than the opposition party of PASOK -...

  9. Ground reaction curve based upon block theory

    International Nuclear Information System (INIS)

    Yow, J.L. Jr.; Goodman, R.E.

    1985-09-01

    Discontinuities in a rock mass can intersect an excavation surface to form discrete blocks (keyblocks) which can be unstable. Once a potentially unstable block is identified, the forces affecting it can be calculated to assess its stability. The normal and shear stresses on each block face before displacement are calculated using elastic theory and are modified in a nonlinear way by discontinuity deformations as the keyblock displaces. The stresses are summed into resultant forces to evaluate block stability. Since the resultant forces change with displacement, successive increments of block movement are examined to see whether the block ultimately becomes stable or fails. Two-dimensional (2D) and three-dimensional (3D) analytic models for the stability of simple pyramidal keyblocks were evaluated. Calculated stability is greater for 3D analyses than for 2D analyses. Calculated keyblock stability increases with larger in situ stress magnitudes, larger lateral stress ratios, and larger shear strengths. Discontinuity stiffness controls blocks displacement more strongly than it does stability itself. Large keyblocks are less stable than small ones, and stability increases as blocks become more slender

  10. Information retrieval system based on INIS tapes

    International Nuclear Information System (INIS)

    Pultorak, G.

    1976-01-01

    An information retrieval system based on the INIS computer tapes is described. It includes the three main elements of a computerized information system: a data base on a machine -readable medium, a collection of queries which represent the information needs from the data - base, and a set of programs by which the actual retrieval is done, according to the user's queries. The system is built for the center's computer, a CDC 3600, and its special features characterize, to a certain degree, the structure of the programs. (author)

  11. Computer Support of Groups: Theory-Based Models for GDSS Research

    OpenAIRE

    V. Srinivasan Rao; Sirkka L. Jarvenpaa

    1991-01-01

    Empirical research in the area of computer support of groups is characterized by inconsistent results across studies. This paper attempts to reconcile the inconsistencies by linking the ad hoc reasoning in the studies to existing theories of communication, minority influence and human information processing. Contingency models are then presented based on the theories discussed. The paper concludes by discussing the linkages between the current work and other recently published integrations of...

  12. Theory of impossible worlds: Toward a physics of information.

    Science.gov (United States)

    Buscema, Paolo Massimo; Sacco, Pier Luigi; Della Torre, Francesca; Massini, Giulia; Breda, Marco; Ferilli, Guido

    2018-05-01

    In this paper, we introduce an innovative approach to the fusion between datasets in terms of attributes and observations, even when they are not related at all. With our technique, starting from datasets representing independent worlds, it is possible to analyze a single global dataset, and transferring each dataset onto the others is always possible. This procedure allows a deeper perspective in the study of a problem, by offering the chance of looking into it from other, independent points of view. Even unrelated datasets create a metaphoric representation of the problem, useful in terms of speed of convergence and predictive results, preserving the fundamental relationships in the data. In order to extract such knowledge, we propose a new learning rule named double backpropagation, by which an auto-encoder concurrently codifies all the different worlds. We test our methodology on different datasets and different issues, to underline the power and flexibility of the Theory of Impossible Worlds.

  13. An Innovative Thinking-Based Intelligent Information Fusion Algorithm

    Directory of Open Access Journals (Sweden)

    Huimin Lu

    2013-01-01

    Full Text Available This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information.

  14. Product-oriented design theory for digital information services: A literature review.

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.; Kraaijenbrink, Jeroen

    2008-01-01

    Purpose – The purpose of this paper is to give a structured literature review, design concepts, and research propositions related to a product-oriented design theory for information services. Information services facilitate the exchange of information goods with or without transforming these goods.

  15. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  16. Relation between entropy functional of Keizer and information theory

    International Nuclear Information System (INIS)

    Freidkin, E.S.; Nettleton, R.E.

    1990-01-01

    An equation given by Keizer which relates the second-order functional derivative of the steady-state entropy to the inverse fluctuation correlation function is satisified by the information-theoretic entropy if the equation is extended to arbitrary nonequilibrium states

  17. STUDENTS’ GEOMETRIC THINKING BASED ON VAN HIELE’S THEORY

    Directory of Open Access Journals (Sweden)

    Harina Fitriyani

    2018-02-01

    Full Text Available The current study aims to identify the development level of students’ geometric thinking in mathematics education department, Universitas Ahmad Dahlan based on van Hiele’s theory. This is a descriptive qualitative research with the respondents as many as 129 students. In addition to researchers, the instrument used in this study is a test consisting of 25 items multiple choice questions. The data is analyzed by using Milles and Huberman model. The result shows that there were 30,65% of students in pre-visualization level, 21,51% of students in visualizes level, and 29,03% of students in analyze level, 16,67% of students in informal deduction level, 2,15% of students in deduction level, and 0,00% of student in rigor level. Furthermore, findings indicated a transition level among development levels of geometric thinking in pre-analyze, pre-informal deduction, pre-deduction, and pre-rigor that were 20%; 13,44%; 6,45%; 1,08% respectively. The other findings were 40,32% of students were difficult to determine and 4,3% of students cannot be identified.

  18. An enstrophy-based linear and nonlinear receptivity theory

    Science.gov (United States)

    Sengupta, Aditi; Suman, V. K.; Sengupta, Tapan K.; Bhaumik, Swagata

    2018-05-01

    In the present research, a new theory of instability based on enstrophy is presented for incompressible flows. Explaining instability through enstrophy is counter-intuitive, as it has been usually associated with dissipation for the Navier-Stokes equation (NSE). This developed theory is valid for both linear and nonlinear stages of disturbance growth. A previously developed nonlinear theory of incompressible flow instability based on total mechanical energy described in the work of Sengupta et al. ["Vortex-induced instability of an incompressible wall-bounded shear layer," J. Fluid Mech. 493, 277-286 (2003)] is used to compare with the present enstrophy based theory. The developed equations for disturbance enstrophy and disturbance mechanical energy are derived from NSE without any simplifying assumptions, as compared to other classical linear/nonlinear theories. The theory is tested for bypass transition caused by free stream convecting vortex over a zero pressure gradient boundary layer. We explain the creation of smaller scales in the flow by a cascade of enstrophy, which creates rotationality, in general inhomogeneous flows. Linear and nonlinear versions of the theory help explain the vortex-induced instability problem under consideration.

  19. Impact of the Cybernetic Law of Requisite Variety on a Theory of Information Science.

    Science.gov (United States)

    Heilprin, Laurence B.

    Search for an integrated, comprehensive theory of information science (IS) has so far been unsuccessful. Appearance of a theory has been retarded by one central constraint, the large number of disciplines concerned with human communication. Crossdisciplinary interdependence occurs in two ways: theoretical relation of IS phenomena to a given…

  20. Protection and security of data base information

    Directory of Open Access Journals (Sweden)

    Mariuţa ŞERBAN

    2011-06-01

    Full Text Available Data bases are one of the most important components in every large informatics system which stores and processes data and information. Because data bases contain all of the valuable information about a company, its clients, its financial activity, they represent one of the key elements in the structure of an organization, which determines imperatives such as confidentiality, integrity and ease of data access. The current paper discuses the integrity of data bases and it refers to the validity and the coherence of stored data. Usually, integrity is defined in connection with terms of constraint, that are rules regarding coherence which the data base cannot infringe. Data base that integrity refers to information correctness and assumes to detect, correct and prevent errors that might have an effect on the data comprised by the data bases.

  1. Complementarity of information sent via different bases

    DEFF Research Database (Denmark)

    Wu, Shengjun; Yu, Sixia; Mølmer, Klaus

    2009-01-01

    We discuss quantitatively the complementarity of information transmitted by a quantum system prepared in a basis state in one out of several different mutually unbiased bases (MUBs). We obtain upper bounds on the information available to a receiver who has no knowledge of which MUB was chosen...

  2. Information and Communication Technology and School Based ...

    African Journals Online (AJOL)

    Information and Communication technology and school based assessment (SBA) is practice that broadens the form mode, means and scope of assessment in the school using modern technologies in order to facilitate and enhance learning. This study sought to ascertain the efficacy of Information and Communication ...

  3. INVESTIGATION OF FISCAL AND BUDGETARY POLICIES BASED ON ECONOMIC THEORIES

    Directory of Open Access Journals (Sweden)

    EMILIA CAMPEANU

    2011-04-01

    Full Text Available Empirical analysis of fiscal and budgetary policies cannot be achieved without first knowing how they are viewed in the economic theories. This approach is important to indicate the position and implications of fiscal and budgetary policy tools in the economic theory considering their major differences. Therefore, the paper aims is to investigate the fiscal and budgetary policies based on economic theories such as neoclassical, Keynesian and neo-Keynesian theory in order to indicate their divergent points. Once known these approaches at the economic theory level is easier to establish the appropriate measures taking into consideration the framing of a country economy in a certain pattern. This work was supported from the European Social Fund through Sectoral Operational Programme Human Resources Development 2007-2013, project number POSDRU/89/1.5/S/59184 „Performance and excellence in postdoctoral research in Romanian economics science domain” (contract no. 0501/01.11.2010.

  4. Dynamical theory of subconstituents based on ternary algebras

    International Nuclear Information System (INIS)

    Bars, I.; Guenaydin, M.

    1980-01-01

    We propose a dynamical theory of possible fundamental constituents of matter. Our scheme is based on (super) ternary algebras which are building blocks of Lie (super) algebras. Elementary fields, called ''ternons,'' are associated with the elements of a (super) ternary algebra. Effective gauge bosons, ''quarks,'' and ''leptons'' are constructed as composite fields from ternons. We propose two- and four-dimensional (super) ternon theories whose structures are closely related to CP/sub N/ and Yang-Mills theories and their supersymmetric extensions. We conjecture that at large distances (low energies) the ternon theories dynamically produce effective gauge theories and thus may be capable of explaining the present particle-physics phenomenology. Such a scenario is valid in two dimensions

  5. Playing styles based on experiential learning theory

    NARCIS (Netherlands)

    Bontchev, Boyan; Vassileva, Dessislava; Aleksieva-Petrova, Adelina; Petrov, Milen

    2018-01-01

    In recent years, many researchers have reported positive outcomes and effects from applying computer games to the educational process. The main preconditions for an effective game-based learning process include the presence of high learning interest and the desire to study hard. Therefore,

  6. Attention Discrimination: Theory and Field Experiments with Monitoring Information Acquisition

    OpenAIRE

    Bartoš, Vojtěch; Bauer, Michal; Chytilová, Julie; Matějka, Filip

    2014-01-01

    We link two important ideas: attention is scarce and lack of information about an individual drives discrimination in selection decisions. Our model of allocation of costly attention implies that applicants from negatively stereotyped groups face "attention discrimination": less attention in highly selective cherry-picking markets, where more attention helps applicants, and more attention in lemon-dropping markets, where it harms them. To test the prediction, we integrate tools to monitor inf...

  7. Information Technology from Theory to Practice in Higher Education Structure

    OpenAIRE

    Tooraj Sadeghi; Zahra Piroziyan; Mehrdad Ebrahimpur

    2016-01-01

    In the past two decades, developments process of higher education dependence on the increased demand for admission to higher education, development of communication technologies, need for human resource development, rapid technological changes, accumulated knowledge and information and leads to serious challenges and changes in the role of universities and higher education in the new millennium. So dramatic changes of higher education and move it towards the universalization and interpreta...

  8. Multiple-User Quantum Information Theory for Optical Communication Channels

    Science.gov (United States)

    2008-06-01

    to this day. He later encouraged (and trained) me to participate in the Physics Olympiad, which led me to make it through all the levels of selection...optimal, homodyne, and hetero- dyne reception versus transmitter power P , with P0 ≡ 2π~c2L2/ AtAr used for the reference power...the constraints on the available physical resources. In most communication systems, the transfer of information is done by superimposing the

  9. The role of behavioral decision theory for cockpit information management

    Science.gov (United States)

    Jonsson, Jon E.

    1991-01-01

    The focus of this report is the consideration of one form of cognition, judgment and decision making, while examining some information management issues associated with the implementation of new forms of automation. As technology matures and more tasks become suitable to automation, human factors researchers will have to consider the effect that increasing automation will have on operator performance. Current technology allows flight deck designers the opportunity to automate activities involving substantially more cognitive processing.

  10. Theory of choice in bandit, information sampling and foraging tasks.

    Science.gov (United States)

    Averbeck, Bruno B

    2015-03-01

    Decision making has been studied with a wide array of tasks. Here we examine the theoretical structure of bandit, information sampling and foraging tasks. These tasks move beyond tasks where the choice in the current trial does not affect future expected rewards. We have modeled these tasks using Markov decision processes (MDPs). MDPs provide a general framework for modeling tasks in which decisions affect the information on which future choices will be made. Under the assumption that agents are maximizing expected rewards, MDPs provide normative solutions. We find that all three classes of tasks pose choices among actions which trade-off immediate and future expected rewards. The tasks drive these trade-offs in unique ways, however. For bandit and information sampling tasks, increasing uncertainty or the time horizon shifts value to actions that pay-off in the future. Correspondingly, decreasing uncertainty increases the relative value of actions that pay-off immediately. For foraging tasks the time-horizon plays the dominant role, as choices do not affect future uncertainty in these tasks.

  11. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  12. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  13. Using Information Theory to Assess the Communicative Capacity of Circulating MicroRNA

    OpenAIRE

    Finn, Nnenna A.; Searles, Charles D.

    2013-01-01

    The discovery of extracellular microRNAs (miRNAs) and their transport modalities (i.e. microparticles, exosomes, proteins and lipoproteins) has sparked theories regarding their role in intercellular communication. Here, we assessed the information transfer capacity of different miRNA transport modalities in human serum by utilizing basic principles of information theory. Zipf Statistics were calculated for each of the miRNA transport modalities identified in human serum. Our analyses revealed...

  14. A Unifying Theory of Value Based Management

    OpenAIRE

    Weaver, Samuel C.; Weston, J. Fred

    2003-01-01

    We identify four alternative performance metrics used in value based management (VBM). (1) Basic is an intrinsic value analysis (IVA), the discounted cash flow (DCF) methodology. (2) We show that this framework will be consistent with returns to shareholder (RTS, capital gains plus dividends) measured over appropriate time horizons. (3) Economic profit (EP) [also called economic value added (EVA®)] takes from the DCF free cash flow valuation, net operating profits after taxes (NOPAT), divide...

  15. Energy Information Data Base: serial titles

    International Nuclear Information System (INIS)

    1980-06-01

    The Department of Energy Technical Information Center (TIC) is responsible for creating bibliographic data bases that are used in the announcement and retrieval of publications dealing with all phases of energy. The TIC interactive information processing system makes use of a number of computerized authorities so that consistency can be maintained and indexes can be produced. One such authority is the Energy Information Data Base: Serial Titles. This authority contains the full and abbreviated journal title, country of publication, CODEN, and certain codes. This revision replaces previous revisions of this document

  16. Generating information-rich high-throughput experimental materials genomes using functional clustering via multitree genetic programming and information theory.

    Science.gov (United States)

    Suram, Santosh K; Haber, Joel A; Jin, Jian; Gregoire, John M

    2015-04-13

    High-throughput experimental methodologies are capable of synthesizing, screening and characterizing vast arrays of combinatorial material libraries at a very rapid rate. These methodologies strategically employ tiered screening wherein the number of compositions screened decreases as the complexity, and very often the scientific information obtained from a screening experiment, increases. The algorithm used for down-selection of samples from higher throughput screening experiment to a lower throughput screening experiment is vital in achieving information-rich experimental materials genomes. The fundamental science of material discovery lies in the establishment of composition-structure-property relationships, motivating the development of advanced down-selection algorithms which consider the information value of the selected compositions, as opposed to simply selecting the best performing compositions from a high throughput experiment. Identification of property fields (composition regions with distinct composition-property relationships) in high throughput data enables down-selection algorithms to employ advanced selection strategies, such as the selection of representative compositions from each field or selection of compositions that span the composition space of the highest performing field. Such strategies would greatly enhance the generation of data-driven discoveries. We introduce an informatics-based clustering of composition-property functional relationships using a combination of information theory and multitree genetic programming concepts for identification of property fields in a composition library. We demonstrate our approach using a complex synthetic composition-property map for a 5 at. % step ternary library consisting of four distinct property fields and finally explore the application of this methodology for capturing relationships between composition and catalytic activity for the oxygen evolution reaction for 5429 catalyst compositions in a

  17. Robust Energy Hub Management Using Information Gap Decision Theory

    DEFF Research Database (Denmark)

    Javadi, Mohammad Sadegh; Anvari-Moghaddam, Amjad; Guerrero, Josep M.

    2017-01-01

    This paper proposes a robust optimization framework for energy hub management. It is well known that the operation of energy systems can be negatively affected by uncertain parameters, such as stochastic load demand or generation. In this regard, it is of high significance to propose efficient...... tools in order to deal with uncertainties and to provide reliable operating conditions. On a broader scale, an energy hub includes diverse energy sources for supplying both electrical load and heating/cooling demands with stochastic behaviors. Therefore, this paper utilizes the Information Decision Gap...

  18. Value of information-based inspection planning for offshore structures

    DEFF Research Database (Denmark)

    Irman, Arifian Agusta; Thöns, Sebastian; Leira, Bernt J.

    2017-01-01

    with each inspection strategy. A simplified and generic risk-based inspection planning utilizing pre- posterior Bayesian decision analysis had been proposed by Faber et al. [1] and Straub [2]. This paper provides considerations on the theoretical background and a Value of Information analysis......-based inspection planning. The paper will start out with a review of the state-of-art RBI planning procedure based on Bayesian decision theory and its application in offshore structure integrity management. An example of the Value of Information approach is illustrated and it is pointed to further research......Asset integrity and management is an important part of the oil and gas industry especially for existing offshore structures. With declining oil price, the production rate is an important factor to be maintained that makes integrity of the structures one of the main concerns. Reliability based...

  19. Restructuring Consciousness –the Psychedelic State in Light of Integrated Information Theory

    Directory of Open Access Journals (Sweden)

    Andrew Robert Gallimore

    2015-06-01

    Full Text Available The psychological state elicited by the classic psychedelics drugs, such as LSD and psilocybin, is one of the most fascinating and yet least understood states of consciousness. However, with the advent of modern functional neuroimaging techniques, the effect of these drugs on neural activity is now being revealed, although many of the varied phenomenological features of the psychedelic state remain challenging to explain. Integrated information theory (IIT is one of the foremost contemporary theories of consciousness, providing a mathematical formalization of both the quantity and quality of conscious experience. This theory can be applied to all known states of consciousness, including the psychedelic state. Using the results of functional neuroimaging data on the psychedelic state, the effects of psychedelic drugs on both the level and structure of consciousness can be explained in terms of the conceptual framework of IIT. This new IIT-based model of the psychedelic state provides an explanation for many of its phenomenological features, including unconstrained cognition, alterations in the structure and meaning of concepts and a sense of expanded awareness. This model also suggests that whilst cognitive flexibility, creativity, and imagination are enhanced during the psychedelic state, this occurs at the expense of cause-effect information, as well as degrading the brain’s ability to organize, categorize, and differentiate the constituents of conscious experience. Furthermore, the model generates specific predictions that can be tested using a combination of functional imaging techniques, as has been applied to the study of levels of consciousness during anesthesia and following brain injury.

  20. Radiographic information theory: correction for x-ray spectral distribution

    International Nuclear Information System (INIS)

    Brodie, I.; Gutcheck, R.A.

    1983-01-01

    A more complete computational method is developed to account for the effect of the spectral distribution of the incident x-ray fluence on the minimum exposure required to record a specified information set in a diagnostic radiograph. It is shown that an earlier, less rigorous, but simpler computational technique does not introduce serious errors provided that both a good estimate of the mean energy per photon can be made and the detector does not contain an absorption edge in the spectral range. Also shown is that to a first approximation, it is immaterial whether the detecting surface counts the number of photons incident from each pixel or measures the energy incident on each pixel. A previous result is confirmed that, for mammography, the present methods of processing data from the detector utilize only a few percent of the incident information, suggesting that techniques can be developed for obtaining mammograms at substantially lower doses than those presently used. When used with film-screen combinations, x-ray tubes with tungsten anodes should require substantially lower exposures than devices using molybdenum anodes, when both are operated at their optimal voltage

  1. The future (and past) of quantum theory after the Higgs boson: a quantum-informational viewpoint.

    Science.gov (United States)

    Plotnitsky, Arkady

    2016-05-28

    Taking as its point of departure the discovery of the Higgs boson, this article considers quantum theory, including quantum field theory, which predicted the Higgs boson, through the combined perspective of quantum information theory and the idea of technology, while also adopting anon-realistinterpretation, in 'the spirit of Copenhagen', of quantum theory and quantum phenomena themselves. The article argues that the 'events' in question in fundamental physics, such as the discovery of the Higgs boson (a particularly complex and dramatic, but not essentially different, case), are made possible by the joint workings of three technologies: experimental technology, mathematical technology and, more recently, digital computer technology. The article will consider the role of and the relationships among these technologies, focusing on experimental and mathematical technologies, in quantum mechanics (QM), quantum field theory (QFT) and finite-dimensional quantum theory, with which quantum information theory has been primarily concerned thus far. It will do so, in part, by reassessing the history of quantum theory, beginning with Heisenberg's discovery of QM, in quantum-informational and technological terms. This history, the article argues, is defined by the discoveries of increasingly complex configurations of observed phenomena and the emergence of the increasingly complex mathematical formalism accounting for these phenomena, culminating in the standard model of elementary-particle physics, defining the current state of QFT. © 2016 The Author(s).

  2. Prototype Theory Based Feature Representation for PolSAR Images

    OpenAIRE

    Huang Xiaojing; Yang Xiangli; Huang Pingping; Yang Wen

    2016-01-01

    This study presents a new feature representation approach for Polarimetric Synthetic Aperture Radar (PolSAR) image based on prototype theory. First, multiple prototype sets are generated using prototype theory. Then, regularized logistic regression is used to predict similarities between a test sample and each prototype set. Finally, the PolSAR image feature representation is obtained by ensemble projection. Experimental results of an unsupervised classification of PolSAR images show that our...

  3. Anticipated detection of favorable periods for wind energy production by means of information theory

    Science.gov (United States)

    Vogel, Eugenio; Saravia, Gonzalo; Kobe, Sigismund; Schumann, Rolf; Schuster, Rolf

    Managing the electric power produced by different sources requires mixing the different response times they present. Thus, for instance, coal burning presents large time lags until operational conditions are reached while hydroelectric generation can react in a matter of some seconds or few minutes to reach the desired productivity. Wind energy production (WEP) can be instantaneously fed to the network to save fuels with low thermal inertia (gas burning for instance), but this source presents sudden variations within few hours. We report here for the first time a method based on information theory to handle WEP. This method has been successful in detecting dynamical changes in magnetic transitions and variations of stock markets. An algorithm called wlzip based on information recognition is used to recognize the information content of a time series. We make use of publically available energy data in Germany to simulate real applications. After a calibration process the system can recognize directly on the WEP data the onset of favorable periods of a desired strength. Optimization can lead to a few hours of anticipation which is enough to control the mixture of WEP with other energy sources, thus saving fuels.

  4. Action-Based Jurisprudence: Praxeological Legal Theory in Relation to Economic Theory, Ethics, and Legal Practice

    Directory of Open Access Journals (Sweden)

    Konrad Graf

    2011-08-01

    Full Text Available Action-based legal theory is a discrete branch of praxeology and the basis of an emerging school of jurisprudence related to, but distinct from, natural law. Legal theory and economic theory share content that is part of praxeology itself: the action axiom, the a priori of argumentation, universalizable property theory, and counterfactual-deductive methodology. Praxeological property-norm justification is separate from the strictly ethical “ought” question of selecting ends in an action context. Examples of action-based jurisprudence are found in existing “Austro-libertarian” literature. Legal theory and legal practice must remain distinct and work closely together if justice is to be found in real cases. Legal theorizing was shaped in religious ethical contexts, which contributed to confused field boundaries between law and ethics. The carrot and stick influence of rulers on theorists has distorted conventional economics and jurisprudence in particular directions over the course of centuries. An action-based approach is relatively immune to such sources of distortion in its methods and conclusions, but has tended historically to be marginalized from conventional institutions for this same reason.

  5. Optimization of hydrometric monitoring network in urban drainage systems using information theory.

    Science.gov (United States)

    Yazdi, J

    2017-10-01

    Regular and continuous monitoring of urban runoff in both quality and quantity aspects is of great importance for controlling and managing surface runoff. Due to the considerable costs of establishing new gauges, optimization of the monitoring network is essential. This research proposes an approach for site selection of new discharge stations in urban areas, based on entropy theory in conjunction with multi-objective optimization tools and numerical models. The modeling framework provides an optimal trade-off between the maximum possible information content and the minimum shared information among stations. This approach was applied to the main surface-water collection system in Tehran to determine new optimal monitoring points under the cost considerations. Experimental results on this drainage network show that the obtained cost-effective designs noticeably outperform the consulting engineers' proposal in terms of both information contents and shared information. The research also determined the highly frequent sites at the Pareto front which might be important for decision makers to give a priority for gauge installation on those locations of the network.

  6. Opera house acoustics based on subjective preference theory

    CERN Document Server

    Ando, Yoichi

    2015-01-01

    This book focuses on opera house acoustics based on subjective preference theory; it targets researchers in acoustics and vision who are working in physics, psychology, and brain physiology. This book helps readers to understand any subjective attributes in relation to objective parameters based on the powerful and workable model of the auditory system. It is reconfirmed here that the well-known Helmholtz theory, which was based on a peripheral model of the auditory system, may not well describe pitch, timbre, and duration as well as the spatial sensations described in this book, nor overall responses such as subjective preference of sound fields and the annoyance of environmental noise.

  7. Brain activity and cognition: a connection from thermodynamics and information theory.

    Science.gov (United States)

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity.

  8. Brain activity and cognition: a connection from thermodynamics and information theory

    Science.gov (United States)

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. PMID:26136709

  9. From motivation and cognition theories to everyday applications and back again: the case of product-integrated information and feedback

    Energy Technology Data Exchange (ETDEWEB)

    McCalley, L.T. [Technical Univ. Eindhoven (Netherlands)

    2003-07-01

    Various moderators of the relationship of goal setting and feedback are explored in four examples of applied empirical research. A selection of theoretical frameworks borrowed from varied disciplines guided the studies and are discussed in terms of their value to the particular questions investigated. The experiments all entailed the use of product-integrated energy feedback and illustrate a progressive understanding of how goals, feedback and other information provided to the user can generate or support better energy conservation. Experiment 1 exemplifies the successful use of combining goal setting and feedback and provides a basic understanding of the interaction from the perspectives of goal setting theory and Feedback Intervention Theory (FIT). Experiment 2 compares FIT to another, fundamentally different, cognitive framework, and the minimal justification principle. The study gives insight into how goals and feedback work through attention focus and the goal hierarchy to guide behaviour, the role of attitude in this process, and offers evidence that FIT better accounts for task specific conservation behaviour. Experiment 3 addresses the role of goals and information in strategy planning through the perspective of goal setting theory. Results of this study suggest the need for more development of the basic theory and illustrate the strong motivational properties of having a goal. Experiment 4 investigates a more fundamental process, anchoring bias, taken from decision theory and the theory of rational choice. This experiment was based again on FIT and provided further evidence of behavioural control through the focus of attention at a particular level of the goal hierarchy.

  10. From motivation and cognition theories to everyday applications and back again. The case of product-integrated information and feedback

    Energy Technology Data Exchange (ETDEWEB)

    McCalley, L.T. [Technical University Eindhoven/ TUE, Eindhoven (Netherlands)

    2003-07-01

    Various moderators of the relationship of goal setting and feedback are explored in four examples of applied empirical research. A selection of theoretical frameworks borrowed from varied disciplines guided the studies and are discussed in terms of their value to the particular questions investigated. The experiments all entailed the use of product-integrated energy feedback and illustrate a progressive understanding of how goals, feedback and other information provided to the user can generate or support better energy conservation. Experiment 1 exemplifies the successful use of combining goal setting and feedback and provides a basic understanding of the interaction from the perspectives of goal setting theory and Feedback Intervention Theory (FIT). Experiment 2 compares FIT to another, fundamentally different, cognitive framework, and the minimal justification principle. The study gives insight into how goals and feedback work through attention focus and the goal hierarchy to guide behaviour, the role of attitude in this process, and offers evidence that FIT better accounts for task specific conservation behaviour. Experiment 3 addresses the role of goals and information in strategy planning through the perspective of goal setting theory. Results of this study suggest the need for more development of the basic theory and illustrate the strong motivational properties of having a goal. Experiment 4 investigates a more fundamental process, anchoring bias, taken from decision theory and the theory of rational choice. This experiment was based again on FIT and provided further evidence of behavioural control through the focus of attention at a particular level of the goal hierarchy.

  11. Alice and Bob meet Banach the interface of asymptotic geometric analysis and quantum information theory

    CERN Document Server

    Aubrun, Guillaume

    2017-01-01

    The quest to build a quantum computer is arguably one of the major scientific and technological challenges of the twenty-first century, and quantum information theory (QIT) provides the mathematical framework for that quest. Over the last dozen or so years, it has become clear that quantum information theory is closely linked to geometric functional analysis (Banach space theory, operator spaces, high-dimensional probability), a field also known as asymptotic geometric analysis (AGA). In a nutshell, asymptotic geometric analysis investigates quantitative properties of convex sets, or other geometric structures, and their approximate symmetries as the dimension becomes large. This makes it especially relevant to quantum theory, where systems consisting of just a few particles naturally lead to models whose dimension is in the thousands, or even in the billions. Alice and Bob Meet Banach is aimed at multiple audiences connected through their interest in the interface of QIT and AGA: at quantum information resea...

  12. Energy Information Data Base: corporate author entries

    International Nuclear Information System (INIS)

    1978-06-01

    The DOE Energy Information Data Base has been created and is maintained by the DOE Technical Information Center. One of the controls for information entered into the base is the standardized name of the corporate entity or the corporate author. The purpose of this list of authorized or standardized corporate entries is to provide a means for the consistent citing of the names of organizations in bibliographic records. It also serves as a guide for users who retrieve information from a bibliographic data base and who want to locate information originating in particular organizations. This authority is a combination of entries established by the Technical Information Center and the International Atomic Energy Agency's International Nuclear Information System (INIS). The format calls, in general, for the name of the organization represented by the literature being cataloged to be cited as follows: the largest element, the place, the smallest element, e.g., Brigham Young Univ., Provo, Utah (USA), Dept. of Chemical Engineering. Code numbers are assigned to each entry to provide manipulation by computer. Cross references are used to reflect name changes and invalid entries

  13. Expanding resource theory and feminist-informed theory to explain intimate partner violence perpetration by court-ordered men.

    Science.gov (United States)

    Basile, Kathleen C; Hall, Jeffrey E; Walters, Mikel L

    2013-07-01

    This study tested resource and feminist-informed theories to explain physical, sexual, psychological, and stalking intimate partner violence (IPV) perpetrated by court-mandated men. Data were obtained from 340 men arrested for physical assault of a partner before their court-ordered treatment. Using path analysis, findings provided partial support for each model. Ineffective arguing and substance-use problems were moderators of resources and perpetration. Dominance mediated early exposures and perpetration in the feminist-informed model. In both models, predictors of stalking were different than those for other types of perpetration. Future studies should replicate this research and determine the utility of combining models.

  14. Cognition to Collaboration: User-Centric Approach and Information Behaviour Theories/Models

    Directory of Open Access Journals (Sweden)

    Alperen M Aydin

    2016-12-01

    Full Text Available Aim/Purpose: The objective of this paper is to review the vast literature of user-centric in-formation science and inform about the emerging themes in information behaviour science. Background:\tThe paradigmatic shift from system-centric to user-centric approach facilitates research on the cognitive and individual information processing. Various information behaviour theories/models emerged. Methodology: Recent information behaviour theories and models are presented. Features, strengths and weaknesses of the models are discussed through the analysis of the information behaviour literature. Contribution: This paper sheds light onto the weaknesses in earlier information behaviour models and stresses (and advocates the need for research on social information behaviour. Findings: Prominent information behaviour models deal with individual information behaviour. People live in a social world and sort out most of their daily or work problems in groups. However, only seven papers discuss social information behaviour (Scopus search. Recommendations for Practitioners\t: ICT tools used for inter-organisational sharing should be redesigned for effective information-sharing during disaster/emergency times. Recommendation for Researchers: There are scarce sources on social side of the information behaviour, however, most of the work tasks are carried out in groups/teams. Impact on Society: In dynamic work contexts like disaster management and health care settings, collaborative information-sharing may result in decreasing the losses. Future Research: A fieldwork will be conducted in disaster management context investigating the inter-organisational information-sharing.

  15. Web information retrieval based on ontology

    Science.gov (United States)

    Zhang, Jian

    2013-03-01

    The purpose of the Information Retrieval (IR) is to find a set of documents that are relevant for a specific information need of a user. Traditional Information Retrieval model commonly used in commercial search engine is based on keyword indexing system and Boolean logic queries. One big drawback of traditional information retrieval is that they typically retrieve information without an explicitly defined domain of interest to the users so that a lot of no relevance information returns to users, which burden the user to pick up useful answer from these no relevance results. In order to tackle this issue, many semantic web information retrieval models have been proposed recently. The main advantage of Semantic Web is to enhance search mechanisms with the use of Ontology's mechanisms. In this paper, we present our approach to personalize web search engine based on ontology. In addition, key techniques are also discussed in our paper. Compared to previous research, our works concentrate on the semantic similarity and the whole process including query submission and information annotation.

  16. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  17. The mourning before: can anticipatory grief theory inform family care in adult intensive care?

    Science.gov (United States)

    Coombs, Maureen A

    2010-12-01

    Although anticipatory grief is a much-debated and critiqued bereavement concept, it does offer a way of understanding and exploring expected loss that may be helpful in certain situations. In end-of-life care in adult intensive care units, families often act as proxy decision makers for patients in the transition from curative treatment efforts to planned treatment withdrawal. Despite there being a developed evidence base to inform care of families at this time, few of the clinical studies that provided this evidence were underpinned by bereavement theory. Focusing on end-of-life intensive care practices, this paper integrates work on anticipatory grief and family interventions to present a family-centred framework of care. Through this it is argued that the complex needs of families must be more comprehensively understood by doctors and nurses and that interventions must be more systematically planned to improve quality end-of-life care for families in this setting.

  18. Measuring organizational effectiveness in information and communication technology companies using item response theory.

    Science.gov (United States)

    Trierweiller, Andréa Cristina; Peixe, Blênio César Severo; Tezza, Rafael; Pereira, Vera Lúcia Duarte do Valle; Pacheco, Waldemar; Bornia, Antonio Cezar; de Andrade, Dalton Francisco

    2012-01-01

    The aim of this paper is to measure the effectiveness of the organizations Information and Communication Technology (ICT) from the point of view of the manager, using Item Response Theory (IRT). There is a need to verify the effectiveness of these organizations which are normally associated to complex, dynamic, and competitive environments. In academic literature, there is disagreement surrounding the concept of organizational effectiveness and its measurement. A construct was elaborated based on dimensions of effectiveness towards the construction of the items of the questionnaire which submitted to specialists for evaluation. It demonstrated itself to be viable in measuring organizational effectiveness of ICT companies under the point of view of a manager through using Two-Parameter Logistic Model (2PLM) of the IRT. This modeling permits us to evaluate the quality and property of each item placed within a single scale: items and respondents, which is not possible when using other similar tools.

  19. Contraction theory based adaptive synchronization of chaotic systems

    International Nuclear Information System (INIS)

    Sharma, B.B.; Kar, I.N.

    2009-01-01

    Contraction theory based stability analysis exploits the incremental behavior of trajectories of a system with respect to each other. Application of contraction theory provides an alternative way for stability analysis of nonlinear systems. This paper considers the design of a control law for synchronization of certain class of chaotic systems based on backstepping technique. The controller is selected so as to make the error dynamics between the two systems contracting. Synchronization problem with and without uncertainty in system parameters is discussed and necessary stability proofs are worked out using contraction theory. Suitable adaptation laws for unknown parameters are proposed based on the contraction principle. The numerical simulations verify the synchronization of the chaotic systems. Also parameter estimates converge to their true values with the proposed adaptation laws.

  20. A density gradient theory based method for surface tension calculations

    DEFF Research Database (Denmark)

    Liang, Xiaodong; Michelsen, Michael Locht; Kontogeorgis, Georgios

    2016-01-01

    The density gradient theory has been becoming a widely used framework for calculating surface tension, within which the same equation of state is used for the interface and bulk phases, because it is a theoretically sound, consistent and computationally affordable approach. Based on the observation...... that the optimal density path from the geometric mean density gradient theory passes the saddle point of the tangent plane distance to the bulk phases, we propose to estimate surface tension with an approximate density path profile that goes through this saddle point. The linear density gradient theory, which...... assumes linearly distributed densities between the two bulk phases, has also been investigated. Numerical problems do not occur with these density path profiles. These two approximation methods together with the full density gradient theory have been used to calculate the surface tension of various...

  1. The Foundation Role for Theories of Agency in Understanding Information Systems Design

    Directory of Open Access Journals (Sweden)

    Robert Johnston

    2002-11-01

    Full Text Available In this paper we argue that theories of agency form a foundation upon which we can build a deeper understanding of information systems design. We do so by firstly recognising that information systems are part of purposeful sociotechnical systems and that consequently theories of agency may help in understanding them. We then present two alternative theories of agency (deliberative and situational, mainly drawn from the robotics and artificial intelligence disciplines, and in doing so, we note that existing information system design methods and ontological studies of those methods implicitly adhere to the deliberative theory of agency. We also note that while there are advantages in specific circumstances from utilising the situated theory of agency in designing complex systems, because of their differing ontological commitments, such systems would be difficult to analyse and evaluate using ontologies currently used in information systems. We then provide evidence that such situational information systems can indeed exist, by giving a specific example (the Kanban system, which has emerged from manufacturing practice. We conclude that information systems are likely to benefit from creating design approaches supporting the production of situational systems.

  2. Frame Works: Using Metaphor in Theory and Practice in Information Literacy

    Science.gov (United States)

    Holliday, Wendy

    2017-01-01

    The ACRL Framework for Information Literacy in Higher Education generated a large amount of discourse during its development and adoption. All of this discourse is rich in metaphoric language that can be used as a tool for critical reflection on teaching and learning, information literacy, and the nature and role of theory in the practice of…

  3. Internet-Based Health Information Consumer Skills Intervention for People Living with HIV/AIDS

    Science.gov (United States)

    Kalichman, Seth C.; Cherry, Charsey; Cain, Demetria; Pope, Howard; Kalichman, Moira; Eaton, Lisa; Weinhardt, Lance; Benotsch, Eric G.

    2006-01-01

    Medical information can improve health, and there is an enormous amount of health information available on the Internet. A randomized clinical trial tested the effectiveness of an intervention based on social-cognitive theory to improve information use among people living with HIV/AIDS. Men and women (N = 448) were placed in either (a) an…

  4. Energy Information Data Base: corporate author entries

    International Nuclear Information System (INIS)

    1980-03-01

    One of the controls for information entered into the data bases created and maintained by the DOE Technical Information Center is the standardized name for the corporate entity or the corporate author. The purpose of Energy Information Data Base: Corporate Author Entries (TID-4585-R1) and this supplemental list of authorized or standardized corporate entries is to provide a means for the consistent citing of the names of organizations in bibliographic records. In general, an entry in Corporate Author Entries consists of the seven-digit code number assigned to the particular corporate entity, the two-letter country code, the largest element of the corporate name, the location of the corporate entity, and the smallest element of the corporate name (if provided). This supplement [DOE/TIC-4585-R1(Suppl.5)] contains additions to the base document (TID-4585-R1) and is intended to be used with that publication

  5. Information encryption systems based on Boolean functions

    Directory of Open Access Journals (Sweden)

    Aureliu Zgureanu

    2011-02-01

    Full Text Available An information encryption system based on Boolean functions is proposed. Information processing is done using multidimensional matrices, performing logical operations with these matrices. At the basis of ensuring high level security of the system the complexity of solving the problem of building systems of Boolean functions that depend on many variables (tens and hundreds is set. Such systems represent the private key. It varies both during the encryption and decryption of information, and during the transition from one message to another.

  6. Improving information for community-based adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Huq, Saleemul

    2011-10-15

    Community-based adaptation aims to empower local people to cope with and plan for the impacts of climate change. In a world where knowledge equals power, you could be forgiven for thinking that enabling this type of adaptation boils down to providing local people with information. Conventional approaches to planning adaptation rely on 'expert' advice and credible 'science' from authoritative information providers such as the Intergovernmental Panel on Climate Change. But to truly support the needs of local communities, this information needs to be more site-specific, more user-friendly and more inclusive of traditional knowledge and existing coping practices.

  7. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  8. Statistical lamb wave localization based on extreme value theory

    Science.gov (United States)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  9. System of marketing deciding support based on game theory

    Directory of Open Access Journals (Sweden)

    Gordana Dukić

    2008-12-01

    Full Text Available Quantitative methods and models can be applied in numerous spheres of marketing deciding. The choice of optimal strategy in product advertising is one of the problems that the marketing-management often meets. The use of models developed within the framework of game theory makes significantly easier to find out the solutions of conflict situations that appear herewith. The system of deciding support presented in this work is based on the supposition that two opposed sides take part in the game. With the aim of deciding process promotion, the starting model incorporates computer simulation of percentile changes in the market share that represent elements of payment matrix. The supposition is that the random variables that represent them follow the normal division. It is necessary to carry out the evaluation of their parameters because of relevant data. Information techniques, computer and the adequate program applications take the special position in solving and analysis of the suggested model. This kind of their application represents the basic characteristic of the deciding support system.

  10. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    Directory of Open Access Journals (Sweden)

    Kaijuan Yuan

    2016-01-01

    Full Text Available Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  11. Continuing bonds in bereavement: an attachment theory based perspective.

    Science.gov (United States)

    Field, Nigel P; Gao, Beryl; Paderna, Lisa

    2005-05-01

    An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual differences in attachment security on effective versus ineffective use of CB in coping with bereavement also is addressed. Finally, the moderating influence of type of loss (e.g., death of a spouse vs. child), culture, and religion on type of CB expression within an overarching attachment framework is discussed.

  12. Application of the Theory of Constraints in Project Based Structures

    Directory of Open Access Journals (Sweden)

    Martynas Sarapinas

    2011-04-01

    Full Text Available The article deals with the application of the Theory of Constraints (TOC in project management. This article involves a short introduction to TOC as a project management method and deep analysis of project management specialties using the TOC: TOC based project planning, timetable management, tasks synchronization, project control and “relay runner work ethic”. Moreover, the article describes traditional and TOC based project management theories in their comparison, and emphasize the main benefits we received as the results of the study. Article in Lithuanian

  13. Friction Theory Prediction of Crude Oil Viscosity at Reservoir Conditions Based on Dead Oil Properties

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2003-01-01

    The general one-parameter friction theory (f-theory) models have been further extended to the prediction of the viscosity of real "live" reservoir fluids based on viscosity measurements of the "dead" oil and the compositional information of the live fluid. This work representation of the viscosity...... of real fluids is obtained by a simple one-parameter tuning of a linear equation derived from a general one-parameter f-theory model. Further, this is achieved using simple cubic equations of state (EOS), such as the Peng-Robinson (PR) EOS or the Soave-Redlich-Kwong (SRK) EOS, which are commonly used...... within the oil industry. In sake of completeness, this work also presents a simple characterization procedure which is based on compositional information of an oil sample. This procedure provides a method for characterizing an oil into a number of compound groups along with the critical constants...

  14. Deconstructing dementia and delirium hospital practice: using cultural historical activity theory to inform education approaches.

    Science.gov (United States)

    Teodorczuk, Andrew; Mukaetova-Ladinska, Elizabeta; Corbett, Sally; Welfare, Mark

    2015-08-01

    Older patients with dementia and delirium receive suboptimal hospital care. Policy calls for more effective education to address this though there is little consensus on what this entails. The purpose of this clarification study is to explore how practice gaps are constructed in relation to managing the confused hospitalised older patient. The intent is to inform educational processes in the work-place beyond traditional approaches such as training. Adopting grounded theory as a research method and working within a social constructionist paradigm we explored the practice gaps of 15 healthcare professionals by interview and conducted five focus groups with patients, carers and Liaison mental health professionals. Data were thematically analysed by constant comparison and theoretical sampling was undertaken until saturation reached. Categories were identified and pragmatic concepts developed grounded within the data. Findings were then further analysed using cultural historical activity theory as a deductive lens. Practice gaps in relation to managing the confused older patient are determined by factors operating at individual (knowledge and skill gaps, personal philosophy, task based practice), team (leadership, time and ward environmental factors) and organisational (power relationships, dominance of medical model, fragmentation of care services) levels. Conceptually, practice appeared to be influenced by socio-cultural ward factors and compounded by a failure to join up existing "patient" knowledge amongst professionals. Applying cultural historical activity theory to further illuminate the findings, the central object is defined as learning about the patient and the mediating artifacts are the care relationships. The overarching medical dominance emerges as an important cultural historical factor at play and staff rules and divisions of labour are exposed. Lastly key contradictions and tensions in the system that work against learning about the patient are

  15. Information Superiority and Game Theory: The Value of Varying Levels of Information

    National Research Council Canada - National Science Library

    McIntosh, Gary

    2002-01-01

    .... This thesis examines how various levels of information and information superiority affect strategy choices and decision-making in determining the payoff value for opposing forces in a classic zero-sum two-sided contest...

  16. An Intuitionistic Fuzzy Stochastic Decision-Making Method Based on Case-Based Reasoning and Prospect Theory

    Directory of Open Access Journals (Sweden)

    Peng Li

    2017-01-01

    Full Text Available According to the case-based reasoning method and prospect theory, this paper mainly focuses on finding a way to obtain decision-makers’ preferences and the criterion weights for stochastic multicriteria decision-making problems and classify alternatives. Firstly, we construct a new score function for an intuitionistic fuzzy number (IFN considering the decision-making environment. Then, we aggregate the decision-making information in different natural states according to the prospect theory and test decision-making matrices. A mathematical programming model based on a case-based reasoning method is presented to obtain the criterion weights. Moreover, in the original decision-making problem, we integrate all the intuitionistic fuzzy decision-making matrices into an expectation matrix using the expected utility theory and classify or rank the alternatives by the case-based reasoning method. Finally, two illustrative examples are provided to illustrate the implementation process and applicability of the developed method.

  17. Information theory and robotics meet to study predator-prey interactions

    Science.gov (United States)

    Neri, Daniele; Ruberto, Tommaso; Cord-Cruz, Gabrielle; Porfiri, Maurizio

    2017-07-01

    Transfer entropy holds promise to advance our understanding of animal behavior, by affording the identification of causal relationships that underlie animal interactions. A critical step toward the reliable implementation of this powerful information-theoretic concept entails the design of experiments in which causal relationships could be systematically controlled. Here, we put forward a robotics-based experimental approach to test the validity of transfer entropy in the study of predator-prey interactions. We investigate the behavioral response of zebrafish to a fear-evoking robotic stimulus, designed after the morpho-physiology of the red tiger oscar and actuated along preprogrammed trajectories. From the time series of the positions of the zebrafish and the robotic stimulus, we demonstrate that transfer entropy correctly identifies the influence of the stimulus on the focal subject. Building on this evidence, we apply transfer entropy to study the interactions between zebrafish and a live red tiger oscar. The analysis of transfer entropy reveals a change in the direction of the information flow, suggesting a mutual influence between the predator and the prey, where the predator adapts its strategy as a function of the movement of the prey, which, in turn, adjusts its escape as a function of the predator motion. Through the integration of information theory and robotics, this study posits a new approach to study predator-prey interactions in freshwater fish.

  18. Critical theory as an approach to the ethics of information security.

    Science.gov (United States)

    Stahl, Bernd Carsten; Doherty, Neil F; Shaw, Mark; Janicke, Helge

    2014-09-01

    Information security can be of high moral value. It can equally be used for immoral purposes and have undesirable consequences. In this paper we suggest that critical theory can facilitate a better understanding of possible ethical issues and can provide support when finding ways of addressing them. The paper argues that critical theory has intrinsic links to ethics and that it is possible to identify concepts frequently used in critical theory to pinpoint ethical concerns. Using the example of UK electronic medical records the paper demonstrates that a critical lens can highlight issues that traditional ethical theories tend to overlook. These are often linked to collective issues such as social and organisational structures, which philosophical ethics with its typical focus on the individual does not tend to emphasise. The paper suggests that this insight can help in developing ways of researching and innovating responsibly in the area of information security.

  19. INFORMATIONAL-METHODICAL SUPPORT OF THE COURSE «MATHEMATICAL LOGIC AND THEORY OF ALGORITHMS»

    Directory of Open Access Journals (Sweden)

    Y. I. Sinko

    2010-06-01

    Full Text Available In this article the basic principles of training technique of future teachers of mathematics to foundations of mathematical logic and theory of algorithms in the Kherson State University with the use of information technologies are examined. General description of functioning of the methodical system of learning of mathematical logic with the use of information technologies, in that variant, when information technologies are presented by the integrated specialized programmatic environment of the educational purpose «MatLog» is given.

  20. Language-based multimedia information retrieval

    NARCIS (Netherlands)

    de Jong, Franciska M.G.; Gauvain, J.L.; Hiemstra, Djoerd; Netter, K.

    2000-01-01

    This paper describes various methods and approaches for language-based multimedia information retrieval, which have been developed in the projects POP-EYE and OLIVE and which will be developed further in the MUMIS project. All of these project aim at supporting automated indexing of video material

  1. Nano-resonator frequency response based on strain gradient theory

    International Nuclear Information System (INIS)

    Miandoab, Ehsan Maani; Yousefi-Koma, Aghil; Pishkenari, Hossein Nejat; Fathi, Mohammad

    2014-01-01

    This paper aims to explore the dynamic behaviour of a nano-resonator under ac and dc excitation using strain gradient theory. To achieve this goal, the partial differential equation of nano-beam vibration is first converted to an ordinary differential equation by the Galerkin projection method and the lumped model is derived. Lumped parameters of the nano-resonator, such as linear and nonlinear springs and damper coefficients, are compared with those of classical theory and it is demonstrated that beams with smaller thickness display greater deviation from classical parameters. Stable and unstable equilibrium points based on classic and non-classical theories are also compared. The results show that, regarding the applied dc voltage, the dynamic behaviours expected by classical and non-classical theories are significantly different, such that one theory predicts the un-deformed shape as the stable condition, while the other theory predicts that the beam will experience bi-stability. To obtain the frequency response of the nano-resonator, a general equation including cubic and quadratic nonlinearities in addition to parametric electrostatic excitation terms is derived, and the analytical solution is determined using a second-order multiple scales method. Based on frequency response analysis, the softening and hardening effects given by two theories are investigated and compared, and it is observed that neglecting the size effect can lead to two completely different predictions in the dynamic behaviour of the resonators. The findings of this article can be helpful in the design and characterization of the size-dependent dynamic behaviour of resonators on small scales. (paper)

  2. Unifying ecology and macroevolution with individual-based theory.

    Science.gov (United States)

    Rosindell, James; Harmon, Luke J; Etienne, Rampal S

    2015-05-01

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. © 2015 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.

  3. A re-examination of information seeking behaviour in the context of activity theory

    Directory of Open Access Journals (Sweden)

    Wilson T.D.

    2006-01-01

    Full Text Available Introduction. Activity theory, developed in the USSR as a Marxist alternative to Western psychology, has been applied widely in educational studies and increasingly in human-computer interaction research. Argument. The key elements of activity theory, Motivation, Goal, Activity, Tools, Object, Outcome, Rules, Community and Division of labour are all directly applicable to the conduct of information behaviour research. An activity-theoretical approach to information behaviour research would provide a sound basis for the elaboration of contextual issues, for the discovering of organizational and other contradictions that affect information behaviour. It may be used to aid the design and analysis of investigations. Elaboration. The basic ideas of activity theory are outlined and an attempt is made to harmonize different perspectives. A contrast is made between an activity system perspective and an activity process perspective and a diagrammatic representation of the process perspective is offered. Conclusion. Activity theory is not a predictive theory but a conceptual framework within which different theoretical perspectives may be employed. Typically, it is suggested that several methods of data collection should be employed and that the time frame for investigation should be long enough for the full range of contextual issues to emerge. Activity theory offers not only a useful conceptual framework, but also a coherent terminology to be shared by researchers, and a rapidly developing body of literature in associated disciplines.

  4. A Model of Statistics Performance Based on Achievement Goal Theory.

    Science.gov (United States)

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  5. Toward an Instructionally Oriented Theory of Example-Based Learning

    Science.gov (United States)

    Renkl, Alexander

    2014-01-01

    Learning from examples is a very effective means of initial cognitive skill acquisition. There is an enormous body of research on the specifics of this learning method. This article presents an instructionally oriented theory of example-based learning that integrates theoretical assumptions and findings from three research areas: learning from…

  6. New MPPT algorithm based on hybrid dynamical theory

    KAUST Repository

    Elmetennani, Shahrazed

    2014-11-01

    This paper presents a new maximum power point tracking algorithm based on the hybrid dynamical theory. A multiceli converter has been considered as an adaptation stage for the photovoltaic chain. The proposed algorithm is a hybrid automata switching between eight different operating modes, which has been validated by simulation tests under different working conditions. © 2014 IEEE.

  7. New unified field theory based on the conformal group

    Energy Technology Data Exchange (ETDEWEB)

    Pessa, E [Rome Univ. (Italy). Ist. di Matematica

    1980-10-01

    Based on a six-dimensional generalization of Maxwell's equations, a new unified theory of the electromagnetic and gravitational field is developed. Additional space-time coordinates are interpreted only as mathematical tools in order to obtain a linear realization of the four-dimensional conformal group.

  8. Technical Note: Application of Decision Theory Based Criteria for ...

    African Journals Online (AJOL)

    Technical Note: Application of Decision Theory Based Criteria for Structural Appraisal of a Building during Construction. ... Nigerian Journal of Technology ... reliability of concrete in a structure during construction, a case study of laboratory block for College of Continuing Education, University of Port Harcourt, Rivers State.

  9. Mobile applications for weight management: theory-based content analysis.

    Science.gov (United States)

    Azar, Kristen M J; Lesser, Lenard I; Laing, Brian Y; Stephens, Janna; Aurora, Magi S; Burke, Lora E; Palaniappan, Latha P

    2013-11-01

    The use of smartphone applications (apps) to assist with weight management is increasingly prevalent, but the quality of these apps is not well characterized. The goal of the study was to evaluate diet/nutrition and anthropometric tracking apps based on incorporation of features consistent with theories of behavior change. A comparative, descriptive assessment was conducted of the top-rated free apps in the Health and Fitness category available in the iTunes App Store. Health and Fitness apps (N=200) were evaluated using predetermined inclusion/exclusion criteria and categorized based on commonality in functionality, features, and developer description. Four researchers then evaluated the two most popular apps in each category using two instruments: one based on traditional behavioral theory (score range: 0-100) and the other on the Fogg Behavioral Model (score range: 0-6). Data collection and analysis occurred in November 2012. Eligible apps (n=23) were divided into five categories: (1) diet tracking; (2) healthy cooking; (3) weight/anthropometric tracking; (4) grocery decision making; and (5) restaurant decision making. The mean behavioral theory score was 8.1 (SD=4.2); the mean persuasive technology score was 1.9 (SD=1.7). The top-rated app on both scales was Lose It! by Fitnow Inc. All apps received low overall scores for inclusion of behavioral theory-based strategies. © 2013 American Journal of Preventive Medicine.

  10. Applications of decision theory to test-based decision making

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1987-01-01

    The use of Bayesian decision theory to solve problems in test-based decision making is discussed. Four basic decision problems are distinguished: (1) selection; (2) mastery; (3) placement; and (4) classification, the situation where each treatment has its own criterion. Each type of decision can be

  11. PhysarumSoft: An update based on rough set theory

    Science.gov (United States)

    Schumann, Andrew; Pancerz, Krzysztof

    2017-07-01

    PhysarumSoft is a software tool consisting of two modules developed for programming Physarum machines and simulating Physarum games, respectively. The paper briefly discusses what has been added since the last version released in 2015. New elements in both modules are based on rough set theory. Rough sets are used to model behaviour of Physarum machines and to describe strategy games.

  12. New MPPT algorithm based on hybrid dynamical theory

    KAUST Repository

    Elmetennani, Shahrazed; Laleg-Kirati, Taous-Meriem; Benmansour, K.; Boucherit, M. S.; Tadjine, M.

    2014-01-01

    This paper presents a new maximum power point tracking algorithm based on the hybrid dynamical theory. A multiceli converter has been considered as an adaptation stage for the photovoltaic chain. The proposed algorithm is a hybrid automata switching between eight different operating modes, which has been validated by simulation tests under different working conditions. © 2014 IEEE.

  13. Unifying ecology and macroevolution with individual-based theory

    NARCIS (Netherlands)

    Rosindell, James; Harmon, Luke J.; Etienne, Rampal S.

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models,

  14. Analytical implications of using practice theory in workplace information literacy research

    DEFF Research Database (Denmark)

    Moring, Camilla Elisabeth; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical...... focus and interest when researching workplace information literacy. Two practice theoretical perspectives are selected, one by Theodore Schatzki and one by Etienne Wenger, and their general commonalities and differences are analysed and discussed. Analysis: The two practice theories and their main ideas...... of what constitute practices, how practices frame social life and the central concepts used to explain this, are presented. Then the application of the theories within workplace information literacy research is briefly explored. Results and Conclusion: The two theoretical perspectives share some...

  15. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers. 

  16. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.

  17. Evidence for an expectancy-based theory of avoidance behaviour.

    Science.gov (United States)

    Declercq, Mieke; De Houwer, Jan; Baeyens, Frank

    2008-01-01

    In most studies on avoidance learning, participants receive an aversive unconditioned stimulus after a warning signal is presented, unless the participant performs a particular response. Lovibond (2006) recently proposed a cognitive theory of avoidance learning, according to which avoidance behaviour is a function of both Pavlovian and instrumental conditioning. In line with this theory, we found that avoidance behaviour was based on an integration of acquired knowledge about, on the one hand, the relation between stimuli and, on the other hand, the relation between behaviour and stimuli.

  18. Theory-based Bayesian models of inductive learning and reasoning.

    Science.gov (United States)

    Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

    2006-07-01

    Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

  19. Web-based Construction Information Management System

    Directory of Open Access Journals (Sweden)

    David Scott

    2012-11-01

    Full Text Available Centralised information systems that are accessible to all parties in a construction project are powerful tools in the quest to improve efficiency and to enhance the flow of information within the construction industry. This report points out the maturity of the necessary IT technology, the availability and the suitability of existing commercial products.Some of these products have been studied and analysed. An evaluation and selection process based on the functions offered in the products and their utility is presented. A survey of local construction personnel has been used to collect typical weighting data and performance criteria used in the evaluation process.

  20. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  1. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    Science.gov (United States)

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  2. Systematizing Web Search through a Meta-Cognitive, Systems-Based, Information Structuring Model (McSIS)

    Science.gov (United States)

    Abuhamdieh, Ayman H.; Harder, Joseph T.

    2015-01-01

    This paper proposes a meta-cognitive, systems-based, information structuring model (McSIS) to systematize online information search behavior based on literature review of information-seeking models. The General Systems Theory's (GST) prepositions serve as its framework. Factors influencing information-seekers, such as the individual learning…

  3. An integrative, experience-based theory of attentional control.

    Science.gov (United States)

    Wilder, Matthew H; Mozer, Michael C; Wickens, Christopher D

    2011-02-09

    Although diverse, theories of visual attention generally share the notion that attention is controlled by some combination of three distinct strategies: (1) exogenous cuing from locally contrasting primitive visual features, such as abrupt onsets or color singletons (e.g., L. Itti, C. Koch, & E. Neiber, 1998), (2) endogenous gain modulation of exogenous activations, used to guide attention to task-relevant features (e.g., V. Navalpakkam & L. Itti, 2007; J. Wolfe, 1994, 2007), and (3) endogenous prediction of likely locations of interest, based on task and scene gist (e.g., A. Torralba, A. Oliva, M. Castelhano, & J. Henderson, 2006). However, little work has been done to synthesize these disparate theories. In this work, we propose a unifying conceptualization in which attention is controlled along two dimensions: the degree of task focus and the contextual scale of operation. Previously proposed strategies-and their combinations-can be viewed as instances of this one mechanism. Thus, this theory serves not as a replacement for existing models but as a means of bringing them into a coherent framework. We present an implementation of this theory and demonstrate its applicability to a wide range of attentional phenomena. The model accounts for key results in visual search with synthetic images and makes reasonable predictions for human eye movements in search tasks involving real-world images. In addition, the theory offers an unusual perspective on attention that places a fundamental emphasis on the role of experience and task-related knowledge.

  4. Knowledge-based information systems in practice

    CERN Document Server

    Jain, Lakhmi; Watada, Junzo; Howlett, Robert

    2015-01-01

    This book contains innovative research from leading researchers who presented their work at the 17th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2013, held in Kitakyusha, Japan, in September 2013. The conference provided a competitive field of 236 contributors, from which 38 authors expanded their contributions and only 21 published. A plethora of techniques and innovative applications are represented within this volume. The chapters are organized using four themes. These topics include: data mining, knowledge management, advanced information processes and system modelling applications. Each topic contains multiple contributions and many offer case studies or innovative examples. Anyone that wants to work with information repositories or process knowledge should consider reading one or more chapters focused on their technique of choice. They may also benefit from reading other chapters to assess if an alternative technique represents a more suitable app...

  5. How cultural evolutionary theory can inform social psychology and vice versa.

    Science.gov (United States)

    Mesoudi, Alex

    2009-10-01

    Cultural evolutionary theory is an interdisciplinary field in which human culture is viewed as a Darwinian process of variation, competition, and inheritance, and the tools, methods, and theories developed by evolutionary biologists to study genetic evolution are adapted to study cultural change. It is argued here that an integration of the theories and findings of mainstream social psychology and of cultural evolutionary theory can be mutually beneficial. Social psychology provides cultural evolution with a set of empirically verified microevolutionary cultural processes, such as conformity, model-based biases, and content biases, that are responsible for specific patterns of cultural change. Cultural evolutionary theory provides social psychology with ultimate explanations for, and an understanding of the population-level consequences of, many social psychological phenomena, such as social learning, conformity, social comparison, and intergroup processes, as well as linking social psychology with other social science disciplines such as cultural anthropology, archaeology, and sociology.

  6. Automatic Trading Agent. RMT Based Portfolio Theory and Portfolio Selection

    Science.gov (United States)

    Snarska, M.; Krzych, J.

    2006-11-01

    Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, arosen from lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision-making system, which allows a single investor to use complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data, biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are better estimated and controlled. These concepts are applied to the Warsaw Stock Exchange Simulator {http://gra.onet.pl}. The result of the simulation is 18% level of gains in comparison with respective 10% loss of the Warsaw Stock Exchange main index WIG.

  7. Theory-Based Stakeholder Evaluation – applied. Competing Stakeholder Theories in the Quality Management of Primary Education

    DEFF Research Database (Denmark)

    Hansen, Morten Balle; Heilesen, J. B.

    In the broader context of evaluation design, this paper examines and compares pros and cons of a theory-based approach to evaluation (TBE) with the Theory-Based Stakeholder evaluation (TSE) model, introduced by Morten Balle Hansen and Evert Vedung (Hansen and Vedung 2010). While most approaches...... to TBE construct one unitary theory of the program (Coryn et al. 2011), the TSE-model emphasizes the importance of keeping theories of diverse stakeholders apart. This paper applies the TSE-model to an evaluation study conducted by the Danish Evaluation Institute (EVA) of the Danish system of quality......-model, as an alternative to traditional program theory evaluation....

  8. Fast mutual-information-based contrast enhancement

    Science.gov (United States)

    Cao, Gang; Yu, Lifang; Tian, Huawei; Huang, Xianglin; Wang, Yongbin

    2017-07-01

    Recently, T. Celik proposed an effective image contrast enhancement (CE) method based on spatial mutual information and PageRank (SMIRANK). According to the state-of-the-art evaluation criteria, it achieves the best visual enhancement quality among existing global CE methods. However, SMIRANK runs much slower than the other counterparts, such as histogram equalization (HE) and adaptive gamma correction. Low computational complexity is also required for good CE algorithms. In this paper, we novelly propose a fast SMIRANK algorithm, called FastSMIRANK. It integrates both spatial and gray-level downsampling into the generation of pixel value mapping function. Moreover, the computation of rank vectors is speeded up by replacing PageRank with a simple yet efficient row-based operation of mutual information matrix. Extensive experimental results show that the proposed FastSMIRANK could accelerate the processing speed of SMIRANK by about 20 times, and is even faster than HE. Comparable enhancement quality is preserved simultaneously.

  9. An evaluation of web-based information.

    Science.gov (United States)

    Murphy, Rebecca; Frost, Susie; Webster, Peter; Schmidt, Ulrike

    2004-03-01

    To evaluate the quality of web-based information on the treatment of eating disorders and to investigate potential indicators of content quality. Two search engines were queried to obtain 15 commonly accessed websites about eating disorders. Two reviewers evaluated the characteristics, quality of content, and accountability of the sites. Intercorrelations between variables were calculated. The overall quality of the sites was poor based on the outcome measures used. All quality of content measures correlated with a measure of accountability (Silberg, W.M., Lundberg, G.D., & Mussachio, R.A., 1993). There is a lack of quality information on the treatment of eating disorders on the web. Although accountability criteria may be useful indicators of content quality, there is a need to investigate whether these can be usefully applied to other mental health areas. Copyright 2004 by Wiley Periodicals, Inc. Int J Eat Disord 35: 145-154, 2004.

  10. An Introduction to Black Holes, Information and the String Theory Revolution: The Holographic Universe

    International Nuclear Information System (INIS)

    Israel, W

    2006-01-01

    lead to inconsistencies. Students and non-specialists will welcome this book, which provides an entry into this fascinating realm at a level that can be enjoyed by an enterprising undergraduate. The first chapter introduces the Schwarzschild black hole and the various coordinate systems used for its description. In four brief chapters (29 pages) the authors then manage a clear presentation of the thermal properties of quantum fields in Rindler and Schwarzschild space that skirts the operator formalism of QFT. Two further chapters treat charged black holes and the stretched-horizon description of black hole electrodynamics. Chapter 8, 'The Laws of Nature', explains how information is quantified, the quantum xerox principle and the entanglement entropy of black holes, with a detailed account of how this evolves as the hole evaporates. This sets the stage for a discussion of the black hole information puzzle and the complementarity principle in chapter 9. The pace heats up in the second part of the book, which in 48 pages sketches a variety of topics: Bousso's entropy bound and holography, the AdS/CFT correspondence, a 13 page introduction to string theory and the ideas underlying the string-based derivations of the entropy-area relation for higher-dimensional black holes. This well-planned, stimulating and sometimes provocative book can be enthusiastically recommended. (book review)

  11. Hamiltonian theories quantization based on a probability operator

    International Nuclear Information System (INIS)

    Entral'go, E.E.

    1986-01-01

    The quantization method with a linear reflection of classical coordinate-momentum-time functions Λ(q,p,t) at quantum operators in a space of quantum states ψ, is considered. The probability operator satisfies a system of equations representing the principles of dynamical and canonical correspondences between the classical and quantum theories. The quantization based on a probability operator leads to a quantum theory with a nonnegative joint coordinate-momentum distribution function for any state ψ. The main consequences of quantum mechanics with a probability operator are discussed in comparison with the generally accepted quantum and classical theories. It is shown that a probability operator leads to an appearance of some new notions called ''subquantum'' ones. Hence the quantum theory with a probability operator does not pretend to any complete description of physical reality in terms of classical variables and by this reason contains no problems like Einstein-Podolsky-Rosen paradox. The results of some concrete problems are given: a free particle, a harmonic oscillator, an electron in the Coulomb field. These results give hope on the possibility of an experimental verification of the quantization based on a probability operator

  12. Neighborhood Hypergraph Based Classification Algorithm for Incomplete Information System

    Directory of Open Access Journals (Sweden)

    Feng Hu

    2015-01-01

    Full Text Available The problem of classification in incomplete information system is a hot issue in intelligent information processing. Hypergraph is a new intelligent method for machine learning. However, it is hard to process the incomplete information system by the traditional hypergraph, which is due to two reasons: (1 the hyperedges are generated randomly in traditional hypergraph model; (2 the existing methods are unsuitable to deal with incomplete information system, for the sake of missing values in incomplete information system. In this paper, we propose a novel classification algorithm for incomplete information system based on hypergraph model and rough set theory. Firstly, we initialize the hypergraph. Second, we classify the training set by neighborhood hypergraph. Third, under the guidance of rough set, we replace the poor hyperedges. After that, we can obtain a good classifier. The proposed approach is tested on 15 data sets from UCI machine learning repository. Furthermore, it is compared with some existing methods, such as C4.5, SVM, NavieBayes, and KNN. The experimental results show that the proposed algorithm has better performance via Precision, Recall, AUC, and F-measure.

  13. A Spread Willingness Computing-Based Information Dissemination Model

    Science.gov (United States)

    Cui, Zhiming; Zhang, Shukui

    2014-01-01

    This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network. PMID:25110738

  14. A spread willingness computing-based information dissemination model.

    Science.gov (United States)

    Huang, Haojing; Cui, Zhiming; Zhang, Shukui

    2014-01-01

    This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.

  15. A Spread Willingness Computing-Based Information Dissemination Model

    Directory of Open Access Journals (Sweden)

    Haojing Huang

    2014-01-01

    Full Text Available This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user’s spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.

  16. Commitment-based action: Rational choice theory and contrapreferential choice

    Directory of Open Access Journals (Sweden)

    Radovanović Bojana

    2014-01-01

    Full Text Available This paper focuses on Sen’s concept of contrapreferential choice. Sen has developed this concept in order to overcome weaknesses of the rational choice theory. According to rational choice theory a decision-maker can be always seen as someone who maximises utility, and each choice he makes as the one that brings to him the highest level of personal wellbeing. Sen argues that in some situations we chose alternatives that bring us lower level of wellbeing than we could achieve if we had chosen some other alternative available to us. This happens when we base our decisions on moral principles, when we act out of duty. Sen calls such action a commitment-based action. When we act out of commitment we actually neglect our preferences and thus we make a contrapreferential choice, as Sen argues. This paper shows that, contrary to Sen, a commitment-based action can be explained within the framework of rational choice theory. However, when each choice we make can be explained within the framework of rational choice theory, when in everything we do maximisation principle can be loaded, then the variety of our motives and traits is lost, and the explanatory power of the rational choice theory is questionable. [Projekat Ministarstva nauke Republike Srbije, br. 47009: Evropske integracije i društveno-ekonomske promene privrede Srbije na putu ka EU i br. 179015: Izazovi i perspektive strukturnih promena u Srbiji: Strateški pravci ekonomskog razvoja i usklađivanje sa zahtevima EU

  17. Information Filtering Based on Users' Negative Opinions

    Science.gov (United States)

    Guo, Qiang; Li, Yang; Liu, Jian-Guo

    2013-05-01

    The process of heat conduction (HC) has recently found application in the information filtering [Zhang et al., Phys. Rev. Lett.99, 154301 (2007)], which is of high diversity but low accuracy. The classical HC model predicts users' potential interested objects based on their interesting objects regardless to the negative opinions. In terms of the users' rating scores, we present an improved user-based HC (UHC) information model by taking into account users' positive and negative opinions. Firstly, the objects rated by users are divided into positive and negative categories, then the predicted interesting and dislike object lists are generated by the UHC model. Finally, the recommendation lists are constructed by filtering out the dislike objects from the interesting lists. By implementing the new model based on nine similarity measures, the experimental results for MovieLens and Netflix datasets show that the new model considering negative opinions could greatly enhance the accuracy, measured by the average ranking score, from 0.049 to 0.036 for Netflix and from 0.1025 to 0.0570 for Movielens dataset, reduced by 26.53% and 44.39%, respectively. Since users prefer to give positive ratings rather than negative ones, the negative opinions contain much more information than the positive ones, the negative opinions, therefore, are very important for understanding users' online collective behaviors and improving the performance of HC model.

  18. Ontology for cell-based geographic information

    Science.gov (United States)

    Zheng, Bin; Huang, Lina; Lu, Xinhai

    2009-10-01

    Inter-operability is a key notion in geographic information science (GIS) for the sharing of geographic information (GI). That requires a seamless translation among different information sources. Ontology is enrolled in GI discovery to settle the semantic conflicts for its natural language appearance and logical hierarchy structure, which are considered to be able to provide better context for both human understanding and machine cognition in describing the location and relationships in the geographic world. However, for the current, most studies on field ontology are deduced from philosophical theme and not applicable for the raster expression in GIS-which is a kind of field-like phenomenon but does not physically coincide to the general concept of philosophical field (mostly comes from the physics concepts). That's why we specifically discuss the cell-based GI ontology in this paper. The discussion starts at the investigation of the physical characteristics of cell-based raster GI. Then, a unified cell-based GI ontology framework for the recognition of the raster objects is introduced, from which a conceptual interface for the connection of the human epistemology and the computer world so called "endurant-occurrant window" is developed for the better raster GI discovery and sharing.

  19. The application of information theory for the research of aging and aging-related diseases.

    Science.gov (United States)

    Blokh, David; Stambler, Ilia

    2017-10-01

    This article reviews the application of information-theoretical analysis, employing measures of entropy and mutual information, for the study of aging and aging-related diseases. The research of aging and aging-related diseases is particularly suitable for the application of information theory methods, as aging processes and related diseases are multi-parametric, with continuous parameters coexisting alongside discrete parameters, and with the relations between the parameters being as a rule non-linear. Information theory provides unique analytical capabilities for the solution of such problems, with unique advantages over common linear biostatistics. Among the age-related diseases, information theory has been used in the study of neurodegenerative diseases (particularly using EEG time series for diagnosis and prediction), cancer (particularly for establishing individual and combined cancer biomarkers), diabetes (mainly utilizing mutual information to characterize the diseased and aging states), and heart disease (mainly for the analysis of heart rate variability). Few works have employed information theory for the analysis of general aging processes and frailty, as underlying determinants and possible early preclinical diagnostic measures for aging-related diseases. Generally, the use of information-theoretical analysis permits not only establishing the (non-linear) correlations between diagnostic or therapeutic parameters of interest, but may also provide a theoretical insight into the nature of aging and related diseases by establishing the measures of variability, adaptation, regulation or homeostasis, within a system of interest. It may be hoped that the increased use of such measures in research may considerably increase diagnostic and therapeutic capabilities and the fundamental theoretical mathematical understanding of aging and disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Detection of network attacks based on adaptive resonance theory

    Science.gov (United States)

    Bukhanov, D. G.; Polyakov, V. M.

    2018-05-01

    The paper considers an approach to intrusion detection systems using a neural network of adaptive resonant theory. It suggests the structure of an intrusion detection system consisting of two types of program modules. The first module manages connections of user applications by preventing the undesirable ones. The second analyzes the incoming network traffic parameters to check potential network attacks. After attack detection, it notifies the required stations using a secure transmission channel. The paper describes the experiment on the detection and recognition of network attacks using the test selection. It also compares the obtained results with similar experiments carried out by other authors. It gives findings and conclusions on the sufficiency of the proposed approach. The obtained information confirms the sufficiency of applying the neural networks of adaptive resonant theory to analyze network traffic within the intrusion detection system.