WorldWideScience

Sample records for information theoretical methods

  1. Robust recognition via information theoretic learning

    CERN Document Server

    He, Ran; Yuan, Xiaotong; Wang, Liang

    2014-01-01

    This Springer Brief represents a comprehensive review of information theoretic methods for robust recognition. A variety of information theoretic methods have been proffered in the past decade, in a large variety of computer vision applications; this work brings them together, attempts to impart the theory, optimization and usage of information entropy.The?authors?resort to a new information theoretic concept, correntropy, as a robust measure and apply it to solve robust face recognition and object recognition problems. For computational efficiency,?the brief?introduces the additive and multip

  2. Inform: Efficient Information-Theoretic Analysis of Collective Behaviors

    Directory of Open Access Journals (Sweden)

    Douglas G. Moore

    2018-06-01

    Full Text Available The study of collective behavior has traditionally relied on a variety of different methodological tools ranging from more theoretical methods such as population or game-theoretic models to empirical ones like Monte Carlo or multi-agent simulations. An approach that is increasingly being explored is the use of information theory as a methodological framework to study the flow of information and the statistical properties of collectives of interacting agents. While a few general purpose toolkits exist, most of the existing software for information theoretic analysis of collective systems is limited in scope. We introduce Inform, an open-source framework for efficient information theoretic analysis that exploits the computational power of a C library while simplifying its use through a variety of wrappers for common higher-level scripting languages. We focus on two such wrappers here: PyInform (Python and rinform (R. Inform and its wrappers are cross-platform and general-purpose. They include classical information-theoretic measures, measures of information dynamics and information-based methods to study the statistical behavior of collective systems, and expose a lower-level API that allow users to construct measures of their own. We describe the architecture of the Inform framework, study its computational efficiency and use it to analyze three different case studies of collective behavior: biochemical information storage in regenerating planaria, nest-site selection in the ant Temnothorax rugatulus, and collective decision making in multi-agent simulations.

  3. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    Directory of Open Access Journals (Sweden)

    Ross S Williamson

    2015-04-01

    Full Text Available Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID, uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  4. Comparison of information-theoretic to statistical methods for gene-gene interactions in the presence of genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Sucheston Lara

    2010-09-01

    Full Text Available Abstract Background Multifactorial diseases such as cancer and cardiovascular diseases are caused by the complex interplay between genes and environment. The detection of these interactions remains challenging due to computational limitations. Information theoretic approaches use computationally efficient directed search strategies and thus provide a feasible solution to this problem. However, the power of information theoretic methods for interaction analysis has not been systematically evaluated. In this work, we compare power and Type I error of an information-theoretic approach to existing interaction analysis methods. Methods The k-way interaction information (KWII metric for identifying variable combinations involved in gene-gene interactions (GGI was assessed using several simulated data sets under models of genetic heterogeneity driven by susceptibility increasing loci with varying allele frequency, penetrance values and heritability. The power and proportion of false positives of the KWII was compared to multifactor dimensionality reduction (MDR, restricted partitioning method (RPM and logistic regression. Results The power of the KWII was considerably greater than MDR on all six simulation models examined. For a given disease prevalence at high values of heritability, the power of both RPM and KWII was greater than 95%. For models with low heritability and/or genetic heterogeneity, the power of the KWII was consistently greater than RPM; the improvements in power for the KWII over RPM ranged from 4.7% to 14.2% at for α = 0.001 in the three models at the lowest heritability values examined. KWII performed similar to logistic regression. Conclusions Information theoretic models are flexible and have excellent power to detect GGI under a variety of conditions that characterize complex diseases.

  5. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Meyer Patrick

    2007-01-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  6. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Patrick E. Meyer

    2007-06-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  7. Hash functions and information theoretic security

    DEFF Research Database (Denmark)

    Bagheri, Nasoor; Knudsen, Lars Ramkilde; Naderi, Majid

    2009-01-01

    Information theoretic security is an important security notion in cryptography as it provides a true lower bound for attack complexities. However, in practice attacks often have a higher cost than the information theoretic bound. In this paper we study the relationship between information theoretic...

  8. Toward a Theoretical Framework for Information Science

    Directory of Open Access Journals (Sweden)

    Amanda Spink

    2000-01-01

    Full Text Available Information Science is beginning to develop a theoretical framework for the modeling of users’ interactions with information retrieval (IR technologies within the more holistic context of human information behavior (Spink, 1998b. This paper addresses the following questions: (1 What is the nature of Information Science? and (2 What theoretical framework and model is most appropriate for Information Science? This paper proposes a theoretical framework for Information Science based on an explication of the processes of human information coordinating behavior and information feedback that facilitate the relationship between human information behavior and human interaction with information retrieval (IR technologies (Web, digital libraries, etc..

  9. Information theoretic description of networks

    Science.gov (United States)

    Wilhelm, Thomas; Hollunder, Jens

    2007-11-01

    We present a new information theoretic approach for network characterizations. It is developed to describe the general type of networks with n nodes and L directed and weighted links, i.e., it also works for the simpler undirected and unweighted networks. The new information theoretic measures for network characterizations are based on a transmitter-receiver analogy of effluxes and influxes. Based on these measures, we classify networks as either complex or non-complex and as either democracy or dictatorship networks. Directed networks, in particular, are furthermore classified as either information spreading and information collecting networks. The complexity classification is based on the information theoretic network complexity measure medium articulation (MA). It is proven that special networks with a medium number of links ( L∼n1.5) show the theoretical maximum complexity MA=(log n)2/2. A network is complex if its MA is larger than the average MA of appropriately randomized networks: MA>MAr. A network is of the democracy type if its redundancy Rdictatorship network. In democracy networks all nodes are, on average, of similar importance, whereas in dictatorship networks some nodes play distinguished roles in network functioning. In other words, democracy networks are characterized by cycling of information (or mass, or energy), while in dictatorship networks there is a straight through-flow from sources to sinks. The classification of directed networks into information spreading and information collecting networks is based on the conditional entropies of the considered networks ( H(A/B)=uncertainty of sender node if receiver node is known, H(B/A)=uncertainty of receiver node if sender node is known): if H(A/B)>H(B/A), it is an information collecting network, otherwise an information spreading network. Finally, different real networks (directed and undirected, weighted and unweighted) are classified according to our general scheme.

  10. An Information-Theoretic-Cluster Visualization for Self-Organizing Maps.

    Science.gov (United States)

    Brito da Silva, Leonardo Enzo; Wunsch, Donald C

    2018-06-01

    Improved data visualization will be a significant tool to enhance cluster analysis. In this paper, an information-theoretic-based method for cluster visualization using self-organizing maps (SOMs) is presented. The information-theoretic visualization (IT-vis) has the same structure as the unified distance matrix, but instead of depicting Euclidean distances between adjacent neurons, it displays the similarity between the distributions associated with adjacent neurons. Each SOM neuron has an associated subset of the data set whose cardinality controls the granularity of the IT-vis and with which the first- and second-order statistics are computed and used to estimate their probability density functions. These are used to calculate the similarity measure, based on Renyi's quadratic cross entropy and cross information potential (CIP). The introduced visualizations combine the low computational cost and kernel estimation properties of the representative CIP and the data structure representation of a single-linkage-based grouping algorithm to generate an enhanced SOM-based visualization. The visual quality of the IT-vis is assessed by comparing it with other visualization methods for several real-world and synthetic benchmark data sets. Thus, this paper also contains a significant literature survey. The experiments demonstrate the IT-vis cluster revealing capabilities, in which cluster boundaries are sharply captured. Additionally, the information-theoretic visualizations are used to perform clustering of the SOM. Compared with other methods, IT-vis of large SOMs yielded the best results in this paper, for which the quality of the final partitions was evaluated using external validity indices.

  11. Information-theoretic metamodel of organizational evolution

    Science.gov (United States)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  12. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  13. Informing Physics: Jacob Bekenstein and the Informational Turn in Theoretical Physics

    Science.gov (United States)

    Belfer, Israel

    2014-03-01

    In his PhD dissertation in the early 1970s, the Mexican-Israeli theoretical physicist Jacob Bekenstein developed the thermodynamics of black holes using a generalized version of the second law of thermodynamics. This work made it possible for physicists to describe and analyze black holes using information-theoretical concepts. It also helped to transform information theory into a fundamental and foundational concept in theoretical physics. The story of Bekenstein's work—which was initially opposed by many scientists, including Stephen Hawking—highlights the transformation within physics towards an information-oriented scientific mode of theorizing. This "informational turn" amounted to a mild-mannered revolution within physics, revolutionary without being rebellious.

  14. Information Ergonomics A theoretical approach and practical experience in transportation

    CERN Document Server

    Sandl, Peter

    2012-01-01

    The variety and increasing availability of hypermedia information systems, which are used in stationary applications like operators’ consoles as well as mobile systems, e.g. driver information and navigation systems in automobiles form a foundation for the mediatization of the society. From the human engineering point of view this development and the ensuing increased importance of information systems for economic and private needs require careful deliberation of the derivation and application of ergonomics methods particularly in the field of information systems. This book consists of two closely intertwined parts. The first, theoretical part defines the concept of an information system, followed by an explanation of action regulation as well as cognitive theories to describe man information system interaction. A comprehensive description of information ergonomics concludes the theoretical approach. In the second, practically oriented part of this book authors from industry as well as from academic institu...

  15. Exploring super-gaussianity towards robust information-theoretical time delay estimation

    DEFF Research Database (Denmark)

    Petsatodis, Theodoros; Talantzis, Fotios; Boukis, Christos

    2013-01-01

    the effect upon TDE when modeling the source signal with different speech-based distributions. An information theoretical TDE method indirectly encapsulating higher order statistics (HOS) formed the basis of this work. The underlying assumption of Gaussian distributed source has been replaced...

  16. One-dimensional barcode reading: an information theoretic approach

    Science.gov (United States)

    Houni, Karim; Sawaya, Wadih; Delignon, Yves

    2008-03-01

    In the convergence context of identification technology and information-data transmission, the barcode found its place as the simplest and the most pervasive solution for new uses, especially within mobile commerce, bringing youth to this long-lived technology. From a communication theory point of view, a barcode is a singular coding based on a graphical representation of the information to be transmitted. We present an information theoretic approach for 1D image-based barcode reading analysis. With a barcode facing the camera, distortions and acquisition are modeled as a communication channel. The performance of the system is evaluated by means of the average mutual information quantity. On the basis of this theoretical criterion for a reliable transmission, we introduce two new measures: the theoretical depth of field and the theoretical resolution. Simulations illustrate the gain of this approach.

  17. STRUCTURAL AND METHODICAL MODEL OF INCREASING THE LEVEL OF THEORETICAL TRAINING OF CADETS USING INFORMATION AND COMMUNICATION TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladislav V. Bulgakov

    2018-03-01

    Full Text Available Features of training in higher educational institutions of system of EMERCOM of Russia demand introduction of the new educational techniques and the technical means directed on intensification of educational process, providing an opportunity of preparation of cadets at any time in the independent mode and improving quality of their theoretical knowledge. The authors have developed a structural and methodological model of increasing the level of theoretical training of cadets using information and communication technologies. The proposed structural and methodological model that includes elements to stimulate and enhance cognitive activity, allows you to generate the trajectory of theoretical training of cadets for the entire period of study at the University, to organize a systematic independent work, objective, current and final control of theoretical knowledge. The structural and methodological model for improving the level of theoretical training consists of three main elements: the base of theoretical questions, functional modules "teacher" and "cadet". The basis of the structural and methodological model of increasing the level of theoretical training of cadets is the base of theoretical issues, developed in all disciplines specialty 20.05.01 – fire safety. The functional module "teacher" allows you to create theoretical questions of various kinds, edit questions and delete them from the database if necessary, as well as create tests and monitor their implementation. The functional module "cadet" provides ample opportunities for theoretical training through independent work, testing for current and final control, the implementation of the game form of training in the form of a duel, as well as for the formation of the results of the cadets in the form of statistics and rankings. Structural and methodical model of increasing the level of theoretical training of cadets is implemented in practice in the form of a multi-level automated system

  18. Information theoretic preattentive saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2011-01-01

    Employing an information theoretic operational definition of bottom-up attention from the field of computational visual perception a very general expression for saliency is provided. As opposed to many of the current approaches to determining a saliency map there is no need for an explicit data...... of which features, image information is described. We illustrate our result by determining a few specific saliency maps based on particular choices of features. One of them makes the link with the mapping underlying well-known Harris interest points, which is a result recently obtained in isolation...

  19. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  20. System identification with information theoretic criteria

    NARCIS (Netherlands)

    A.A. Stoorvogel; J.H. van Schuppen (Jan)

    1995-01-01

    textabstractAttention is focused in this paper on the approximation problem of system identification with information theoretic criteria. For a class of problems it is shown that the criterion of mutual information rate is identical to the criterion of exponential-of-quadratic cost and to

  1. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  2. Information-Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory.

    Science.gov (United States)

    Agres, Kat; Abdallah, Samer; Pearce, Marcus

    2018-01-01

    A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different types of statistical information affect listeners' memory for auditory stimuli. We used a combination of behavioral and computational methods to investigate memory for non-linguistic auditory sequences. Participants repeatedly heard tone sequences varying systematically in their information-theoretic properties. Expectedness ratings of tones were collected during three listening sessions, and a recognition memory test was given after each session. Information-theoretic measures of sequential predictability significantly influenced listeners' expectedness ratings, and variations in these properties had a significant impact on memory performance. Predictable sequences yielded increasingly better memory performance with increasing exposure. Computational simulations using a probabilistic model of auditory expectation suggest that listeners dynamically formed a new, and increasingly accurate, implicit cognitive model of the information-theoretic structure of the sequences throughout the experimental session. Copyright © 2017 Cognitive Science Society, Inc.

  3. Information theoretic learning Renyi's entropy and Kernel perspectives

    CERN Document Server

    Principe, Jose C

    2010-01-01

    This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy. ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesi

  4. Information theoretic analysis of edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  5. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  6. Information Theoretic-Learning Auto-Encoder

    OpenAIRE

    Santana, Eder; Emigh, Matthew; Principe, Jose C

    2016-01-01

    We propose Information Theoretic-Learning (ITL) divergence measures for variational regularization of neural networks. We also explore ITL-regularized autoencoders as an alternative to variational autoencoding bayes, adversarial autoencoders and generative adversarial networks for randomly generating sample data without explicitly defining a partition function. This paper also formalizes, generative moment matching networks under the ITL framework.

  7. Information-Theoretic Bounded Rationality and ε-Optimality

    Directory of Open Access Journals (Sweden)

    Daniel A. Braun

    2014-08-01

    Full Text Available Bounded rationality concerns the study of decision makers with limited information processing resources. Previously, the free energy difference functional has been suggested to model bounded rational decision making, as it provides a natural trade-off between an energy or utility function that is to be optimized and information processing costs that are measured by entropic search costs. The main question of this article is how the information-theoretic free energy model relates to simple ε-optimality models of bounded rational decision making, where the decision maker is satisfied with any action in an ε-neighborhood of the optimal utility. We find that the stochastic policies that optimize the free energy trade-off comply with the notion of ε-optimality. Moreover, this optimality criterion even holds when the environment is adversarial. We conclude that the study of bounded rationality based on ε-optimality criteria that abstract away from the particulars of the information processing constraints is compatible with the information-theoretic free energy model of bounded rationality.

  8. Theoretical development of information science: A brief history

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    the strongest “paradigms” in the field is a tradition derived from the Cranfield experiments in the 1960s and the bibliometric research following the publication of Science Citation Index from 1963 and forward. Among the competing theoretical frameworks, ‘the cognitive view’ became influential from the 1970s......This paper presents a brief history of information science (IS) as viewed by the author. The term ‘information science’ goes back to 1955 and evolved in the aftermath of Claude Shannon’s ‘information theory’ (1948), which also inspired research into problems in fields of library science...... and documentation. These subjects were a main focus of what became established as ‘information science’, which from 1964 onwards was often termed ‘library and information science’ (LIS). However, the usefulness of Shannon’s information theory as the theoretical foundation of the field was been challenged. Among...

  9. Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity.

    Science.gov (United States)

    Lizier, Joseph T; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail

    2011-02-01

    The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.

  10. Information-theoretic lengths of Jacobi polynomials

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, A; Dehesa, J S [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, Granada (Spain); Sanchez-Moreno, P, E-mail: agmartinez@ugr.e, E-mail: pablos@ugr.e, E-mail: dehesa@ugr.e [Instituto ' Carlos I' de Fisica Teorica y Computacional, Universidad de Granada, Granada (Spain)

    2010-07-30

    The information-theoretic lengths of the Jacobi polynomials P{sup ({alpha}, {beta})}{sub n}(x), which are information-theoretic measures (Renyi, Shannon and Fisher) of their associated Rakhmanov probability density, are investigated. They quantify the spreading of the polynomials along the orthogonality interval [- 1, 1] in a complementary but different way as the root-mean-square or standard deviation because, contrary to this measure, they do not refer to any specific point of the interval. The explicit expressions of the Fisher length are given. The Renyi lengths are found by the use of the combinatorial multivariable Bell polynomials in terms of the polynomial degree n and the parameters ({alpha}, {beta}). The Shannon length, which cannot be exactly calculated because of its logarithmic functional form, is bounded from below by using sharp upper bounds to general densities on [- 1, +1] given in terms of various expectation values; moreover, its asymptotics is also pointed out. Finally, several computational issues relative to these three quantities are carefully analyzed.

  11. An Information-Theoretic Approach for Indirect Train Traffic Monitoring Using Building Vibration

    Directory of Open Access Journals (Sweden)

    Susu Xu

    2017-05-01

    Full Text Available This paper introduces an indirect train traffic monitoring method to detect and infer real-time train events based on the vibration response of a nearby building. Monitoring and characterizing traffic events are important for cities to improve the efficiency of transportation systems (e.g., train passing, heavy trucks, and traffic. Most prior work falls into two categories: (1 methods that require intensive labor to manually record events or (2 systems that require deployment of dedicated sensors. These approaches are difficult and costly to execute and maintain. In addition, most prior work uses dedicated sensors designed for a single purpose, resulting in deployment of multiple sensor systems. This further increases costs. Meanwhile, with the increasing demands of structural health monitoring, many vibration sensors are being deployed in commercial buildings. Traffic events create ground vibration that propagates to nearby building structures inducing noisy vibration responses. We present an information-theoretic method for train event monitoring using commonly existing vibration sensors deployed for building health monitoring. The key idea is to represent the wave propagation in a building induced by train traffic as information conveyed in noisy measurement signals. Our technique first uses wavelet analysis to detect train events. Then, by analyzing information exchange patterns of building vibration signals, we infer the category of the events (i.e., southbound or northbound train. Our algorithm is evaluated with an 11-story building where trains pass by frequently. The results show that the method can robustly achieve a train event detection accuracy of up to a 93% true positive rate and an 80% true negative rate. For direction categorization, compared with the traditional signal processing method, our information-theoretic approach reduces categorization error from 32.1 to 12.1%, which is a 2.5× improvement.

  12. Biometric security from an information-theoretical perspective

    NARCIS (Netherlands)

    Ignatenko, T.; Willems, F.M.J.

    2012-01-01

    In this review, biometric systems are studied from an information theoretical point of view. In the first part biometric authentication systems are studied. The objective of these systems is, observing correlated enrollment and authentication biometric sequences, to generate or convey as large as

  13. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  14. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  15. An Information-Theoretic Approach to PMU Placement in Electric Power Systems

    OpenAIRE

    Li, Qiao; Cui, Tao; Weng, Yang; Negi, Rohit; Franchetti, Franz; Ilic, Marija D.

    2012-01-01

    This paper presents an information-theoretic approach to address the phasor measurement unit (PMU) placement problem in electric power systems. Different from the conventional 'topological observability' based approaches, this paper advocates a much more refined, information-theoretic criterion, namely the mutual information (MI) between the PMU measurements and the power system states. The proposed MI criterion can not only include the full system observability as a special case, but also ca...

  16. Information-theoretic decomposition of embodied and situated systems.

    Science.gov (United States)

    Da Rold, Federico

    2018-07-01

    The embodied and situated view of cognition stresses the importance of real-time and nonlinear bodily interaction with the environment for developing concepts and structuring knowledge. In this article, populations of robots controlled by an artificial neural network learn a wall-following task through artificial evolution. At the end of the evolutionary process, time series are recorded from perceptual and motor neurons of selected robots. Information-theoretic measures are estimated on pairings of variables to unveil nonlinear interactions that structure the agent-environment system. Specifically, the mutual information is utilized to quantify the degree of dependence and the transfer entropy to detect the direction of the information flow. Furthermore, the system is analyzed with the local form of such measures, thus capturing the underlying dynamics of information. Results show that different measures are interdependent and complementary in uncovering aspects of the robots' interaction with the environment, as well as characteristics of the functional neural structure. Therefore, the set of information-theoretic measures provides a decomposition of the system, capturing the intricacy of nonlinear relationships that characterize robots' behavior and neural dynamics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings

    Directory of Open Access Journals (Sweden)

    Siaw-Teng Liaw

    2014-10-01

    Full Text Available Introduction Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework.Methods We searched PubMed, Medline, Web of Science, ABI Inform (Proquest and Business Source Premier (EBSCO using the terms curation, information ecosystem, data quality management (DQM, data governance, information governance (IG and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise.Findings There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly ‘big-data’ environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle.Conclusions The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  18. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  19. Information theoretic bounds for compressed sensing in SAR imaging

    International Nuclear Information System (INIS)

    Jingxiong, Zhang; Ke, Yang; Jianzhong, Guo

    2014-01-01

    Compressed sensing (CS) is a new framework for sampling and reconstructing sparse signals from measurements significantly fewer than those prescribed by Nyquist rate in the Shannon sampling theorem. This new strategy, applied in various application areas including synthetic aperture radar (SAR), relies on two principles: sparsity, which is related to the signals of interest, and incoherence, which refers to the sensing modality. An important question in CS-based SAR system design concerns sampling rate necessary and sufficient for exact or approximate recovery of sparse signals. In the literature, bounds of measurements (or sampling rate) in CS have been proposed from the perspective of information theory. However, these information-theoretic bounds need to be reviewed and, if necessary, validated for CS-based SAR imaging, as there are various assumptions made in the derivations of lower and upper bounds on sub-Nyquist sampling rates, which may not hold true in CS-based SAR imaging. In this paper, information-theoretic bounds of sampling rate will be analyzed. For this, the SAR measurement system is modeled as an information channel, with channel capacity and rate-distortion characteristics evaluated to enable the determination of sampling rates required for recovery of sparse scenes. Experiments based on simulated data will be undertaken to test the theoretic bounds against empirical results about sampling rates required to achieve certain detection error probabilities

  20. THEORETICAL ASPECTS OF INFORMATIONAL SERVICES REGIONAL MARKET EFFECTIVE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    I.N. Korabejnikov

    2008-12-01

    Full Text Available The peculiarities and priorities of the informational services regional market formation as a part of network model of the economic development are described in this article. The authors present the classification of the factors which have an influence on the effectiveness of the informational services regional market development. Theoretical aspects of the informational services regional market effective development are shown.

  1. Theoretical information reuse and integration

    CERN Document Server

    Rubin, Stuart

    2016-01-01

    Information Reuse and Integration addresses the efficient extension and creation of knowledge through the exploitation of Kolmogorov complexity in the extraction and application of domain symmetry. Knowledge, which seems to be novel, can more often than not be recast as the image of a sequence of transformations, which yield symmetric knowledge. When the size of those transformations and/or the length of that sequence of transforms exceeds the size of the image, then that image is said to be novel or random. It may also be that the new knowledge is random in that no such sequence of transforms, which produces it exists, or is at least known. The nine chapters comprising this volume incorporate symmetry, reuse, and integration as overt operational procedures or as operations built into the formal representations of data and operators employed. Either way, the aforementioned theoretical underpinnings of information reuse and integration are supported.

  2. Information-theoretic signatures of biodiversity in the barcoding gene.

    Science.gov (United States)

    Barbosa, Valmir C

    2018-08-14

    Analyzing the information content of DNA, though holding the promise to help quantify how the processes of evolution have led to information gain throughout the ages, has remained an elusive goal. Paradoxically, one of the main reasons for this has been precisely the great diversity of life on the planet: if on the one hand this diversity is a rich source of data for information-content analysis, on the other hand there is so much variation as to make the task unmanageable. During the past decade or so, however, succinct fragments of the COI mitochondrial gene, which is present in all animal phyla and in a few others, have been shown to be useful for species identification through DNA barcoding. A few million such fragments are now publicly available through the BOLD systems initiative, thus providing an unprecedented opportunity for relatively comprehensive information-theoretic analyses of DNA to be attempted. Here we show how a generalized form of total correlation can yield distinctive information-theoretic descriptors of the phyla represented in those fragments. In order to illustrate the potential of this analysis to provide new insight into the evolution of species, we performed principal component analysis on standardized versions of the said descriptors for 23 phyla. Surprisingly, we found that, though based solely on the species represented in the data, the first principal component correlates strongly with the natural logarithm of the number of all known living species for those phyla. The new descriptors thus constitute clear information-theoretic signatures of the processes whereby evolution has given rise to current biodiversity, which suggests their potential usefulness in further related studies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Information-theoretic semi-supervised metric learning via entropy regularization.

    Science.gov (United States)

    Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi

    2014-08-01

    We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.

  4. The Theoretical Principles of the Organization of Information Systems.

    Science.gov (United States)

    Kulikowski, Juliusz Lech

    A survey of the theoretical problems connected with the organization and design of systems for processing and transmitting information is presented in this article. It gives a definition of Information Systems (IS) and classifies them from various points of view. It discusses briefly the most important aspects of the organization of IS, such as…

  5. Qualitative methods in theoretical physics

    CERN Document Server

    Maslov, Dmitrii

    2018-01-01

    This book comprises a set of tools which allow researchers and students to arrive at a qualitatively correct answer without undertaking lengthy calculations. In general, Qualitative Methods in Theoretical Physics is about combining approximate mathematical methods with fundamental principles of physics: conservation laws and symmetries. Readers will learn how to simplify problems, how to estimate results, and how to apply symmetry arguments and conduct dimensional analysis. A comprehensive problem set is included. The book will appeal to a wide range of students and researchers.

  6. Theoretical aspects of cellular decision-making and information-processing.

    Science.gov (United States)

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2012-01-01

    Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.

  7. Role of information theoretic uncertainty relations in quantum theory

    International Nuclear Information System (INIS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-01-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed

  8. Role of information theoretic uncertainty relations in quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  9. Expectancy-Violation and Information-Theoretic Models of Melodic Complexity

    Directory of Open Access Journals (Sweden)

    Tuomas Eerola

    2016-07-01

    Full Text Available The present study assesses two types of models for melodic complexity: one based on expectancy violations and the other one related to an information-theoretic account of redundancy in music. Seven different datasets spanning artificial sequences, folk and pop songs were used to refine and assess the models. The refinement eliminated unnecessary components from both types of models. The final analysis pitted three variants of the two model types against each other and could explain from 46-74% of the variance in the ratings across the datasets. The most parsimonious models were identified with an information-theoretic criterion. This suggested that the simplified expectancy-violation models were the most efficient for these sets of data. However, the differences between all optimized models were subtle in terms both of performance and simplicity.

  10. Information density converges in dialogue: Towards an information-theoretic model.

    Science.gov (United States)

    Xu, Yang; Reitter, David

    2018-01-01

    The principle of entropy rate constancy (ERC) states that language users distribute information such that words tend to be equally predictable given previous contexts. We examine the applicability of this principle to spoken dialogue, as previous findings primarily rest on written text. The study takes into account the joint-activity nature of dialogue and the topic shift mechanisms that are different from monologue. It examines how the information contributions from the two dialogue partners interactively evolve as the discourse develops. The increase of local sentence-level information density (predicted by ERC) is shown to apply to dialogue overall. However, when the different roles of interlocutors in introducing new topics are identified, their contribution in information content displays a new converging pattern. We draw explanations to this pattern from multiple perspectives: Casting dialogue as an information exchange system would mean that the pattern is the result of two interlocutors maintaining their own context rather than sharing one. Second, we present some empirical evidence that a model of Interactive Alignment may include information density to explain the effect. Third, we argue that building common ground is a process analogous to information convergence. Thus, we put forward an information-theoretic view of dialogue, under which some existing theories of human dialogue may eventually be unified. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Theoretical Model of Development of Information Competence among Students Enrolled in Elective Courses

    Science.gov (United States)

    Zhumasheva, Anara; Zhumabaeva, Zaida; Sakenov, Janat; Vedilina, Yelena; Zhaxylykova, Nuriya; Sekenova, Balkumis

    2016-01-01

    The current study focuses on the research topic of creating a theoretical model of development of information competence among students enrolled in elective courses. In order to examine specific features of the theoretical model of development of information competence among students enrolled in elective courses, we performed an analysis of…

  12. Theoretical information measurement in nonrelativistic time-dependent approach

    Science.gov (United States)

    Najafizade, S. A.; Hassanabadi, H.; Zarrinkamar, S.

    2018-02-01

    The information-theoretic measures of time-dependent Schrödinger equation are investigated via the Shannon information entropy, variance and local Fisher quantities. In our calculations, we consider the two first states n = 0,1 and obtain the position Sx (t) and momentum Sp (t) Shannon entropies as well as Fisher information Ix (t) in position and momentum Ip (t) spaces. Using the Fourier transformed wave function, we obtain the results in momentum space. Some interesting features of the information entropy densities ρs (x,t) and γs (p,t), as well as the probability densities ρ (x,t) and γ (p,t) for time-dependent states are demonstrated. We establish a general relation between variance and Fisher's information. The Bialynicki-Birula-Mycielski inequality is tested and verified for the states n = 0,1.

  13. Information-Theoretical Analysis of EEG Microstate Sequences in Python

    Directory of Open Access Journals (Sweden)

    Frederic von Wegner

    2018-06-01

    Full Text Available We present an open-source Python package to compute information-theoretical quantities for electroencephalographic data. Electroencephalography (EEG measures the electrical potential generated by the cerebral cortex and the set of spatial patterns projected by the brain's electrical potential on the scalp surface can be clustered into a set of representative maps called EEG microstates. Microstate time series are obtained by competitively fitting the microstate maps back into the EEG data set, i.e., by substituting the EEG data at a given time with the label of the microstate that has the highest similarity with the actual EEG topography. As microstate sequences consist of non-metric random variables, e.g., the letters A–D, we recently introduced information-theoretical measures to quantify these time series. In wakeful resting state EEG recordings, we found new characteristics of microstate sequences such as periodicities related to EEG frequency bands. The algorithms used are here provided as an open-source package and their use is explained in a tutorial style. The package is self-contained and the programming style is procedural, focusing on code intelligibility and easy portability. Using a sample EEG file, we demonstrate how to perform EEG microstate segmentation using the modified K-means approach, and how to compute and visualize the recently introduced information-theoretical tests and quantities. The time-lagged mutual information function is derived as a discrete symbolic alternative to the autocorrelation function for metric time series and confidence intervals are computed from Markov chain surrogate data. The software package provides an open-source extension to the existing implementations of the microstate transform and is specifically designed to analyze resting state EEG recordings.

  14. Theoretical-methodical Fundamentals of industrial marketing research

    OpenAIRE

    Butenko, N.

    2009-01-01

    The article proves the necessity to research theoretical and methodical fundamentals of industrial marketing and defines main key aspects of relationship management with the customers on industrial market.

  15. Information theoretic analysis of canny edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  16. Information-theoretic temporal Bell inequality and quantum computation

    International Nuclear Information System (INIS)

    Morikoshi, Fumiaki

    2006-01-01

    An information-theoretic temporal Bell inequality is formulated to contrast classical and quantum computations. Any classical algorithm satisfies the inequality, while quantum ones can violate it. Therefore, the violation of the inequality is an immediate consequence of the quantumness in the computation. Furthermore, this approach suggests a notion of temporal nonlocality in quantum computation

  17. A theoretical study on a convergence problem of nodal methods

    Energy Technology Data Exchange (ETDEWEB)

    Shaohong, Z.; Ziyong, L. [Shanghai Jiao Tong Univ., 1954 Hua Shan Road, Shanghai, 200030 (China); Chao, Y. A. [Westinghouse Electric Company, P. O. Box 355, Pittsburgh, PA 15230-0355 (United States)

    2006-07-01

    The effectiveness of modern nodal methods is largely due to its use of the information from the analytical flux solution inside a homogeneous node. As a result, the nodal coupling coefficients depend explicitly or implicitly on the evolving Eigen-value of a problem during its solution iteration process. This poses an inherently non-linear matrix Eigen-value iteration problem. This paper points out analytically that, whenever the half wave length of an evolving node interior analytic solution becomes smaller than the size of that node, this non-linear iteration problem can become inherently unstable and theoretically can always be non-convergent or converge to higher order harmonics. This phenomenon is confirmed, demonstrated and analyzed via the simplest 1-D problem solved by the simplest analytic nodal method, the Analytic Coarse Mesh Finite Difference (ACMFD, [1]) method. (authors)

  18. Information-Theoretic Inference of Common Ancestors

    Directory of Open Access Journals (Sweden)

    Bastian Steudel

    2015-04-01

    Full Text Available A directed acyclic graph (DAG partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version.

  19. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    Energy Technology Data Exchange (ETDEWEB)

    Tsourtis, Anastasios, E-mail: tsourtis@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, Crete (Greece); Pantazis, Yannis, E-mail: pantazis@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, and Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), GR-70013 Heraklion, Crete (Greece)

    2015-07-07

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities.

  20. Surface physics theoretical models and experimental methods

    CERN Document Server

    Mamonova, Marina V; Prudnikova, I A

    2016-01-01

    The demands of production, such as thin films in microelectronics, rely on consideration of factors influencing the interaction of dissimilar materials that make contact with their surfaces. Bond formation between surface layers of dissimilar condensed solids-termed adhesion-depends on the nature of the contacting bodies. Thus, it is necessary to determine the characteristics of adhesion interaction of different materials from both applied and fundamental perspectives of surface phenomena. Given the difficulty in obtaining reliable experimental values of the adhesion strength of coatings, the theoretical approach to determining adhesion characteristics becomes more important. Surface Physics: Theoretical Models and Experimental Methods presents straightforward and efficient approaches and methods developed by the authors that enable the calculation of surface and adhesion characteristics for a wide range of materials: metals, alloys, semiconductors, and complex compounds. The authors compare results from the ...

  1. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  2. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    practice of science (except by Gong et al., 2013, whose fundamental insight is the basis for this talk), and here I offer two examples of practical methods that scientists might use to approximately measure ontological information. I place this practical discussion in the context of several recent and high-profile experiments that have found that simple out-of-sample statistical models typically (vastly) outperform our most sophisticated terrestrial hydrology models. I offer some perspective on several open questions about how to use these findings to improve our models and understanding of these systems. Cartwright, N. (1983) How the Laws of Physics Lie. New York, NY: Cambridge Univ Press. Clark, M. P., Kavetski, D. and Fenicia, F. (2011) 'Pursuing the method of multiple working hypotheses for hydrological modeling', Water Resources Research, 47(9). Cover, T. M. and Thomas, J. A. (1991) Elements of Information Theory. New York, NY: Wiley-Interscience. Cox, R. T. (1946) 'Probability, frequency and reasonable expectation', American Journal of Physics, 14, pp. 1-13. Csiszár, I. (1972) 'A Class of Measures of Informativity of Observation Channels', Periodica Mathematica Hungarica, 2(1), pp. 191-213. Davies, P. C. W. (1990) 'Why is the physical world so comprehensible', Complexity, entropy and the physics of information, pp. 61-70. Gong, W., Gupta, H. V., Yang, D., Sricharan, K. and Hero, A. O. (2013) 'Estimating Epistemic & Aleatory Uncertainties During Hydrologic Modeling: An Information Theoretic Approach', Water Resources Research, 49(4), pp. 2253-2273. Jaynes, E. T. (2003) Probability Theory: The Logic of Science. New York, NY: Cambridge University Press. Nearing, G. S. and Gupta, H. V. (2015) 'The quantity and quality of information in hydrologic models', Water Resources Research, 51(1), pp. 524-538. Popper, K. R. (2002) The Logic of Scientific Discovery. New York: Routledge. Van Horn, K. S. (2003) 'Constructing a logic of plausible inference: a guide to cox's theorem

  3. Information-Theoretic Data Discarding for Dynamic Trees on Data Streams

    Directory of Open Access Journals (Sweden)

    Christoforos Anagnostopoulos

    2013-12-01

    Full Text Available Ubiquitous automated data collection at an unprecedented scale is making available streaming, real-time information flows in a wide variety of settings, transforming both science and industry. Learning algorithms deployed in such contexts often rely on single-pass inference, where the data history is never revisited. Learning may also need to be temporally adaptive to remain up-to-date against unforeseen changes in the data generating mechanism. Online Bayesian inference remains challenged by such transient, evolving data streams. Nonparametric modeling techniques can prove particularly ill-suited, as the complexity of the model is allowed to increase with the sample size. In this work, we take steps to overcome these challenges by porting information theoretic heuristics, such as exponential forgetting and active learning, into a fully Bayesian framework. We showcase our methods by augmenting a modern non-parametric modeling framework, dynamic trees, and illustrate its performance on a number of practical examples. The end product is a powerful streaming regression and classification tool, whose performance compares favorably to the state-of-the-art.

  4. Dimensional Information-Theoretic Measurement of Facial Emotion Expressions in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Jihun Hamm

    2014-01-01

    Full Text Available Altered facial expressions of emotions are characteristic impairments in schizophrenia. Ratings of affect have traditionally been limited to clinical rating scales and facial muscle movement analysis, which require extensive training and have limitations based on methodology and ecological validity. To improve reliable assessment of dynamic facial expression changes, we have developed automated measurements of facial emotion expressions based on information-theoretic measures of expressivity of ambiguity and distinctiveness of facial expressions. These measures were examined in matched groups of persons with schizophrenia (n=28 and healthy controls (n=26 who underwent video acquisition to assess expressivity of basic emotions (happiness, sadness, anger, fear, and disgust in evoked conditions. Persons with schizophrenia scored higher on ambiguity, the measure of conditional entropy within the expression of a single emotion, and they scored lower on distinctiveness, the measure of mutual information across expressions of different emotions. The automated measures compared favorably with observer-based ratings. This method can be applied for delineating dynamic emotional expressivity in healthy and clinical populations.

  5. Systems information management: graph theoretical approach

    NARCIS (Netherlands)

    Temel, T.

    2006-01-01

    This study proposes a new method for characterising the underlying information structure of a multi-sector system. A complete characterisation is accomplished by identifying information gaps and cause-effect information pathways in the system, and formulating critical testable hypotheses.

  6. On the information-theoretic approach to G\\"odel's incompleteness theorem

    OpenAIRE

    D'Abramo, Germano

    2002-01-01

    In this paper we briefly review and analyze three published proofs of Chaitin's theorem, the celebrated information-theoretic version of G\\"odel's incompleteness theorem. Then, we discuss our main perplexity concerning a key step common to all these demonstrations.

  7. Adaptive information-theoretic bounded rational decision-making with parametric priors

    OpenAIRE

    Grau-Moya, Jordi; Braun, Daniel A.

    2015-01-01

    Deviations from rational decision-making due to limited computational resources have been studied in the field of bounded rationality, originally proposed by Herbert Simon. There have been a number of different approaches to model bounded rationality ranging from optimality principles to heuristics. Here we take an information-theoretic approach to bounded rationality, where information-processing costs are measured by the relative entropy between a posterior decision strategy and a given fix...

  8. Nanoscale thermal transport: Theoretical method and application

    Science.gov (United States)

    Zeng, Yu-Jia; Liu, Yue-Yang; Zhou, Wu-Xing; Chen, Ke-Qiu

    2018-03-01

    With the size reduction of nanoscale electronic devices, the heat generated by the unit area in integrated circuits will be increasing exponentially, and consequently the thermal management in these devices is a very important issue. In addition, the heat generated by the electronic devices mostly diffuses to the air in the form of waste heat, which makes the thermoelectric energy conversion also an important issue for nowadays. In recent years, the thermal transport properties in nanoscale systems have attracted increasing attention in both experiments and theoretical calculations. In this review, we will discuss various theoretical simulation methods for investigating thermal transport properties and take a glance at several interesting thermal transport phenomena in nanoscale systems. Our emphasizes will lie on the advantage and limitation of calculational method, and the application of nanoscale thermal transport and thermoelectric property. Project supported by the Nation Key Research and Development Program of China (Grant No. 2017YFB0701602) and the National Natural Science Foundation of China (Grant No. 11674092).

  9. Symbolic interactionism as a theoretical perspective for multiple method research.

    Science.gov (United States)

    Benzies, K M; Allen, M N

    2001-02-01

    Qualitative and quantitative research rely on different epistemological assumptions about the nature of knowledge. However, the majority of nurse researchers who use multiple method designs do not address the problem of differing theoretical perspectives. Traditionally, symbolic interactionism has been viewed as one perspective underpinning qualitative research, but it is also the basis for quantitative studies. Rooted in social psychology, symbolic interactionism has a rich intellectual heritage that spans more than a century. Underlying symbolic interactionism is the major assumption that individuals act on the basis of the meaning that things have for them. The purpose of this paper is to present symbolic interactionism as a theoretical perspective for multiple method designs with the aim of expanding the dialogue about new methodologies. Symbolic interactionism can serve as a theoretical perspective for conceptually clear and soundly implemented multiple method research that will expand the understanding of human health behaviour.

  10. MAIA - Method for Architecture of Information Applied: methodological construct of information processing in complex contexts

    Directory of Open Access Journals (Sweden)

    Ismael de Moura Costa

    2017-04-01

    Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.

  11. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data.

    Science.gov (United States)

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K

    2018-06-01

    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  12. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    Science.gov (United States)

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  14. Distinguishing prognostic and predictive biomarkers: An information theoretic approach.

    Science.gov (United States)

    Sechidis, Konstantinos; Papangelou, Konstantinos; Metcalfe, Paul D; Svensson, David; Weatherall, James; Brown, Gavin

    2018-05-02

    The identification of biomarkers to support decision-making is central to personalised medicine, in both clinical and research scenarios. The challenge can be seen in two halves: identifying predictive markers, which guide the development/use of tailored therapies; and identifying prognostic markers, which guide other aspects of care and clinical trial planning, i.e. prognostic markers can be considered as covariates for stratification. Mistakenly assuming a biomarker to be predictive, when it is in fact largely prognostic (and vice-versa) is highly undesirable, and can result in financial, ethical and personal consequences. We present a framework for data-driven ranking of biomarkers on their prognostic/predictive strength, using a novel information theoretic method. This approach provides a natural algebra to discuss and quantify the individual predictive and prognostic strength, in a self-consistent mathematical framework. Our contribution is a novel procedure, INFO+, which naturally distinguishes the prognostic vs predictive role of each biomarker and handles higher order interactions. In a comprehensive empirical evaluation INFO+ outperforms more complex methods, most notably when noise factors dominate, and biomarkers are likely to be falsely identified as predictive, when in fact they are just strongly prognostic. Furthermore, we show that our methods can be 1-3 orders of magnitude faster than competitors, making it useful for biomarker discovery in 'big data' scenarios. Finally, we apply our methods to identify predictive biomarkers on two real clinical trials, and introduce a new graphical representation that provides greater insight into the prognostic and predictive strength of each biomarker. R implementations of the suggested methods are available at https://github.com/sechidis. konstantinos.sechidis@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  15. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings.

    Science.gov (United States)

    Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon

    2014-01-01

    Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  16. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer

    Science.gov (United States)

    Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2014-01-01

    Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086

  17. Universality in an information-theoretic motivated nonlinear Schrodinger equation

    International Nuclear Information System (INIS)

    Parwani, R; Tabia, G

    2007-01-01

    Using perturbative methods, we analyse a nonlinear generalization of Schrodinger's equation that had previously been obtained through information-theoretic arguments. We obtain analytical expressions for the leading correction, in terms of the nonlinearity scale, to the energy eigenvalues of the linear Schrodinger equation in the presence of an external potential and observe some generic features. In one space dimension these are (i) for nodeless ground states, the energy shifts are subleading in the nonlinearity parameter compared to the shifts for the excited states; (ii) the shifts for the excited states are due predominantly to contribution from the nodes of the unperturbed wavefunctions, and (iii) the energy shifts for excited states are positive for small values of a regulating parameter and negative at large values, vanishing at a universal critical value that is not manifest in the equation. Some of these features hold true for higher dimensional problems. We also study two exactly solved nonlinear Schrodinger equations so as to contrast our observations. Finally, we comment on the possible significance of our results if the nonlinearity is physically realized

  18. About possibilities using of theoretical calculation methods in radioecology

    International Nuclear Information System (INIS)

    Demoukhamedova, S.D.; Aliev, D.I.; Alieva, I.N.

    2002-01-01

    remaining the biological activity and changing the herbicide properties and selectivity is determined by charges distribution into naphthalene ring. The changing of charge distribution in the naphthalene ring has been induced by the effect of ionizing radiation. So, the theoretical calculation methods capable provide more detailed information concerns the radiation effect on ecosystem

  19. A comparison of SAR ATR performance with information theoretic predictions

    Science.gov (United States)

    Blacknell, David

    2003-09-01

    Performance assessment of automatic target detection and recognition algorithms for SAR systems (or indeed any other sensors) is essential if the military utility of the system / algorithm mix is to be quantified. This is a relatively straightforward task if extensive trials data from an existing system is used. However, a crucial requirement is to assess the potential performance of novel systems as a guide to procurement decisions. This task is no longer straightforward since a hypothetical system cannot provide experimental trials data. QinetiQ has previously developed a theoretical technique for classification algorithm performance assessment based on information theory. The purpose of the study presented here has been to validate this approach. To this end, experimental SAR imagery of targets has been collected using the QinetiQ Enhanced Surveillance Radar to allow algorithm performance assessments as a number of parameters are varied. In particular, performance comparisons can be made for (i) resolutions up to 0.1m, (ii) single channel versus polarimetric (iii) targets in the open versus targets in scrubland and (iv) use versus non-use of camouflage. The change in performance as these parameters are varied has been quantified from the experimental imagery whilst the information theoretic approach has been used to predict the expected variation of performance with parameter value. A comparison of these measured and predicted assessments has revealed the strengths and weaknesses of the theoretical technique as will be discussed in the paper.

  20. Set-theoretic methods in control

    CERN Document Server

    Blanchini, Franco

    2015-01-01

    The second edition of this monograph describes the set-theoretic approach for the control and analysis of dynamic systems, both from a theoretical and practical standpoint.  This approach is linked to fundamental control problems, such as Lyapunov stability analysis and stabilization, optimal control, control under constraints, persistent disturbance rejection, and uncertain systems analysis and synthesis.  Completely self-contained, this book provides a solid foundation of mathematical techniques and applications, extensive references to the relevant literature, and numerous avenues for further theoretical study. All the material from the first edition has been updated to reflect the most recent developments in the field, and a new chapter on switching systems has been added.  Each chapter contains examples, case studies, and exercises to allow for a better understanding of theoretical concepts by practical application. The mathematical language is kept to the minimum level necessary for the adequate for...

  1. The informal recycling in the international and local context: theoretical Elements

    International Nuclear Information System (INIS)

    Yepes P, Dora Luz

    2002-01-01

    This article is a synthesis of the theoretical aspects related with the urban problem of the informal recycling in our means, and it is framed inside the denominated investigation project alternatives for their invigoration of the informal recycling in Medellin, which is a thesis of the grade that looks for to strengthen the informal recycling through the study of the factors associated to the labor productivity of the informal recycle. Specifically, the study will identify options of improvement of its work y points to propose alternatives to dignify the labor of these people integrally by the light of environmental precepts, technicians, normative, institutional social and of sustainability. This document describe the theoretical elements in which this investigation will be based, showing the informal recycling inside of an international context, and their situation in a national and local environment. As a result of the bibliographical revision carried out, can be said, that it glimpses a low interest in to improve the conditions of work a International level of the informal recycle, unless the strategies that it outlines the international labor organization, with regard to the strengthening of the informal economy; in Latin America, it has not been possible to go further of the official rhetoric and the pro motion of the groups environmentalists, but in the issue of the recovery policies, reuse, and the recycling of solid wastes, if there. Has been a sustained advance; at national level clear strategies to improve the informal work of the recycle are being identified, however, lacks many efforts to develop the committed actions with these strategies, in spite of the fact that has been advancing the creation of recycle organizations little by little

  2. Quantum information theoretical analysis of various constructions for quantum secret sharing

    NARCIS (Netherlands)

    Rietjens, K.P.T.; Schoenmakers, B.; Tuyls, P.T.

    2005-01-01

    Recently, an information theoretical model for quantum secret sharing (QSS) schemes was introduced. By using this model, we prove that pure state quantum threshold schemes (QTS) can be constructed from quantum MDS codes and vice versa. In particular, we consider stabilizer codes and give a

  3. Numerical Methods Application for Reinforced Concrete Elements-Theoretical Approach for Direct Stiffness Matrix Method

    Directory of Open Access Journals (Sweden)

    Sergiu Ciprian Catinas

    2015-07-01

    Full Text Available A detailed theoretical and practical investigation of the reinforced concrete elements is due to recent techniques and method that are implemented in the construction market. More over a theoretical study is a demand for a better and faster approach nowadays due to rapid development of the calculus technique. The paper above will present a study for implementing in a static calculus the direct stiffness matrix method in order capable to address phenomena related to different stages of loading, rapid change of cross section area and physical properties. The method is a demand due to the fact that in our days the FEM (Finite Element Method is the only alternative to such a calculus and FEM are considered as expensive methods from the time and calculus resources point of view. The main goal in such a method is to create the moment-curvature diagram in the cross section that is analyzed. The paper above will express some of the most important techniques and new ideas as well in order to create the moment curvature graphic in the cross sections considered.

  4. Theoretical frameworks informing family-based child and adolescent obesity interventions

    DEFF Research Database (Denmark)

    Alulis, Sarah; Grabowski, Dan

    2017-01-01

    into focus. However, the use of theoretical frameworks to strengthen these interventions is rare and very uneven. OBJECTIVE AND METHOD: To conduct a qualitative meta-synthesis of family-based interventions for child and adolescent obesity to identify the theoretical frameworks applied, thus understanding how...... inconsistencies and a significant void between research results and health care practice. Based on the analysis, this article proposes three themes to be used as focus points when designing future interventions and when selecting theories for the development of solid, theory-based frameworks for application...... cognitive, self-efficacy and Family Systems Theory appeared most frequently. The remaining 24 were classified as theory-related as theoretical elements of self-monitoring; stimulus control, reinforcement and modelling were used. CONCLUSION: The designs of family-based interventions reveal numerous...

  5. Methods for evaluating information sources

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2012-01-01

    The article briefly presents and discusses 12 different approaches to the evaluation of information sources (for example a Wikipedia entry or a journal article): (1) the checklist approach; (2) classical peer review; (3) modified peer review; (4) evaluation based on examining the coverage...... of controversial views; (5) evidence-based evaluation; (6) comparative studies; (7) author credentials; (8) publisher reputation; (9) journal impact factor; (10) sponsoring: tracing the influence of economic, political, and ideological interests; (11) book reviews and book reviewing; and (12) broader criteria....... Reading a text is often not a simple process. All the methods discussed here are steps on the way on learning how to read, understand, and criticize texts. According to hermeneutics it involves the subjectivity of the reader, and that subjectivity is influenced, more or less, by different theoretical...

  6. Physics Without Physics. The Power of Information-theoretical Principles

    Science.gov (United States)

    D'Ariano, Giacomo Mauro

    2017-01-01

    David Finkelstein was very fond of the new information-theoretic paradigm of physics advocated by John Archibald Wheeler and Richard Feynman. Only recently, however, the paradigm has concretely shown its full power, with the derivation of quantum theory (Chiribella et al., Phys. Rev. A 84:012311, 2011; D'Ariano et al., 2017) and of free quantum field theory (D'Ariano and Perinotti, Phys. Rev. A 90:062106, 2014; Bisio et al., Phys. Rev. A 88:032301, 2013; Bisio et al., Ann. Phys. 354:244, 2015; Bisio et al., Ann. Phys. 368:177, 2016) from informational principles. The paradigm has opened for the first time the possibility of avoiding physical primitives in the axioms of the physical theory, allowing a re-foundation of the whole physics over logically solid grounds. In addition to such methodological value, the new information-theoretic derivation of quantum field theory is particularly interesting for establishing a theoretical framework for quantum gravity, with the idea of obtaining gravity itself as emergent from the quantum information processing, as also suggested by the role played by information in the holographic principle (Susskind, J. Math. Phys. 36:6377, 1995; Bousso, Rev. Mod. Phys. 74:825, 2002). In this paper I review how free quantum field theory is derived without using mechanical primitives, including space-time, special relativity, Hamiltonians, and quantization rules. The theory is simply provided by the simplest quantum algorithm encompassing a countable set of quantum systems whose network of interactions satisfies the three following simple principles: homogeneity, locality, and isotropy. The inherent discrete nature of the informational derivation leads to an extension of quantum field theory in terms of a quantum cellular automata and quantum walks. A simple heuristic argument sets the scale to the Planck one, and the currently observed regime where discreteness is not visible is the so-called "relativistic regime" of small wavevectors, which

  7. Characterising Information Systems in Australia: A Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Gail Ridley

    2006-11-01

    Full Text Available The study reported in this volume aims to investigate the state of the Information Systems academic discipline in Australia from a historical and current perspective, collecting evidence across a range of dimensions. To maximise the strategic potential of the study, the results need to be capable of integration, so that the relationships within and across the dimensions and geographical units are understood. A meaningful theoretical framework will help relate the results of the different dimensions of the study to characterise the discipline in the region, and assist in empowering the Australian IS research community. This paper reviewed literature on the development of disciplines, before deriving a theoretical framework for the broader study reported in this volume. The framework considered the current and past state of IS in Australian universities from the perspective of the development of a discipline. The components of the framework were derived and validated through a thematic analysis of both the IS and non-IS literature. This paper also presents brief vignettes of the development of two other related disciplines. The framework developed in this paper, which has been partly guided by Whitley’s Theory of Scientific Change, has been used to analyse data collated from the Australian states and the Australian Capital Territory. The degree of variation in Australian IS as an indication of its “professionalisation”, the nature of its body of knowledge and its mechanisms of control, will be used to frame the analysis. Research reported in several of the papers that follow in this volume has drawn upon the theoretical framework presented below.

  8. Several foundational and information theoretic implications of Bell’s theorem

    Science.gov (United States)

    Kar, Guruprasad; Banik, Manik

    2016-08-01

    In 1935, Albert Einstein and two colleagues, Boris Podolsky and Nathan Rosen (EPR) developed a thought experiment to demonstrate what they felt was a lack of completeness in quantum mechanics (QM). EPR also postulated the existence of more fundamental theory where physical reality of any system would be completely described by the variables/states of that fundamental theory. This variable is commonly called hidden variable and the theory is called hidden variable theory (HVT). In 1964, John Bell proposed an empirically verifiable criterion to test for the existence of these HVTs. He derived an inequality, which must be satisfied by any theory that fulfill the conditions of locality and reality. He also showed that QM, as it violates this inequality, is incompatible with any local-realistic theory. Later it has been shown that Bell’s inequality (BI) can be derived from different set of assumptions and it also find applications in useful information theoretic protocols. In this review, we will discuss various foundational as well as information theoretic implications of BI. We will also discuss about some restricted nonlocal feature of quantum nonlocality and elaborate the role of Uncertainty principle and Complementarity principle in explaining this feature.

  9. An Information Theoretic Analysis of Classification Sorting and Cognition by Ninth Grade Children within a Piagetian Setting.

    Science.gov (United States)

    Dunlop, David Livingston

    The purpose of this study was to use an information theoretic memory model to quantitatively investigate classification sorting and recall behaviors of various groups of students. The model provided theorems for the determination of information theoretic measures from which inferences concerning mental processing were made. The basic procedure…

  10. Information-Theoretic Approaches for Evaluating Complex Adaptive Social Simulation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Ganguly, Auroop R [ORNL; Jiao, Yu [ORNL

    2009-01-01

    In this paper, we propose information-theoretic approaches for comparing and evaluating complex agent-based models. In information theoretic terms, entropy and mutual information are two measures of system complexity. We used entropy as a measure of the regularity of the number of agents in a social class; and mutual information as a measure of information shared by two social classes. Using our approaches, we compared two analogous agent-based (AB) models developed for regional-scale social-simulation system. The first AB model, called ABM-1, is a complex AB built with 10,000 agents on a desktop environment and used aggregate data; the second AB model, ABM-2, was built with 31 million agents on a highperformance computing framework located at Oak Ridge National Laboratory, and fine-resolution data from the LandScan Global Population Database. The initializations were slightly different, with ABM-1 using samples from a probability distribution and ABM-2 using polling data from Gallop for a deterministic initialization. The geographical and temporal domain was present-day Afghanistan, and the end result was the number of agents with one of three behavioral modes (proinsurgent, neutral, and pro-government) corresponding to the population mindshare. The theories embedded in each model were identical, and the test simulations focused on a test of three leadership theories - legitimacy, coercion, and representative, and two social mobilization theories - social influence and repression. The theories are tied together using the Cobb-Douglas utility function. Based on our results, the hypothesis that performance measures can be developed to compare and contrast AB models appears to be supported. Furthermore, we observed significant bias in the two models. Even so, further tests and investigations are required not only with a wider class of theories and AB models, but also with additional observed or simulated data and more comprehensive performance measures.

  11. A short course in quantum information theory. An approach from theoretical physics

    International Nuclear Information System (INIS)

    Diosi, L.

    2007-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. (orig.)

  12. Research on image complexity evaluation method based on color information

    Science.gov (United States)

    Wang, Hao; Duan, Jin; Han, Xue-hui; Xiao, Bo

    2017-11-01

    In order to evaluate the complexity of a color image more effectively and find the connection between image complexity and image information, this paper presents a method to compute the complexity of image based on color information.Under the complexity ,the theoretical analysis first divides the complexity from the subjective level, divides into three levels: low complexity, medium complexity and high complexity, and then carries on the image feature extraction, finally establishes the function between the complexity value and the color characteristic model. The experimental results show that this kind of evaluation method can objectively reconstruct the complexity of the image from the image feature research. The experimental results obtained by the method of this paper are in good agreement with the results of human visual perception complexity,Color image complexity has a certain reference value.

  13. Theoretical framework for government information service delivery to deep rural communities in South Africa

    CSIR Research Space (South Africa)

    Mvelase, PS

    2009-10-01

    Full Text Available This paper reports on a study to determine the information requirements of communities in deep rural areas on government services and how this information can be made available to them. The study then proposes an e-government theoretical framework...

  14. LPI Optimization Framework for Target Tracking in Radar Network Architectures Using Information-Theoretic Criteria

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2014-01-01

    Full Text Available Widely distributed radar network architectures can provide significant performance improvement for target detection and localization. For a fixed radar network, the achievable target detection performance may go beyond a predetermined threshold with full transmitted power allocation, which is extremely vulnerable in modern electronic warfare. In this paper, we study the problem of low probability of intercept (LPI design for radar network and propose two novel LPI optimization schemes based on information-theoretic criteria. For a predefined threshold of target detection, Schleher intercept factor is minimized by optimizing transmission power allocation among netted radars in the network. Due to the lack of analytical closed-form expression for receiver operation characteristics (ROC, we employ two information-theoretic criteria, namely, Bhattacharyya distance and J-divergence as the metrics for target detection performance. The resulting nonconvex and nonlinear LPI optimization problems associated with different information-theoretic criteria are cast under a unified framework, and the nonlinear programming based genetic algorithm (NPGA is used to tackle the optimization problems in the framework. Numerical simulations demonstrate that our proposed LPI strategies are effective in enhancing the LPI performance for radar network.

  15. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  16. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  17. Optimal information transfer in enzymatic networks: A field theoretic formulation

    Science.gov (United States)

    Samanta, Himadri S.; Hinczewski, Michael; Thirumalai, D.

    2017-07-01

    Signaling in enzymatic networks is typically triggered by environmental fluctuations, resulting in a series of stochastic chemical reactions, leading to corruption of the signal by noise. For example, information flow is initiated by binding of extracellular ligands to receptors, which is transmitted through a cascade involving kinase-phosphatase stochastic chemical reactions. For a class of such networks, we develop a general field-theoretic approach to calculate the error in signal transmission as a function of an appropriate control variable. Application of the theory to a simple push-pull network, a module in the kinase-phosphatase cascade, recovers the exact results for error in signal transmission previously obtained using umbral calculus [Hinczewski and Thirumalai, Phys. Rev. X 4, 041017 (2014), 10.1103/PhysRevX.4.041017]. We illustrate the generality of the theory by studying the minimal errors in noise reduction in a reaction cascade with two connected push-pull modules. Such a cascade behaves as an effective three-species network with a pseudointermediate. In this case, optimal information transfer, resulting in the smallest square of the error between the input and output, occurs with a time delay, which is given by the inverse of the decay rate of the pseudointermediate. Surprisingly, in these examples the minimum error computed using simulations that take nonlinearities and discrete nature of molecules into account coincides with the predictions of a linear theory. In contrast, there are substantial deviations between simulations and predictions of the linear theory in error in signal propagation in an enzymatic push-pull network for a certain range of parameters. Inclusion of second-order perturbative corrections shows that differences between simulations and theoretical predictions are minimized. Our study establishes that a field theoretic formulation of stochastic biological signaling offers a systematic way to understand error propagation in

  18. Theoretically informed Monte Carlo simulation of liquid crystals by sampling of alignment-tensor fields

    Energy Technology Data Exchange (ETDEWEB)

    Armas-Pérez, Julio C.; Londono-Hurtado, Alejandro [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637 (United States); Guzmán, Orlando [Departamento de Física, Universidad Autónoma Metropolitana, Iztapalapa, DF 09340, México (Mexico); Hernández-Ortiz, Juan P. [Departamento de Materiales y Minerales, Universidad Nacional de Colombia, Sede Medellín, Medellín (Colombia); Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637 (United States); Pablo, Juan J. de, E-mail: depablo@uchicago.edu [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637 (United States); Materials Science Division, Argonne National Laboratory, Argonne, Illinois 60439 (United States)

    2015-07-28

    A theoretically informed coarse-grained Monte Carlo method is proposed for studying liquid crystals. The free energy functional of the system is described in the framework of the Landau-de Gennes formalism. The alignment field and its gradients are approximated by finite differences, and the free energy is minimized through a stochastic sampling technique. The validity of the proposed method is established by comparing the results of the proposed approach to those of traditional free energy minimization techniques. Its usefulness is illustrated in the context of three systems, namely, a nematic liquid crystal confined in a slit channel, a nematic liquid crystal droplet, and a chiral liquid crystal in the bulk. It is found that for systems that exhibit multiple metastable morphologies, the proposed Monte Carlo method is generally able to identify lower free energy states that are often missed by traditional approaches. Importantly, the Monte Carlo method identifies such states from random initial configurations, thereby obviating the need for educated initial guesses that can be difficult to formulate.

  19. Theoretically informed Monte Carlo simulation of liquid crystals by sampling of alignment-tensor fields.

    Energy Technology Data Exchange (ETDEWEB)

    Armas-Perez, Julio C.; Londono-Hurtado, Alejandro; Guzman, Orlando; Hernandez-Ortiz, Juan P.; de Pablo, Juan J.

    2015-07-27

    A theoretically informed coarse-grained Monte Carlo method is proposed for studying liquid crystals. The free energy functional of the system is described in the framework of the Landau-de Gennes formalism. The alignment field and its gradients are approximated by finite differences, and the free energy is minimized through a stochastic sampling technique. The validity of the proposed method is established by comparing the results of the proposed approach to those of traditional free energy minimization techniques. Its usefulness is illustrated in the context of three systems, namely, a nematic liquid crystal confined in a slit channel, a nematic liquid crystal droplet, and a chiral liquid crystal in the bulk. It is found that for systems that exhibit multiple metastable morphologies, the proposed Monte Carlo method is generally able to identify lower free energy states that are often missed by traditional approaches. Importantly, the Monte Carlo method identifies such states from random initial configurations, thereby obviating the need for educated initial guesses that can be difficult to formulate.

  20. Information Design Theories

    Science.gov (United States)

    Pettersson, Rune

    2014-01-01

    Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…

  1. An information-theoretical approach to image resolution applied to neutron imaging detectors based upon individual discriminator signals

    International Nuclear Information System (INIS)

    Clergeau, Jean-Francois; Ferraton, Matthieu; Guerard, Bruno; Khaplanov, Anton; Piscitelli, Francesco; Platz, Martin; Rigal, Jean-Marie; Van Esch, Patrick; Daulle, Thibault

    2013-06-01

    1D or 2D neutron imaging detectors with individual wire or strip readout using discriminators have the advantage of being able to treat several neutron impacts partially overlapping in time, hence reducing global dead time. A single neutron impact usually gives rise to several discriminator signals. In this paper, we introduce an information-theoretical definition of image resolution. Two point-like spots of neutron impacts with a given distance between them act as a source of information (each neutron hit belongs to one spot or the other), and the detector plus signal treatment is regarded as an imperfect communication channel that transmits this information. The maximal mutual information obtained from this channel as a function of the distance between the spots allows to define a calibration-independent measure of resolution. We then apply this measure to quantify the power of resolution of different algorithms treating these individual discriminator signals which can be implemented in firmware. The method is then applied to different detectors existing at the ILL. Center-of-gravity methods usually improve the resolution over best-wire algorithms which are the standard way of treating these signals. (authors)

  2. An information-theoretical approach to image resolution applied to neutron imaging detectors based upon individual discriminator signals

    Energy Technology Data Exchange (ETDEWEB)

    Clergeau, Jean-Francois; Ferraton, Matthieu; Guerard, Bruno; Khaplanov, Anton; Piscitelli, Francesco; Platz, Martin; Rigal, Jean-Marie; Van Esch, Patrick [Institut Laue Langevin, Neutron Detector Service, Grenoble (France); Daulle, Thibault [PHELMA Grenoble - INP Grenoble (France)

    2013-06-15

    1D or 2D neutron imaging detectors with individual wire or strip readout using discriminators have the advantage of being able to treat several neutron impacts partially overlapping in time, hence reducing global dead time. A single neutron impact usually gives rise to several discriminator signals. In this paper, we introduce an information-theoretical definition of image resolution. Two point-like spots of neutron impacts with a given distance between them act as a source of information (each neutron hit belongs to one spot or the other), and the detector plus signal treatment is regarded as an imperfect communication channel that transmits this information. The maximal mutual information obtained from this channel as a function of the distance between the spots allows to define a calibration-independent measure of resolution. We then apply this measure to quantify the power of resolution of different algorithms treating these individual discriminator signals which can be implemented in firmware. The method is then applied to different detectors existing at the ILL. Center-of-gravity methods usually improve the resolution over best-wire algorithms which are the standard way of treating these signals. (authors)

  3. Theoretical Mathematics

    Science.gov (United States)

    Stöltzner, Michael

    Answering to the double-faced influence of string theory on mathematical practice and rigour, the mathematical physicists Arthur Jaffe and Frank Quinn have contemplated the idea that there exists a `theoretical' mathematics (alongside `theoretical' physics) whose basic structures and results still require independent corroboration by mathematical proof. In this paper, I shall take the Jaffe-Quinn debate mainly as a problem of mathematical ontology and analyse it against the backdrop of two philosophical views that are appreciative towards informal mathematical development and conjectural results: Lakatos's methodology of proofs and refutations and John von Neumann's opportunistic reading of Hilbert's axiomatic method. The comparison of both approaches shows that mitigating Lakatos's falsificationism makes his insights about mathematical quasi-ontology more relevant to 20th century mathematics in which new structures are introduced by axiomatisation and not necessarily motivated by informal ancestors. The final section discusses the consequences of string theorists' claim to finality for the theory's mathematical make-up. I argue that ontological reductionism as advocated by particle physicists and the quest for mathematically deeper axioms do not necessarily lead to identical results.

  4. Phenomenological description of selected elementary chemical reaction mechanisms: An information-theoretic study

    International Nuclear Information System (INIS)

    Esquivel, R.O.; Flores-Gallegos, N.; Iuga, C.; Carrera, E.M.; Angulo, J.C.; Antolin, J.

    2010-01-01

    The information-theoretic description of the course of two elementary chemical reactions allows a phenomenological description of the chemical course of the hydrogenic abstraction and the S N 2 identity reactions by use of Shannon entropic measures in position and momentum spaces. The analyses reveal their synchronous/asynchronous mechanistic behavior.

  5. Information-theoretic discrepancy based iterative reconstructions (IDIR) for polychromatic x-ray tomography

    International Nuclear Information System (INIS)

    Jang, Kwang Eun; Lee, Jongha; Sung, Younghun; Lee, SeongDeok

    2013-01-01

    Purpose: X-ray photons generated from a typical x-ray source for clinical applications exhibit a broad range of wavelengths, and the interactions between individual particles and biological substances depend on particles' energy levels. Most existing reconstruction methods for transmission tomography, however, neglect this polychromatic nature of measurements and rely on the monochromatic approximation. In this study, we developed a new family of iterative methods that incorporates the exact polychromatic model into tomographic image recovery, which improves the accuracy and quality of reconstruction.Methods: The generalized information-theoretic discrepancy (GID) was employed as a new metric for quantifying the distance between the measured and synthetic data. By using special features of the GID, the objective function for polychromatic reconstruction which contains a double integral over the wavelength and the trajectory of incident x-rays was simplified to a paraboloidal form without using the monochromatic approximation. More specifically, the original GID was replaced with a surrogate function with two auxiliary, energy-dependent variables. Subsequently, the alternating minimization technique was applied to solve the double minimization problem. Based on the optimization transfer principle, the objective function was further simplified to the paraboloidal equation, which leads to a closed-form update formula. Numerical experiments on the beam-hardening correction and material-selective reconstruction were conducted to compare and assess the performance of conventional methods and the proposed algorithms.Results: The authors found that the GID determines the distance between its two arguments in a flexible manner. In this study, three groups of GIDs with distinct data representations were considered. The authors demonstrated that one type of GIDs that comprises “raw” data can be viewed as an extension of existing statistical reconstructions; under a

  6. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data

    Science.gov (United States)

    Darmon, David

    2018-03-01

    In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.

  7. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    Science.gov (United States)

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of

  8. Theoretical Coalescence: A Method to Develop Qualitative Theory: The Example of Enduring.

    Science.gov (United States)

    Morse, Janice M

    Qualitative research is frequently context bound, lacks generalizability, and is limited in scope. The purpose of this article was to describe a method, theoretical coalescence, that provides a strategy for analyzing complex, high-level concepts and for developing generalizable theory. Theoretical coalescence is a method of theoretical expansion, inductive inquiry, of theory development, that uses data (rather than themes, categories, and published extracts of data) as the primary source for analysis. Here, using the development of the lay concept of enduring as an example, I explore the scientific development of the concept in multiple settings over many projects and link it within the Praxis Theory of Suffering. As comprehension emerges when conducting theoretical coalescence, it is essential that raw data from various different situations be available for reinterpretation/reanalysis and comparison to identify the essential features of the concept. The concept is then reconstructed, with additional inquiry that builds description, and evidence is conducted and conceptualized to create a more expansive concept and theory. By utilizing apparently diverse data sets from different contexts that are linked by certain characteristics, the essential features of the concept emerge. Such inquiry is divergent and less bound by context yet purposeful, logical, and with significant pragmatic implications for practice in nursing and beyond our discipline. Theoretical coalescence is a means by which qualitative inquiry is broadened to make an impact, to accommodate new theoretical shifts and concepts, and to make qualitative research applied and accessible in new ways.

  9. Strongly Correlated Systems Theoretical Methods

    CERN Document Server

    Avella, Adolfo

    2012-01-01

    The volume presents, for the very first time, an exhaustive collection of those modern theoretical methods specifically tailored for the analysis of Strongly Correlated Systems. Many novel materials, with functional properties emerging from macroscopic quantum behaviors at the frontier of modern research in physics, chemistry and materials science, belong to this class of systems. Any technique is presented in great detail by its own inventor or by one of the world-wide recognized main contributors. The exposition has a clear pedagogical cut and fully reports on the most relevant case study where the specific technique showed to be very successful in describing and enlightening the puzzling physics of a particular strongly correlated system. The book is intended for advanced graduate students and post-docs in the field as textbook and/or main reference, but also for other researchers in the field who appreciates consulting a single, but comprehensive, source or wishes to get acquainted, in a as painless as po...

  10. A Game-Theoretic Approach to Information-Flow Control via Protocol Composition

    Directory of Open Access Journals (Sweden)

    Mário S. Alvim

    2018-05-01

    Full Text Available In the inference attacks studied in Quantitative Information Flow (QIF, the attacker typically tries to interfere with the system in the attempt to increase its leakage of secret information. The defender, on the other hand, typically tries to decrease leakage by introducing some controlled noise. This noise introduction can be modeled as a type of protocol composition, i.e., a probabilistic choice among different protocols, and its effect on the amount of leakage depends heavily on whether or not this choice is visible to the attacker. In this work, we consider operators for modeling visible and hidden choice in protocol composition, and we study their algebraic properties. We then formalize the interplay between defender and attacker in a game-theoretic framework adapted to the specific issues of QIF, where the payoff is information leakage. We consider various kinds of leakage games, depending on whether players act simultaneously or sequentially, and on whether or not the choices of the defender are visible to the attacker. In the case of sequential games, the choice of the second player is generally a function of the choice of the first player, and his/her probabilistic choice can be either over the possible functions (mixed strategy or it can be on the result of the function (behavioral strategy. We show that when the attacker moves first in a sequential game with a hidden choice, then behavioral strategies are more advantageous for the defender than mixed strategies. This contrasts with the standard game theory, where the two types of strategies are equivalent. Finally, we establish a hierarchy of these games in terms of their information leakage and provide methods for finding optimal strategies (at the points of equilibrium for both attacker and defender in the various cases.

  11. Information Retrieval Methods in Libraries and Information Centers ...

    African Journals Online (AJOL)

    The volumes of information created, generated and stored are immense that without adequate knowledge of information retrieval methods, the retrieval process for an information user would be cumbersome and frustrating. Studies have further revealed that information retrieval methods are essential in information centers ...

  12. Information-theoretic limitations on approximate quantum cloning and broadcasting

    Science.gov (United States)

    Lemm, Marius; Wilde, Mark M.

    2017-07-01

    We prove quantitative limitations on any approximate simultaneous cloning or broadcasting of mixed states. The results are based on information-theoretic (entropic) considerations and generalize the well-known no-cloning and no-broadcasting theorems. We also observe and exploit the fact that the universal cloning machine on the symmetric subspace of n qudits and symmetrized partial trace channels are dual to each other. This duality manifests itself both in the algebraic sense of adjointness of quantum channels and in the operational sense that a universal cloning machine can be used as an approximate recovery channel for a symmetrized partial trace channel and vice versa. The duality extends to give control of the performance of generalized universal quantum cloning machines (UQCMs) on subspaces more general than the symmetric subspace. This gives a way to quantify the usefulness of a priori information in the context of cloning. For example, we can control the performance of an antisymmetric analog of the UQCM in recovering from the loss of n -k fermionic particles.

  13. Theoretical methods and models for mechanical properties of soft biomaterials

    Directory of Open Access Journals (Sweden)

    Zhonggang Feng

    2017-06-01

    Full Text Available We review the most commonly used theoretical methods and models for the mechanical properties of soft biomaterials, which include phenomenological hyperelastic and viscoelastic models, structural biphasic and network models, and the structural alteration theory. We emphasize basic concepts and recent developments. In consideration of the current progress and needs of mechanobiology, we introduce methods and models for tackling micromechanical problems and their applications to cell biology. Finally, the challenges and perspectives in this field are discussed.

  14. Data, Information, Knowledge, Wisdom (DIKW: A Semiotic Theoretical and Empirical Exploration of the Hierarchy and its Quality Dimension

    Directory of Open Access Journals (Sweden)

    Sasa Baskarada

    2013-03-01

    Full Text Available What exactly is the difference between data and information? What is the difference between data quality and information quality; is there any difference between the two? And, what are knowledge and wisdom? Are there such things as knowledge quality and wisdom quality? As these primitives are the most basic axioms of information systems research, it is somewhat surprising that consensus on exact definitions seems to be lacking. This paper presents a theoretical and empirical exploration of the sometimes directly quoted, and often implied Data, Information, Knowledge, Wisdom (DIKW hierarchy and its quality dimension. We first review relevant literature from a range of perspectives and develop and contextualise a theoretical DIKW framework through semiotics. The literature review identifies definitional commonalities and divergences from a scholarly perspective; the theoretical discussion contextualises the terms and their relationships within a semiotic framework and proposes relevant definitions grounded in that framework. Next, rooted in Wittgenstein’s ordinary language philosophy, we analyse 20 online news articles for their uses of the terms and present the results of an online focus group discussion comprising 16 information systems experts. The empirical exploration identifies a range of definitional ambiguities from a practical perspective.

  15. Theoretical studies of chemical reaction dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Schatz, G.C. [Argonne National Laboratory, IL (United States)

    1993-12-01

    This collaborative program with the Theoretical Chemistry Group at Argonne involves theoretical studies of gas phase chemical reactions and related energy transfer and photodissociation processes. Many of the reactions studied are of direct relevance to combustion; others are selected they provide important examples of special dynamical processes, or are of relevance to experimental measurements. Both classical trajectory and quantum reactive scattering methods are used for these studies, and the types of information determined range from thermal rate constants to state to state differential cross sections.

  16. State-of-the-Art: Research Theoretical Framework of Information Systems Implementation Research in the Health Sector in Sub-Saharan Africa

    DEFF Research Database (Denmark)

    Tetteh, Godwin Kofi

    2014-01-01

    This study is about the state-of-the-art of reference theories and theoretical framework of information systems implementation research in the health industry in the Sub-Saharan countries from a process perspective. A process – variance framework, Poole et al, (2000), Markus & Robey, (1988......) and Shaw & Jarvenpaa, (1997) is employed to examine reference theories employed in research conducted on information systems implementation in the health sector in the Sub-Saharan region and published between 2003 and 2013. Using a number of key words and searching on a number of databases, EBSCO, CSA...... the process theoretical framework to enhance our insight into successful information systems implementation in the region. It is our optimism that the process based theoretical framework will be useful for, information system practitioners and organisational managers and researchers in the health sector...

  17. Information theoretical assessment of visual communication with wavelet coding

    Science.gov (United States)

    Rahman, Zia-ur

    1995-06-01

    A visual communication channel can be characterized by the efficiency with which it conveys information, and the quality of the images restored from the transmitted data. Efficient data representation requires the use of constraints of the visual communication channel. Our information theoretic analysis combines the design of the wavelet compression algorithm with the design of the visual communication channel. Shannon's communication theory, Wiener's restoration filter, and the critical design factors of image gathering and display are combined to provide metrics for measuring the efficiency of data transmission, and for quantitatively assessing the visual quality of the restored image. These metrics are: a) the mutual information (Eta) between the radiance the radiance field and the restored image, and b) the efficiency of the channel which can be roughly measured by as the ratio (Eta) /H, where H is the average number of bits being used to transmit the data. Huck, et al. (Journal of Visual Communication and Image Representation, Vol. 4, No. 2, 1993) have shown that channels desinged to maximize (Eta) , also maximize. Our assessment provides a framework for designing channels which provide the highest possible visual quality for a given amount of data under the critical design limitations of the image gathering and display devices. Results show that a trade-off exists between the maximum realizable information of the channel and its efficiency: an increase in one leads to a decrease in the other. The final selection of which of these quantities to maximize is, of course, application dependent.

  18. The Ulam Index: Methods of Theoretical Computer Science Help in Identifying Chemical Substances

    Science.gov (United States)

    Beltran, Adriana; Salvador, James

    1997-01-01

    In this paper, we show how methods developed for solving a theoretical computer problem of graph isomorphism are used in structural chemistry. We also discuss potential applications of these methods to exobiology: the search for life outside Earth.

  19. SAIL: Summation-bAsed Incremental Learning for Information-Theoretic Text Clustering.

    Science.gov (United States)

    Cao, Jie; Wu, Zhiang; Wu, Junjie; Xiong, Hui

    2013-04-01

    Information-theoretic clustering aims to exploit information-theoretic measures as the clustering criteria. A common practice on this topic is the so-called Info-Kmeans, which performs K-means clustering with KL-divergence as the proximity function. While expert efforts on Info-Kmeans have shown promising results, a remaining challenge is to deal with high-dimensional sparse data such as text corpora. Indeed, it is possible that the centroids contain many zero-value features for high-dimensional text vectors, which leads to infinite KL-divergence values and creates a dilemma in assigning objects to centroids during the iteration process of Info-Kmeans. To meet this challenge, in this paper, we propose a Summation-bAsed Incremental Learning (SAIL) algorithm for Info-Kmeans clustering. Specifically, by using an equivalent objective function, SAIL replaces the computation of KL-divergence by the incremental computation of Shannon entropy. This can avoid the zero-feature dilemma caused by the use of KL-divergence. To improve the clustering quality, we further introduce the variable neighborhood search scheme and propose the V-SAIL algorithm, which is then accelerated by a multithreaded scheme in PV-SAIL. Our experimental results on various real-world text collections have shown that, with SAIL as a booster, the clustering performance of Info-Kmeans can be significantly improved. Also, V-SAIL and PV-SAIL indeed help improve the clustering quality at a lower cost of computation.

  20. Nonlocal correlations as an information-theoretic resource

    International Nuclear Information System (INIS)

    Barrett, Jonathan; Massar, Serge; Pironio, Stefano; Linden, Noah; Popescu, Sandu; Roberts, David

    2005-01-01

    It is well known that measurements performed on spatially separated entangled quantum systems can give rise to correlations that are nonlocal, in the sense that a Bell inequality is violated. They cannot, however, be used for superluminal signaling. It is also known that it is possible to write down sets of 'superquantum' correlations that are more nonlocal than is allowed by quantum mechanics, yet are still nonsignaling. Viewed as an information-theoretic resource, superquantum correlations are very powerful at reducing the amount of communication needed for distributed computational tasks. An intriguing question is why quantum mechanics does not allow these more powerful correlations. We aim to shed light on the range of quantum possibilities by placing them within a wider context. With this in mind, we investigate the set of correlations that are constrained only by the no-signaling principle. These correlations form a polytope, which contains the quantum correlations as a (proper) subset. We determine the vertices of the no-signaling polytope in the case that two observers each choose from two possible measurements with d outcomes. We then consider how interconversions between different sorts of correlations may be achieved. Finally, we consider some multipartite examples

  1. An Everyday and Theoretical Reading of "Perezhivanie" for Informing Research in Early Childhood Education

    Science.gov (United States)

    Fleer, Marilyn

    2016-01-01

    The concept of "perezhivanie" has received increasing attention in recent years. However, a clear understanding of this term has not yet been established. Mostly what is highlighted is the need for more informed theoretical discussion. In this paper, discussions centre on what "perezhivanie" means for research in early…

  2. A short course in quantum information theory. An approach from theoretical physics. 2. ed.

    International Nuclear Information System (INIS)

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition: ''The best things about this book are its brevity and clarity. In around 100 pages it provides a tutorial introduction to quantum information theory, including problems and solutions.. it's worth a look if you want to quickly get up to speed with the language and central concepts of quantum information theory, including the background classical information theory.'' (Craig Savage, Australian Physics, Vol. 44 (2), 2007). (orig.)

  3. Synergy between experimental and theoretical methods in the exploration of homogeneous transition metal catalysis

    DEFF Research Database (Denmark)

    Lupp, Daniel; Christensen, Niels Johan; Fristrup, Peter

    2014-01-01

    n this Perspective, we will focus on the use of both experimental and theoretical methods in the exploration of reaction mechanisms in homogeneous transition metal catalysis. We briefly introduce the use of Hammett studies and kinetic isotope effects (KIE). Both of these techniques can be complem......n this Perspective, we will focus on the use of both experimental and theoretical methods in the exploration of reaction mechanisms in homogeneous transition metal catalysis. We briefly introduce the use of Hammett studies and kinetic isotope effects (KIE). Both of these techniques can...... be complemented by computational chemistry – in particular in cases where interpretation of the experimental results is not straightforward. The good correspondence between experiment and theory is only possible due to recent advances within the applied theoretical framework. We therefore also highlight...

  4. Game-theoretic methods for functional response and optimal foraging behavior

    Czech Academy of Sciences Publication Activity Database

    Cressman, R.; Křivan, Vlastimil; Brown, J. S.; Garay, J.

    2014-01-01

    Roč. 9, č. 2 (2014), e88773 E-ISSN 1932-6203 Grant - others:Hungarian National Research Fund(HU) K62000; Hungarian National Research Fund(HU) K67961 Institutional support: RVO:60077344 Keywords : game-theoretic methods Subject RIV: EH - Ecology, Behaviour Impact factor: 3.234, year: 2014 http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0088773

  5. Using a Theoretical Framework to Investigate Whether the HIV/AIDS Information Needs of the AfroAIDSinfo Web Portal Members Are Met: A South African eHealth Study

    Directory of Open Access Journals (Sweden)

    Hendra Van Zyl

    2014-03-01

    Full Text Available eHealth has been identified as a useful approach to disseminate HIV/AIDS information. Together with Consumer Health Informatics (CHI, the Web-to-Public Knowledge Transfer Model (WPKTM has been applied as a theoretical framework to identify consumer needs for AfroAIDSinfo, a South African Web portal. As part of the CHI practice, regular eSurveys are conducted to determine whether these needs are changing and are continually being met. eSurveys show high rates of satisfaction with the content as well as the modes of delivery. The nature of information is thought of as reliable to reuse; both for education and for referencing of information. Using CHI and the WPKTM as a theoretical framework, it ensures that needs of consumers are being met and that they find the tailored methods of presenting the information agreeable. Combining ICTs and theories in eHealth interventions, this approach can be expanded to deliver information in other sectors of public health.

  6. Theoretical reflections on the connection between environmental assessment methods and conflict

    International Nuclear Information System (INIS)

    Persson, Jesper

    2006-01-01

    Today there is a great variety of methods for evaluating the environmental impact of plans, programs and projects. But which of these methods should planners and managers choose? This theoretical article explores the connection between conflicts, communication and rationality in assessment methods. It focuses on the form (rationality) and substance of communication, i.e. what we should communicate about. The outcome supports the view that environmental assessments should be based on value- and interest-focused thinking, following a teleological ethic, when goals, alternatives and compensations are to be developed and impacts evaluated

  7. E-loyalty towards a cancer information website: applying a theoretical framework.

    Science.gov (United States)

    Crutzen, Rik; Beekers, Nienke; van Eenbergen, Mies; Becker, Monique; Jongen, Lilian; van Osch, Liesbeth

    2014-06-01

    To provide more insight into user perceptions related to e-loyalty towards a cancer information website. This is needed to assure adequate provision of high quality information during the full process of cancer treatment-from diagnosis to after care-and an important first step towards optimizing cancer information websites in order to promote e-loyalty. Participants were cancer patients (n = 63) and informal caregivers (n = 202) that visited a website providing regional information about cancer care for all types of cancer. Subsequently, they filled out a questionnaire assessing e-loyalty towards the website and user perceptions (efficiency, effectiveness, active trust and enjoyment) based on a theoretical framework derived from the field of e-commerce. A structural equation model was constructed to test the relationships between user perceptions and e-loyalty. Participants in general could find the information they were looking for (efficiency), thought it was relevant (effectiveness) and that they could act upon it (active trust) and thought the visit itself was pleasant (enjoyment). Effectiveness and enjoyment were both positively related with e-loyalty, but this was mediated by active trust. Efficiency was positively related with e-loyalty. The explained variance of e-loyalty was high (R(2)  = 0.70). This study demonstrates that the importance of user perceptions is not limited to fields such as e-commerce but is also present within the context of cancer information websites. The high information need among participants might explain the positive relationship between efficiency and e-loyalty. Therefore, cancer information websites need to foster easy search and access of information provided. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Information theoretic approach to tactile encoding and discrimination

    OpenAIRE

    Saal, Hannes

    2011-01-01

    The human sense of touch integrates feedback from a multitude of touch receptors, but how this information is represented in the neural responses such that it can be extracted quickly and reliably is still largely an open question. At the same time, dexterous robots equipped with touch sensors are becoming more common, necessitating better methods for representing sequentially updated information and new control strategies that aid in extracting relevant features for object man...

  9. Data, Methods, and Theoretical Implications

    Science.gov (United States)

    Hannagan, Rebecca J.; Schneider, Monica C.; Greenlee, Jill S.

    2012-01-01

    Within the subfields of political psychology and the study of gender, the introduction of new data collection efforts, methodologies, and theoretical approaches are transforming our understandings of these two fields and the places at which they intersect. In this article we present an overview of the research that was presented at a National…

  10. A short course in quantum information theory an approach from theoretical physics

    CERN Document Server

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition...

  11. Theoretical physics 7 quantum mechanics : methods and applications

    CERN Document Server

    Nolting, Wolfgang

    2017-01-01

    This textbook offers a clear and comprehensive introduction to methods and applications in quantum mechanics, one of the core components of undergraduate physics courses. It follows on naturally from the previous volumes in this series, thus developing the understanding of quantized states further on. The first part of the book introduces the quantum theory of angular momentum and approximation methods. More complex themes are covered in the second part of the book, which describes multiple particle systems and scattering theory. Ideally suited to undergraduate students with some grounding in the basics of quantum mechanics, the book is enhanced throughout with learning features such as boxed inserts and chapter summaries, with key mathematical derivations highlighted to aid understanding. The text is supported by numerous worked examples and end of chapter problem sets.  About the Theoretical Physics series Translated from the renowned and highly successful German editions, the eight volumes of this seri...

  12. Advanced Numerical and Theoretical Methods for Photonic Crystals and Metamaterials

    Science.gov (United States)

    Felbacq, Didier

    2016-11-01

    This book provides a set of theoretical and numerical tools useful for the study of wave propagation in metamaterials and photonic crystals. While concentrating on electromagnetic waves, most of the material can be used for acoustic (or quantum) waves. For each presented numerical method, numerical code written in MATLAB® is presented. The codes are limited to 2D problems and can be easily translated in Python or Scilab, and used directly with Octave as well.

  13. 108 Information Retrieval Methods in Libraries and Information ...

    African Journals Online (AJOL)

    User

    without adequate knowledge of information retrieval methods, the retrieval process for an ... discusses the concept of Information retrieval, the various information ..... Other advantages of automatic indexing are the maintenance of consistency.

  14. Toward theoretical understanding of the fertility preservation decision-making process: examining information processing among young women with cancer.

    Science.gov (United States)

    Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2013-01-01

    Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.

  15. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  16. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  17. Theoretical vs. empirical discriminability: the application of ROC methods to eyewitness identification.

    Science.gov (United States)

    Wixted, John T; Mickes, Laura

    2018-01-01

    Receiver operating characteristic (ROC) analysis was introduced to the field of eyewitness identification 5 years ago. Since that time, it has been both influential and controversial, and the debate has raised an issue about measuring discriminability that is rarely considered. The issue concerns the distinction between empirical discriminability (measured by area under the ROC curve) vs. underlying/theoretical discriminability (measured by d' or variants of it). Under most circumstances, the two measures will agree about a difference between two conditions in terms of discriminability. However, it is possible for them to disagree, and that fact can lead to confusion about which condition actually yields higher discriminability. For example, if the two conditions have implications for real-world practice (e.g., a comparison of competing lineup formats), should a policymaker rely on the area-under-the-curve measure or the theory-based measure? Here, we illustrate the fact that a given empirical ROC yields as many underlying discriminability measures as there are theories that one is willing to take seriously. No matter which theory is correct, for practical purposes, the singular area-under-the-curve measure best identifies the diagnostically superior procedure. For that reason, area under the ROC curve informs policy in a way that underlying theoretical discriminability never can. At the same time, theoretical measures of discriminability are equally important, but for a different reason. Without an adequate theoretical understanding of the relevant task, the field will be in no position to enhance empirical discriminability.

  18. Theoretical investigations of the new Cokriging method for variable-fidelity surrogate modeling

    DEFF Research Database (Denmark)

    Zimmermann, Ralf; Bertram, Anna

    2018-01-01

    Cokriging is a variable-fidelity surrogate modeling technique which emulates a target process based on the spatial correlation of sampled data of different levels of fidelity. In this work, we address two theoretical questions associated with the so-called new Cokriging method for variable fidelity...

  19. Linear information retrieval method in X-ray grating-based phase contrast imaging and its interchangeability with tomographic reconstruction

    Science.gov (United States)

    Wu, Z.; Gao, K.; Wang, Z. L.; Shao, Q. G.; Hu, R. F.; Wei, C. X.; Zan, G. B.; Wali, F.; Luo, R. H.; Zhu, P. P.; Tian, Y. C.

    2017-06-01

    In X-ray grating-based phase contrast imaging, information retrieval is necessary for quantitative research, especially for phase tomography. However, numerous and repetitive processes have to be performed for tomographic reconstruction. In this paper, we report a novel information retrieval method, which enables retrieving phase and absorption information by means of a linear combination of two mutually conjugate images. Thanks to the distributive law of the multiplication as well as the commutative law and associative law of the addition, the information retrieval can be performed after tomographic reconstruction, thus simplifying the information retrieval procedure dramatically. The theoretical model of this method is established in both parallel beam geometry for Talbot interferometer and fan beam geometry for Talbot-Lau interferometer. Numerical experiments are also performed to confirm the feasibility and validity of the proposed method. In addition, we discuss its possibility in cone beam geometry and its advantages compared with other methods. Moreover, this method can also be employed in other differential phase contrast imaging methods, such as diffraction enhanced imaging, non-interferometric imaging, and edge illumination.

  20. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework.

    Science.gov (United States)

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.

  1. Information-theoretic security proof for quantum-key-distribution protocols

    International Nuclear Information System (INIS)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-01-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel

  2. Information-theoretic security proof for quantum-key-distribution protocols

    Science.gov (United States)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-07-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel.

  3. Information-theoretical analysis of private content identification

    NARCIS (Netherlands)

    Voloshynovskiy, S.; Koval, O.; Beekhof, F.; Farhadzadeh, F.; Holotyak, T.

    2010-01-01

    In recent years, content identification based on digital fingerprinting attracts a lot of attention in different emerging applications. At the same time, the theoretical analysis of digital fingerprinting systems for finite length case remains an open issue. Additionally, privacy leaks caused by

  4. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  5. A theoretical global optimization method for vapor-compression refrigeration systems based on entransy theory

    International Nuclear Information System (INIS)

    Xu, Yun-Chao; Chen, Qun

    2013-01-01

    The vapor-compression refrigeration systems have been one of the essential energy conversion systems for humankind and exhausting huge amounts of energy nowadays. Surrounding the energy efficiency promotion of the systems, there are lots of effectual optimization methods but mainly relied on engineering experience and computer simulations rather than theoretical analysis due to the complex and vague physical essence. We attempt to propose a theoretical global optimization method based on in-depth physical analysis for the involved physical processes, i.e. heat transfer analysis for condenser and evaporator, through introducing the entransy theory and thermodynamic analysis for compressor and expansion valve. The integration of heat transfer and thermodynamic analyses forms the overall physical optimization model for the systems to describe the relation between all the unknown parameters and known conditions, which makes theoretical global optimization possible. With the aid of the mathematical conditional extremum solutions, an optimization equation group and the optimal configuration of all the unknown parameters are analytically obtained. Eventually, via the optimization of a typical vapor-compression refrigeration system with various working conditions to minimize the total heat transfer area of heat exchangers, the validity and superior of the newly proposed optimization method is proved. - Highlights: • A global optimization method for vapor-compression systems is proposed. • Integrating heat transfer and thermodynamic analyses forms the optimization model. • A mathematical relation between design parameters and requirements is derived. • Entransy dissipation is introduced into heat transfer analysis. • The validity of the method is proved via optimization of practical cases

  6. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  7. Methods of Organizational Information Security

    Science.gov (United States)

    Martins, José; Dos Santos, Henrique

    The principle objective of this article is to present a literature review for the methods used in the security of information at the level of organizations. Some of the principle problems are identified and a first group of relevant dimensions is presented for an efficient management of information security. The study is based on the literature review made, using some of the more relevant certified articles of this theme, in international reports and in the principle norms of management of information security. From the readings that were done, we identified some of the methods oriented for risk management, norms of certification and good practice of security of information. Some of the norms are oriented for the certification of the product or system and others oriented to the processes of the business. There are also studies with the proposal of Frameworks that suggest the integration of different approaches with the foundation of norms focused on technologies, in processes and taking into consideration the organizational and human environment of the organizations. In our perspective, the biggest contribute to the security of information is the development of a method of security of information for an organization in a conflicting environment. This should make available the security of information, against the possible dimensions of attack that the threats could exploit, through the vulnerability of the organizational actives. This method should support the new concepts of "Network centric warfare", "Information superiority" and "Information warfare" especially developed in this last decade, where information is seen simultaneously as a weapon and as a target.

  8. An investigation on characterizing dense coal-water slurry with ultrasound: theoretical and experimental method

    Energy Technology Data Exchange (ETDEWEB)

    Xue, M.H.; Su, M.X.; Dong, L.L.; Shang, Z.T.; Cai, X.S. [Shanghai University of Science & Technology, Shanghai (China)

    2010-07-01

    Particle size distribution and concentration in particulate two-phase flow are important parameters in a wide variety of industrial areas. For the purpose of online characterization in dense coal-water slurries, ultrasonic methods have many advantages such as avoiding dilution, the capability for being used in real time, and noninvasive testing, while light-based techniques are not capable of providing information because optical methods often require the slurry to be diluted. In this article, the modified Urick equation including temperature modification, which can be used to determine the concentration by means of the measurement of ultrasonic velocity in a coal-water slurry, is evaluated on the basis of theoretical analysis and experimental study. A combination of the coupled-phase model and the Bouguer-Lambert-Beer law is employed in this work, and the attenuation spectrum is measured within the frequency region from 3 to 12 MHz. Particle size distributions of the coal-water slurry at different volume fractions are obtained with the optimum regularization technique. Therefore, the ultrasonic technique presented in this work brings the possibility of using ultrasound for online measurements of dense slurries.

  9. Theoretical clarity is not “Manicheanism”

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2011-01-01

    It is argued that in order to establish a new theoretical approach to information science it is necessary to express disagreement with some established views. The “social turn” in information science is not just exemplified in relation to the works of Marcia Bates but in relation to many different...... researchers in the field. Therefore it should not be taken personally, and the debate should focus on the substance. Marcia Bates has contributed considerably to information science. In spite of this some of her theoretical points of departure may be challenged. It is important to seek theoretical clarity...... and this may involve a degree of schematic confrontation that should not be confused with theoretical one-sidedness, “Manicheanism” or lack of respect....

  10. Knowledge and information needs of young people with epilepsy and their parents: Mixed-method systematic review

    Directory of Open Access Journals (Sweden)

    Noyes Jane

    2010-12-01

    Full Text Available Abstract Background Young people with neurological impairments such as epilepsy are known to receive less adequate services compared to young people with other long-term conditions. The time (age 13-19 years around transition to adult services is particularly important in facilitating young people's self-care and ongoing management. There are epilepsy specific, biological and psycho-social factors that act as barriers and enablers to information exchange and nurturing of self-care practices. Review objectives were to identify what is known to be effective in delivering information to young people age 13-19 years with epilepsy and their parents, to describe their experiences of information exchange in healthcare contexts, and to identify factors influencing positive and negative healthcare communication. Methods The Evidence for Policy and Practice Information Coordinating Centre systematic mixed-method approach was adapted to locate, appraise, extract and synthesise evidence. We used Ley's cognitive hypothetical model of communication and subsequently developed a theoretical framework explaining information exchange in healthcare contexts. Results Young people and parents believed that healthcare professionals were only interested in medical management. Young people felt that discussions about their epilepsy primarily occurred between professionals and parents. Epilepsy information that young people obtained from parents or from their own efforts increased the risk of epilepsy misconceptions. Accurate epilepsy knowledge aided psychosocial adjustment. There is some evidence that interventions, when delivered in a structured psycho-educational, age appropriate way, increased young people's epilepsy knowledge, with positive trend to improving quality of life. We used mainly qualitative and mixed-method evidence to develop a theoretical framework explaining information exchange in clinical encounters. Conclusions There is a paucity of evidence

  11. Connectionist Interaction Information Retrieval.

    Science.gov (United States)

    Dominich, Sandor

    2003-01-01

    Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…

  12. An information-theoretic machine learning approach to expression QTL analysis.

    Directory of Open Access Journals (Sweden)

    Tao Huang

    Full Text Available Expression Quantitative Trait Locus (eQTL analysis is a powerful tool to study the biological mechanisms linking the genotype with gene expression. Such analyses can identify genomic locations where genotypic variants influence the expression of genes, both in close proximity to the variant (cis-eQTL, and on other chromosomes (trans-eQTL. Many traditional eQTL methods are based on a linear regression model. In this study, we propose a novel method by which to identify eQTL associations with information theory and machine learning approaches. Mutual Information (MI is used to describe the association between genetic marker and gene expression. MI can detect both linear and non-linear associations. What's more, it can capture the heterogeneity of the population. Advanced feature selection methods, Maximum Relevance Minimum Redundancy (mRMR and Incremental Feature Selection (IFS, were applied to optimize the selection of the affected genes by the genetic marker. When we applied our method to a study of apoE-deficient mice, it was found that the cis-acting eQTLs are stronger than trans-acting eQTLs but there are more trans-acting eQTLs than cis-acting eQTLs. We compared our results (mRMR.eQTL with R/qtl, and MatrixEQTL (modelLINEAR and modelANOVA. In female mice, 67.9% of mRMR.eQTL results can be confirmed by at least two other methods while only 14.4% of R/qtl result can be confirmed by at least two other methods. In male mice, 74.1% of mRMR.eQTL results can be confirmed by at least two other methods while only 18.2% of R/qtl result can be confirmed by at least two other methods. Our methods provide a new way to identify the association between genetic markers and gene expression. Our software is available from supporting information.

  13. Eclecticism as the foundation of meta-theoretical, mixed methods and interdisciplinary research in social sciences.

    Science.gov (United States)

    Kroos, Karmo

    2012-03-01

    This article examines the value of "eclecticism" as the foundation of meta-theoretical, mixed methods and interdisciplinary research in social sciences. On the basis of the analysis of the historical background of the concept, it is first suggested that eclecticism-based theoretical scholarship in social sciences could benefit from the more systematic research method that has been developed for synthesizing theoretical works under the name metatheorizing. Second, it is suggested that the mixed methods community could base its research approach on philosophical eclecticism instead of pragmatism because the basic idea of eclecticism is much more in sync with the nature of the combined research tradition. Finally, the Kuhnian frame is used to support the argument for interdisciplinary research and, hence, eclecticism in social sciences (rather than making an argument against multiple paradigms). More particularly, it is suggested that integrating the different (inter)disciplinary traditions and schools into one is not necessarily desirable at all in social sciences because of the complexity and openness of the research field. If it is nevertheless attempted, experience in economics suggests that paradigmatic unification comes at a high price.

  14. The Philosophy of Information as an Underlying and Unifying Theory of Information Science

    Science.gov (United States)

    Tomic, Taeda

    2010-01-01

    Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…

  15. Information System Quality Assessment Methods

    OpenAIRE

    Korn, Alexandra

    2014-01-01

    This thesis explores challenging topic of information system quality assessment and mainly process assessment. In this work the term Information System Quality is defined as well as different approaches in a quality definition for different domains of information systems are outlined. Main methods of process assessment are overviewed and their relationships are described. Process assessment methods are divided into two categories: ISO standards and best practices. The main objective of this w...

  16. Methods for communicating technical information as public information

    International Nuclear Information System (INIS)

    Zara, S.A.

    1987-01-01

    Many challenges face the nuclear industry, especially in the waste management area. One of the biggest challenges is effective communication with the general public. Technical complexity, combined with the public's lack of knowledge and negative emotional response, complicate clear communication of radioactive waste management issues. The purpose of this session is to present and discuss methods for overcoming these obstacles and effectively transmitting technical information as public information. The methods presented encompass audio, visual, and print approaches to message transmission. To support these methods, the author also discusses techniques, based on current research, for improving the communication process

  17. Towards an Information Theory of Complex Networks

    CERN Document Server

    Dehmer, Matthias; Mehler, Alexander

    2011-01-01

    For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti

  18. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  19. Experimental Verification of a Jarzynski-Related Information-Theoretic Equality by a Single Trapped Ion.

    Science.gov (United States)

    Xiong, T P; Yan, L L; Zhou, F; Rehan, K; Liang, D F; Chen, L; Yang, W L; Ma, Z H; Feng, M; Vedral, V

    2018-01-05

    Most nonequilibrium processes in thermodynamics are quantified only by inequalities; however, the Jarzynski relation presents a remarkably simple and general equality relating nonequilibrium quantities with the equilibrium free energy, and this equality holds in both the classical and quantum regimes. We report a single-spin test and confirmation of the Jarzynski relation in the quantum regime using a single ultracold ^{40}Ca^{+} ion trapped in a harmonic potential, based on a general information-theoretic equality for a temporal evolution of the system sandwiched between two projective measurements. By considering both initially pure and mixed states, respectively, we verify, in an exact and fundamental fashion, the nonequilibrium quantum thermodynamics relevant to the mutual information and Jarzynski equality.

  20. Blogging in Higher Education: Theoretical and Practical Approach

    OpenAIRE

    Gulfidan CAN; Devrim OZDEMIR

    2006-01-01

    In this paper the blogging method, which includes new forms of writing, is supported as an alternative approach to address the frequently asserted problems in higher education such as product-oriented assessment and lack of value given to students' writing as contribution to the discourse of the academic disciplines. Both theoretical and research background information is provided to clarify the rationale of using this method in higher education. Furthermore, recommended way of using this met...

  1. Theoretical Methods of Domain Structures in Ultrathin Ferroelectric Films: A Review

    Directory of Open Access Journals (Sweden)

    Jianyi Liu

    2014-09-01

    Full Text Available This review covers methods and recent developments of the theoretical study of domain structures in ultrathin ferroelectric films. The review begins with an introduction to some basic concepts and theories (e.g., polarization and its modern theory, ferroelectric phase transition, domain formation, and finite size effects, etc. that are relevant to the study of domain structures in ultrathin ferroelectric films. Basic techniques and recent progress of a variety of important approaches for domain structure simulation, including first-principles calculation, molecular dynamics, Monte Carlo simulation, effective Hamiltonian approach and phase field modeling, as well as multiscale simulation are then elaborated. For each approach, its important features and relative merits over other approaches for modeling domain structures in ultrathin ferroelectric films are discussed. Finally, we review recent theoretical studies on some important issues of domain structures in ultrathin ferroelectric films, with an emphasis on the effects of interfacial electrostatics, boundary conditions and external loads.

  2. Theoretical and numerical investigations into the SPRT method for anomaly detection

    International Nuclear Information System (INIS)

    Schoonewelle, H.; Hagen, T.H.J.J. van der; Hoogenboom, J.E.

    1995-01-01

    The sequential probability ratio test developed by Wald is a powerful method of testing an alternative hypothesis against a null hypothesis. This makes the method applicable for anomaly detection. In this paper the method is used to detect a change of the standard deviation of a Gaussian distributed white noise signal. The false alarm probability, the alarm failure probability and the average time to alarm of the method, which are important parameters for anomaly detection, are determined by simulation and compared with theoretical results. Each of the three parameters is presented in dependence of the other two and the ratio of the standard deviation of the anomalous signal and that of the normal signal. Results show that the method is very well suited for anomaly detection. It can detect for example a 50% change in standard deviation within 1 second with a false alarm and alarm failure rate of less than once per month. (author)

  3. Theoretical and numerical investigations into the SPRT method for anomaly detection

    Energy Technology Data Exchange (ETDEWEB)

    Schoonewelle, H.; Hagen, T.H.J.J. van der; Hoogenboom, J.E. [Interuniversitair Reactor Inst., Delft (Netherlands)

    1995-11-01

    The sequential probability ratio test developed by Wald is a powerful method of testing an alternative hypothesis against a null hypothesis. This makes the method applicable for anomaly detection. In this paper the method is used to detect a change of the standard deviation of a Gaussian distributed white noise signal. The false alarm probability, the alarm failure probability and the average time to alarm of the method, which are important parameters for anomaly detection, are determined by simulation and compared with theoretical results. Each of the three parameters is presented in dependence of the other two and the ratio of the standard deviation of the anomalous signal and that of the normal signal. Results show that the method is very well suited for anomaly detection. It can detect for example a 50% change in standard deviation within 1 second with a false alarm and alarm failure rate of less than once per month. (author).

  4. Perspectives on Cybersecurity Information Sharing among Multiple Stakeholders Using a Decision-Theoretic Approach.

    Science.gov (United States)

    He, Meilin; Devine, Laura; Zhuang, Jun

    2018-02-01

    The government, private sectors, and others users of the Internet are increasingly faced with the risk of cyber incidents. Damage to computer systems and theft of sensitive data caused by cyber attacks have the potential to result in lasting harm to entities under attack, or to society as a whole. The effects of cyber attacks are not always obvious, and detecting them is not a simple proposition. As the U.S. federal government believes that information sharing on cybersecurity issues among organizations is essential to safety, security, and resilience, the importance of trusted information exchange has been emphasized to support public and private decision making by encouraging the creation of the Information Sharing and Analysis Center (ISAC). Through a decision-theoretic approach, this article provides new perspectives on ISAC, and the advent of the new Information Sharing and Analysis Organizations (ISAOs), which are intended to provide similar benefits to organizations that cannot fit easily into the ISAC structure. To help understand the processes of information sharing against cyber threats, this article illustrates 15 representative information sharing structures between ISAC, government, and other participating entities, and provide discussions on the strategic interactions between different stakeholders. This article also identifies the costs of information sharing and information security borne by different parties in this public-private partnership both before and after cyber attacks, as well as the two main benefits. This article provides perspectives on the mechanism of information sharing and some detailed cost-benefit analysis. © 2017 Society for Risk Analysis.

  5. Game-theoretic interference coordination approaches for dynamic spectrum access

    CERN Document Server

    Xu, Yuhua

    2016-01-01

    Written by experts in the field, this book is based on recent research findings in dynamic spectrum access for cognitive radio networks. It establishes a game-theoretic framework and presents cutting-edge technologies for distributed interference coordination. With game-theoretic formulation and the designed distributed learning algorithms, it provides insights into the interactions between multiple decision-makers and the converging stable states. Researchers, scientists and engineers in the field of cognitive radio networks will benefit from the book, which provides valuable information, useful methods and practical algorithms for use in emerging 5G wireless communication.

  6. BRIEF INTRODUCTION TO THEORETICAL INTENTION OF "NEEDLING METHOD FOR TRANQUILLIZATION AND CALMING THE MIND" FOR TREATMENT OF INSOMNIA

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A set of scientific theories and an effective acupuncture therapy for insomnia about "the needling method for tranquillization and calming the mind" are gradually formed through many years' theoretical and clinical studies. In this paper, the theoretical intention about "the needling method for tranquillization and calming the mind" for treatment of insomnia are briefly introduced mainly from the cause of disease,pathogenesis, therapeutic method and characteristics of composition of a prescription, etc. in order to provide a new train of thoughts and a new method for working out scientific and standard prescriptions in the treatment of insomnia.

  7. Computational Study of Chemical Reactivity Using Information-Theoretic Quantities from Density Functional Reactivity Theory for Electrophilic Aromatic Substitution Reactions.

    Science.gov (United States)

    Wu, Wenjie; Wu, Zemin; Rong, Chunying; Lu, Tian; Huang, Ying; Liu, Shubin

    2015-07-23

    The electrophilic aromatic substitution for nitration, halogenation, sulfonation, and acylation is a vastly important category of chemical transformation. Its reactivity and regioselectivity is predominantly determined by nucleophilicity of carbon atoms on the aromatic ring, which in return is immensely influenced by the group that is attached to the aromatic ring a priori. In this work, taking advantage of recent developments in quantifying nucleophilicity (electrophilicity) with descriptors from the information-theoretic approach in density functional reactivity theory, we examine the reactivity properties of this reaction system from three perspectives. These include scaling patterns of information-theoretic quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy and information gain at both molecular and atomic levels, quantitative predictions of the barrier height with both Hirshfeld charge and information gain, and energetic decomposition analyses of the barrier height for the reactions. To that end, we focused in this work on the identity reaction of the monosubstituted-benzene molecule reacting with hydrogen fluoride using boron trifluoride as the catalyst in the gas phase. We also considered 19 substituting groups, 9 of which are ortho/para directing and the other 9 meta directing, besides the case of R = -H. Similar scaling patterns for these information-theoretic quantities found for stable species elsewhere were disclosed for these reactions systems. We also unveiled novel scaling patterns for information gain at the atomic level. The barrier height of the reactions can reliably be predicted by using both the Hirshfeld charge and information gain at the regioselective carbon atom. The energy decomposition analysis ensued yields an unambiguous picture about the origin of the barrier height, where we showed that it is the electrostatic interaction that plays the dominant role, while the roles played by exchange-correlation and

  8. Information Theoretic Characterization of Physical Theories with Projective State Space

    Science.gov (United States)

    Zaopo, Marco

    2015-08-01

    Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.

  9. THE ROLE AND IMPORTANCE OF THEORETICAL PREPARATION ON “PHYSICAL EDUCATION FOR HIGHSCHOOL STUDENTS

    Directory of Open Access Journals (Sweden)

    DANIEL DOCU AXELERAD

    2009-12-01

    Full Text Available According to the pre-universitary curriculum, one of the criteria to asses the level of the subject’s acquisition is the quality of the theoretical knowledge. In the basic organizing documents of school physical education, there were and still are stipulated the exact requests regarding the necessary theoretical knowledge ofstudents on various education levels. According to these documents, the theoretical knowledge was general knowledge. To the general knowledge, there are added those pertaining to the basic information of the given subject, information about the means and methods of physical education, information from the domain of prophylactic physical education, etc. Special knowledge is that representing the students’ knowledge form various sports tests provided by the school curriculum, such as the sporting games (volleyball, basketball,football, handball, athletics (running, jumping, throwing and gymnastics (apparatus and floor exercises. It is here that the means and methods applied in acquiring the compartments listed above are attributed. In the special knowledge category there is also the knowledge related to the means, the forms and the methods to develop the basic motor qualities (force, speed, flexibility, resistance, skills, as well as the procedures for evaluating them.Nevertheless, regardless of the fact that in the physical education organizing normative documents, highschool included, it is provided that theoretical knowledge should be acquired, still there is no actual presentation of the specific requirements and the assessment criteria for the level of acquisition. No document specifies the ways to evaluate the volume and quality of acquiring the theoretical knowledge, which is why we are going to present here a detailed analysis of the level of acquisition of theoretical knowledge for the “Physical Education” subject by highschool students after applying the teaching –learning -evaluation technique on the

  10. A review of Web information seeking research: considerations of method and foci of interest

    Directory of Open Access Journals (Sweden)

    Konstantina Martzoukou

    2005-01-01

    Full Text Available Introduction. This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background. Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of gaining direct knowledge of behaviour. User-centred research emphasises the importance of holistic approaches, which incorporate physical, cognitive, and affective elements. Problems. Comprehensive studies are limited; many approaches are problematic and a consistent methodological framework has not been developed. Research has often failed to ensure appropriate samples that ensure both quantitative validity and qualitative consistency. Typically, observation has been based on simulated rather than real information needs and most studies show little attempt to examine holistically different characteristics of users in the same research schema. Research also deals with various aspects of cognitive style and ability with variant definitions of expertise and different layers of user experience. Finally the effect of social and cultural elements has not been extensively investigated. Conclusion. The existing limitations in method and the plethora of different approaches allow little progress and fewer comparisons across studies. There is urgent need for establishing a theoretical framework on which future studies can be based so that information seeking behaviour can be more holistically understood, and results can be generalised.

  11. Molecular acidity: An accurate description with information-theoretic approach in density functional reactivity theory.

    Science.gov (United States)

    Cao, Xiaofang; Rong, Chunying; Zhong, Aiguo; Lu, Tian; Liu, Shubin

    2018-01-15

    Molecular acidity is one of the important physiochemical properties of a molecular system, yet its accurate calculation and prediction are still an unresolved problem in the literature. In this work, we propose to make use of the quantities from the information-theoretic (IT) approach in density functional reactivity theory and provide an accurate description of molecular acidity from a completely new perspective. To illustrate our point, five different categories of acidic series, singly and doubly substituted benzoic acids, singly substituted benzenesulfinic acids, benzeneseleninic acids, phenols, and alkyl carboxylic acids, have been thoroughly examined. We show that using IT quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy, information gain, Onicescu information energy, and relative Rényi entropy, one is able to simultaneously predict experimental pKa values of these different categories of compounds. Because of the universality of the quantities employed in this work, which are all density dependent, our approach should be general and be applicable to other systems as well. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Principle-theoretic approach of kondo and construction-theoretic formalism of gauge theories

    International Nuclear Information System (INIS)

    Jain, L.C.

    1986-01-01

    Einstein classified various theories in physics as principle-theories and constructive-theories. In this lecture Kondo's approach to microscopic and macroscopic phenomena is analysed for its principle theoretic pursuit as followed by construction. The fundamentals of his theory may be recalled as Tristimulus principle, Observation principle, Kawaguchi spaces, empirical information, epistemological point of view, unitarity, intrinsicality, and dimensional analysis subject to logical and geometrical achievement. On the other hand, various physicists have evolved constructive gauge theories through the phenomenological point of view, often a collective one. Their synthetic method involves fibre bundles and connections, path integrals as well as other hypothetical structures. They lead towards clarity, completeness and adaptability

  13. Investigation of Means of Mitigating Congestion in Complex, Distributed Network Systems by Optimization Means and Information Theoretic Procedures

    Science.gov (United States)

    2008-02-01

    Information Theoretic Proceedures Frank Mufalli Rakesh Nagi Jim Llinas Sumita Mishra SUNY at Buffalo— CUBRC 4455 Genessee Street Buffalo...5f. WORK UNIT NUMBER NY 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) SUNY at Buffalo— CUBRC * Paine College ** 4455 Genessee

  14. Study on the Reduced Traffic Congestion Method Based on Dynamic Guidance Information

    Science.gov (United States)

    Li, Shu-Bin; Wang, Guang-Min; Wang, Tao; Ren, Hua-Ling; Zhang, Lin

    2018-05-01

    This paper studies how to generate the reasonable information of travelers’ decision in real network. This problem is very complex because the travelers’ decision is constrained by different human behavior. The network conditions can be predicted by using the advanced dynamic OD (Origin-Destination, OD) estimation techniques. Based on the improved mesoscopic traffic model, the predictable dynamic traffic guidance information can be obtained accurately. A consistency algorithm is designed to investigate the travelers’ decision by simulating the dynamic response to guidance information. The simulation results show that the proposed method can provide the best guidance information. Further, a case study is conducted to verify the theoretical results and to draw managerial insights into the potential of dynamic guidance strategy in improving traffic performance. Supported by National Natural Science Foundation of China under Grant Nos. 71471104, 71771019, 71571109, and 71471167; The University Science and Technology Program Funding Projects of Shandong Province under Grant No. J17KA211; The Project of Public Security Department of Shandong Province under Grant No. GATHT2015-236; The Major Social and Livelihood Special Project of Jinan under Grant No. 20150905

  15. Modern trends in theoretical radiation chemistry development

    International Nuclear Information System (INIS)

    Kaplan, I.G.

    1983-01-01

    Most important trends in the development of radiation chemitry theory are considered. Wide use of electronic computers for modeling different stages of radiolysis in conjUnction with advanced precision experimental methods (picosecond pulse radiolysis, acceptor additions method, magnetic method of detecting interstitial active particles) is noted. Information obtained in photochemistry and molecular spectroscopy, including laser photolysis, is in common use in developing the theory. It is noted that data on the processes occurring within less than 10 -12 s time can be obtained now only on the base of theoretical representations about the mechanism of ionizing irradiation interaction with molecular medium. Therefore, special attention in the review is paid to investigation of primary radiolysis processes. Besides investigation of primary medium excitation processes theoretical investigations into the ways of energy degradation, knocked out electrons and their further state are continued. It is noted that a considerable number of papers deal with the nature and behaviour of radiation-induced excess electrons in non-polar solutions and solid matrices. Works on application of diffusion kinetics in radiolysis have been developed in recent years

  16. The value of private patient information in the physician-patient relationship: a game-theoretic account.

    Science.gov (United States)

    De Jaegher, Kris

    2012-01-01

    This paper presents a game-theoretical model of the physician-patient relationship. There is a conflict of interests between physician and patient, in that the physician prefers the patient to always obtain a particular treatment, even if the patient would not consider this treatment in his interest. The patient obtains imperfect cues of whether or not he needs the treatment. The effect of an increase in the quality of the patient's private information is studied, in the form of an improvement in the quality of his cues. It is shown that when the patient's information improves in this sense, he may either become better off or worse off. The precise circumstances under which either result is obtained are derived.

  17. Methods to determine stratification efficiency of thermal energy storage processes–Review and theoretical comparison

    DEFF Research Database (Denmark)

    Haller, Michel; Cruickshank, Chynthia; Streicher, Wolfgang

    2009-01-01

    This paper reviews different methods that have been proposed to characterize thermal stratification in energy storages from a theoretical point of view. Specifically, this paper focuses on the methods that can be used to determine the ability of a storage to promote and maintain stratification...... during charging, storing and discharging, and represent this ability with a single numerical value in terms of a stratification efficiency for a given experiment or under given boundary conditions. Existing methods for calculating stratification efficiencies have been applied to hypothetical storage...

  18. A method for comparison of experimental and theoretical differential neutron spectra in the Zenith reactor

    International Nuclear Information System (INIS)

    Reed, D.L.; Symons, C.R.

    1965-01-01

    A method of calculation is given which assists the analyses of chopper measurements of spectra from ZENITH and enables complex multigroup theoretical calculations of the spectra to be put into a form which may be compared with experiment. In addition the theory of the cut-off function has been extended to give analytical expressions which take into account the effects of sub-collimators, off centre slits and of a rotor made of a material partially transparent to neutrons. The theoretical cut-off function suggested shows good agreement with experiment. (author)

  19. A method for comparison of experimental and theoretical differential neutron spectra in the Zenith reactor

    Energy Technology Data Exchange (ETDEWEB)

    Reed, D L; Symons, C R [General Reactor Physics Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1965-01-15

    A method of calculation is given which assists the analyses of chopper measurements of spectra from ZENITH and enables complex multigroup theoretical calculations of the spectra to be put into a form which may be compared with experiment. In addition the theory of the cut-off function has been extended to give analytical expressions which take into account the effects of sub-collimators, off centre slits and of a rotor made of a material partially transparent to neutrons. The theoretical cut-off function suggested shows good agreement with experiment. (author)

  20. Three-dimensionality of space and the quantum bit: an information-theoretic approach

    International Nuclear Information System (INIS)

    Müller, Markus P; Masanes, Lluís

    2013-01-01

    It is sometimes pointed out as a curiosity that the state space of quantum two-level systems, i.e. the qubit, and actual physical space are both three-dimensional and Euclidean. In this paper, we suggest an information-theoretic analysis of this relationship, by proving a particular mathematical result: suppose that physics takes place in d spatial dimensions, and that some events happen probabilistically (not assuming quantum theory in any way). Furthermore, suppose there are systems that carry ‘minimal amounts of direction information’, interacting via some continuous reversible time evolution. We prove that this uniquely determines spatial dimension d = 3 and quantum theory on two qubits (including entanglement and unitary time evolution), and that it allows observers to infer local spatial geometry from probability measurements. (paper)

  1. Planning and design of information systems

    CERN Document Server

    Blokdijk, André

    1991-01-01

    Planning and Design of Information Systems provides a theoretical base and a practical method of executing the planning of computerized information systems, and the planning and design of individual applications. The book is organized into five parts, covering the non-technical and nonimplementational part of information systems planning, design, and development. Part I gives the theoretical base for the subsequent parts of the book. It discusses modeling, techniques, notations, boundaries, quality issues and aspects, and decomposition techniques and problems. Part II discusses the needs, prob

  2. Information theoretical methods as discerning quantifiers of the equations of state of neutron stars

    Energy Technology Data Exchange (ETDEWEB)

    Avellar, M.G.B. de, E-mail: mgb.avellar@iag.usp.br [Instituto de Astronomia, Geofísica e Ciências Atmosféricas – Universidade de São Paulo, Rua do Matão 1226, Cidade Universitária, 05508-090, São Paulo, SP (Brazil); Souza, R.A. de, E-mail: rodrigo.souza@usp.br [Instituto de Astronomia, Geofísica e Ciências Atmosféricas – Universidade de São Paulo, Rua do Matão 1226, Cidade Universitária, 05508-090, São Paulo, SP (Brazil); Horvath, J.E., E-mail: foton@iag.usp.br [Instituto de Astronomia, Geofísica e Ciências Atmosféricas – Universidade de São Paulo, Rua do Matão 1226, Cidade Universitária, 05508-090, São Paulo, SP (Brazil); Paret, D.M., E-mail: dmanreza@fisica.uh.cu [Facultad de Física, Universidad de la Habana, San Lázaro y L, Vedado La Habana, 10400 (Cuba)

    2014-11-07

    In this work we use the statistical measures of information entropy, disequilibrium and complexity to discriminate different approaches and parametrizations for different equations of state for quark stars. We confirm the usefulness of such quantities to quantify the role of interactions in such stars. We find that within this approach, a quark matter equation of state such as SU(2) NJL with vectorial coupling and phase transition is slightly favoured and deserves deeper studies. - Highlights: • We used information theory tools to discern different compositions for compact stars. • Hadronic and quark stars analogues behave differently when analyzed with these tools. • The effects of different equations of state are singled out in this work.

  3. Some Observations on the Concepts of Information-Theoretic Entropy and Randomness

    Directory of Open Access Journals (Sweden)

    Jonathan D.H. Smith

    2001-02-01

    Full Text Available Abstract: Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed. Pre-dating Shannon, the concept is traced back to Pauli. A derivation from first principles is given, without use of approximations. The concept depends on the underlying degree of randomness. In physical applications, this translates to dependence on the experimental apparatus available. An example illustrates how this dependence affects Prigogine's proposal for the use of the Second Law of Thermodynamics as a selection principle for the breaking of time symmetry. The dependence also serves to yield a resolution of the so-called ``Gibbs Paradox.'' Extension of the concept from the discrete to the continuous case is discussed. The usual extension is shown to be dimensionally incorrect. Correction introduces a reference density, leading to the concept of Kullback entropy. Practical relativistic considerations suggest a possible proper reference density.

  4. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    Science.gov (United States)

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  5. 31st International Colloquium in Group Theoretical Methods in Physics

    CERN Document Server

    Gazeau, Jean-Pierre; Faci, Sofiane; Micklitz, Tobias; Scherer, Ricardo; Toppan, Francesco

    2017-01-01

    This proceedings records the 31st International Colloquium on Group Theoretical Methods in Physics (“Group 31”). Plenary-invited articles propose new approaches to the moduli spaces in gauge theories (V. Pestun, 2016 Weyl Prize Awardee), the phenomenology of neutrinos in non-commutative space-time, the use of Hardy spaces in quantum physics, contradictions in the use of statistical methods on complex systems, and alternative models of supersymmetry. This volume’s survey articles broaden the colloquia’s scope out into Majorana neutrino behavior, the dynamics of radiating charges, statistical pattern recognition of amino acids, and a variety of applications of gauge theory, among others. This year’s proceedings further honors Bertram Kostant (2016 Wigner Medalist), as well as S.T. Ali and L. Boyle, for their life-long contributions to the math and physics communities. The aim of the ICGTMP is to provide a forum for physicists, mathematicians, and scientists of related disciplines who develop or apply ...

  6. Aligning professional skills and active learning methods: an application for information and communications technology engineering

    Science.gov (United States)

    Llorens, Ariadna; Berbegal-Mirabent, Jasmina; Llinàs-Audet, Xavier

    2017-07-01

    Engineering education is facing new challenges to effectively provide the appropriate skills to future engineering professionals according to market demands. This study proposes a model based on active learning methods, which is expected to facilitate the acquisition of the professional skills most highly valued in the information and communications technology (ICT) market. The theoretical foundations of the study are based on the specific literature on active learning methodologies. The Delphi method is used to establish the fit between learning methods and generic skills required by the ICT sector. An innovative proposition is therefore presented that groups the required skills in relation to the teaching method that best develops them. The qualitative research suggests that a combination of project-based learning and the learning contract is sufficient to ensure a satisfactory skills level for this profile of engineers.

  7. Theoretical foundations of information security investment security companies

    Directory of Open Access Journals (Sweden)

    G.V. Berlyak

    2015-03-01

    Full Text Available Methodological problems related to the lack of guidance in the provisions (standards of accounting on the reflection in the accounting and financial reporting of the research object. In this connection, it is proposed to amend the provisions (standards of accounting. This will allow to come to the consistency of accounting methods of operations with elements of investment activity. Based on analysis of the information needs of users suggested indicators identikativnye blocks (block corporate finance unit assess the relationship with financial institutions, block the fulfillment of obligations according to the calculations, the investment unit, a science and innovation, investment security and developed forms of internal accounting controls and improvements to existing forms financial statements for the investment activities of the enterprise. Using enterprise data reporting forms provide timely and reliable information on the identity and structure of investment security and enable the company to effectively plan and develop personnel policies for enterprise management.

  8. THEORETICAL FRAMEWORK FOR INFORMATION AND EDUCATIONAL COMPLEX DEVELOPMENT OF AN ACADEMIC DISCIPLINE AT A HIGHER INSTITUTION

    Directory of Open Access Journals (Sweden)

    Evgeniia Nikolaevna Kikot

    2015-05-01

    Full Text Available The question of organization of contemporary education process is getting more important nowadays in the conditions of ICT (information and communication technologies and e-education usage.This defines one of the most important methodological and research directions in the university – creation of informational-educational course unit complex as the foundation of e-University resource.The foundation of informational-educational course unit complex creation are the concepts of openness, accessibility, clearness, personalisation and that allow to built the requirements system to the complex creation and its substantial content.The main functions of informational educational complex are detected: informational, educational, controlling and communicative.It’s defined that into the basis of scientific justification of new structure elements of informational-educational of course unit complex development and introduction is necessary to include creation of e-workbook, e-workshops in order to organize theoretical and practical e-conferences.Development of ICT in education that provides e-education application assume establishment of distance learning techno-logies for educational programme implementation.

  9. Model for Electromagnetic Information Leakage

    OpenAIRE

    Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming

    2013-01-01

    Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...

  10. The Linear Quadratic Gaussian Multistage Game with Nonclassical Information Pattern Using a Direct Solution Method

    Science.gov (United States)

    Clemens, Joshua William

    Game theory has application across multiple fields, spanning from economic strategy to optimal control of an aircraft and missile on an intercept trajectory. The idea of game theory is fascinating in that we can actually mathematically model real-world scenarios and determine optimal decision making. It may not always be easy to mathematically model certain real-world scenarios, nonetheless, game theory gives us an appreciation for the complexity involved in decision making. This complexity is especially apparent when the players involved have access to different information upon which to base their decision making (a nonclassical information pattern). Here we will focus on the class of adversarial two-player games (sometimes referred to as pursuit-evasion games) with nonclassical information pattern. We present a two-sided (simultaneous) optimization solution method for the two-player linear quadratic Gaussian (LQG) multistage game. This direct solution method allows for further interpretation of each player's decision making (strategy) as compared to previously used formal solution methods. In addition to the optimal control strategies, we present a saddle point proof and we derive an expression for the optimal performance index value. We provide some numerical results in order to further interpret the optimal control strategies and to highlight real-world application of this game-theoretic optimal solution.

  11. Maintenance and methods of forming theoretical knowledge and methodical and practical abilities in area of physical culture for students, future specialists on social work

    Directory of Open Access Journals (Sweden)

    Leyfa A.V.

    2009-12-01

    Full Text Available The value of theoretical knowledge, methodical, practical studies, skills in forming physical activity of students is rotined. The level of mastering of components of physical activity is closely associate with the basic blocks of professional preparation of students and their future professional activity. Theoretical knowledge on discipline the «Physical culture» assist the certain affecting depth and breadth of mastering of knowledge of professional preparation.

  12. Theoretical and simulation studies of seeding methods

    Energy Technology Data Exchange (ETDEWEB)

    Pellegrini, Claudio [Univ. of California, Los Angeles, CA (United States)

    2017-12-11

    We report the theoretical and experimental studies done with the support of DOE-Grant DE-SC0009983 to increase an X-ray FEL peak power from the present level of 20 to 40 GW to one or more TW by seeding, undulator tapering and using the new concept of the Double Bunch FEL.

  13. Number theoretic methods in cryptography complexity lower bounds

    CERN Document Server

    Shparlinski, Igor

    1999-01-01

    The book introduces new techniques which imply rigorous lower bounds on the complexity of some number theoretic and cryptographic problems. These methods and techniques are based on bounds of character sums and numbers of solutions of some polynomial equations over finite fields and residue rings. It also contains a number of open problems and proposals for further research. We obtain several lower bounds, exponential in terms of logp, on the de­ grees and orders of • polynomials; • algebraic functions; • Boolean functions; • linear recurring sequences; coinciding with values of the discrete logarithm modulo a prime p at suf­ ficiently many points (the number of points can be as small as pI/He). These functions are considered over the residue ring modulo p and over the residue ring modulo an arbitrary divisor d of p - 1. The case of d = 2 is of special interest since it corresponds to the representation of the right­ most bit of the discrete logarithm and defines whether the argument is a quadratic...

  14. CONCEPTUAL MODEL OF INFORMATION SYSTEM OF THE AGRICULTURAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Uladzimir Buts

    2017-02-01

    Full Text Available Abstract. Research subject represented by the theoretical and practical issues use of information resources in the agricultural business. Research aim is to formation of a conceptual model of information system of agricultural enterprises according to the requirements of sustainable development. Research methods. The work is prepared on basis of several scientific methods and approaches including monographic, analytical, computational and constructive methods of mathematical and structural logic simulation of information systems. Research results. Based on the assessment of the results of research information systems in agribusiness, as reflected in the theoretical review, the author designed principles of the information system for the agricultural enterprise for sustainable development of agribusiness. Sphere of application of the research results. State and regional authorities of economic regulation. Agricultural enterprises and farmers.

  15. Information Security Assessment of SMEs as Coursework -- Learning Information Security Management by Doing

    Science.gov (United States)

    Ilvonen, Ilona

    2013-01-01

    Information security management is an area with a lot of theoretical models. The models are designed to guide practitioners in prioritizing management resources in companies. Information security management education should address the gap between the academic ideals and practice. This paper introduces a teaching method that has been in use as…

  16. FUTURE SPECIALIST’S KULTURAL PREPARATION IN INFORMATION SOCIETY

    Directory of Open Access Journals (Sweden)

    Tatiana Vinnyk

    2015-10-01

    Full Text Available The article is devoted to theoretical and methodical basics of future specialist’s information culture formation as another form of communication and life in a new ontological reality. Theoretical bases of interrelation of culture and education arising from the nature and genesis of culture development as a social and personal phenomenon. Cultural characteristics of modernity are analyzed: a virtual mode of culture existence, the trend of greening culture. The authors examine the impact of personal factors on the process of culture development in the information society, analyze the symbolic nature of the activities in the information society, and focus on the virtualization of life as the culture of the information society. Theoretical investigation of the concept of personality information culture is made, its informative value, structural components and features of the formation. It is proved that in the process of planning the cultural training is important to consider such principles as continuity, sufficiency, consistency and practical application. The methods of students’ stimulation to information activities are disclosed. The main directions of the experimental searches of cultural training organization are determined: improving the content, forms and methods of future primary school teachers’ professional training, the use of innovative pedagogical techniques, computer tools and technologies. The possible results of students’ information activity in University are described.

  17. Advanced approaches to intelligent information and database systems

    CERN Document Server

    Boonjing, Veera; Chittayasothorn, Suphamit

    2014-01-01

    This book consists of 35 chapters presenting different theoretical and practical aspects of Intelligent Information and Database Systems. Nowadays both Intelligent and Database Systems are applied in most of the areas of human activities which necessitates further research in these areas. In this book various interesting issues related to the intelligent information models and methods as well as their advanced applications, database systems applications, data models and their analysis, and digital multimedia methods and applications are presented and discussed both from the practical and theoretical points of view. The book is organized in four parts devoted to intelligent systems models and methods, intelligent systems advanced applications, database systems methods and applications, and multimedia systems methods and applications. The book will be interesting for both practitioners and researchers, especially graduate and PhD students of information technology and computer science, as well more experienced ...

  18. Visual words assignment via information-theoretic manifold embedding.

    Science.gov (United States)

    Deng, Yue; Li, Yipeng; Qian, Yanjun; Ji, Xiangyang; Dai, Qionghai

    2014-10-01

    Codebook-based learning provides a flexible way to extract the contents of an image in a data-driven manner for visual recognition. One central task in such frameworks is codeword assignment, which allocates local image descriptors to the most similar codewords in the dictionary to generate histogram for categorization. Nevertheless, existing assignment approaches, e.g., nearest neighbors strategy (hard assignment) and Gaussian similarity (soft assignment), suffer from two problems: 1) too strong Euclidean assumption and 2) neglecting the label information of the local descriptors. To address the aforementioned two challenges, we propose a graph assignment method with maximal mutual information (GAMI) regularization. GAMI takes the power of manifold structure to better reveal the relationship of massive number of local features by nonlinear graph metric. Meanwhile, the mutual information of descriptor-label pairs is ultimately optimized in the embedding space for the sake of enhancing the discriminant property of the selected codewords. According to such objective, two optimization models, i.e., inexact-GAMI and exact-GAMI, are respectively proposed in this paper. The inexact model can be efficiently solved with a closed-from solution. The stricter exact-GAMI nonparametrically estimates the entropy of descriptor-label pairs in the embedding space and thus leads to a relatively complicated but still trackable optimization. The effectiveness of GAMI models are verified on both the public and our own datasets.

  19. Theoretical nuclear physics

    CERN Document Server

    Blatt, John M

    1979-01-01

    A classic work by two leading physicists and scientific educators endures as an uncommonly clear and cogent investigation and correlation of key aspects of theoretical nuclear physics. It is probably the most widely adopted book on the subject. The authors approach the subject as ""the theoretical concepts, methods, and considerations which have been devised in order to interpret the experimental material and to advance our ability to predict and control nuclear phenomena.""The present volume does not pretend to cover all aspects of theoretical nuclear physics. Its coverage is restricted to

  20. Experimental and Theoretical Methods in Algebra, Geometry and Topology

    CERN Document Server

    Veys, Willem; Bridging Algebra, Geometry, and Topology

    2014-01-01

    Algebra, geometry and topology cover a variety of different, but intimately related research fields in modern mathematics. This book focuses on specific aspects of this interaction. The present volume contains refereed papers which were presented at the International Conference “Experimental and Theoretical Methods in Algebra, Geometry and Topology”, held in Eforie Nord (near Constanta), Romania, during 20-25 June 2013. The conference was devoted to the 60th anniversary of the distinguished Romanian mathematicians Alexandru Dimca and Ştefan Papadima. The selected papers consist of original research work and a survey paper. They are intended for a large audience, including researchers and graduate students interested in algebraic geometry, combinatorics, topology, hyperplane arrangements and commutative algebra. The papers are written by well-known experts from different fields of mathematics, affiliated to universities from all over the word, they cover a broad range of topics and explore the research f...

  1. Theoretical studies of potential energy surfaces and computational methods

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, R. [Argonne National Laboratory, IL (United States)

    1993-12-01

    This project involves the development, implementation, and application of theoretical methods for the calculation and characterization of potential energy surfaces involving molecular species that occur in hydrocarbon combustion. These potential energy surfaces require an accurate and balanced treatment of reactants, intermediates, and products. This difficult challenge is met with general multiconfiguration self-consistent-field (MCSCF) and multireference single- and double-excitation configuration interaction (MRSDCI) methods. In contrast to the more common single-reference electronic structure methods, this approach is capable of describing accurately molecular systems that are highly distorted away from their equilibrium geometries, including reactant, fragment, and transition-state geometries, and of describing regions of the potential surface that are associated with electronic wave functions of widely varying nature. The MCSCF reference wave functions are designed to be sufficiently flexible to describe qualitatively the changes in the electronic structure over the broad range of geometries of interest. The necessary mixing of ionic, covalent, and Rydberg contributions, along with the appropriate treatment of the different electron-spin components (e.g. closed shell, high-spin open-shell, low-spin open shell, radical, diradical, etc.) of the wave functions, are treated correctly at this level. Further treatment of electron correlation effects is included using large scale multireference CI wave functions, particularly including the single and double excitations relative to the MCSCF reference space. This leads to the most flexible and accurate large-scale MRSDCI wave functions that have been used to date in global PES studies.

  2. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  3. Theoretical Proof and Empirical Confirmation of a Continuous Labeling Method Using Naturally 13C-Depleted Carbon Dioxide

    Institute of Scientific and Technical Information of China (English)

    Weixin Cheng; Feike A. Dijkstra

    2007-01-01

    Continuous isotope labeling and tracing is often needed to study the transformation, movement, and allocation of carbon in plant-soil systems. However, existing labeling methods have numerous limitations. The present study introduces a new continuous labeling method using naturally 13C-depleted CO2. We theoretically proved that a stable level of 13C-CO2 abundance In a labeling chamber can be maintained by controlling the rate of CO2-free air injection and the rate of ambient airflow with coupling of automatic control of CO2 concentration using a CO2 analyzer. The theoretical results were tested and confirmed in a 54 day experiment in a plant growth chamber. This new continuous labeling method avoids the use of radioactive 14C or expensive 13C-enriched CO2 required by existing methods and therefore eliminates issues of radiation safety or unaffordable isotope cost, as well as creating new opportunities for short- or long-term labeling experiments under a controlled environment.

  4. Exploring methods in information literacy research

    CERN Document Server

    Lipu, Suzanne; Lloyd, Annemaree

    2007-01-01

    This book provides an overview of approaches to assist researchers and practitioners to explore ways of undertaking research in the information literacy field. The first chapter provides an introductory overview of research by Dr Kirsty Williamson (author of Research Methods for Students, Academics and Professionals: Information Management and Systems) and this sets the scene for the rest of the chapters where each author explores the key aspects of a specific method and explains how it may be applied in practice. The methods covered include those representing qualitative, quantitative and mix

  5. Almost Free Modules Set-Theoretic Methods

    CERN Document Server

    Eklof, PC

    1990-01-01

    This is an extended treatment of the set-theoretic techniques which have transformed the study of abelian group and module theory over the last 15 years. Part of the book is new work which does not appear elsewhere in any form. In addition, a large body of material which has appeared previously (in scattered and sometimes inaccessible journal articles) has been extensively reworked and in many cases given new and improved proofs. The set theory required is carefully developed with algebraists in mind, and the independence results are derived from explicitly stated axioms. The book contains exe

  6. Method of and System for Information Retrieval

    DEFF Research Database (Denmark)

    2015-01-01

    This invention relates to a system for and a method (100) of searching a collection of digital information (150) comprising a number of digital documents (110), the method comprising receiving or obtaining (102) a search query, the query comprising a number of search terms, searching (103) an ind......, a method of and a system for information retrieval or searching is readily provided that enhances the searching quality (i.e. the number of relevant documents retrieved and such documents being ranked high) when (also) using queries containing many search terms.......This invention relates to a system for and a method (100) of searching a collection of digital information (150) comprising a number of digital documents (110), the method comprising receiving or obtaining (102) a search query, the query comprising a number of search terms, searching (103) an index...... (300) using the search terms thereby providing information (301) about which digital documents (110) of the collection of digital information (150) that contains a given search term and one or more search related metrics (302; 303; 304; 305; 306), ranking (105) at least a part of the search result...

  7. Tools and methods of the formation of the armed violence’ information component

    Directory of Open Access Journals (Sweden)

    A. V. Bader

    2016-10-01

    Thus, we can state that the informational component of the armed violence is gradually approaching such theoretically grounded phenomenon as «consistent war» This is a system of outreach and psychological tools, which are aimed at the creation of public awareness, and are conducted using media information, culture, arts and other (psychotropic, psychotronic tools during a long time according to the carefully developed scenarios.

  8. Unorthodox theoretical methods

    Energy Technology Data Exchange (ETDEWEB)

    Nedd, Sean [Iowa State Univ., Ames, IA (United States)

    2012-01-01

    The use of the ReaxFF force field to correlate with NMR mobilities of amine catalytic substituents on a mesoporous silica nanosphere surface is considered. The interfacing of the ReaxFF force field within the Surface Integrated Molecular Orbital/Molecular Mechanics (SIMOMM) method, in order to replicate earlier SIMOMM published data and to compare with the ReaxFF data, is discussed. The development of a new correlation consistent Composite Approach (ccCA) is presented, which incorporates the completely renormalized coupled cluster method with singles, doubles and non-iterative triples corrections towards the determination of heats of formations and reaction pathways which contain biradical species.

  9. Locating sensors for detecting source-to-target patterns of special nuclear material smuggling: a spatial information theoretic approach.

    Science.gov (United States)

    Przybyla, Jay; Taylor, Jeffrey; Zhou, Xuesong

    2010-01-01

    In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM) smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  10. Locating Sensors for Detecting Source-to-Target Patterns of Special Nuclear Material Smuggling: A Spatial Information Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Xuesong Zhou

    2010-08-01

    Full Text Available In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  11. Implementation of 2D Discrete Wavelet Transform by Number Theoretic Transform and 2D Overlap-Save Method

    Directory of Open Access Journals (Sweden)

    Lina Yang

    2014-01-01

    Full Text Available To reduce the computation complexity of wavelet transform, this paper presents a novel approach to be implemented. It consists of two key techniques: (1 fast number theoretic transform(FNTT In the FNTT, linear convolution is replaced by the circular one. It can speed up the computation of 2D discrete wavelet transform. (2 In two-dimensional overlap-save method directly calculating the FNTT to the whole input sequence may meet two difficulties; namely, a big modulo obstructs the effective implementation of the FNTT and a long input sequence slows the computation of the FNTT down. To fight with such deficiencies, a new technique which is referred to as 2D overlap-save method is developed. Experiments have been conducted. The fast number theoretic transform and 2D overlap-method have been used to implement the dyadic wavelet transform and applied to contour extraction in pattern recognition.

  12. Self-informant Agreement for Personality and Evaluative Person Descriptors: Comparing Methods for Creating Informant Measures.

    Science.gov (United States)

    Simms, Leonard J; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F

    2010-05-01

    Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets' self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits.

  13. A Theoretical Framework for Soft-Information-Based Synchronization in Iterative (Turbo Receivers

    Directory of Open Access Journals (Sweden)

    Lottici Vincenzo

    2005-01-01

    Full Text Available This contribution considers turbo synchronization, that is to say, the use of soft data information to estimate parameters like carrier phase, frequency, or timing offsets of a modulated signal within an iterative data demodulator. In turbo synchronization, the receiver exploits the soft decisions computed at each turbo decoding iteration to provide a reliable estimate of some signal parameters. The aim of our paper is to show that such "turbo-estimation" approach can be regarded as a special case of the expectation-maximization (EM algorithm. This leads to a general theoretical framework for turbo synchronization that allows to derive parameter estimation procedures for carrier phase and frequency offset, as well as for timing offset and signal amplitude. The proposed mathematical framework is illustrated by simulation results reported for the particular case of carrier phase and frequency offsets estimation of a turbo-coded 16-QAM signal.

  14. Method and apparatus for information carrier authentication

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a method of enabling authentication of an information carrier, the information carrier comprising a writeable part and a physical token arranged to supply a response upon receiving a challenge, the method comprising the following steps; applying a first challenge to

  15. Intelligent systems: A semiotic perspective. Volume I: Theoretical semiotics

    Energy Technology Data Exchange (ETDEWEB)

    Albus, J.; Meystel, A.; Quintero, R.

    1996-12-31

    This report contains the papers from the Proceedings of the 1996 International Multidisciplinary Conference - Theoretical Semiotics. General topics covered are: semiotic in biology: biologically inspired complex systems; intelligence in constructed complex systems; intelligence of learning and evolution; fuzzy logic and the mechanisms of generalization; information representation for decision making; sematic foundations; syntactics of intelligent systems: the kind of logic available; intelligence of recognition: the semiotic tools; and multiresolutional methods.

  16. Science Academies' Refresher Course on Theoretical Structural ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 22; Issue 8. Science Academies' Refresher Course on Theoretical Structural Geology, Crystallography, Mineralogy, Thermodynamics, Experimental Petrology and Theoretical Geophysics. Information and Announcements Volume 22 Issue 8 August 2017 ...

  17. Meta-Synthesis of Research on Information Seeking Behaviour

    Science.gov (United States)

    Urquhart, Christine

    2011-01-01

    Introduction: Meta-synthesis methods may help to make more sense of information behaviour research evidence. Aims and objectives: The objectives are to: 1) identify and examine the theoretical research strategies commonly used in information behaviour research; 2) discuss meta-synthesis methods that might be appropriate to the type of research…

  18. Group-theoretical method in the many-beam theory of electron diffraction

    International Nuclear Information System (INIS)

    Kogiso, Motokazu; Takahashi, Hidewo.

    1977-01-01

    A group-theoretical method is developed for the many-beam dynamical theory of the symmetric Laue case. When the incident wave is directed so that the Laue point lies on a symmetric position in the reciprocal lattice, the dispersion matrix in the fundamental equation can be reduced to a block diagonal form. The transformation matrix is composed of column vectors belonging to irreducible representations of the group of the incident wave vector. Without performing reduction, the reduced form of the dispersion matrix is determined from characters of representations. Practical application is made to the case of symmorphic crystals, where general reduced forms and all solvable examples are given in terms of some geometrical factors of reciprocal lattice arrangements. (auth.)

  19. Information-Theoretic Approach May Shed a Light to a Better Understanding and Sustaining the Integrity of Ecological-Societal Systems under Changing Climate

    Science.gov (United States)

    Kim, J.

    2016-12-01

    Considering high levels of uncertainty, epistemological conflicts over facts and values, and a sense of urgency, normal paradigm-driven science will be insufficient to mobilize people and nation toward sustainability. The conceptual framework to bridge the societal system dynamics with that of natural ecosystems in which humanity operates remains deficient. The key to understanding their coevolution is to understand `self-organization.' Information-theoretic approach may shed a light to provide a potential framework which enables not only to bridge human and nature but also to generate useful knowledge for understanding and sustaining the integrity of ecological-societal systems. How can information theory help understand the interface between ecological systems and social systems? How to delineate self-organizing processes and ensure them to fulfil sustainability? How to evaluate the flow of information from data through models to decision-makers? These are the core questions posed by sustainability science in which visioneering (i.e., the engineering of vision) is an essential framework. Yet, visioneering has neither quantitative measure nor information theoretic framework to work with and teach. This presentation is an attempt to accommodate the framework of self-organizing hierarchical open systems with visioneering into a common information-theoretic framework. A case study is presented with the UN/FAO's communal vision of climate-smart agriculture (CSA) which pursues a trilemma of efficiency, mitigation, and resilience. Challenges of delineating and facilitating self-organizing systems are discussed using transdisciplinary toold such as complex systems thinking, dynamic process network analysis and multi-agent systems modeling. Acknowledgments: This study was supported by the Korea Meteorological Administration Research and Development Program under Grant KMA-2012-0001-A (WISE project).

  20. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection.

    Science.gov (United States)

    I H, Monrad Aas

    2014-11-01

    Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview - unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants - as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these influences is fundamental. Further development of

  1. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  2. Group theoretical methods in physics. [Tuebingen, July 18-22, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, P; Rieckers, A

    1978-01-01

    This volume comprises the proceedings of the 6th International Colloquium on Group Theoretical Methods in Physics, held at Tuebingen in July 1977. Invited papers were presented on the following topics: supersymmetry and graded Lie algebras; concepts of order and disorder arising from molecular physics; symplectic structures and many-body physics; symmetry breaking in statistical mechanics and field theory; automata and systems as examples of applied (semi-) group theory; renormalization group; and gauge theories. Summaries are given of the contributed papers, which can be grouped as follows: supersymmetry, symmetry in particles and relativistic physics; symmetry in molecular and solid state physics; broken symmetry and phase transitions; structure of groups and dynamical systems; representations of groups and Lie algebras; and general symmetries, quantization. Those individual papers in scope for the TIC data base are being entered from ATOMINDEX tapes. (RWR)

  3. Parenting Practices of Anxious and Non-Anxious Mothers: A Multi-method Multi-informant Approach

    Science.gov (United States)

    Drake, Kelly L.; Ginsburg, Golda S.

    2012-01-01

    Anxious and non-anxious mothers were compared on theoretically derived parenting and family environment variables (i.e., over-control, warmth, criticism, anxious modeling) using multiple informants and methods. Mother-child dyads completed questionnaires about parenting and were observed during an interactional task. Findings revealed that, after controlling for race and child anxiety, maternal anxiety was associated with less warmth and more anxious modeling based on maternal-report. However, maternal anxiety was not related to any parenting domain based on child-report or independent observer (IO) ratings. Findings are discussed in the context of the impact of maternal anxiety on parenting and suggest that child, rather than maternal, anxiety may have a greater influence on parental behavior. PMID:22639487

  4. Towards a Definition of Serendipity in Information Behaviour

    Science.gov (United States)

    Agarwal, Naresh Kumar

    2015-01-01

    Introduction: Serendipitous or accidental discovery of information has often been neglected in information behaviour models, which tend to focus on information seeking, a more goal-directed behaviour. Method: This theoretical paper seeks to map the conceptual space of serendipity in information behaviour and to arrive at a definition. This is done…

  5. Current and future prospects for the application of systematic theoretical methods to the study of problems in physical oceanography

    Energy Technology Data Exchange (ETDEWEB)

    Constantin, A., E-mail: adrian.constantin@kcl.ac.uk [Department of Mathematics, King' s College London, Strand, London WC2R 2LS (United Kingdom); Faculty of Mathematics, University of Vienna, Oskar-Morgenstern-Platz 1, 1090 Vienna (Austria); Johnson, R.S., E-mail: r.s.johnson@ncl.ac.uk [School of Mathematics & Statistics, Newcastle University, Newcastle upon Tyne NE1 7RU (United Kingdom)

    2016-09-07

    Highlights: • Systematic theoretical methods in studies of equatorial ocean dynamics. • Linear wave-current interactions in stratified flows. • Exact solutions – Kelvin waves, azimuthal non-uniform currents. • Three-dimensional nonlinear currents. • Hamiltonian formulation for the governing equations and for structure-preserving/enhancing approximations. - Abstract: This essay is a commentary on the pivotal role of systematic theoretical methods in physical oceanography. At some level, there will always be a conflict between theory and experiment/data collection: Which is pre-eminent? Which should come first? This issue appears to be particularly marked in physical oceanography, to the extreme detriment of the development of the subject. It is our contention that the classical theory of fluids, coupled with methods from the theory of differential equations, can play a significant role in carrying the subject, and our understanding, forward. We outline the philosophy behind a systematic theoretical approach, highlighting some aspects of equatorial ocean dynamics where these methods have already been successful, paving the way for much more in the future and leading, we expect, to the better understanding of this and many other types of ocean flow. We believe that the ideas described here promise to reveal a rich and beautiful dynamical structure.

  6. A group theoretic approach to quantum information

    CERN Document Server

    Hayashi, Masahito

    2017-01-01

    This textbook is the first one addressing quantum information from the viewpoint of group symmetry. Quantum systems have a group symmetrical structure. This structure enables to handle systematically quantum information processing. However, there is no other textbook focusing on group symmetry for quantum information although there exist many textbooks for group representation. After the mathematical preparation of quantum information, this book discusses quantum entanglement and its quantification by using group symmetry. Group symmetry drastically simplifies the calculation of several entanglement measures although their calculations are usually very difficult to handle. This book treats optimal information processes including quantum state estimation, quantum state cloning, estimation of group action and quantum channel etc. Usually it is very difficult to derive the optimal quantum information processes without asymptotic setting of these topics. However, group symmetry allows to derive these optimal solu...

  7. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  8. Development and content validation of the information assessment method for patients and consumers.

    Science.gov (United States)

    Pluye, Pierre; Granikov, Vera; Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan Lm; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-02-18

    Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded and all inconsistencies in using a

  9. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  10. INFANTILISM: THEORETICAL CONSTRUCT AND OPERATIONALIZATION

    Directory of Open Access Journals (Sweden)

    Yelena V. Sabelnikova

    2016-01-01

    Full Text Available The aim of the presented research is to define and operationalize theoretically the concept of infantilism and its construct. The content of theoretical construct «infantilism» is analyzed. Methods. The methods of theoretical research involve analysis and synthesis. The age and content criteria are analysed in the context of childhood and adulthood. The traits which can be interpreted as adult infantile traits are described. Results. The characteristics of adult infantilism in modern world taking into account the increasing of information flows and socio-economic changes are defined. The definition of the concept «infantilism» including its main features is given. Infantilism is defined as the personal organization including features and models of the previous age period not adequate for the real age stage with emphasis on immaturity of the emotional and volitional sphere. Scientific novelty. The main psychological characteristics of adulthood are described as the reflection, requirement to work and professional activity, existence of professional self-determination, possession of labor skills, need for selfrealization, maturity of the emotional and volitional sphere. As objective adulthood characteristics are considered the following: transition to economic and territorial independence of a parental family, and also development of new social roles, such as a worker, spouse, and parent. Two options of a possible operationalization of concept are allocated: objective (existence / absence in real human life of objective criteria of adulthood and subjective (the self-report on subjective feeling of existence / lack of psychological characteristics of adulthood. Practical significance consists in a construct operationalization of «infantilism» which at the moment has so many interpretations. That operationalization is necessary for the further analysis and carrying out various researches. 

  11. Signal correlations in biomass combustion. An information theoretic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruusunen, M.

    2013-09-01

    Increasing environmental and economic awareness are driving the development of combustion technologies to efficient biomass use and clean burning. To accomplish these goals, quantitative information about combustion variables is needed. However, for small-scale combustion units the existing monitoring methods are often expensive or complex. This study aimed to quantify correlations between flue gas temperatures and combustion variables, namely typical emission components, heat output, and efficiency. For this, data acquired from four small-scale combustion units and a large circulating fluidised bed boiler was studied. The fuel range varied from wood logs, wood chips, and wood pellets to biomass residue. Original signals and a defined set of their mathematical transformations were applied to data analysis. In order to evaluate the strength of the correlations, a multivariate distance measure based on information theory was derived. The analysis further assessed time-varying signal correlations and relative time delays. Ranking of the analysis results was based on the distance measure. The uniformity of the correlations in the different data sets was studied by comparing the 10-quantiles of the measured signal. The method was validated with two benchmark data sets. The flue gas temperatures and the combustion variables measured carried similar information. The strongest correlations were mainly linear with the transformed signal combinations and explicable by the combustion theory. Remarkably, the results showed uniformity of the correlations across the data sets with several signal transformations. This was also indicated by simulations using a linear model with constant structure to monitor carbon dioxide in flue gas. Acceptable performance was observed according to three validation criteria used to quantify modelling error in each data set. In general, the findings demonstrate that the presented signal transformations enable real-time approximation of the studied

  12. Teaching information seeking

    Directory of Open Access Journals (Sweden)

    Louise Limberg

    2006-01-01

    Full Text Available Introduction. The article argues for a closer association between information seeking research and the practices of teaching information seeking. Findings are presented from a research project on information seeking, didactics and learning (IDOL investigating librarians' and teachers' experiences of teaching information seeking. Method. Thirteen teachers and five librarians, teaching 12-19 year-old students in three schools, participated. Forty-five interviews were conducted over a period of three years. Analysis. The IDOL project adopted a phenomenographic approach with the purpose of describing patterns of variation in experiences. The findings were also analysed by way of relating them to four competing approaches to the mediation of information literacy. Results. A gap was identified between experiences of teaching content that focused on sources and order, and experiences of assessment criteria applied to students' work that focused on the importance of correct facts and the analysis of information. These findings indicate a highly restricted range of teaching contents when compared with the four theoretical approaches to the mediation of information literacy. Conclusion. Teaching information seeking might be enhanced by a wider repertoire of contents reflecting more varied theoretical understanding developed in information seeking research, particularly as regards the importance of content and context related to user perspectives.

  13. Application of geo-information science methods in ecotourism exploitation

    Science.gov (United States)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  14. Derivation of Human Chromatic Discrimination Ability from an Information-Theoretical Notion of Distance in Color Space.

    Science.gov (United States)

    da Fonseca, María; Samengo, Inés

    2016-12-01

    The accuracy with which humans detect chromatic differences varies throughout color space. For example, we are far more precise when discriminating two similar orange stimuli than two similar green stimuli. In order for two colors to be perceived as different, the neurons representing chromatic information must respond differently, and the difference must be larger than the trial-to-trial variability of the response to each separate color. Photoreceptors constitute the first stage in the processing of color information; many more stages are required before humans can consciously report whether two stimuli are perceived as chromatically distinguishable. Therefore, although photoreceptor absorption curves are expected to influence the accuracy of conscious discriminability, there is no reason to believe that they should suffice to explain it. Here we develop information-theoretical tools based on the Fisher metric that demonstrate that photoreceptor absorption properties explain about 87% of the variance of human color discrimination ability, as tested by previous behavioral experiments. In the context of this theory, the bottleneck in chromatic information processing is determined by photoreceptor absorption characteristics. Subsequent encoding stages modify only marginally the chromatic discriminability at the photoreceptor level.

  15. Entropy Maximization as a Basis for Information Recovery in Dynamic Economic Behavioral Systems

    Directory of Open Access Journals (Sweden)

    George Judge

    2015-02-01

    Full Text Available As a basis for information recovery in open dynamic microeconomic systems, we emphasize the connection between adaptive intelligent behavior, causal entropy maximization and self-organized equilibrium seeking behavior. This entropy-based causal adaptive behavior framework permits the use of information-theoretic methods as a solution basis for the resulting pure and stochastic inverse economic-econometric problems. We cast the information recovery problem in the form of a binary network and suggest information-theoretic methods to recover estimates of the unknown binary behavioral parameters without explicitly sampling the configuration-arrangement of the sample space.

  16. Studying collaborative information seeking: Experiences with three methods

    DEFF Research Database (Denmark)

    Hyldegård, Jette Seiden; Hertzum, Morten; Hansen, Preben

    2015-01-01

    , however, benefit from a discussion of methodological issues. This chapter describes the application of three methods for collecting and analyzing data in three CIS studies. The three methods are Multidimensional Exploration, used in a CIS study of students’ in-formation behavior during a group assignment......; Task-structured Observation, used in a CIS study of patent engineers; and Condensed Observation, used in a CIS study of information-systems development. The three methods are presented in the context of the studies for which they were devised, and the experiences gained using the methods are discussed....... The chapter shows that different methods can be used for collecting and analyzing data about CIS incidents. Two of the methods focused on tasks and events in work settings, while the third was applied in an educational setting. Commonalities and differences among the methods are discussed to inform decisions...

  17. Theoretical Guidelines for the Development of Reading Comprehension Using Metacognitive Strategies

    Directory of Open Access Journals (Sweden)

    Yelitza del Carmen Morillo Terán

    2016-11-01

    Full Text Available The research aims to generate theoretical development of reading comprehension guidance by using metacognitive strategies in the IUTEMBI University Institute of Technology "Mario Briceño Iragorry" located in Valera, Trujillo state. Referential theories will be based on Pujol (2010, Pinzás (2013, among others. For the qualitative development through the phenomenological method supported by hermeneutics that enable understanding of the phenomena in its various manifestations approach is assumed. Key informants will be selected according to criteria of qualitative research, based on the fact that teachers are the aforementioned institution and pledged to voluntarily participate in the research. So they are regarded as informers three teachers and three students of different races that offer this university. For the development of the study will be used as instruments an unstructured interview, audio recordings, that will study in depth analysis units. The findings allow the aforementioned theoretical generating orientations.

  18. The Methods of Information Security Based on Blurring of System

    Directory of Open Access Journals (Sweden)

    Mikhail Andreevich Styugin

    2016-03-01

    Full Text Available The paper present the model of researching system with own known input, output and set of discrete internal states. These theoretical objects like an absolutely protected from research system and an absolutely indiscernible data transfer channel are defined. Generalization of the principle of Shannon Secrecy are made. The method of system blurring is defined. Theoretically cryptographically strong of absolutely indiscernible data transfer channel is proved and its practical unbreakable against unreliable pseudo random number generator is shown. This paper present system with blurring of channel named Pseudo IDTC and shown asymptotic complexity of break this system compare with AES and GOST.

  19. Agile Methods from the Viewpoint of Information

    Directory of Open Access Journals (Sweden)

    Eder Junior Alves

    2017-10-01

    Full Text Available Introduction: Since Paul M. G. Otlet highlighted the term documentation in 1934, proposing how to collect and organize the world's knowledge, many scientific researches directed observations to the study of Information Science. Methods and techniques have come up with a world view from the perspective of information. Agile methods follow this trend. Objective: The purpose is to analyze the relevance of information flow to organizations adopting agile methods, understanding how the innovation process is influenced by this practice. Methodology: This is a bibliometric study with fundamentals of Systematic Literature Review (SLR. The integration between the SLR technique interacting with Summarize tool is a new methodological proposal. Results: Scrum appears with the highest number of publications in SPELL. In comparison, results of Google Scholar pointed out to the importance of practices and team behaviors. In Science Direct repository, critical success factors in project management and software development are highlighted. Introduction: Conclusions: It was evident that agile methods are being used as process innovations. The benefits and advantages are evident with internal and external occurrence of information flow. Due to the prevalence in the literature, Scrum deserves attention by firms.

  20. Theoretical study (ab initio and DFT methods on acidic dissociation constant of xylenol orange in aqueous solution

    Directory of Open Access Journals (Sweden)

    F. Kiani

    2017-07-01

    Full Text Available Analytical measurement of materials requires exact knowledge of their acid dissociation constant (pKa values. In recent years, quantum mechanical calculations have been extensively used to study of acidities in the aqueous solutions and the results were compared with the experimental values. In this study, a theoretical study was carried out on xylenol orange (in water solution by ab initio method. We calculated the pKa values of xylenol orange in water, using high-level ab initio (PM3, DFT (HF, B3LYP/6-31+G(d and SCRF methods. The experimental determination of these values (pKa,s is a challenge because xylenol orange has a low solubility in water. We considered several ionization reactions and equilibriums in water that constitute the indispensable theoretical basis to calculate the pKa values of xylenol orange. The results show that the calculated pKa values have a comparable agreement with the experimentally determined pKa values. Therefore, this method can be used to predict such properties for indicators, drugs and other important molecules.

  1. An Information Theoretic Characterisation of Auditory Encoding

    Science.gov (United States)

    Overath, Tobias; Cusack, Rhodri; Kumar, Sukhbinder; von Kriegstein, Katharina; Warren, Jason D; Grube, Manon; Carlyon, Robert P; Griffiths, Timothy D

    2007-01-01

    The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. By systematically varying the entropy of pitch sequences, we sought brain areas where neural activity and energetic demands increase as a function of entropy. Such a relationship is predicted to occur in an efficient encoding mechanism that uses less computational resource when less information is present in the signal: we specifically tested the hypothesis that such a relationship is present in the planum temporale (PT). In two convergent functional MRI studies, we demonstrated this relationship in PT for encoding, while furthermore showing that a distributed fronto-parietal network for retrieval of acoustic information is independent of entropy. The results establish PT as an efficient neural engine that demands less computational resource to encode redundant signals than those with high information content. PMID:17958472

  2. Theoretical reflections on the paradigmatic construction of Information Science: considerations about the (s paradigm (s cognitive (s and social

    Directory of Open Access Journals (Sweden)

    Jonathas Luiz Carvalho Silva

    2013-07-01

    Full Text Available It presents a research about the theoretical and epistemological processes that influence the formation of the cognitive paradigm of Information Science (IS, noting the emergence of social paradigm within the domain analysis and hermeneutics of information. For this, we adopted the reflections of classical and contemporary authors, like Thomas Kuhn, Boaventura Santos, Capurro, Hjørland and Albrechtsen. We conclude that the perception paradigm in IS is a consolidated issue, however the social paradigm is still under construction, which will allow the creation of perceptions, interpretations and contributions in order to fill gaps left by other paradigms.

  3. Theoretical study on the inverse modeling of deep body temperature measurement

    International Nuclear Information System (INIS)

    Huang, Ming; Chen, Wenxi

    2012-01-01

    We evaluated the theoretical aspects of monitoring the deep body temperature distribution with the inverse modeling method. A two-dimensional model was built based on anatomical structure to simulate the human abdomen. By integrating biophysical and physiological information, the deep body temperature distribution was estimated from cutaneous surface temperature measurements using an inverse quasilinear method. Simulations were conducted with and without the heat effect of blood perfusion in the muscle and skin layers. The results of the simulations showed consistently that the noise characteristics and arrangement of the temperature sensors were the major factors affecting the accuracy of the inverse solution. With temperature sensors of 0.05 °C systematic error and an optimized 16-sensor arrangement, the inverse method could estimate the deep body temperature distribution with an average absolute error of less than 0.20 °C. The results of this theoretical study suggest that it is possible to reconstruct the deep body temperature distribution with the inverse method and that this approach merits further investigation. (paper)

  4. Fuzzy Search Method for Hi Education Information Security

    Directory of Open Access Journals (Sweden)

    Grigory Grigorevich Novikov

    2016-03-01

    Full Text Available The main reason of the research is how to use fuzzy search method for information security of Hi Education or some similar purposes. So many sensitive information leaks are through non SUMMARY 149 classified documents legal publishing. That’s why many intelligence services so love to use the «mosaic» information collection method. This article is about how to prevent it.

  5. Theoretical methods for the calculation of the multiphoton ionisation cross-section of atoms and molecules

    International Nuclear Information System (INIS)

    Moccia, R.

    1991-01-01

    Some of the available theoretical methods to compute the two-photon ionisation cross-section of many-electron systems are reviewed. In particular the problems concerning the computation of (i) reliable approximations for the transition matrix elements and the excitation energies; and (ii) accurate results pertaining to the electronic continuum by the use of L 2 basis functions are considered. (author). 29 refs., 6 figs., 1 tab

  6. Theoretical simulation of the dual-heat-flux method in deep body temperature measurements.

    Science.gov (United States)

    Huang, Ming; Chen, Wenxi

    2010-01-01

    Deep body temperature reveals individual physiological states, and is important in patient monitoring and chronobiological studies. An innovative dual-heat-flux method has been shown experimentally to be competitive with the conventional zero-heat-flow method in its performance, in terms of measurement accuracy and step response to changes in the deep temperature. We have utilized a finite element method to model and simulate the dynamic process of a dual-heat-flux probe in deep body temperature measurements to validate the fundamental principles of the dual-heat-flux method theoretically, and to acquire a detailed quantitative description of the thermal profile of the dual-heat-flux probe. The simulation results show that the estimated deep body temperature is influenced by the ambient temperature (linearly, at a maximum rate of 0.03 °C/°C) and the blood perfusion rate. The corresponding depth of the estimated temperature in the skin and subcutaneous tissue layer is consistent when using the dual-heat-flux probe. Insights in improving the performance of the dual-heat-flux method were discussed for further studies of dual-heat-flux probes, taking into account structural and geometric considerations.

  7. Solid surfaces : some theoretical aspects

    International Nuclear Information System (INIS)

    Das, M.P.

    1978-01-01

    An appraisal of the current situation concerning some of the theoretical aspects of solid surfaces is presented. First of all the characterization of the surfaces that involves the surface geometry and atomic composition for both the clean and adsorbed surfaces is discussed. Under this, the methods for determining the surface structure (such as low energy electron diffraction, field electron and field ion microscopy, photo emission spectroscopy and atomic scattering) and methods for determining the surface composition by the Auger electron spectroscopy are outlined. In the second part, emphasis is on the electronic structure of the clean and adsorbed surfaces. The measurements of ultra-violet and X-ray photo electron spectra are shown to yield the information about the surface electronic structure. In this context the many body effects such as, shake-up and relaxation energy etc. are discussed. Finally the status of the theory in relation to the experiments on angular resolved and polarization dependent photo emission are presented. (auth.)

  8. Informational Urbanism

    Directory of Open Access Journals (Sweden)

    Wolfgang G. Stock

    2015-10-01

    Full Text Available Contemporary and future cities are often labeled as "smart cities," "ubiquitous cities," "knowledge cities" and "creative cities." Informational urbanism includes all aspects of information and knowledge with regard to urban regions. "Informational city" is an umbrella term uniting the divergent trends of information-related city research. Informational urbanism is an interdisciplinary endeavor incorporating on the one side computer science and information science and on the other side urbanism, architecture, (city economics, and (city sociology. In our research project on informational cities, we visited more than 40 metropolises and smaller towns all over the world. In this paper, we sketch the theoretical background on a journey from Max Weber to the Internet of Things, introduce our research methods, and describe main results on characteristics of informational cities as prototypical cities of the emerging knowledge society.

  9. Set-Theoretic Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan

    Despite being widely accepted and applied, maturity models in Information Systems (IS) have been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. This PhD thesis focuses on addressing...... these criticisms by incorporating recent developments in configuration theory, in particular application of set-theoretic approaches. The aim is to show the potential of employing a set-theoretic approach for maturity model research and empirically demonstrating equifinal paths to maturity. Specifically...... methodological guidelines consisting of detailed procedures to systematically apply set theoretic approaches for maturity model research and provides demonstrations of it application on three datasets. The thesis is a collection of six research papers that are written in a sequential manner. The first paper...

  10. Fundamental energy limits of SET-based Brownian NAND and half-adder circuits. Preliminary findings from a physical-information-theoretic methodology

    Science.gov (United States)

    Ercan, İlke; Suyabatmaz, Enes

    2018-06-01

    The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.

  11. A Theoretical Approach

    African Journals Online (AJOL)

    NICO

    L-rhamnose and L-fucose: A Theoretical Approach ... L-ramnose and L-fucose, by means of the Monte Carlo conformational search method. The energy of the conformers ..... which indicates an increased probability for the occurrence of.

  12. A Transient Fault Recognition Method for an AC-DC Hybrid Transmission System Based on MMC Information Fusion

    Directory of Open Access Journals (Sweden)

    Jikai Chen

    2016-12-01

    Full Text Available At present, the research is still in the primary stage in the process of fault disturbance energy transfer in the multilevel modular converter based high voltage direct current (HVDC-MMC. An urgent problem is how to extract and analyze the fault features hidden in MMC electrical information in further studies on the HVDC system. Aiming at the above, this article analyzes the influence of AC transient disturbance on electrical signals of MMC. At the same time, it is found that the energy distribution of electrical signals in MMC is different for different arms in the same frequency bands after the discrete wavelet packet transformation (DWPT. Renyi wavelet packet energy entropy (RWPEE and Renyi wavelet packet time entropy (RWPTE are proposed and applied to AC transient fault feature extraction from electrical signals in MMC. Using the feature extraction results of Renyi wavelet packet entropy (RWPE, a novel recognition method is put forward to recognize AC transient faults using the information fusion technology. Theoretical analysis and experimental results show that the proposed method is available to recognize transient AC faults.

  13. Analysis of Emergency Information Management Research Hotspots Based on Bibliometric and Co-occurrence Analysis

    Directory of Open Access Journals (Sweden)

    Zou Qingyun

    2017-04-01

    Full Text Available [Purpose/significance] Emergency information management is an interdisciplinary field of emergency management and information management. Summarizing the major research output is helpful to strengthen the effective utilization of information resources in emergency management research, and to provide references for the follow-up development and practical exploration of emergency information management research. [Method/process] By retrieving concerned literature from CNKI, this paper used the bibliometric and co-word clustering analysis methods to analyze the domestic emergency management research output. [Result/conclusion] Domestic emergency information management research mainly focuses on five hot topics: disaster emergency information management, crisis information disclosure, emergency information management system, emergency response, wisdom emergency management. China should strengthen the emergency management information base for future theoretical research, and build the emergency information management theoretical framework.

  14. Value of information methods to design a clinical trial in a small population to optimise a health economic utility function.

    Science.gov (United States)

    Pearce, Michael; Hee, Siew Wan; Madan, Jason; Posch, Martin; Day, Simon; Miller, Frank; Zohar, Sarah; Stallard, Nigel

    2018-02-08

    Most confirmatory randomised controlled clinical trials (RCTs) are designed with specified power, usually 80% or 90%, for a hypothesis test conducted at a given significance level, usually 2.5% for a one-sided test. Approval of the experimental treatment by regulatory agencies is then based on the result of such a significance test with other information to balance the risk of adverse events against the benefit of the treatment to future patients. In the setting of a rare disease, recruiting sufficient patients to achieve conventional error rates for clinically reasonable effect sizes may be infeasible, suggesting that the decision-making process should reflect the size of the target population. We considered the use of a decision-theoretic value of information (VOI) method to obtain the optimal sample size and significance level for confirmatory RCTs in a range of settings. We assume the decision maker represents society. For simplicity we assume the primary endpoint to be normally distributed with unknown mean following some normal prior distribution representing information on the anticipated effectiveness of the therapy available before the trial. The method is illustrated by an application in an RCT in haemophilia A. We explicitly specify the utility in terms of improvement in primary outcome and compare this with the costs of treating patients, both financial and in terms of potential harm, during the trial and in the future. The optimal sample size for the clinical trial decreases as the size of the population decreases. For non-zero cost of treating future patients, either monetary or in terms of potential harmful effects, stronger evidence is required for approval as the population size increases, though this is not the case if the costs of treating future patients are ignored. Decision-theoretic VOI methods offer a flexible approach with both type I error rate and power (or equivalently trial sample size) depending on the size of the future population for

  15. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    International Nuclear Information System (INIS)

    Shafii, Mohammad Ali; Meidianti, Rahma; Wildian,; Fitriyani, Dian; Tongkukut, Seni H. J.; Arkundato, Artoto

    2014-01-01

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation

  16. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    Energy Technology Data Exchange (ETDEWEB)

    Shafii, Mohammad Ali, E-mail: mashafii@fmipa.unand.ac.id; Meidianti, Rahma, E-mail: mashafii@fmipa.unand.ac.id; Wildian,, E-mail: mashafii@fmipa.unand.ac.id; Fitriyani, Dian, E-mail: mashafii@fmipa.unand.ac.id [Department of Physics, Andalas University Padang West Sumatera Indonesia (Indonesia); Tongkukut, Seni H. J. [Department of Physics, Sam Ratulangi University Manado North Sulawesi Indonesia (Indonesia); Arkundato, Artoto [Department of Physics, Jember University Jember East Java Indonesia (Indonesia)

    2014-09-30

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation.

  17. Information–theoretic implications of quantum causal structures

    DEFF Research Database (Denmark)

    Chaves, Rafael; Majenz, Christian; Gross, David

    2015-01-01

    . However, no systematic method is known for treating such problems in a way that generalizes to quantum systems. Here, we describe a general algorithm for computing information–theoretic constraints on the correlations that can arise from a given causal structure, where we allow for quantum systems as well...... as classical random variables. The general technique is applied to two relevant cases: first, we show that the principle of information causality appears naturally in our framework and go on to generalize and strengthen it. Second, we derive bounds on the correlations that can occur in a networked architecture......It is a relatively new insight of classical statistics that empirical data can contain information about causation rather than mere correlation. First algorithms have been proposed that are capable of testing whether a presumed causal relationship is compatible with an observed distribution...

  18. Theoretical study of electron transfer mechanism in biological systems with a QM (MRSCI+DFT)/MM method

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Toshikazu [Research Program for Computational Science, RIKEN 2-1, Hirosawa, Wako, Saitama 351-0198 (Japan)

    2007-07-15

    The goal of this project is to understand the charge separation mechanisms in biological systems using the molecular orbital theories. Specially, the charge separation in the photosynthetic reaction center is focused on, since the efficiency in use of the solar energy is extraordinary and the reason for it is still kept unknown. Here, a QM/MM theoretical scheme is employed to take the effects of the surrounding proteins onto the pigments into account. To describe such excited electronic structures, a unified theory by MRSCI and DFT is newly invented. For atoms in the MM space, a new sampling method has also been created, based on the statistical physics. By using these theoretical framework, the excited and positively charged states of the special pair, that is, chlorophyll dimmer are planning to be calculated this year.

  19. Theoretical study of electron transfer mechanism in biological systems with a QM (MRSCI+DFT)/MM method

    International Nuclear Information System (INIS)

    Takada, Toshikazu

    2007-01-01

    The goal of this project is to understand the charge separation mechanisms in biological systems using the molecular orbital theories. Specially, the charge separation in the photosynthetic reaction center is focused on, since the efficiency in use of the solar energy is extraordinary and the reason for it is still kept unknown. Here, a QM/MM theoretical scheme is employed to take the effects of the surrounding proteins onto the pigments into account. To describe such excited electronic structures, a unified theory by MRSCI and DFT is newly invented. For atoms in the MM space, a new sampling method has also been created, based on the statistical physics. By using these theoretical framework, the excited and positively charged states of the special pair, that is, chlorophyll dimmer are planning to be calculated this year

  20. Basic Theoretical Principles Pertaining to Thermal Protection of Oil Transformer

    Directory of Open Access Journals (Sweden)

    O. G. Shirokov

    2008-01-01

    Full Text Available The paper contains formulation of basic theoretical principles pertaining to thermal protection of an oil transformer in accordance with classical theory of relay protection and theory of diagnostics with the purpose of unification of terminological and analytical information which is presently available in respect of this problem. Classification of abnormal thermal modes of an oil transformer and also algorithms and methods for operation of diagnostic thermal protection of a transformer have been proposed.

  1. Unified Theoretical Frame of a Joint Transmitter-Receiver Reduced Dimensional STAP Method for an Airborne MIMO Radar

    Directory of Open Access Journals (Sweden)

    Guo Yiduo

    2016-10-01

    Full Text Available The unified theoretical frame of a joint transmitter-receiver reduced dimensional Space-Time Adaptive Processing (STAP method is studied for an airborne Multiple-Input Multiple-Output (MIMO radar. First, based on the transmitted waveform diverse characteristics of the transmitted waveform of the airborne MIMO radar, a uniform theoretical frame structure for the reduced dimensional joint adaptive STAP is constructed. Based on it, three reduced dimensional STAP fixed structures are established. Finally, three reduced rank STAP algorithms, which are suitable for a MIMO system, are presented corresponding to the three reduced dimensional STAP fixed structures. The simulations indicate that the joint adaptive algorithms have preferable clutter suppression and anti-interference performance.

  2. Discourse Analysis of the Documentary Method as "Key" to Self-Referential Communication Systems? Theoretic-Methodological Basics and Empirical Vignettes

    Directory of Open Access Journals (Sweden)

    Gian-Claudio Gentile

    2010-09-01

    Full Text Available Niklas LUHMANN is well known for his deliberate departure from the classical focus on studying individual actions and directing attention on the actors' relatedness through so called (autopoietic communication systems. In contrast to the gain of a new perspective of observation his focus on autopoietic systems is simultaneously its biggest methodological obstacle for the use in social and management sciences. The present contribution considers the above shift on a theoretical level and with a specific qualitative method. It argues for a deeper understanding of systemic sense making and its enactment in a systematic and comprehensible way. Central to this approach is its focus on groups. Using group discussions as the method of data collection, and the "documentary method" by Ralf BOHNSACK (2003 as a method of data analysis, the article describes a methodologically grounded way to record the self-referential systems proposed by LUHMANN's system theory. The theoretical considerations of the paper are illustrated by empirical vignettes derived from a research project conducted in Switzerland concerning the social responsibility of business. URN: urn:nbn:de:0114-fqs1003156

  3. Value of information methods to design a clinical trial in a small population to optimise a health economic utility function

    Directory of Open Access Journals (Sweden)

    Michael Pearce

    2018-02-01

    Full Text Available Abstract Background Most confirmatory randomised controlled clinical trials (RCTs are designed with specified power, usually 80% or 90%, for a hypothesis test conducted at a given significance level, usually 2.5% for a one-sided test. Approval of the experimental treatment by regulatory agencies is then based on the result of such a significance test with other information to balance the risk of adverse events against the benefit of the treatment to future patients. In the setting of a rare disease, recruiting sufficient patients to achieve conventional error rates for clinically reasonable effect sizes may be infeasible, suggesting that the decision-making process should reflect the size of the target population. Methods We considered the use of a decision-theoretic value of information (VOI method to obtain the optimal sample size and significance level for confirmatory RCTs in a range of settings. We assume the decision maker represents society. For simplicity we assume the primary endpoint to be normally distributed with unknown mean following some normal prior distribution representing information on the anticipated effectiveness of the therapy available before the trial. The method is illustrated by an application in an RCT in haemophilia A. We explicitly specify the utility in terms of improvement in primary outcome and compare this with the costs of treating patients, both financial and in terms of potential harm, during the trial and in the future. Results The optimal sample size for the clinical trial decreases as the size of the population decreases. For non-zero cost of treating future patients, either monetary or in terms of potential harmful effects, stronger evidence is required for approval as the population size increases, though this is not the case if the costs of treating future patients are ignored. Conclusions Decision-theoretic VOI methods offer a flexible approach with both type I error rate and power (or equivalently

  4. 48 CFR 2905.101 - Methods of disseminating information.

    Science.gov (United States)

    2010-10-01

    ... information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 2905.101 Methods of disseminating... dissemination of information concerning procurement actions. The Division of Acquisition Management Services...

  5. Assessment of two theoretical methods to estimate potentiometric titration curves of peptides: comparison with experiment.

    Science.gov (United States)

    Makowska, Joanna; Bagiñska, Katarzyna; Makowski, Mariusz; Jagielska, Anna; Liwo, Adam; Kasprzykowski, Franciszek; Chmurzyñski, Lech; Scheraga, Harold A

    2006-03-09

    We compared the ability of two theoretical methods of pH-dependent conformational calculations to reproduce experimental potentiometric titration curves of two models of peptides: Ac-K5-NHMe in 95% methanol (MeOH)/5% water mixture and Ac-XX(A)7OO-NH2 (XAO) (where X is diaminobutyric acid, A is alanine, and O is ornithine) in water, methanol (MeOH), and dimethyl sulfoxide (DMSO), respectively. The titration curve of the former was taken from the literature, and the curve of the latter was determined in this work. The first theoretical method involves a conformational search using the electrostatically driven Monte Carlo (EDMC) method with a low-cost energy function (ECEPP/3 plus the SRFOPT surface-solvation model, assumming that all titratable groups are uncharged) and subsequent reevaluation of the free energy at a given pH with the Poisson-Boltzmann equation, considering variable protonation states. In the second procedure, molecular dynamics (MD) simulations are run with the AMBER force field and the generalized Born model of electrostatic solvation, and the protonation states are sampled during constant-pH MD runs. In all three solvents, the first pKa of XAO is strongly downshifted compared to the value for the reference compounds (ethylamine and propylamine, respectively); the water and methanol curves have one, and the DMSO curve has two jumps characteristic of remarkable differences in the dissociation constants of acidic groups. The predicted titration curves of Ac-K5-NHMe are in good agreement with the experimental ones; better agreement is achieved with the MD-based method. The titration curves of XAO in methanol and DMSO, calculated using the MD-based approach, trace the shape of the experimental curves, reproducing the pH jump, while those calculated with the EDMC-based approach and the titration curve in water calculated using the MD-based approach have smooth shapes characteristic of the titration of weak multifunctional acids with small differences

  6. Sharing information: Mixed-methods investigation of brief experiential interprofessional

    Science.gov (United States)

    Cocksedge, Simon; Barr, Nicky; Deakin, Corinne

    In UK health policy ‘sharing good information is pivotal to improving care quality, safety, and effectiveness. Nevertheless, educators often neglect this vital communication skill. The consequences of brief communication education interventions for healthcare workers are not yet established. This study investigated a three-hour interprofessional experiential workshop (group work, theoretical input, rehearsal) training healthcare staff in sharing information using a clear structure (PARSLEY). Staff in one UK hospital participated. Questionnaires were completed before, immediately after, and eight weeks after training, with semistructured interviews seven weeks after training. Participants (n=76) were from assorted healthcare occupations (26% non-clinical). Knowledge significantly increased immediately after training. Self-efficacy, outcome expectancy, and motivation to use the structure taught were significantly increased immediately following training and at eight weeks. Respondents at eight weeks (n=35) reported their practice in sharing information had changed within seven days of training. Seven weeks after training, most interviewees (n=13) reported confidently using the PARSLEY structure regularly in varied settings. All had re-evaluated their communication practice. Brief training altered self-reported communication behaviour of healthcare staff, with sustained changes in everyday work. As sharing information is central to communication curricula, health policy, and shared decision-making, the effectiveness of brief teaching interventions has economic and educational implications.

  7. A Theoretical Model of Health Information Technology Usage Behaviour with Implications for Patient Safety

    Science.gov (United States)

    Holden, Richard J.; Karsh, Ben-Tzion

    2009-01-01

    Primary objective: much research and practice related to the design and implementation of information technology in health care has been atheoretical. It is argued that using extant theory to develop testable models of health information technology (HIT) benefits both research and practice. Methods and procedures: several theories of motivation,…

  8. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...

  9. Staying theoretically sensitive when conducting grounded theory research.

    Science.gov (United States)

    Reay, Gudrun; Bouchal, Shelley Raffin; A Rankin, James

    2016-09-01

    Background Grounded theory (GT) is founded on the premise that underlying social patterns can be discovered and conceptualised into theories. The method and need for theoretical sensitivity are best understood in the historical context in which GT was developed. Theoretical sensitivity entails entering the field with no preconceptions, so as to remain open to the data and the emerging theory. Investigators also read literature from other fields to understand various ways to construct theories. Aim To explore the concept of theoretical sensitivity from a classical GT perspective, and discuss the ontological and epistemological foundations of GT. Discussion Difficulties in remaining theoretically sensitive throughout research are discussed and illustrated with examples. Emergence - the idea that theory and substance will emerge from the process of comparing data - and staying open to the data are emphasised. Conclusion Understanding theoretical sensitivity as an underlying guiding principle of GT helps the researcher make sense of important concepts, such as delaying the literature review, emergence and the constant comparative method (simultaneous collection, coding and analysis of data). Implications for practice Theoretical sensitivity and adherence to the GT research method allow researchers to discover theories that can bridge the gap between theory and practice.

  10. Corruption and Economic Development in Nigeria: A Theoretical ...

    African Journals Online (AJOL)

    Corruption and Economic Development in Nigeria: A Theoretical Review. ... By using a theoretical method of analysis, the study reveals that corruption has been a deterrent to economic development in Nigeria. ... Section two discusses the theoretical and conceptual issues in corruption and economic development. Section ...

  11. Method for gathering and summarizing internet information

    Energy Technology Data Exchange (ETDEWEB)

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  12. Risk-oriented internal control: The essence, management methods at small enterprises

    OpenAIRE

    Piskunov, V. A.; Manyayeva, V. A.; Tatarovskaya, T. E.; Bychkova, E. Y.

    2016-01-01

    The research topic relevance is inspired by necessity to develop theoretical and methodical provisions on the internal control system, risk-based management at small enterprises and to prove application feasibility, using economic-mathematical methods its implementation. The purpose of this research is to develop theoretical and methodical approaches to internal control system formation in small businesses, generating reliable and relevant information on the commercial organization activities...

  13. Vector-Quantization using Information Theoretic Concepts

    DEFF Research Database (Denmark)

    Lehn-Schiøler, Tue; Hegde, Anant; Erdogmus, Deniz

    2005-01-01

    interpretation and relies on minimization of a well defined cost-function. It is also shown how the potential field approach can be linked to information theory by use of the Parzen density estimator. In the light of information theory it becomes clear that minimizing the free energy of the system is in fact......The process of representing a large data set with a smaller number of vectors in the best possible way, also known as vector quantization, has been intensively studied in the recent years. Very efficient algorithms like the Kohonen Self Organizing Map (SOM) and the Linde Buzo Gray (LBG) algorithm...... have been devised. In this paper a physical approach to the problem is taken, and it is shown that by considering the processing elements as points moving in a potential field an algorithm equally efficient as the before mentioned can be derived. Unlike SOM and LBG this algorithm has a clear physical...

  14. Detecting Network Vulnerabilities Through Graph TheoreticalMethods

    Energy Technology Data Exchange (ETDEWEB)

    Cesarz, Patrick; Pomann, Gina-Maria; Torre, Luis de la; Villarosa, Greta; Flournoy, Tamara; Pinar, Ali; Meza Juan

    2007-09-30

    Identifying vulnerabilities in power networks is an important problem, as even a small number of vulnerable connections can cause billions of dollars in damage to a network. In this paper, we investigate a graph theoretical formulation for identifying vulnerabilities of a network. We first try to find the most critical components in a network by finding an optimal solution for each possible cutsize constraint for the relaxed version of the inhibiting bisection problem, which aims to find loosely coupled subgraphs with significant demand/supply mismatch. Then we investigate finding critical components by finding a flow assignment that minimizes the maximum among flow assignments on all edges. We also report experiments on IEEE 30, IEEE 118, and WSCC 179 benchmark power networks.

  15. Information needs and risk perception as predictors of risk information seeking

    NARCIS (Netherlands)

    ter Huurne, E.F.J.; Gutteling, Jan M.

    2008-01-01

    This paper introduces a theoretical framework that describes the importance of public's information sufficiency, risk perception, and self-efficacy as predictors of intended risk information seeking behaviour. Based on theoretical assumptions, measurement instruments for relevant concepts were

  16. Methods of determining information needs for control

    Energy Technology Data Exchange (ETDEWEB)

    Borkowski, Z.

    1980-01-01

    Work has begun in the Main Data Center in the field of mining (Poland) on estimation in improvement of methods of determining information requirements necessary for control. Existing methods are briefly surveyed. Their imperfection is shown. The complexity of characteristics for this problem is pointed out.

  17. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework

    Directory of Open Access Journals (Sweden)

    French Simon D

    2012-04-01

    Full Text Available Abstract Background There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF to advance the science of implementation research. Methods The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s of delivery could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? Results A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. Conclusions We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be

  18. Theoretical explanations for maintenance of behaviour change: a systematic review of behaviour theories

    OpenAIRE

    Kwasnicka, Dominika; Dombrowski, Stephan U; White, Martin; Sniehotta, Falko

    2016-01-01

    ABSTRACT Background: Behaviour change interventions are effective in supporting individuals in achieving temporary behaviour change. Behaviour change maintenance, however, is rarely attained. The aim of this review was to identify and synthesise current theoretical explanations for behaviour change maintenance to inform future research and practice. Methods: Potentially relevant theories were identified through systematic searches of electronic databases (Ovid MEDLINE, Embase, PsycINFO). In a...

  19. A Theoretical Model of Resource-Oriented Music Therapy with Informal Hospice Caregivers during Pre-Bereavement.

    Science.gov (United States)

    Potvin, Noah; Bradt, Joke; Ghetti, Claire

    2018-03-09

    Over the past decade, caregiver pre-bereavement has received increased scholarly and clinical attention across multiple healthcare fields. Pre-bereavement represents a nascent area for music therapy to develop best practices in and an opportunity to establish clinical relevancy in the interdisciplinary team. This study was an exploratory inquiry into the role of music therapy with pre-bereaved informal hospice caregivers. This study intended to articulate (a) what pre-bereavement needs are present for informal hospice caregivers, (b) which of those needs were addressed in music, and (c) the process by which music therapy addressed those needs. A constructivist grounded theory methodology using situational analysis was used. We interviewed 14 currently bereaved informal hospice caregivers who had participated in music therapy with the care recipient. Analysis resulted in a theoretical model of resource-oriented music therapy promoting caregiver resilience. The resource, caregivers' stable caring relationships with care recipients through their pre-illness identities (i.e., spouse, parent, or child), is amplified through music therapy. Engagement with this resource mediates the risk of increased care burden and results in resilience fostering purposefulness and value in caregiving. Resource-oriented music therapy provides a unique clinical avenue for supporting caregivers through pre-bereavement, and was acknowledged by caregivers as a unique and integral hospice service. Within this model, caregivers are better positioned to develop meaning from the experience of providing care through the death of a loved one.

  20. Information in medical treatment courses

    DEFF Research Database (Denmark)

    Møller, Marianne; Hollnagel, Erik; Andersen, Stig Ejdrup

    Background Unintended events and suboptimal treatment with medicines are major burdens for patients and health systems all over the world. Information processes have important roles for establishing safe and effective treatment courses. The platform for this Ph.d. study is learning from situations...... to the quality of medical treatment courses. Methods Systems theory, cybernetics (steering, timing and feedback) and a classic communication model are applied as theoretical frames. Two groups of patients and their information providers are studied using qualitative methods. The data analysis focuses...... that goes well (Safety-II) while having a broad understanding of quality. Objectives The overall purpose is to investigate how information is used as a steering tool for quality in medical treatment courses. In this first part of the study, the role of information on medicine is analyzed in relation...

  1. THEORETICAL APPROACHES TO THE DEFINITION OF THE "INFORMATION RESOURCE"

    OpenAIRE

    I. Netreba

    2014-01-01

    Existing approaches to determining the nature of the category "information resource" are detailed and systematized. The relationships between the categories "information resource", "information technology", "information management system" are revealed. Determined the importance of information resources for the production process at the enterprise.

  2. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. A human-machine interface evaluation method: A difficulty evaluation method in information searching (DEMIS)

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2009-01-01

    A human-machine interface (HMI) evaluation method, which is named 'difficulty evaluation method in information searching (DEMIS)', is proposed and demonstrated with an experimental study. The DEMIS is based on a human performance model and two measures of attentional-resource effectiveness in monitoring and detection tasks in nuclear power plants (NPPs). Operator competence and HMI design are modeled to be most significant factors to human performance. One of the two effectiveness measures is fixation-to-importance ratio (FIR) which represents attentional resource (eye fixations) spent on an information source compared to importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates FIRs for all information sources. The underlying principle of the measures is that the information source should be selectively attended to according to its informational importance. In this study, poor performance in information searching tasks is modeled to be coupled with difficulties caused by poor mental models of operators or/and poor HMI design. Human performance in information searching tasks is evaluated by analyzing the FIR and the SAE. Operator mental models are evaluated by a questionnaire-based method. Then difficulties caused by a poor HMI design are evaluated by a focused interview based on the FIR evaluation and then root causes leading to poor performance are identified in a systematic way.

  4. Studying Economic Space: Synthesis of Balance and Game-Theoretic Methods of Modelling

    Directory of Open Access Journals (Sweden)

    Natalia Gennadyevna Zakharchenko

    2015-12-01

    Full Text Available The article introduces questions about development of models used to study economic space. The author proposes the model that combines balance and game-theoretic methods for estimating system effects of economic agents’ interactions in multi-level economic space. The model is applied to research interactions between economic agents that are spatially heterogeneous within the Russian Far East. In the model the economic space of region is considered in a territorial dimension (the first level of decomposing space and also in territorial and product dimensions (the second level of decomposing space. The paper shows the mechanism of system effects formation that exists in the economic space of region. The author estimates system effects, analyses the real allocation of these effects between economic agents and identifies three types of local industrial markets: with zero, positive and negative system effects

  5. Computer programs of information processing of nuclear physical methods as a demonstration material in studying nuclear physics and numerical methods

    Science.gov (United States)

    Bateev, A. B.; Filippov, V. P.

    2017-01-01

    The principle possibility of using computer program Univem MS for Mössbauer spectra fitting as a demonstration material at studying such disciplines as atomic and nuclear physics and numerical methods by students is shown in the article. This program is associated with nuclear-physical parameters such as isomer (or chemical) shift of nuclear energy level, interaction of nuclear quadrupole moment with electric field and of magnetic moment with surrounded magnetic field. The basic processing algorithm in such programs is the Least Square Method. The deviation of values of experimental points on spectra from the value of theoretical dependence is defined on concrete examples. This value is characterized in numerical methods as mean square deviation. The shape of theoretical lines in the program is defined by Gaussian and Lorentzian distributions. The visualization of the studied material on atomic and nuclear physics can be improved by similar programs of the Mössbauer spectroscopy, X-ray Fluorescence Analyzer or X-ray diffraction analysis.

  6. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  7. Interactive Nonlinear Multiobjective Optimization Methods

    OpenAIRE

    Miettinen, Kaisa; Hakanen, Jussi; Podkopaev, Dmitry

    2016-01-01

    An overview of interactive methods for solving nonlinear multiobjective optimization problems is given. In interactive methods, the decision maker progressively provides preference information so that the most satisfactory Pareto optimal solution can be found for her or his. The basic features of several methods are introduced and some theoretical results are provided. In addition, references to modifications and applications as well as to other methods are indicated. As the...

  8. Parameters and error of a theoretical model

    International Nuclear Information System (INIS)

    Moeller, P.; Nix, J.R.; Swiatecki, W.

    1986-09-01

    We propose a definition for the error of a theoretical model of the type whose parameters are determined from adjustment to experimental data. By applying a standard statistical method, the maximum-likelihoodlmethod, we derive expressions for both the parameters of the theoretical model and its error. We investigate the derived equations by solving them for simulated experimental and theoretical quantities generated by use of random number generators. 2 refs., 4 tabs

  9. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  10. Research methods in information

    CERN Document Server

    Pickard, Alison Jane

    2013-01-01

    The long-awaited 2nd edition of this best-selling research methods handbook is fully updated and includes brand new coverage of online research methods and techniques, mixed methodology and qualitative analysis. There is an entire chapter contributed by Professor Julie McLeod, Sue Childs and Elizabeth Lomas focusing on research data management, applying evidence from the recent JISC funded 'DATUM' project. The first to focus entirely on the needs of the information and communications community, it guides the would-be researcher through the variety of possibilities open to them under the heading "research" and provides students with the confidence to embark on their dissertations. The focus here is on the 'doing' and although the philosophy and theory of research is explored to provide context, this is essentially a practical exploration of the whole research process with each chapter fully supported by examples and exercises tried and tested over a whole teaching career. The book will take readers through eac...

  11. INFORMATION CULTURE AND INFORMATION SAFETY OF SCHOOLCHILDREN

    Directory of Open Access Journals (Sweden)

    E. G. Belyakova

    2017-01-01

    Full Text Available Introduction. The article is devoted to the problem of interaction between schoolchildren and possible informational risks transmitted on the Internet. Considering the lack of external filters on the way of harmful information streams, it is actually necessary to develop information culture of schoolchildren, their abilities to sensibly and critically interpret the information on the Internet, and choice of adequate behaviour models surfing the Web. The aim of the present research is to analyze the state of informational safety of schoolchildren while using the Internet; gaining an understanding of the role of external restrictions and opportunities of intrapersonal filtration of the harmful Internet content depending on children age. Methodology and research methods. The methodology of the research is based on modern methods aimed to consider the problem of personal socialization in modern information society. Thus, the Internet Initiatives Development Fund (IIDF questionnaire let the authors define the level of awareness of recipients on the problem under consideration. Results and scientific novelty. The theoretical analysis helped the authors predict the correlation of basic methods in order to guarantee personal safety of schoolchildren taking into account the process of maturity as well as the decrease of external filters that may stop harmful content. Empirical part of the research has enabled to reveal decrease in external control of staying of a child in network in the process of growing up against the background of restrictive attitudes prevalence among teachers and parents. Therefore, the research supposed to improve information culture of schoolchildren from the earliest ages encouraging them to sensibly and correctly interpret the information on the Internet. Practical significance. The practical recommendations to parents and teachers in order to improve informational personal safety of schoolchildren are proposed. The relevancy

  12. An Improved Information Hiding Method Based on Sparse Representation

    Directory of Open Access Journals (Sweden)

    Minghai Yao

    2015-01-01

    Full Text Available A novel biometric authentication information hiding method based on the sparse representation is proposed for enhancing the security of biometric information transmitted in the network. In order to make good use of abundant information of the cover image, the sparse representation method is adopted to exploit the correlation between the cover and biometric images. Thus, the biometric image is divided into two parts. The first part is the reconstructed image, and the other part is the residual image. The biometric authentication image cannot be restored by any one part. The residual image and sparse representation coefficients are embedded into the cover image. Then, for the sake of causing much less attention of attackers, the visual attention mechanism is employed to select embedding location and embedding sequence of secret information. Finally, the reversible watermarking algorithm based on histogram is utilized for embedding the secret information. For verifying the validity of the algorithm, the PolyU multispectral palmprint and the CASIA iris databases are used as biometric information. The experimental results show that the proposed method exhibits good security, invisibility, and high capacity.

  13. Theoretical repeatability assessment without repetitive measurements in gradient high-performance liquid chromatography.

    Science.gov (United States)

    Kotani, Akira; Tsutsumi, Risa; Shoji, Asaki; Hayashi, Yuzuru; Kusu, Fumiyo; Yamamoto, Kazuhiro; Hakamata, Hideki

    2016-07-08

    This paper puts forward a time and material-saving method for evaluating the repeatability of area measurements in gradient HPLC with UV detection (HPLC-UV), based on the function of mutual information (FUMI) theory which can theoretically provide the measurement standard deviation (SD) and detection limits through the stochastic properties of baseline noise with no recourse to repetitive measurements of real samples. The chromatographic determination of terbinafine hydrochloride and enalapril maleate is taken as an example. The best choice of the number of noise data points, inevitable for the theoretical evaluation, is shown to be 512 data points (10.24s at 50 point/s sampling rate of an A/D converter). Coupled with the relative SD (RSD) of sample injection variability in the instrument used, the theoretical evaluation is proved to give identical values of area measurement RSDs to those estimated by the usual repetitive method (n=6) over a wide concentration range of the analytes within the 95% confidence intervals of the latter RSD. The FUMI theory is not a statistical one, but the "statistical" reliability of its SD estimates (n=1) is observed to be as high as that attained by thirty-one measurements of the same samples (n=31). Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Satellite, climatological, and theoretical inputs for modeling of the diurnal cycle of fire emissions

    Science.gov (United States)

    Hyer, E. J.; Reid, J. S.; Schmidt, C. C.; Giglio, L.; Prins, E.

    2009-12-01

    The diurnal cycle of fire activity is crucial for accurate simulation of atmospheric effects of fire emissions, especially at finer spatial and temporal scales. Estimating diurnal variability in emissions is also a critical problem for construction of emissions estimates from multiple sensors with variable coverage patterns. An optimal diurnal emissions estimate will use as much information as possible from satellite fire observations, compensate known biases in those observations, and use detailed theoretical models of the diurnal cycle to fill in missing information. As part of ongoing improvements to the Fire Location and Monitoring of Burning Emissions (FLAMBE) fire monitoring system, we evaluated several different methods of integrating observations with different temporal sampling. We used geostationary fire detections from WF_ABBA, fire detection data from MODIS, empirical diurnal cycles from TRMM, and simple theoretical diurnal curves based on surface heating. Our experiments integrated these data in different combinations to estimate the diurnal cycles of emissions for each location and time. Hourly emissions estimates derived using these methods were tested using an aerosol transport model. We present results of this comparison, and discuss the implications of our results for the broader problem of multi-sensor data fusion in fire emissions modeling.

  15. Information-theoretic characterization of dynamic energy systems

    Science.gov (United States)

    Bevis, Troy Lawson

    sources are compounded by the dynamics of the grid itself. Loads are constantly changing, as well as the sources; this can sometimes lead to a quick change in system states. There is a need for a metric to be able to take into consideration all of the factors detailed above; it needs to be able to take into consideration the amount of information that is available in the system and the rate that the information is losing its value. In a dynamic system, the information is only valid for a length of time, and the controller must be able to take into account the decay of currently held information. This thesis will present the information theory metrics in a way that is useful for application to dynamic energy systems. A test case involving synchronization of several generators is presented for analysis and application of the theory. The objective is to synchronize all the generators and connect them to a common bus. As the phase shift of each generator is a random process, the effects of latency and information decay can be directly observed. The results of the experiments clearly show that the expected outcomes are observed and that entropy and information theory is a valid metric for timing requirement extraction.

  16. TiO2 synthesized by microwave assisted solvothermal method: Experimental and theoretical evaluation

    International Nuclear Information System (INIS)

    Moura, K.F.; Maul, J.; Albuquerque, A.R.; Casali, G.P.; Longo, E.; Keyson, D.; Souza, A.G.; Sambrano, J.R.; Santos, I.M.G.

    2014-01-01

    In this study, a microwave assisted solvothermal method was used to synthesize TiO 2 with anatase structure. The synthesis was done using Ti (IV) isopropoxide and ethanol without templates or alkalinizing agents. Changes in structural features were observed with increasing time of synthesis and evaluated using periodic quantum chemical calculations. The anatase phase was obtained after only 1 min of reaction besides a small amount of brookite phase. Experimental Raman spectra are in accordance with the theoretical one. Micrometric spheres constituted by nanometric particles were obtained for synthesis from 1 to 30 min, while spheres and sticks were observed after 60 min. - Graphical abstract: FE-SEM images of anatase obtained with different periods of synthesis associated with the order–disorder degree. Display Omitted - Highlights: • Anatase microspheres were obtained by the microwave assisted hydrothermal method. • Only ethanol and titanium isopropoxide were used as precursors during the synthesis. • Raman spectra and XRD patterns were compared with quantum chemical calculations. • Time of synthesis increased the short-range disorder in one direction and decreased in another

  17. INFORMATION SECURITY RISKS OPTIMIZATION IN CLOUDY SERVICES ON THE BASIS OF LINEAR PROGRAMMING

    Directory of Open Access Journals (Sweden)

    I. A. Zikratov

    2013-01-01

    Full Text Available The paper discusses theoretical aspects of secure cloud services creation for information processing of various confidentiality degrees. A new approach to the reasoning of information security composition in distributed computing structures is suggested, presenting the problem of risk assessment as an extreme problem of decisionmaking. Linear programming method application is proved to minimize the risk of information security for given performance security in compliance with the economic balance for the maintenance of security facilities and cost of services. An example is given to illustrate the obtained theoretical results.

  18. Computational Biomechanics Theoretical Background and BiologicalBiomedical Problems

    CERN Document Server

    Tanaka, Masao; Nakamura, Masanori

    2012-01-01

    Rapid developments have taken place in biological/biomedical measurement and imaging technologies as well as in computer analysis and information technologies. The increase in data obtained with such technologies invites the reader into a virtual world that represents realistic biological tissue or organ structures in digital form and allows for simulation and what is called “in silico medicine.” This volume is the third in a textbook series and covers both the basics of continuum mechanics of biosolids and biofluids and the theoretical core of computational methods for continuum mechanics analyses. Several biomechanics problems are provided for better understanding of computational modeling and analysis. Topics include the mechanics of solid and fluid bodies, fundamental characteristics of biosolids and biofluids, computational methods in biomechanics analysis/simulation, practical problems in orthopedic biomechanics, dental biomechanics, ophthalmic biomechanics, cardiovascular biomechanics, hemodynamics...

  19. Research Investigation of Information Access Methods

    Science.gov (United States)

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  20. When the Mannequin Dies, Creation and Exploration of a Theoretical Framework Using a Mixed Methods Approach.

    Science.gov (United States)

    Tripathy, Shreepada; Miller, Karen H; Berkenbosch, John W; McKinley, Tara F; Boland, Kimberly A; Brown, Seth A; Calhoun, Aaron W

    2016-06-01

    Controversy exists in the simulation community as to the emotional and educational ramifications of mannequin death due to learner action or inaction. No theoretical framework to guide future investigations of learner actions currently exists. The purpose of our study was to generate a model of the learner experience of mannequin death using a mixed methods approach. The study consisted of an initial focus group phase composed of 11 learners who had previously experienced mannequin death due to action or inaction on the part of learners as defined by Leighton (Clin Simul Nurs. 2009;5(2):e59-e62). Transcripts were analyzed using grounded theory to generate a list of relevant themes that were further organized into a theoretical framework. With the use of this framework, a survey was generated and distributed to additional learners who had experienced mannequin death due to action or inaction. Results were analyzed using a mixed methods approach. Forty-one clinicians completed the survey. A correlation was found between the emotional experience of mannequin death and degree of presession anxiety (P framework. Using the previous approach, we created a model of the effect of mannequin death on the educational and psychological state of learners. We offer the final model as a guide to future research regarding the learner experience of mannequin death.

  1. Theoretical Modelling Methods for Thermal Management of Batteries

    Directory of Open Access Journals (Sweden)

    Bahman Shabani

    2015-09-01

    Full Text Available The main challenge associated with renewable energy generation is the intermittency of the renewable source of power. Because of this, back-up generation sources fuelled by fossil fuels are required. In stationary applications whether it is a back-up diesel generator or connection to the grid, these systems are yet to be truly emissions-free. One solution to the problem is the utilisation of electrochemical energy storage systems (ESS to store the excess renewable energy and then reusing this energy when the renewable energy source is insufficient to meet the demand. The performance of an ESS amongst other things is affected by the design, materials used and the operating temperature of the system. The operating temperature is critical since operating an ESS at low ambient temperatures affects its capacity and charge acceptance while operating the ESS at high ambient temperatures affects its lifetime and suggests safety risks. Safety risks are magnified in renewable energy storage applications given the scale of the ESS required to meet the energy demand. This necessity has propelled significant effort to model the thermal behaviour of ESS. Understanding and modelling the thermal behaviour of these systems is a crucial consideration before designing an efficient thermal management system that would operate safely and extend the lifetime of the ESS. This is vital in order to eliminate intermittency and add value to renewable sources of power. This paper concentrates on reviewing theoretical approaches used to simulate the operating temperatures of ESS and the subsequent endeavours of modelling thermal management systems for these systems. The intent of this review is to present some of the different methods of modelling the thermal behaviour of ESS highlighting the advantages and disadvantages of each approach.

  2. Molecular physics. Theoretical principles and experimental methods

    International Nuclear Information System (INIS)

    Demtroeder, W.

    2005-01-01

    This advanced textbook comprehensively explains important principles of diatomic and polyatomic molecules and their spectra in two separate, distinct parts. The first part concentrates on the theoretical aspects of molecular physics, whereas the second part of the book covers experimental techniques, i.e. laser, Fourier, NMR, and ESR spectroscopies, used in the fields of physics, chemistry, biolog, and material science. Appropriate for undergraduate and graduate students in physics and chemistry with a knowledge of atomic physics and familiar with the basics of quantum mechanics. From the contents: - Electronic States of Molecules, - Rotation, Oscillation and Potential Curves of Diatomic Molecules, - The Spectra of Diatomic Molecules, - Molecule Symmetries and Group Theory, - Rotation and Oscillations of Polyatomic Molecules, - Electronic States of Polyatomic Molecules, - The Spectra of Polyatomic Molecules, - Collapse of the Born-Oppenheimer-Approximation, Disturbances in Molecular Spectra, - Molecules in Disturbing Fields, - Van-der-Waals-Molecules and Cluster, - Experimental Techniques in Molecular Physics. (orig.)

  3. 48 CFR 1205.101 - Methods of disseminating information.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Methods of disseminating information. 1205.101 Section 1205.101 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION... disseminating information. (b) The DOT Office of Small and Disadvantaged Business Utilization (S-40), 400 7th...

  4. Classifying and Designing the Educational Methods with Information Communications Technoligies

    Directory of Open Access Journals (Sweden)

    I. N. Semenova

    2013-01-01

    Full Text Available The article describes the conceptual apparatus for implementing the Information Communications Technologies (ICT in education. The authors suggest the classification variants of the related teaching methods according to the following component combinations: types of students work with information, goals of ICT incorporation into the training process, individualization degrees, contingent involvement, activity levels and pedagogical field targets, ideology of informational didactics, etc. Each classification can solve the educational tasks in the context of the partial paradigm of modern didactics; any kind of methods implies the particular combination of activities in educational environment.The whole spectrum of classifications provides the informational functional basis for the adequate selection of necessary teaching methods in accordance with the specified goals and planned results. The potential variants of ICT implementation methods are given for different teaching models. 

  5. Information-theoretic analysis of rotational distributions from quantal and quasiclassical computations of reactive and nonreactive scattering

    International Nuclear Information System (INIS)

    Bernstein, R.B.

    1976-01-01

    An information-theoretic approach to the analysis of rotational excitation cross sections was developed by Levine, Bernstein, Johnson, Procaccia, and coworkers and applied to state-to-state cross sections available from numerical computations of reactive and nonreactive scattering (for example, by Wyatt and Kuppermann and their coworkers and by Pack and Pattengill and others). The rotational surprisals are approximately linear in the energy transferred, thereby accounting for the so-called ''exponential gap law'' for rotational relaxation discovered experimentally by Polanyi, Woodall, and Ding. For the ''linear surprisal'' case the unique relation between the surprisal parameter theta/sub R/ and the first moment of the rotational energy distribution provides a link between the pattern of the rotational state distribution and those features of the potential surface which govern the average energy transfer

  6. Geometrical Fuzzy Search Method for the Business Information Security Systems

    Directory of Open Access Journals (Sweden)

    Grigory Grigorievich Novikov

    2014-12-01

    Full Text Available The main reason of the article is how to use one of new fuzzy search method for information security of business or some other purposes. So many sensitive information leaks are through non-classified documents legal publishing. That’s why many intelligence services like to use the “mosaic” information collection method so much: This article is about how to prevent it.

  7. A Model-Driven Development Method for Management Information Systems

    Science.gov (United States)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  8. THE THEORETICAL AND METHODICAL APPROACH TO AN ASSESSMENT OF A LEVEL OF DEVELOPMENT OF THE ENTERPRISE IN CONDITIONS OF GLOBALIZATION

    Directory of Open Access Journals (Sweden)

    Tatiana Shved

    2016-11-01

    Full Text Available The subject of this article is theoretical, methodical and practical aspects of enterprise development in conditions of globalization. The purpose of this research is to provide theoretical and methodical approach to an assessment of a level of development of the enterprise, which is based on the relationship between the factors and influence, illustrating the effect of the internal and external environment of enterprises functioning, and indicates the level of development of the enterprise. Methodology. Theoretical basis of the study was the examination and rethinking of the main achievements of world and domestic science on the development of enterprises. To achieve the objectives of the research following methods were used: systemic and structural analysis for the formation of methodical approaches to the selection of the factors, influencing the development of enterprises; abstract and logical – for the formulation of conclusions and proposals; the method of valuation and expert assessments to the implementation of the proposed theoretical and methodical approach to an assessment of a level of development of the enterprise in conditions of globalization. Results of the research is the proposed theoretical and methodical to an assessment of a level of development of the enterprise in conditions of globalization, which is associated with the idea of development of the enterprise as a system with inputs–factors, influencing on the development , and outputs – indicators of the level of enterprise development within these factors. So, the chosen factors – resources, financial-economic activity, innovation and investment activities, competition, government influence, and foreign trade. Indicators that express these factors, are capital productivity, labour productivity, material efficiency within the first factor; the profitability of the activity, the coefficient of current assets, the total liquidity coefficient, financial stability

  9. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    characterized by equifinality, multiple conjunctural causation, and case diversity. We prescribe methodological guidelines consisting of a six-step procedure to systematically apply set theoretic methods to conceptualize, develop, and empirically derive maturity models and provide a demonstration......Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models...

  10. Rethinking the Elementary Science Methods Course: A Case for Content, Pedagogy, and Informal Science Education.

    Science.gov (United States)

    Kelly, Janet

    2000-01-01

    Indicates the importance of preparing prospective teachers who will be elementary science teachers with different methods. Presents the theoretical and practical rationale for developing a constructivist-based elementary science methods course. Discusses the impact student knowledge and understanding of science and student attitudes has on…

  11. Formal approach to modeling of modern Information Systems

    Directory of Open Access Journals (Sweden)

    Bálint Molnár

    2016-01-01

    Full Text Available Most recently, the concept of business documents has started to play double role. On one hand, a business document (word processing text or calculation sheet can be used as specification tool, on the other hand the business document is an immanent constituent of business processes, thereby essential component of business Information Systems. The recent tendency is that the majority of documents and their contents within business Information Systems remain in semi-structured format and a lesser part of documents is transformed into schemas of structured databases. In order to keep the emerging situation in hand, we suggest the creation (1 a theoretical framework for modeling business Information Systems; (2 and a design method for practical application based on the theoretical model that provides the structuring principles. The modeling approach that focuses on documents and their interrelationships with business processes assists in perceiving the activities of modern Information Systems.

  12. Usability Evaluation Methods for Special Interest Internet Information Services

    Directory of Open Access Journals (Sweden)

    Eva-Maria Schön

    2014-06-01

    Full Text Available The internet provides a wide range of scientific information for different areas of research, used by the related scientific communities. Often the design or architecture of these web pages does not correspond to the mental model of their users. As a result the wanted information is difficult to find. Methods established by Usability Engineering and User Experience can help to increase the appeal of scientific internet information services by analyzing the users’ requirements. This paper describes a procedure to analyze and optimize scientific internet information services that can be accomplished with relatively low effort. It consists of a combination of methods that already have been successfully applied to practice: Personas, usability inspections, Online Questionnaire, Kano model and Web Analytics.

  13. Theoretical prediction method of subcooled flow boiling CHF

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Young Min; Chang, Soon Heung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    A theoretical critical heat flux (CHF ) model, based on lateral bubble coalescence on the heated wall, is proposed to predict the subcooled flow boiling CHF in a uniformly heated vertical tube. The model is based on the concept that a single layer of bubbles contacted to the heated wall prevents a bulk liquid from reaching the wall at near CHF condition. Comparisons between the model predictions and experimental data result in satisfactory agreement within less than 9.73% root-mean-square error by the appropriate choice of the critical void fraction in the bubbly layer. The present model shows comparable performance with the CHF look-up table of Groeneveld et al.. 28 refs., 11 figs., 1 tab. (Author)

  14. Theoretical prediction method of subcooled flow boiling CHF

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Young Min; Chang, Soon Heung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A theoretical critical heat flux (CHF ) model, based on lateral bubble coalescence on the heated wall, is proposed to predict the subcooled flow boiling CHF in a uniformly heated vertical tube. The model is based on the concept that a single layer of bubbles contacted to the heated wall prevents a bulk liquid from reaching the wall at near CHF condition. Comparisons between the model predictions and experimental data result in satisfactory agreement within less than 9.73% root-mean-square error by the appropriate choice of the critical void fraction in the bubbly layer. The present model shows comparable performance with the CHF look-up table of Groeneveld et al.. 28 refs., 11 figs., 1 tab. (Author)

  15. Justification of computational methods to ensure information management systems

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. Due to the diversity and complexity of organizational management tasks a large enterprise, the construction of an information management system requires the establishment of interconnected complexes of means, implementing the most efficient way collect, transfer, accumulation and processing of information necessary drivers handle different ranks in the governance process. The main trends of the construction of integrated logistics management information systems can be considered: the creation of integrated data processing systems by centralizing storage and processing of data arrays; organization of computer systems to realize the time-sharing; aggregate-block principle of the integrated logistics; Use a wide range of peripheral devices with the unification of information and hardware communication. Main attention is paid to the application of the system of research of complex technical support, in particular, the definition of quality criteria for the operation of technical complex, the development of information base analysis methods of management information systems and define the requirements for technical means, as well as methods of structural synthesis of the major subsystems of integrated logistics. Thus, the aim is to study on the basis of systematic approach of integrated logistics management information system and the development of a number of methods of analysis and synthesis of complex logistics that are suitable for use in the practice of engineering systems design. The objective function of the complex logistics management information systems is the task of gathering systems, transmission and processing of specified amounts of information in the regulated time intervals with the required degree of accuracy while minimizing the reduced costs for the establishment and operation of technical complex. Achieving the objective function of the complex logistics to carry out certain organization of interaction of information

  16. Detection System of HTTP DDoS Attacks in a Cloud Environment Based on Information Theoretic Entropy and Random Forest

    Directory of Open Access Journals (Sweden)

    Mohamed Idhammad

    2018-01-01

    Full Text Available Cloud Computing services are often delivered through HTTP protocol. This facilitates access to services and reduces costs for both providers and end-users. However, this increases the vulnerabilities of the Cloud services face to HTTP DDoS attacks. HTTP request methods are often used to address web servers’ vulnerabilities and create multiple scenarios of HTTP DDoS attack such as Low and Slow or Flooding attacks. Existing HTTP DDoS detection systems are challenged by the big amounts of network traffic generated by these attacks, low detection accuracy, and high false positive rates. In this paper we present a detection system of HTTP DDoS attacks in a Cloud environment based on Information Theoretic Entropy and Random Forest ensemble learning algorithm. A time-based sliding window algorithm is used to estimate the entropy of the network header features of the incoming network traffic. When the estimated entropy exceeds its normal range the preprocessing and the classification tasks are triggered. To assess the proposed approach various experiments were performed on the CIDDS-001 public dataset. The proposed approach achieves satisfactory results with an accuracy of 99.54%, a FPR of 0.4%, and a running time of 18.5s.

  17. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  18. On the use of information technologies in the process of physical education of student youth

    Directory of Open Access Journals (Sweden)

    Andriy Shankovsky

    2017-06-01

    Full Text Available Topicality. The determining of the level of effectiveness of using information technology in shaping the informational educational environment of students’ physical education in the system of education and beyond, the identification of the general theoretical basis of study, necessary for didactic development of approaches of the use of information technology in the physical education of students, requires the organization of a special research. Objectives of the study: to analyze and systematize modern scientific and methodological knowledge and the results of practical experience of domestic and foreign researchers in the sphere of the use of information technologies in the practice of physical education of students; to develop informational and methodical multimedia system “Рerfectum corpus”, as a means of raising the level of theoretical knowledge of students in the process of physical education. Research results. The results of the qualifying experiment led us to develop the multimedia information and methodical system “Рerfectum corpus” as an auxiliary means of training for the use in class and in the extracurricular format, which is intended for self-study and raising the level of theoretical knowledge, motivation of students for physical exercises. Each module of the multimedia information and methodical system “Рerfectum corpus” consists of sections that may contain background images, buttons and other options of visual representation. Conclusions. The analysis of scientific sources testifies that the use of information technologies in the practice of physical education of student youth opens new opportunities for increasing the efficiency of the process of physical education. The developed information-methodical system includes three modules: “Useful to know”, “Practice”, “Bonus".

  19. Method of Improving Personal Name Search in Academic Information Service

    Directory of Open Access Journals (Sweden)

    Heejun Han

    2012-12-01

    Full Text Available All academic information on the web or elsewhere has its creator, that is, a subject who has created the information. The subject can be an individual, a group, or an institution, and can be a nation depending on the nature of the relevant information. Most information is composed of a title, an author, and contents. An essay which is under the academic information category has metadata including a title, an author, keyword, abstract, data about publication, place of publication, ISSN, and the like. A patent has metadata including the title, an applicant, an inventor, an attorney, IPC, number of application, and claims of the invention. Most web-based academic information services enable users to search the information by processing the meta-information. An important element is to search information by using the author field which corresponds to a personal name. This study suggests a method of efficient indexing and using the adjacent operation result ranking algorithm to which phrase search-based boosting elements are applied, and thus improving the accuracy of the search results of personal names. It also describes a method for providing the results of searching co-authors and related researchers in searching personal names. This method can be effectively applied to providing accurate and additional search results in the academic information services.

  20. An information hiding method based on LSB and tent chaotic map

    Science.gov (United States)

    Song, Jianhua; Ding, Qun

    2011-06-01

    In order to protect information security more effectively, a novel information hiding method based on LSB and Tent chaotic map was proposed, first the secret message is Tent chaotic encrypted, and then LSB steganography is executed for the encrypted message in the cover-image. Compared to the traditional image information hiding method, the simulation results indicate that the method greatly improved in imperceptibility and security, and acquired good results.

  1. Information-theoretic treatment of tripartite systems and quantum channels

    International Nuclear Information System (INIS)

    Coles, Patrick J.; Yu Li; Gheorghiu, Vlad; Griffiths, Robert B.

    2011-01-01

    A Holevo measure is used to discuss how much information about a given positive operator valued measure (POVM) on system a is present in another system b, and how this influences the presence or absence of information about a different POVM on a in a third system c. The main goal is to extend information theorems for mutually unbiased bases or general bases to arbitrary POVMs, and especially to generalize ''all-or-nothing'' theorems about information located in tripartite systems to the case of partial information, in the form of quantitative inequalities. Some of the inequalities can be viewed as entropic uncertainty relations that apply in the presence of quantum side information, as in recent work by Berta et al. [Nature Physics 6, 659 (2010)]. All of the results also apply to quantum channels: For example, if E accurately transmits certain POVMs, the complementary channel F will necessarily be noisy for certain other POVMs. While the inequalities are valid for mixed states of tripartite systems, restricting to pure states leads to the basis invariance of the difference between the information about a contained in b and c.

  2. Axiomatic Evaluation Method and Content Structure for Information Appliances

    Science.gov (United States)

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  3. Adaptation of an Agile Information System Development Method

    NARCIS (Netherlands)

    Aydin, M.N.; Harmsen, A.F.; van Hillegersberg, Jos; Stegwee, R.A.; Siau, K.

    2007-01-01

    Little specific research has been conducted to date on the adaptation of agile information systems development (ISD) methods. This chapter presents the work practice in dealing with the adaptation of such a method in the ISD department of one of the leading financial institutes in Europe. The

  4. "It's the Method, Stupid." Interrelations between Methodological and Theoretical Advances: The Example of Comparing Higher Education Systems Internationally

    Science.gov (United States)

    Hoelscher, Michael

    2017-01-01

    This article argues that strong interrelations between methodological and theoretical advances exist. Progress in, especially comparative, methods may have important impacts on theory evaluation. By using the example of the "Varieties of Capitalism" approach and an international comparison of higher education systems, it can be shown…

  5. Extending the theoretical framework for curriculum integration in pre-clinical medical education

    DEFF Research Database (Denmark)

    Vergel, John; Stentoft, Diana; Montoya, Juny

    2017-01-01

    students' knowledge integration. Therefore, we aimed to uncover how curriculum integration is manifested through context. METHODS: We collected data from the official curriculum and interviewed ten participants (including curriculum designers, facilitators, and students) in the bachelor's medical program......INTRODUCTION: Curriculum integration is widely discussed in medical education but remains ill defined. Although there is plenty of information on logistical aspects of curriculum integration, little attention has been paid to the contextual issues that emerge from its practice and may complicate...... at Aalborg University. We observed various learning activities focused on pre-clinical education. Inspired by grounded theory, we analyzed the information we gathered. RESULTS: The following theoretical constructs emerged after the inductive analysis: 1) curriculum integration complexity is embedded...

  6. Information-seeking Behavior During Residency Is Associated With Quality of Theoretical Learning, Academic Career Achievements, and Evidence-based Medical Practice

    Science.gov (United States)

    Oussalah, Abderrahim; Fournier, Jean-Paul; Guéant, Jean-Louis; Braun, Marc

    2015-01-01

    Abstract Data regarding knowledge acquisition during residency training are sparse. Predictors of theoretical learning quality, academic career achievements and evidence-based medical practice during residency are unknown. We performed a cross-sectional study on residents and attending physicians across several residency programs in 2 French faculties of medicine. We comprehensively evaluated the information-seeking behavior (I-SB) during residency using a standardized questionnaire and looked for independent predictors of theoretical learning quality, academic career achievements, and evidence-based medical practice among I-SB components using multivariate logistic regression analysis. Between February 2013 and May 2013, 338 fellows and attending physicians were included in the study. Textbooks and international medical journals were reported to be used on a regular basis by 24% and 57% of the respondents, respectively. Among the respondents, 47% refer systematically (4.4%) or frequently (42.6%) to published guidelines from scientific societies upon their publication. The median self-reported theoretical learning quality score was 5/10 (interquartile range, 3–6; range, 1–10). A high theoretical learning quality score (upper quartile) was independently and strongly associated with the following I-SB components: systematic reading of clinical guidelines upon their publication (odds ratio [OR], 5.55; 95% confidence interval [CI], 1.77–17.44); having access to a library that offers the leading textbooks of the specialty in the medical department (OR, 2.45, 95% CI, 1.33–4.52); knowledge of the specialty leading textbooks (OR, 2.12; 95% CI, 1.09–4.10); and PubMed search skill score ≥5/10 (OR, 1.94; 95% CI, 1.01–3.73). Research Master (M2) and/or PhD thesis enrolment were independently and strongly associated with the following predictors: PubMed search skill score ≥5/10 (OR, 4.10; 95% CI, 1.46–11.53); knowledge of the leading medical journals of the

  7. Theoretical Foundations of Active Learning

    Science.gov (United States)

    2009-05-01

    I study the informational complexity of active learning in a statistical learning theory framework. Specifically, I derive bounds on the rates of...convergence achievable by active learning , under various noise models and under general conditions on the hypothesis class. I also study the theoretical...advantages of active learning over passive learning, and develop procedures for transforming passive learning algorithms into active learning algorithms

  8. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency.

    Science.gov (United States)

    Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian

    2013-12-20

    Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.

  9. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework.

    Science.gov (United States)

    French, Simon D; Green, Sally E; O'Connor, Denise A; McKenzie, Joanne E; Francis, Jill J; Michie, Susan; Buchbinder, Rachelle; Schattner, Peter; Spike, Neil; Grimshaw, Jeremy M

    2012-04-24

    There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a

  10. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  11. Enablers and barriers to physical activity in overweight and obese pregnant women: an analysis informed by the theoretical domains framework and COM-B model.

    Science.gov (United States)

    Flannery, C; McHugh, S; Anaba, A E; Clifford, E; O'Riordan, M; Kenny, L C; McAuliffe, F M; Kearney, P M; Byrne, M

    2018-05-21

    Obesity during pregnancy is associated with increased risk of gestational diabetes mellitus (GDM) and other complications. Physical activity is a modifiable lifestyle factor that may help to prevent these complications but many women reduce their physical activity levels during pregnancy. Interventions targeting physical activity in pregnancy are on-going but few identify the underlying behaviour change mechanisms by which the intervention is expected to work. To enhance intervention effectiveness, recent tools in behavioural science such as the Theoretical Domains Framework (TDF) and COM-B model (capability, opportunity, motivation and behaviour) have been employed to understand behaviours for intervention development. Using these behaviour change methods, this study aimed to identify the enablers and barriers to physical activity in overweight and obese pregnant women. Semi-structured interviews were conducted with a purposive sample of overweight and obese women at different stages of pregnancy attending a public antenatal clinic in a large academic maternity hospital in Cork, Ireland. Interviews were recorded and transcribed into NVivo V.10 software. Data analysis followed the framework approach, drawing on the TDF and the COM-B model. Twenty one themes were identified and these mapped directly on to the COM-B model of behaviour change and ten of the TDF domains. Having the social opportunity to engage in physical activity was identified as an enabler; pregnant women suggested being active was easier when supported by their partners. Knowledge was a commonly reported barrier with women lacking information on safe activities during pregnancy and describing the information received from their midwife as 'limited'. Having the physical capability and physical opportunity to carry out physical activity were also identified as barriers; experiencing pain, a lack of time, having other children, and working prevented women from being active. A wide range of barriers

  12. Computational information geometry for image and signal processing

    CERN Document Server

    Critchley, Frank; Dodson, Christopher

    2017-01-01

    This book focuses on the application and development of information geometric methods in the analysis, classification and retrieval of images and signals. It provides introductory chapters to help those new to information geometry and applies the theory to several applications. This area has developed rapidly over recent years, propelled by the major theoretical developments in information geometry, efficient data and image acquisition and the desire to process and interpret large databases of digital information. The book addresses both the transfer of methodology to practitioners involved in database analysis and in its efficient computational implementation.

  13. Model-free information-theoretic approach to infer leadership in pairs of zebrafish.

    Science.gov (United States)

    Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio

    2016-04-01

    Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.

  14. Evaluation of two methods for using MR information in PET reconstruction

    International Nuclear Information System (INIS)

    Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.

    2013-01-01

    Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed

  15. Summer School organized by the International Centre for Theoretical Physics, Trieste, and the Institute for Information Sciences, University of Tübingen

    CERN Document Server

    Güttinger, Werner; Cin, Mario

    1974-01-01

    This volume is the record and product of the Summer School on the Physics and Mathematics of the Nervous System, held at the International Centre for Theoretical Physics in Trieste from August 21-31, 1973, and jointly organized by the Institute for Information Sciences, University of Tlibingen and by the Centre. The school served to bring biologists, physicists and mathemati­ cians together to exchange ideas about the nervous system and brain, and also to introduce young scientists to the field. The program, attended by more than a hundred scientists, was interdisciplinary both in character and participation. The primary support for the school was provided by the Volkswagen Foundation of West Germany. We are particularly indebted to Drs. G. Gambke, M. -L Zarnitz, and H. Penschuck of the Foundation for their in­ terest in and help with the project. The school also received major support from the International Centre for Theoretical Physics in Trieste and its sponsoring agencies, including the use of its exce...

  16. Theoretical explanations for maintenance of behaviour change: a systematic review of behaviour theories

    Science.gov (United States)

    Kwasnicka, Dominika; Dombrowski, Stephan U; White, Martin; Sniehotta, Falko

    2016-01-01

    ABSTRACT Background: Behaviour change interventions are effective in supporting individuals in achieving temporary behaviour change. Behaviour change maintenance, however, is rarely attained. The aim of this review was to identify and synthesise current theoretical explanations for behaviour change maintenance to inform future research and practice. Methods: Potentially relevant theories were identified through systematic searches of electronic databases (Ovid MEDLINE, Embase, PsycINFO). In addition, an existing database of 80 theories was searched, and 25 theory experts were consulted. Theories were included if they formulated hypotheses about behaviour change maintenance. Included theories were synthesised thematically to ascertain overarching explanations for behaviour change maintenance. Initial theoretical themes were cross-validated. Findings: One hundred and seventeen behaviour theories were identified, of which 100 met the inclusion criteria. Five overarching, interconnected themes representing theoretical explanations for behaviour change maintenance emerged. Theoretical explanations of behaviour change maintenance focus on the differential nature and role of motives, self-regulation, resources (psychological and physical), habits, and environmental and social influences from initiation to maintenance. Discussion: There are distinct patterns of theoretical explanations for behaviour change and for behaviour change maintenance. The findings from this review can guide the development and evaluation of interventions promoting maintenance of health behaviours and help in the development of an integrated theory of behaviour change maintenance. PMID:26854092

  17. Efficient sensor selection for active information fusion.

    Science.gov (United States)

    Zhang, Yongmian; Ji, Qiang

    2010-06-01

    In our previous paper, we formalized an active information fusion framework based on dynamic Bayesian networks to provide active information fusion. This paper focuses on a central issue of active information fusion, i.e., the efficient identification of a subset of sensors that are most decision relevant and cost effective. Determining the most informative and cost-effective sensors requires an evaluation of all the possible subsets of sensors, which is computationally intractable, particularly when information-theoretic criterion such as mutual information is used. To overcome this challenge, we propose a new quantitative measure for sensor synergy based on which a sensor synergy graph is constructed. Using the sensor synergy graph, we first introduce an alternative measure to multisensor mutual information for characterizing the sensor information gain. We then propose an approximated nonmyopic sensor selection method that can efficiently and near-optimally select a subset of sensors for active fusion. The simulation study demonstrates both the performance and the efficiency of the proposed sensor selection method.

  18. Information Needs and Information Competencies: A Case Study of the Off-Site Supervision of Financial Institutions in Brazil

    Science.gov (United States)

    Miranda, Silvania V.; Tarapanoff, Kira M. A.

    2008-01-01

    Introduction: The paper deals with the identification of the information needs and information competencies of a professional group. Theoretical basis: A theoretical relationship between information needs and information competencies as subjects is proposed. Three dimensions are examine: cognitive, affective and situational. The recognition of an…

  19. Development of methods for theoretical analysis of nuclear reactors (Phase II), I-V, Part IV, Fuel depletion

    International Nuclear Information System (INIS)

    Pop-Jordanov, J.

    1962-10-01

    This report includes the analysis of plutonium isotopes from U 238 depletion chain. Two theoretical approaches for solving the depletion of fuel are shown. One results in the system of differential equations that can be solved only by using electronic calculators and the second, Machinari-Goto method enables obtaining analytical equations for approximative values of particular nuclei. In addition, differential equations are given for different approximation levels in calculating Pu 239 , as well as relations between the released energy and irradiation [sr

  20. Development and validation of a theoretical test in basic laparoscopy

    DEFF Research Database (Denmark)

    Strandbygaard, Jeanett; Maagaard, Mathilde; Larsen, Christian Rifbjerg

    2013-01-01

    for first-year residents in obstetrics and gynecology. This study therefore aimed to develop and validate a framework for a theoretical knowledge test, a multiple-choice test, in basic theory related to laparoscopy. METHODS: The content of the multiple-choice test was determined by conducting informal...... conversational interviews with experts in laparoscopy. The subsequent relevance of the test questions was evaluated using the Delphi method involving regional chief physicians. Construct validity was tested by comparing test results from three groups with expected different clinical competence and knowledge.......001). Internal consistency (Cronbach's alpha) was 0.82. There was no evidence of differential item functioning between the three groups tested. CONCLUSIONS: A newly developed knowledge test in basic laparoscopy proved to have content and construct validity. The formula for the development and validation...

  1. Information processing in the transcriptional regulatory network of yeast: Functional robustness

    Directory of Open Access Journals (Sweden)

    Dehmer Matthias

    2009-03-01

    Full Text Available Abstract Background Gene networks are considered to represent various aspects of molecular biological systems meaningfully because they naturally provide a systems perspective of molecular interactions. In this respect, the functional understanding of the transcriptional regulatory network is considered as key to elucidate the functional organization of an organism. Results In this paper we study the functional robustness of the transcriptional regulatory network of S. cerevisiae. We model the information processing in the network as a first order Markov chain and study the influence of single gene perturbations on the global, asymptotic communication among genes. Modification in the communication is measured by an information theoretic measure allowing to predict genes that are 'fragile' with respect to single gene knockouts. Our results demonstrate that the predicted set of fragile genes contains a statistically significant enrichment of so called essential genes that are experimentally found to be necessary to ensure vital yeast. Further, a structural analysis of the transcriptional regulatory network reveals that there are significant differences between fragile genes, hub genes and genes with a high betweenness centrality value. Conclusion Our study does not only demonstrate that a combination of graph theoretical, information theoretical and statistical methods leads to meaningful biological results but also that such methods allow to study information processing in gene networks instead of just their structural properties.

  2. Knowledge acquisition process as an issue in information sciences

    Directory of Open Access Journals (Sweden)

    Boris Bosančić

    2016-07-01

    Full Text Available The paper presents an overview of some problems of information science which are explicitly portrayed in literature. It covers the following issues: information explosion, information flood and data deluge, information retrieval and relevance of information, and finally, the problem of scientific communication. The purpose of this paper is to explain why knowledge acquisition, can be considered as an issue in information sciences. The existing theoretical foundation within the information sciences, i.e. the DIKW hierarchy and its key concepts - data, information, knowledge and wisdom, is recognized as a symbolic representation as well as the theoretical foundation of the knowledge acquisition process. Moreover, it seems that the relationship between the DIKW hierarchy and the knowledge acquisition process is essential for a stronger foundation of information sciences in the 'body' of the overall human knowledge. In addition, the history of both the human and machine knowledge acquisition has been considered, as well as a proposal that the DIKW hierarchy take place as a symbol of general knowledge acquisition process, which could equally relate to both human and machine knowledge acquisition. To achieve this goal, it is necessary to modify the existing concept of the DIKW hierarchy. The appropriate modification of the DIKW hierarchy (one of which is presented in this paper could result in a much more solid theoretical foundation of the knowledge acquisition process and information sciences as a whole. The theoretical assumptions on which the knowledge acquisition process may be established as a problem of information science are presented at the end of the paper. The knowledge acquisition process does not necessarily have to be the subject of epistemology. It may establish a stronger link between the concepts of data and knowledge; furthermore, it can be used in the context of scientific research, but on the more primitive level than conducting

  3. Theoretical value of psychological testing.

    Science.gov (United States)

    Shapiro, David

    2012-01-01

    Apart from their diagnostic value, psychological tests, especially the Rorschach test, have an important theoretical value for understanding psychopathology. They present a picture of a living person, in contrast to a picture of forces and agencies within the person. This rests on 2 advantages of tests over the usual psychiatric and psychoanalytic interviews: Tests are ahistorical and they present information primarily of a formal kind.

  4. THEORETICAL ASPECTS OF STATE REGULATION OF NATIONAL MACROECONOMIC ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Yevgen Maslennikov

    2017-09-01

    Full Text Available The question of the economic role of the state was put up for a long time and until now every state solves this problem in different ways and only for itself, on the basis of the accumulated experience taking into account its customs and traditions. The objective need to include the state in the economic process is determined by such factors as: the need to ensure public reproduction on an extended scale, ensuring the long-term interests of the population, maintaining the balance of socio-economic interests of different population groups in the country, ensuring the unity and integrity of the country's territorial space. For this reason, the purpose of the paper is to examine the main theoretical aspects of state regulation of the economy in historical retrospect and at the present stage of the development of society. Methodology. Methodological and informational basis of the investigation are scientific articles, materials of periodicals, resources of the Internet. To achieve the goal set, the following general scientific and special methods were used: morphological analysis, system and structural-logical analysis, formalization, analogy, comparison and integration, tabular method. Results. As a result of the research, scientific and theoretical grounds for state regulation of national macroeconomic environment were presented; forms and methods of state regulation, programming as a form of perspective state regulation and principles of economic planning were considered in more detail. Practical implications. The considered forms of state regulation are relevant in the current conditions of management and can be applied by states in accordance with the level of economic development. Value/originality. The authors presented innovative forms of state regulation methods, and analyzed their effectiveness.

  5. Microanalysis in Music Therapy: Introduction and Theoretical basis

    DEFF Research Database (Denmark)

    Wosch, Thomas; Wigram, Tony

    2007-01-01

    In the context of music therapy, microanalysis is the detailed analysis of that short period of time during a music therapy session during which some kind of significant change takes place. These moments are crucial to the therapeutic process, and there is increasing interest amongst music therap...... provides a wealth of important theoretical and practical information for music therapy clinicians, educators and students.......In the context of music therapy, microanalysis is the detailed analysis of that short period of time during a music therapy session during which some kind of significant change takes place. These moments are crucial to the therapeutic process, and there is increasing interest amongst music...... therapists in understanding how they come about and whether there are ways of initiating them. The contributors to this groundbreaking book look at methods of micro process analyses used in a variety of music therapy contexts, both clinical and research-based. They outline their methods, which include using...

  6. Modelling in Accounting. Theoretical and Practical Dimensions

    Directory of Open Access Journals (Sweden)

    Teresa Szot-Gabryś

    2010-10-01

    Full Text Available Accounting in the theoretical approach is a scientific discipline based on specific paradigms. In the practical aspect, accounting manifests itself through the introduction of a system for measurement of economic quantities which operates in a particular business entity. A characteristic of accounting is its flexibility and ability of adaptation to information needs of information recipients. One of the main currents in the development of accounting theory and practice is to cover by economic measurements areas which have not been hitherto covered by any accounting system (it applies, for example, to small businesses, agricultural farms, human capital, which requires the development of an appropriate theoretical and practical model. The article illustrates the issue of modelling in accounting based on the example of an accounting model developed for small businesses, i.e. economic entities which are not obliged by law to keep accounting records.

  7. A Theoretical Approach to Information Needs Across Different Healthcare Stakeholders

    Science.gov (United States)

    Raitoharju, Reetta; Aarnio, Eeva

    Increased access to medical information can lead to information overload among both the employees in the healthcare sector as well as among healthcare consumers. Moreover, medical information can be hard to understand for consumers who have no prerequisites for interpreting and understanding it. Information systems (e.g. electronic patient records) are normally designed to meet the demands of one professional group, for instance those of physicians. Therefore, the same information in the same form is presented to all the users of the systems regardless of the actual need or prerequisites. The purpose of this article is to illustrate the differences in information needs across different stakeholders in healthcare. A literature review was conducted to collect examples of these different information needs. Based on the findings the role of more user specific information systems is discussed.

  8. Information Diffusion in Facebook-Like Social Networks Under Information Overload

    Science.gov (United States)

    Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui

    2013-07-01

    Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.

  9. X-ray fluorescence method for trace analysis and imaging

    International Nuclear Information System (INIS)

    Hayakawa, Shinjiro

    2000-01-01

    X-ray fluorescence analysis has a long history as conventional bulk elemental analysis with medium sensitivity. However, with the use of synchrotron radiation x-ray fluorescence method has become a unique analytical technique which can provide tace elemental information with the spatial resolution. To obtain quantitative information of trace elemental distribution by using the x-ray fluorescence method, theoretical description of x-ray fluorescence yield is described. Moreover, methods and instruments for trace characterization with a scanning x-ray microprobe are described. (author)

  10. Towards a Set Theoretical Approach to Big Data Analytics

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    Formal methods, models and tools for social big data analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by relational sociology. There are no other unified modeling approaches to social big data that integrate the conceptual, formal...... this technique to the data analysis of big social data collected from Facebook page of the fast fashion company, H&M....... and software realms. In this paper, we first present and discuss a theory and conceptual model of social data. Second, we outline a formal model based on set theory and discuss the semantics of the formal model with a real-world social data example from Facebook. Third, we briefly present and discuss...

  11. Information geometric methods for complexity

    Science.gov (United States)

    Felice, Domenico; Cafaro, Carlo; Mancini, Stefano

    2018-03-01

    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.

  12. Neutron thermalization in absorbing infinite homogeneous media: theoretical methods; Methodes theoriques pour l'etude de la thermalisation des neutrons dans les milieux absorbants infinis et homogenes

    Energy Technology Data Exchange (ETDEWEB)

    Cadilhac, M [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1963-11-15

    After a general survey of the theory of neutron thermalization in homogeneous media, one introduces, through a proper formulation, a simplified model generalizing both the Horowitz model (generalized heavy free gas approximation) and the proton gas model. When this model is used, the calculation of spectra is reduced to the solution of linear second order differential equations. Since it depends on two arbitrary functions, the model gives a good approximation of any usual moderator for reactor physics purposes. The choice of these functions is discussed from a theoretical point of view; a method based on the consideration of the first two moments of the scattering law is investigated. Finally, the possibility of discriminating models by using experimental informations is considered. (author) [French] Apres un passage en revue de generalites sur la thermalisation des neutrons dans les milieux homogenes, on developpe un formalisme permettant de definir et d'etudier un modele simplifie de thermaliseur. Ce modele generalise l'approximation proposee par J. HOROWITZ (''gaz lourd generalise'') et comporte comme cas particulier le modele ''hydrogene gazeux monoatomique''. Il ramene le calcul des spectres a la resolution d'equations differentielles lineaires du second ordre. Il fait intervenir deux fonctions arbitraires, ce qui lui permet de representer les thermaliseurs usuels de facon satisfaisante pour les besoins de la physique des reacteurs. L'ajustement theorique de ces fonctions est discute; on etudie une methode basee sur la consideration des deux premiers moments de la loi de diffusion. On envisage enfin la possibilite de discriminer les modeles d'apres des renseignements d'origine experimentale. (auteur)

  13. Informality as a stepping stone: A search-theoretical assessment of informal sector and government policy

    Directory of Open Access Journals (Sweden)

    Semih Tümen

    2016-09-01

    Full Text Available This paper develops a model of sequential job search to understand the factors determining the effect of tax and enforcement policies on the size (i.e., employment share of informal sector. The focus is on the role of informal sector as a stepping stone to formal jobs. I argue that the stepping-stone role of informal jobs is an important concept determining how strongly government policies affect the size of informal sector. I measure the extent of the stepping-stone role with the intensity of skill accumulation in the informal sector. If informal jobs help workers acquire skills, gain expertise, and build professional networks for boosting the chances to switch to a formal job, then the size of informal sector is less sensitive to government policy. In this case, the option value of a job in informal sector will be high and a worker with an informal job will not rush to switch to a formal job when a policy encouraging formal employment is in effect. If, on the other hand, informal sector does not provide satisfactory training opportunities, then the size of informal sector becomes more sensitive to government policy. Calibrating the model to the Brazilian data, I perform numerical exercises confirming that the effect of government policy on the size of informal sector is a decreasing function of the intensity of skill acquisition in the informal sector.

  14. Modelling in Accounting. Theoretical and Practical Dimensions

    OpenAIRE

    Teresa Szot -Gabryś

    2010-01-01

    Accounting in the theoretical approach is a scientific discipline based on specific paradigms. In the practical aspect, accounting manifests itself through the introduction of a system for measurement of economic quantities which operates in a particular business entity. A characteristic of accounting is its flexibility and ability of adaptation to information needs of information recipients. One of the main currents in the development of accounting theory and practice is to cover by economic...

  15. Theoretical Relevance of Neuropsychological Data for Connectionist Modelling

    Directory of Open Access Journals (Sweden)

    Mauricio Iza

    2011-05-01

    Full Text Available The symbolic information-processing paradigm in cognitive psychology has met a growing challenge from neural network models over the past two decades. While neuropsychological
    evidence has been of great utility to theories concerned with information processing, the real question is, whether the less rigid connectionist models provide valid, or enough, information
    concerning complex cognitive structures. In this work, we will discuss the theoretical implications that neuropsychological data posits for modelling cognitive systems.

  16. How do small groups make decisions? : A theoretical framework to inform the implementation and study of clinical competency committees.

    Science.gov (United States)

    Chahine, Saad; Cristancho, Sayra; Padgett, Jessica; Lingard, Lorelei

    2017-06-01

    In the competency-based medical education (CBME) approach, clinical competency committees are responsible for making decisions about trainees' competence. However, we currently lack a theoretical model for group decision-making to inform this emerging assessment phenomenon. This paper proposes an organizing framework to study and guide the decision-making processes of clinical competency committees.This is an explanatory, non-exhaustive review, tailored to identify relevant theoretical and evidence-based papers related to small group decision-making. The search was conducted using Google Scholar, Web of Science, MEDLINE, ERIC, and PsycINFO for relevant literature. Using a thematic analysis, two researchers (SC & JP) met four times between April-June 2016 to consolidate the literature included in this review.Three theoretical orientations towards group decision-making emerged from the review: schema, constructivist, and social influence. Schema orientations focus on how groups use algorithms for decision-making. Constructivist orientations focus on how groups construct their shared understanding. Social influence orientations focus on how individual members influence the group's perspective on a decision. Moderators of decision-making relevant to all orientations include: guidelines, stressors, authority, and leadership.Clinical competency committees are the mechanisms by which groups of clinicians will be in charge of interpreting multiple assessment data points and coming to a shared decision about trainee competence. The way in which these committees make decisions can have huge implications for trainee progression and, ultimately, patient care. Therefore, there is a pressing need to build the science of how such group decision-making works in practice. This synthesis suggests a preliminary organizing framework that can be used in the implementation and study of clinical competency committees.

  17. Control method for biped locomotion robots based on ZMP information

    International Nuclear Information System (INIS)

    Kume, Etsuo

    1994-01-01

    The Human Acts Simulation Program (HASP) started as a ten year program of Computing and Information Systems Center (CISC) at Japan Atomic Energy Research Institute (JAERI) in 1987. A mechanical design study of biped locomotion robots for patrol and inspection in nuclear facilities is being performed as an item of the research scope. One of the goals of our research is to design a biped locomotion robot for practical use in nuclear facilities. So far, we have been studying for several dynamic walking patterns. In conventional control methods for biped locomotion robots, the program control is used based on preset walking patterns, so it dose not have the robustness such as a dynamic change of walking pattern. Therefore, a real-time control method based on dynamic information of the robot states is necessary for the high performance of walking. In this study a new control method based on Zero Moment Point (ZMP) information is proposed as one of real-time control methods. The proposed method is discussed and validated based on the numerical simulation. (author)

  18. Sibutramine characterization and solubility, a theoretical study

    Science.gov (United States)

    Aceves-Hernández, Juan M.; Nicolás Vázquez, Inés; Hinojosa-Torres, Jaime; Penieres Carrillo, Guillermo; Arroyo Razo, Gabriel; Miranda Ruvalcaba, René

    2013-04-01

    Solubility data from sibutramine (SBA) in a family of alcohols were obtained at different temperatures. Sibutramine was characterized by using thermal analysis and X-ray diffraction technique. Solubility data were obtained by the saturation method. The van't Hoff equation was used to obtain the theoretical solubility values and the ideal solvent activity coefficient. No polymorphic phenomena were found from the X-ray diffraction analysis, even though this compound is a racemic mixture of (+) and (-) enantiomers. Theoretical calculations showed that the polarisable continuum model was able to reproduce the solubility and stability of sibutramine molecule in gas phase, water and a family of alcohols at B3LYP/6-311++G (d,p) level of theory. Dielectric constant, dipolar moment and solubility in water values as physical parameters were used in those theoretical calculations for explaining that behavior. Experimental and theoretical results were compared and good agreement was obtained. Sibutramine solubility increased from methanol to 1-octanol in theoretical and experimental results.

  19. An Order-Theoretic Quantification of Contextuality

    Directory of Open Access Journals (Sweden)

    Ian T. Durham

    2014-09-01

    Full Text Available In this essay, I develop order-theoretic notions of determinism and contextuality on domains and topoi. In the process, I develop a method for quantifying contextuality and show that the order-theoretic sense of contextuality is analogous to the sense embodied in the topos-theoretic statement of the Kochen–Specker theorem. Additionally, I argue that this leads to a relation between the entropy associated with measurements on quantum systems and the second law of thermodynamics. The idea that the second law has its origin in the ordering of quantum states and processes dates to at least 1958 and possibly earlier. The suggestion that the mechanism behind this relation is contextuality, is made here for the first time.

  20. Fuel cycle covariance of plutonium and americium separations to repository capacity using information theoretic measures

    International Nuclear Information System (INIS)

    Scopatz, Anthony; Schneider, Erich; Li, Jun; Yim, Man-Sung

    2011-01-01

    A light water reactor, fast reactor symbiotic fuel cycle scenario was modeled and parameterized based on thirty independent inputs. Simultaneously and stochastically choosing different values for each of these inputs and performing the associated fuel cycle mass-balance calculation, the fuel cycle itself underwent Monte Carlo simulation. A novel information theoretic metric is postulated as a measure of system-wide covariance. This metric is the coefficient of variation of the set of uncertainty coefficients generated from 2D slices of a 3D contingency table. It is then applied to the fuel cycle, taking fast reactor used fuel plutonium and americium separations as independent variables and the capacity of a fully-loaded tuff repository as the response. This set of parameters is known from prior studies to have a strong covariance. When measured with all 435 other input parameters possible, the fast reactor plutonium and americium separations pair was found to be ranked the second most covariant. This verifies that the coefficient of variation metric captures the desired sensitivity of sensitivity effects in the nuclear fuel cycle. (author)

  1. Lightning Talks 2015: Theoretical Division

    Energy Technology Data Exchange (ETDEWEB)

    Shlachter, Jack S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-11-25

    This document is a compilation of slides from a number of student presentations given to LANL Theoretical Division members. The subjects cover the range of activities of the Division, including plasma physics, environmental issues, materials research, bacterial resistance to antibiotics, and computational methods.

  2. Decision-Making under Ambiguity Is Modulated by Visual Framing, but Not by Motor vs. Non-Motor Context. Experiments and an Information-Theoretic Ambiguity Model.

    Science.gov (United States)

    Grau-Moya, Jordi; Ortega, Pedro A; Braun, Daniel A

    2016-01-01

    A number of recent studies have investigated differences in human choice behavior depending on task framing, especially comparing economic decision-making to choice behavior in equivalent sensorimotor tasks. Here we test whether decision-making under ambiguity exhibits effects of task framing in motor vs. non-motor context. In a first experiment, we designed an experience-based urn task with varying degrees of ambiguity and an equivalent motor task where subjects chose between hitting partially occluded targets. In a second experiment, we controlled for the different stimulus design in the two tasks by introducing an urn task with bar stimuli matching those in the motor task. We found ambiguity attitudes to be mainly influenced by stimulus design. In particular, we found that the same subjects tended to be ambiguity-preferring when choosing between ambiguous bar stimuli, but ambiguity-avoiding when choosing between ambiguous urn sample stimuli. In contrast, subjects' choice pattern was not affected by changing from a target hitting task to a non-motor context when keeping the stimulus design unchanged. In both tasks subjects' choice behavior was continuously modulated by the degree of ambiguity. We show that this modulation of behavior can be explained by an information-theoretic model of ambiguity that generalizes Bayes-optimal decision-making by combining Bayesian inference with robust decision-making under model uncertainty. Our results demonstrate the benefits of information-theoretic models of decision-making under varying degrees of ambiguity for a given context, but also demonstrate the sensitivity of ambiguity attitudes across contexts that theoretical models struggle to explain.

  3. Decision-Making under Ambiguity Is Modulated by Visual Framing, but Not by Motor vs. Non-Motor Context. Experiments and an Information-Theoretic Ambiguity Model.

    Directory of Open Access Journals (Sweden)

    Jordi Grau-Moya

    Full Text Available A number of recent studies have investigated differences in human choice behavior depending on task framing, especially comparing economic decision-making to choice behavior in equivalent sensorimotor tasks. Here we test whether decision-making under ambiguity exhibits effects of task framing in motor vs. non-motor context. In a first experiment, we designed an experience-based urn task with varying degrees of ambiguity and an equivalent motor task where subjects chose between hitting partially occluded targets. In a second experiment, we controlled for the different stimulus design in the two tasks by introducing an urn task with bar stimuli matching those in the motor task. We found ambiguity attitudes to be mainly influenced by stimulus design. In particular, we found that the same subjects tended to be ambiguity-preferring when choosing between ambiguous bar stimuli, but ambiguity-avoiding when choosing between ambiguous urn sample stimuli. In contrast, subjects' choice pattern was not affected by changing from a target hitting task to a non-motor context when keeping the stimulus design unchanged. In both tasks subjects' choice behavior was continuously modulated by the degree of ambiguity. We show that this modulation of behavior can be explained by an information-theoretic model of ambiguity that generalizes Bayes-optimal decision-making by combining Bayesian inference with robust decision-making under model uncertainty. Our results demonstrate the benefits of information-theoretic models of decision-making under varying degrees of ambiguity for a given context, but also demonstrate the sensitivity of ambiguity attitudes across contexts that theoretical models struggle to explain.

  4. Decision-Making under Ambiguity Is Modulated by Visual Framing, but Not by Motor vs. Non-Motor Context. Experiments and an Information-Theoretic Ambiguity Model

    Science.gov (United States)

    Grau-Moya, Jordi; Ortega, Pedro A.; Braun, Daniel A.

    2016-01-01

    A number of recent studies have investigated differences in human choice behavior depending on task framing, especially comparing economic decision-making to choice behavior in equivalent sensorimotor tasks. Here we test whether decision-making under ambiguity exhibits effects of task framing in motor vs. non-motor context. In a first experiment, we designed an experience-based urn task with varying degrees of ambiguity and an equivalent motor task where subjects chose between hitting partially occluded targets. In a second experiment, we controlled for the different stimulus design in the two tasks by introducing an urn task with bar stimuli matching those in the motor task. We found ambiguity attitudes to be mainly influenced by stimulus design. In particular, we found that the same subjects tended to be ambiguity-preferring when choosing between ambiguous bar stimuli, but ambiguity-avoiding when choosing between ambiguous urn sample stimuli. In contrast, subjects’ choice pattern was not affected by changing from a target hitting task to a non-motor context when keeping the stimulus design unchanged. In both tasks subjects’ choice behavior was continuously modulated by the degree of ambiguity. We show that this modulation of behavior can be explained by an information-theoretic model of ambiguity that generalizes Bayes-optimal decision-making by combining Bayesian inference with robust decision-making under model uncertainty. Our results demonstrate the benefits of information-theoretic models of decision-making under varying degrees of ambiguity for a given context, but also demonstrate the sensitivity of ambiguity attitudes across contexts that theoretical models struggle to explain. PMID:27124723

  5. Practical Methods for Information Security Risk Management

    Directory of Open Access Journals (Sweden)

    Cristian AMANCEI

    2011-01-01

    Full Text Available The purpose of this paper is to present some directions to perform the risk man-agement for information security. The article follows to practical methods through question-naire that asses the internal control, and through evaluation based on existing controls as part of vulnerability assessment. The methods presented contains all the key elements that concurs in risk management, through the elements proposed for evaluation questionnaire, list of threats, resource classification and evaluation, correlation between risks and controls and residual risk computation.

  6. Theoretical and Methodological Aspects of Assessment of the Adaptation Potential of Personnel

    Directory of Open Access Journals (Sweden)

    Sesina Iryna M.

    2014-02-01

    Full Text Available The article is devoted to development of theoretical and methodological recommendations with respect to assessment of the adaptation potential of employees as an important prerequisite of development of employees and ensuring competitiveness of an enterprise. It contains the author’s interpretation of the adaptation potential as a possibility of adjusting to the environment with the aim of achieving socio-economic goals of an enterprise. Adaptation potential is a property of a person as a performer of labour functions and ability to master new methods of work, adjustment to new labour conditions, processing of information and also a communicative property. At the same time adaptation potential is an aggregate of motivational, professional, information and integration components of a person. For assessing the adaptation potential it is proposed to combine 360 degrees method and method of paired comparison, which facilitates increase of trustworthiness of results. The author marks out some criteria of assessment of the adaptation potential: ratio of professional experience, ratio of official experience, ratio of efficiency of work, independence in mastering new methods of work, fast adjustment to new labour conditions, ability to quickly process big volumes of information, mobility, high level of productivity under different labour conditions, sharpness of wit in different production situations, ability to form interpersonal relations in a collective and psychological features.

  7. Overview. Department of Theoretical Physics. Section 4

    Energy Technology Data Exchange (ETDEWEB)

    Kwiecinski, J. [Institute of Nuclear Physics, Cracow (Poland)

    1995-12-31

    Research activity of the Department of the Theoretical Physics spans a wide variety of problems in theoretical high-energy and elementary particle physics, theoretical nuclear physics, theory of the nuclear matter, quark gluon plasma and relativistic heavy-ion collisions, theoretical astrophysics, as well as general physics. Theoretical research in high energy and elementary particle physics is concentrated on the theory of deep inelastic lepton scattering in the region of low x and its phenomenological implication for the ep collider HERA at DESY, on the theory of nonleptonic decays of hadrons, and on low energy {pi}{pi} and K-anti-K interactions and scalar meson spectroscopy. The activity in the theory of relativistic heavy-ion collisions is focused on the study of quark condensate fluctuations, on the analysis of critical scattering near the chiral phase transition, and on Bose-Einstein correlation in heavy-ion collisions. Theoretical studies in nuclear physics and in theory of nuclear matter concern analysis of models, with dynamical symmetry based on group S{sub p}(6,R) for the description of collective modes of atomic nuclei, analysis of the Goldstone bosons in nuclear matter and analysis of saturation properties of nuclear matter. Research in theoretical astrophysics is mainly devoted to the analysis of magnetic properties of hadronic matter in neutron stars with proton admixture. Studies in general physics concern problem related to the Galilean covariance of classical and quantum mechanics. The detailed results obtained in various fields are summarised in presented abstracts as well as information about employed personnel, publications, contribution to conferences, reports, workshops and seminars.

  8. Overview. Department of Theoretical Physics. Section 4

    Energy Technology Data Exchange (ETDEWEB)

    Kwiecinski, J [Institute of Nuclear Physics, Cracow (Poland)

    1996-12-31

    Research activity of the Department of the Theoretical Physics spans a wide variety of problems in theoretical high-energy and elementary particle physics, theoretical nuclear physics, theory of the nuclear matter, quark gluon plasma and relativistic heavy-ion collisions, theoretical astrophysics, as well as general physics. Theoretical research in high energy and elementary particle physics is concentrated on the theory of deep inelastic lepton scattering in the region of low x and its phenomenological implication for the ep collider HERA at DESY, on the theory of nonleptonic decays of hadrons, and on low energy {pi}{pi} and K-anti-K interactions and scalar meson spectroscopy. The activity in the theory of relativistic heavy-ion collisions is focused on the study of quark condensate fluctuations, on the analysis of critical scattering near the chiral phase transition, and on Bose-Einstein correlation in heavy-ion collisions. Theoretical studies in nuclear physics and in theory of nuclear matter concern analysis of models, with dynamical symmetry based on group S{sub p}(6,R) for the description of collective modes of atomic nuclei, analysis of the Goldstone bosons in nuclear matter and analysis of saturation properties of nuclear matter. Research in theoretical astrophysics is mainly devoted to the analysis of magnetic properties of hadronic matter in neutron stars with proton admixture. Studies in general physics concern problem related to the Galilean covariance of classical and quantum mechanics. The detailed results obtained in various fields are summarised in presented abstracts as well as information about employed personnel, publications, contribution to conferences, reports, workshops and seminars.

  9. Overview. Department of Theoretical Physics. Section 4

    International Nuclear Information System (INIS)

    Kwiecinski, J.

    1995-01-01

    Research activity of the Department of the Theoretical Physics spans a wide variety of problems in theoretical high-energy and elementary particle physics, theoretical nuclear physics, theory of the nuclear matter, quark gluon plasma and relativistic heavy-ion collisions, theoretical astrophysics, as well as general physics. Theoretical research in high energy and elementary particle physics is concentrated on the theory of deep inelastic lepton scattering in the region of low x and its phenomenological implication for the ep collider HERA at DESY, on the theory of nonleptonic decays of hadrons, and on low energy ππ and K-anti-K interactions and scalar meson spectroscopy. The activity in the theory of relativistic heavy-ion collisions is focused on the study of quark condensate fluctuations, on the analysis of critical scattering near the chiral phase transition, and on Bose-Einstein correlation in heavy-ion collisions. Theoretical studies in nuclear physics and in theory of nuclear matter concern analysis of models, with dynamical symmetry based on group S p (6,R) for the description of collective modes of atomic nuclei, analysis of the Goldstone bosons in nuclear matter and analysis of saturation properties of nuclear matter. Research in theoretical astrophysics is mainly devoted to the analysis of magnetic properties of hadronic matter in neutron stars with proton admixture. Studies in general physics concern problem related to the Galilean covariance of classical and quantum mechanics. The detailed results obtained in various fields are summarised in presented abstracts as well as information about employed personnel, publications, contribution to conferences, reports, workshops and seminars

  10. 3D nonrigid medical image registration using a new information theoretic measure

    Science.gov (United States)

    Li, Bicao; Yang, Guanyu; Coatrieux, Jean Louis; Li, Baosheng; Shu, Huazhong

    2015-11-01

    This work presents a novel method for the nonrigid registration of medical images based on the Arimoto entropy, a generalization of the Shannon entropy. The proposed method employed the Jensen-Arimoto divergence measure as a similarity metric to measure the statistical dependence between medical images. Free-form deformations were adopted as the transformation model and the Parzen window estimation was applied to compute the probability distributions. A penalty term is incorporated into the objective function to smooth the nonrigid transformation. The goal of registration is to optimize an objective function consisting of a dissimilarity term and a penalty term, which would be minimal when two deformed images are perfectly aligned using the limited memory BFGS optimization method, and thus to get the optimal geometric transformation. To validate the performance of the proposed method, experiments on both simulated 3D brain MR images and real 3D thoracic CT data sets were designed and performed on the open source elastix package. For the simulated experiments, the registration errors of 3D brain MR images with various magnitudes of known deformations and different levels of noise were measured. For the real data tests, four data sets of 4D thoracic CT from four patients were selected to assess the registration performance of the method, including ten 3D CT images for each 4D CT data covering an entire respiration cycle. These results were compared with the normalized cross correlation and the mutual information methods and show a slight but true improvement in registration accuracy.

  11. 3D nonrigid medical image registration using a new information theoretic measure

    International Nuclear Information System (INIS)

    Li, Bicao; Yang, Guanyu; Coatrieux, Jean Louis; Li, Baosheng; Shu, Huazhong

    2015-01-01

    This work presents a novel method for the nonrigid registration of medical images based on the Arimoto entropy, a generalization of the Shannon entropy. The proposed method employed the Jensen–Arimoto divergence measure as a similarity metric to measure the statistical dependence between medical images. Free-form deformations were adopted as the transformation model and the Parzen window estimation was applied to compute the probability distributions. A penalty term is incorporated into the objective function to smooth the nonrigid transformation. The goal of registration is to optimize an objective function consisting of a dissimilarity term and a penalty term, which would be minimal when two deformed images are perfectly aligned using the limited memory BFGS optimization method, and thus to get the optimal geometric transformation. To validate the performance of the proposed method, experiments on both simulated 3D brain MR images and real 3D thoracic CT data sets were designed and performed on the open source elastix package. For the simulated experiments, the registration errors of 3D brain MR images with various magnitudes of known deformations and different levels of noise were measured. For the real data tests, four data sets of 4D thoracic CT from four patients were selected to assess the registration performance of the method, including ten 3D CT images for each 4D CT data covering an entire respiration cycle. These results were compared with the normalized cross correlation and the mutual information methods and show a slight but true improvement in registration accuracy. (paper)

  12. Information Theory for Information Science: Antecedents, Philosophy, and Applications

    Science.gov (United States)

    Losee, Robert M.

    2017-01-01

    This paper provides an historical overview of the theoretical antecedents leading to information theory, specifically those useful for understanding and teaching information science and systems. Information may be discussed in a philosophical manner and at the same time be measureable. This notion of information can thus be the subject of…

  13. Methods of Certification tests PLC-Networks in Compliance Safety Information

    Directory of Open Access Journals (Sweden)

    A. A. Balaev

    2011-12-01

    Full Text Available The aim of this research was description of the methodology of the audit plc-network to meet the requirements of information security. The technique is based on the provisions of the guidance documents and model FSTEC Russia test object methods of information on safety information.

  14. a Task-Oriented Disaster Information Correlation Method

    Science.gov (United States)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  15. Theoretical Foundations of Study of Cartography

    Science.gov (United States)

    Talhofer, Václav; Hošková-Mayerová, Šárka

    2018-05-01

    Cartography and geoinformatics are technical-based fields which deal with modelling and visualization of landscape in the form of a map. The theoretical foundation is necessary to obtain during study of cartography and geoinformatics based mainly on mathematics. For the given subjects, mathematics is necessary for understanding of many procedures that are connected to modelling of the Earth as a celestial body, to ways of its projection into a plane, to methods and procedures of modelling of landscape and phenomena in society and visualization of these models in the form of electronic as well as classic paper maps. Not only general mathematics, but also its extension of differential geometry of curves and surfaces, ways of approximation of lines and surfaces of functional surfaces, mathematical statistics and multi-criterial analyses seem to be suitable and necessary. Underestimation of the significance of mathematical education in cartography and geoinformatics is inappropriate and lowers competence of cartographers and professionals in geographic information science and technology to solve problems.

  16. A Theoretical Foundation for Tilden's Interpretive Principles.

    Science.gov (United States)

    Hammitt, William E.

    1981-01-01

    Draws from perceptual and cognitive psychology to present a theoretical basis for the principles of interpretation developed by Freeman Tilden. Emphasized is cognitive map theory which holds that information units people receive, code and store are structured into cognitive models intended to represent the environment. (Author/WB)

  17. Promoting mental wellbeing: developing a theoretically and empirically sound complex intervention.

    Science.gov (United States)

    Millar, S L; Donnelly, M

    2014-06-01

    This paper describes the development of a complex intervention to promote mental wellbeing using the revised framework for developing and evaluating complex interventions produced by the UK Medical Research Council (UKMRC). Application of the first two phases of the framework is described--development and feasibility and piloting. The theoretical case and evidence base were examined analytically to explicate the theoretical and empirical foundations of the intervention. These findings informed the design of a 12-week mental wellbeing promotion programme providing early intervention for people showing signs of mental health difficulties. The programme is based on the theoretical constructs of self-efficacy, self-esteem, purpose in life, resilience and social support and comprises 10 steps. A mixed methods approach was used to conduct a feasibility study with community and voluntary sector service users and in primary care. A significant increase in mental wellbeing was observed following participation in the intervention. Qualitative data corroborated this finding and suggested that the intervention was feasible to deliver and acceptable to participants, facilitators and health professionals. The revised UKMRC framework can be successfully applied to the development of public health interventions. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. A Theoretical Bayesian Game Model for the Vendor-Retailer Relation

    Directory of Open Access Journals (Sweden)

    Emil CRIŞAN

    2012-06-01

    Full Text Available We consider an equilibrated supply chain with two equal partners, a vendor and a retailer (also called newsboy type products supply chain. The actions of each partner are driven by profit. Given the fact that at supply chain level are specific external influences which affect the costs and concordant the profit, we use a game theoretic model for the situation, considering costs and demand. At theoretical level, symmetric and asymmetric information patterns are considered for this situation. There are at every supply chain’s level situations when external factors (such as inflation, raw-material rate influence the situation of each partner even if the information is well shared within the chain. The model we propose considers both the external factors and asymmetric information within a supply chain.

  19. Developing a targeted, theory-informed implementation intervention using two theoretical frameworks to address health professional and organisational factors: a case study to improve the management of mild traumatic brain injury in the emergency department.

    Science.gov (United States)

    Tavender, Emma J; Bosch, Marije; Gruen, Russell L; Green, Sally E; Michie, Susan; Brennan, Sue E; Francis, Jill J; Ponsford, Jennie L; Knott, Jonathan C; Meares, Sue; Smyth, Tracy; O'Connor, Denise A

    2015-05-25

    Despite the availability of evidence-based guidelines for the management of mild traumatic brain injury in the emergency department (ED), variations in practice exist. Interventions designed to implement recommended behaviours can reduce this variation. Using theory to inform intervention development is advocated; however, there is no consensus on how to select or apply theory. Integrative theoretical frameworks, based on syntheses of theories and theoretical constructs relevant to implementation, have the potential to assist in the intervention development process. This paper describes the process of applying two theoretical frameworks to investigate the factors influencing recommended behaviours and the choice of behaviour change techniques and modes of delivery for an implementation intervention. A stepped approach was followed: (i) identification of locally applicable and actionable evidence-based recommendations as targets for change, (ii) selection and use of two theoretical frameworks for identifying barriers to and enablers of change (Theoretical Domains Framework and Model of Diffusion of Innovations in Service Organisations) and (iii) identification and operationalisation of intervention components (behaviour change techniques and modes of delivery) to address the barriers and enhance the enablers, informed by theory, evidence and feasibility/acceptability considerations. We illustrate this process in relation to one recommendation, prospective assessment of post-traumatic amnesia (PTA) by ED staff using a validated tool. Four recommendations for managing mild traumatic brain injury were targeted with the intervention. The intervention targeting the PTA recommendation consisted of 14 behaviour change techniques and addressed 6 theoretical domains and 5 organisational domains. The mode of delivery was informed by six Cochrane reviews. It was delivered via five intervention components : (i) local stakeholder meetings, (ii) identification of local opinion

  20. How to Create Business Value through Information Technology (A Case Study on Automotive Production

    Directory of Open Access Journals (Sweden)

    Kamran Feizi

    2018-03-01

    Full Text Available This study aimed at designing information technology business value model in order to explain the value of information technology in business and evaluate information technology contribution in organization function. For the purpose of a comprehensive analysis of the subject using qualitative data, the research method was set to be mixed method focusing on qualitative sequential design. The first stage of the research strategy was an exploratory single case study and for the second phase, the researchers used focus groups design. Saipa Corporation, one of the most famous firms in Iranian automotive industry, was selected as the case. The varied qualitative date were gathered through interview, document review and observation methods and were analyzed afterwards. Ultimately, the findings of the study highlighted the items and dimensions of information technology business value. This model is composed of the relationship between different dimensions of the theoretical models regarding and the share of information technology in building a business. In order to ensure research generalizability, the results of the research were compared with those in the literature review and the theoretic adequacy of the proposed framework was confirmed.

  1. International Conference on Theoretical and Computational Physics

    CERN Document Server

    2016-01-01

    Int'l Conference on Theoretical and Computational Physics (TCP 2016) will be held from August 24 to 26, 2016 in Xi'an, China. This Conference will cover issues on Theoretical and Computational Physics. It dedicates to creating a stage for exchanging the latest research results and sharing the advanced research methods. TCP 2016 will be an important platform for inspiring international and interdisciplinary exchange at the forefront of Theoretical and Computational Physics. The Conference will bring together researchers, engineers, technicians and academicians from all over the world, and we cordially invite you to take this opportunity to join us for academic exchange and visit the ancient city of Xi’an.

  2. Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints

    Directory of Open Access Journals (Sweden)

    Jim W. Kay

    2018-03-01

    Full Text Available The Partial Information Decomposition, introduced by Williams P. L. et al. (2010, provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method ( I dep has recently been proposed by James R. G. et al. (2017 for computing a two-predictor partial information decomposition over discrete spaces. A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a particular predictor has about the target is defined as the minimum increase in joint predictor-target mutual information when that particular predictor-target marginal dependency is constrained. Here, we apply the I dep approach to Gaussian systems, for which the marginally constrained maximum entropy models are Gaussian graphical models. Closed form solutions for the I dep PID are derived for both univariate and multivariate Gaussian systems. Numerical and graphical illustrations are provided, together with practical and theoretical comparisons of the I dep PID with the minimum mutual information partial information decomposition ( I mmi , which was discussed by Barrett A. B. (2015. The results obtained using I dep appear to be more intuitive than those given with other methods, such as I mmi , in which the redundant and unique information components are constrained to depend only on the predictor-target marginal distributions. In particular, it is proved that the I mmi method generally produces larger estimates of redundancy and synergy than does the I dep method. In discussion of the practical examples, the PIDs are complemented by the use of tests of deviance for the comparison of Gaussian graphical models.

  3. Human papillomavirus (HPV) information needs: a theoretical framework

    Science.gov (United States)

    Marlow, Laura A V; Wardle, Jane; Waller, Jo; Grant, Nina

    2009-01-01

    Background With the introduction of human papillomavirus (HPV) testing and vaccination in the UK, health professionals will start to receive questions about the virus from their patients. This study aimed to identify the key questions about HPV that British women will ask when considering having an HPV test or vaccination. Methods Face-to-face interviews were carried out with 21 women to discover what they wanted to know about HPV. A thematic framework approach was used to analyse the data and identify key themes in women's HPV knowledge requirements. Results Women's questions about HPV fell into six areas: identity (e.g. What are the symptoms?), cause (e.g. How do you get HPV?), timeline (e.g. How long does it last?), consequences (e.g. Does it always cause cervical cancer?) and control-cure (e.g. Can you prevent infection?). In addition, they asked procedural questions about testing and vaccination (e.g. Where do I get an HPV test?). These mapped well onto the dimensions identified in Leventhal's description of lay models of illness, called the 'Common Sense Model' (CSM). Discussion and conclusions These results indicated that the majority of the questions women asked about HPV fitted well into the CSM, which therefore provides a structure for women's information needs. The findings could help health professionals understand what questions they may be expected to answer. Framing educational materials using the CSM themes may also help health educators achieve a good fit with what the public want to know. PMID:19126314

  4. Method of sharing mobile unit state information between base station routers

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Polakos, Paul Anthony; Rajkumar, Ajay; Sundaram, Ganapathy S.

    2007-01-01

    The present invention provides a method of operating a first base station router. The method may include transmitting state information associated with at least one inactive mobile unit to at least one second base station router. The state information is usable to initiate an active session with the

  5. Method of sharing mobile unit state information between base station routers

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Polakos, Paul Anthony; Rajkumar, Ajay; Sundaram, Ganapathy S.

    2010-01-01

    The present invention provides a method of operating a first base station router. The method may include transmitting state information associated with at least one inactive mobile unit to at least one second base station router. The state information is usable to initiate an active session with the

  6. Archaeological culture and medieval ethnic community: theoretical and methodical problems of correlation (the case of medieval Bulgaria

    Directory of Open Access Journals (Sweden)

    Izmaylov Iskander L.

    2014-09-01

    Full Text Available Problems related to archaeological culture and ethnos comparison in the case of medieval Bulgaria are discussed in the article. According to the author, in recent years it has become evident that the traditional concept and methodology of the study of the Bulgars’ ethnogenesis and ethnic history are in contradiction with the facts accumulated. The methods of “archaeological ethno-genetics”, which dictated solving problems of ethnogenesis of the ancient population belonging to an archaeological culture in direct correlation with ethnicity, are currently being criticized. According to modern ideas about ethnos and ethnicity, ethnicity is based upon identity with a complex hierarchical nature. Contemporary methodology requires proceeding with the integrated study of the problems of ethnogenesis on the basis of archaeology and ethnology. This kind of analysis is based upon the study of the medieval Bulgar mentality as a source of information on key aspects of ethno-political ideas. The analysis of authentic historical sources, historiographical tradition elements and folklore materials makes it possible to reconstruct the basic ideas that were significant for an ethnic group. The archaeological culture of the population of Bulgaria is characterized by two clearly distinguished and interconnected elements – the common Muslim culture and that of the elite military “druzhina” (squad. These elements directly characterize the Bulgar ethno-political community. These theoretical conclusions and empirical research concerning the case of the medieval Bulgars’ ethnogenesis attest to the productivity of ethnological synthesis techniques on an interdisciplinary basis.

  7. Theoretical aspects of spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized theoretical aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter provides up-to-date coverage of particle association measures that underpin the theoretical properties of recently developed random set methods in space and time otherwise known as the class of probability hypothesis density framework (PHD filters). The second chapter gives an overview of recent advances in Monte Carlo methods for Bayesian filtering in high-dimensional spaces. In particular, the chapter explains how one may extend classical sequential Monte Carlo methods for filtering and static inference problems to high dimensions and big-data applications. The third chapter presents an overview of generalized families of processes that extend the class of Gaussian process models to heavy-tailed families known as alph...

  8. Theoretical development and first-principles analysis of strongly correlated systems

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Chen [Iowa State Univ., Ames, IA (United States)

    2016-12-17

    A variety of quantum many-body methods have been developed for studying the strongly correlated electron systems. We have also proposed a computationally efficient and accurate approach, named the correlation matrix renormalization (CMR) method, to address the challenges. The initial implementation of the CMR method is designed for molecules which have theoretical advantages, including small size of system, manifest mechanism and strongly correlation effect such as bond breaking process. The theoretic development and benchmark tests of the CMR method are included in this thesis. Meanwhile, ground state total energy is the most important property of electronic calculations. We also investigated an alternative approach to calculate the total energy, and extended this method for magnetic anisotropy energy (MAE) of ferromagnetic materials. In addition, another theoretical tool, dynamical mean- field theory (DMFT) on top of the DFT , has also been used in electronic structure calculations for an Iridium oxide to study the phase transition, which results from an interplay of the d electrons' internal degrees of freedom.

  9. Sentence Comprehension as Mental Simulation: An Information-Theoretic Perspective

    Directory of Open Access Journals (Sweden)

    Gabriella Vigliocco

    2011-11-01

    Full Text Available It has been argued that the mental representation resulting from sentence comprehension is not (just an abstract symbolic structure but a “mental simulation” of the state-of-affairs described by the sentence. We present a particular formalization of this theory and show how it gives rise to quantifications of the amount of syntactic and semantic information conveyed by each word in a sentence. These information measures predict simulated word-processing times in a dynamic connectionist model of sentence comprehension as mental simulation. A quantitatively similar relation between information content and reading time is known to be present in human reading-time data.

  10. Structural, vibrational and nuclear magnetic resonance investigations of 4-bromoisoquinoline by experimental and theoretical DFT methods.

    Science.gov (United States)

    Arjunan, V; Thillai Govindaraja, S; Jayapraksh, A; Mohan, S

    2013-04-15

    Quantum chemical calculations of energy, structural parameters and vibrational wavenumbers of 4-bromoisoquinoline (4BIQ) were carried out by using B3LYP method using 6-311++G(**), cc-pVTZ and LANL2DZ basis sets. The optimised geometrical parameters obtained by DFT calculations are in good agreement with electron diffraction data. Interpretations of the experimental FTIR and FT-Raman spectra have been reported with the aid of the theoretical wavenumbers. The differences between the observed and scaled wavenumber values of most of the fundamentals are very small. The thermodynamic parameters have also been computed. Electronic properties of the molecule were discussed through the molecular electrostatic potential surface, HOMO-LUMO energy gap and NBO analysis. To provide precise assignments of (1)H and (13)CNMR spectra, isotropic shielding and chemical shifts were calculated with the Gauge-Invariant Atomic Orbital (GIAO) method. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. A game theoretic approach to assignment problems

    NARCIS (Netherlands)

    Klijn, F.

    2000-01-01

    Game theory deals with the mathematical modeling and analysis of conflict and cooperation in the interaction of multiple decision makers. This thesis adopts two game theoretic methods to analyze a range of assignment problems that arise in various economic situations. The first method has as

  12. Information loss method to measure node similarity in networks

    Science.gov (United States)

    Li, Yongli; Luo, Peng; Wu, Chong

    2014-09-01

    Similarity measurement for the network node has been paid increasing attention in the field of statistical physics. In this paper, we propose an entropy-based information loss method to measure the node similarity. The whole model is established based on this idea that less information loss is caused by seeing two more similar nodes as the same. The proposed new method has relatively low algorithm complexity, making it less time-consuming and more efficient to deal with the large scale real-world network. In order to clarify its availability and accuracy, this new approach was compared with some other selected approaches on two artificial examples and synthetic networks. Furthermore, the proposed method is also successfully applied to predict the network evolution and predict the unknown nodes' attributions in the two application examples.

  13. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    Science.gov (United States)

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  14. Theoretical aspects of light meson spectroscopy

    International Nuclear Information System (INIS)

    Barnes, T.; Univ. of Tennessee, Knoxville, TN

    1995-01-01

    In this pedagogical review the authors discuss the theoretical understanding of light hadron spectroscopy in terms of QCD and the quark model. They begin with a summary of the known and surmised properties of QCD and confinement. Following this they review the nonrelativistic quark potential model for q anti q mesons and discuss the quarkonium spectrum and methods for identifying q anti q states. Finally, they review theoretical expectations for non-q anti q states (glueballs, hybrids and multiquark systems) and the status of experimental candidates for these states

  15. Older people in the information society

    Directory of Open Access Journals (Sweden)

    Aleksandra Marcinkiewicz-Wilk

    2016-06-01

    Full Text Available This paper focuses on the situation of older people in the information society. In the theoretical part of article phenomena of aging population and information society were described. This paper includes results of research conducted in qualitative strategy. The method of collecting data was biographical method. The method for data processing was qualitative content analysis. In the research 2 older, educationally active people took part. Results of research shows how older people understand the information society and what risk and opportunities they notice in this new reality. Narratives of the respondents indicated that education is of crucial importance for participation in the information society. Older people who take part in lifelong learning cope better with the new reality than people who do not learn. Based on the research results we can point out areas of education which should be development. Moreover, it is visible that educational activity of older people is very important in adaptation to the information society. Narratives of seniors indicate reasons for the lack of educational activity of other seniors. According to this, it can be specified what action should be undertaken to prevent the exclusion of older people in this new reality

  16. Theoretical and experimental investigation of multispectral photoacoustic osteoporosis detection method

    Science.gov (United States)

    Steinberg, Idan; Hershkovich, Hadas Sara; Gannot, Israel; Eyal, Avishay

    2014-03-01

    Osteoporosis is a widespread disorder, which has a catastrophic impact on patients lives and overwhelming related to healthcare costs. Recently, we proposed a multispectral photoacoustic technique for early detection of osteoporosis. Such technique has great advantages over pure ultrasonic or optical methods as it allows the deduction of both bone functionality from the bone absorption spectrum and bone resistance to fracture from the characteristics of the ultrasound propagation. We demonstrated the propagation of multiple acoustic modes in animal bones in-vitro. To further investigate the effects of multiple wavelength excitations and of induced osteoporosis on the PA signal a multispectral photoacoustic system is presented. The experimental investigation is based on measuring the interference of multiple acoustic modes. The performance of the system is evaluated and a simple two mode theoretical model is fitted to the measured phase signals. The results show that such PA technique is accurate and repeatable. Then a multiple wavelength excitation is tested. It is shown that the PA response due to different excitation wavelengths revels that absorption by the different bone constitutes has a profound effect on the mode generation. The PA response is measured in single wavelength before and after induced osteoporosis. Results show that induced osteoporosis alters the measured amplitude and phase in a consistent manner which allows the detection of the onset of osteoporosis. These results suggest that a complete characterization of the bone over a region of both acoustic and optical frequencies might be used as a powerful tool for in-vivo bone evaluation.

  17. Beyond the SCS-CN method: A theoretical framework for spatially lumped rainfall-runoff response

    Science.gov (United States)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-06-01

    Since its introduction in 1954, the Soil Conservation Service curve number (SCS-CN) method has become the standard tool, in practice, for estimating an event-based rainfall-runoff response. However, because of its empirical origins, the SCS-CN method is restricted to certain geographic regions and land use types. Moreover, it does not describe the spatial variability of runoff. To move beyond these limitations, we present a new theoretical framework for spatially lumped, event-based rainfall-runoff modeling. In this framework, we describe the spatially lumped runoff model as a point description of runoff that is upscaled to a watershed area based on probability distributions that are representative of watershed heterogeneities. The framework accommodates different runoff concepts and distributions of heterogeneities, and in doing so, it provides an implicit spatial description of runoff variability. Heterogeneity in storage capacity and soil moisture are the basis for upscaling a point runoff response and linking ecohydrological processes to runoff modeling. For the framework, we consider two different runoff responses for fractions of the watershed area: "prethreshold" and "threshold-excess" runoff. These occur before and after infiltration exceeds a storage capacity threshold. Our application of the framework results in a new model (called SCS-CNx) that extends the SCS-CN method with the prethreshold and threshold-excess runoff mechanisms and an implicit spatial description of runoff. We show proof of concept in four forested watersheds and further that the resulting model may better represent geographic regions and site types that previously have been beyond the scope of the traditional SCS-CN method.

  18. Micro Ethnographic Research as a Method for Informing Educational Technology Design in Practice

    DEFF Research Database (Denmark)

    Davidsen, Jacob; Vanderlinde, Ruben

    2013-01-01

    Objectives and purposes. This paper describes research on how micro ethnographic classroom studies (Mehan, 1979) of the integration of technology can inform researchers understanding of teachers and children’s situated acts with technology. Hence, the objective of this paper is to show stories...... of the integration of technology from the teachers and children’s perspective. The central research question of the study is: how can researchers of educational technology represent the local and situated action of teachers and children to inform future technologies? Theoretical frameworks. Integrating technology...... technology researchers discuss how to bridge the gap between researchers and practitioners (Vanderlinde & Van Braak, 2010). Similar, there is also a gap between educational technology developers and practitioners. This gap between developers of technology and the users have been described in the Scandinavian...

  19. [Lack of access to information on oral health problems among adults: an approach based on the theoretical model for literacy in health].

    Science.gov (United States)

    Roberto, Luana Leal; Noronha, Daniele Durães; Souza, Taiane Oliveira; Miranda, Ellen Janayne Primo; Martins, Andréa Maria Eleutério de Barros Lima; Paula, Alfredo Maurício Batista De; Ferreira, Efigênia Ferreira E; Haikal, Desirée Sant'ana

    2018-03-01

    This study sought to investigate factors associated with the lack of access to information on oral health among adults. It is a cross-sectional study, carried out among 831 adults (35-44 years of age). The dependent variable was access to information on how to avoid oral problems, and the independent variables were gathered into subgroups according to the theoretical model for literacy in health. Binary logistic regression was carried out, and results were corrected by the design effect. It was observed that 37.5% had no access to information about dental problems. The lack of access was higher among adults who had lower per capita income, were dissatisfied with the dental services provided, did not use dental floss, had unsatisfactory physical control of the quality of life, and self-perceived their oral health as fair/poor/very poor. The likelihood of not having access to information about dental problems among those dissatisfied with the dental services used was 3.28 times higher than for those satisfied with the dental services used. Thus, decreased access to information was related to unfavorable conditions among adults. Health services should ensure appropriate information to their users in order to increase health literacy levels and improve satisfaction and equity.

  20. A new theoretical approach to analyze complex processes in cytoskeleton proteins.

    Science.gov (United States)

    Li, Xin; Kolomeisky, Anatoly B

    2014-03-20

    Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.

  1. A Dynamic and Adaptive Selection Radar Tracking Method Based on Information Entropy

    Directory of Open Access Journals (Sweden)

    Ge Jianjun

    2017-12-01

    Full Text Available Nowadays, the battlefield environment has become much more complex and variable. This paper presents a quantitative method and lower bound for the amount of target information acquired from multiple radar observations to adaptively and dynamically organize the detection of battlefield resources based on the principle of information entropy. Furthermore, for minimizing the given information entropy’s lower bound for target measurement at every moment, a method to dynamically and adaptively select radars with a high amount of information for target tracking is proposed. The simulation results indicate that the proposed method has higher tracking accuracy than that of tracking without adaptive radar selection based on entropy.

  2. Information theoretic resources in quantum theory

    Science.gov (United States)

    Meznaric, Sebastian

    Resource identification and quantification is an essential element of both classical and quantum information theory. Entanglement is one of these resources, arising when quantum communication and nonlocal operations are expensive to perform. In the first part of this thesis we quantify the effective entanglement when operations are additionally restricted to account for both fundamental restrictions on operations, such as those arising from superselection rules, as well as experimental errors arising from the imperfections in the apparatus. For an important class of errors we find a linear relationship between the usual and effective higher dimensional generalization of concurrence, a measure of entanglement. Following the treatment of effective entanglement, we focus on a related concept of nonlocality in the presence of superselection rules (SSR). Here we propose a scheme that may be used to activate nongenuinely multipartite nonlocality, in that a single copy of a state is not multipartite nonlocal, while two or more copies exhibit nongenuinely multipartite nonlocality. The states used exhibit the more powerful genuinely multipartite nonlocality when SSR are not enforced, but not when they are, raising the question of what is needed for genuinely multipartite nonlocality. We show that whenever the number of particles is insufficient, the degrading of genuinely multipartite to nongenuinely multipartite nonlocality is necessary. While in the first few chapters we focus our attention on understanding the resources present in quantum states, in the final part we turn the picture around and instead treat operations themselves as a resource. We provide our observers with free access to classical operations - ie. those that cannot detect or generate quantum coherence. We show that the operation of interest can then be used to either generate or detect quantum coherence if and only if it violates a particular commutation relation. Using the relative entropy, the

  3. On the road to metallic nanoparticles by rational design: bridging the gap between atomic-level theoretical modeling and reality by total scattering experiments

    Science.gov (United States)

    Prasai, Binay; Wilson, A. R.; Wiley, B. J.; Ren, Y.; Petkov, Valeri

    2015-10-01

    The extent to which current theoretical modeling alone can reveal real-world metallic nanoparticles (NPs) at the atomic level was scrutinized and demonstrated to be insufficient and how it can be improved by using a pragmatic approach involving straightforward experiments is shown. In particular, 4 to 6 nm in size silica supported Au100-xPdx (x = 30, 46 and 58) explored for catalytic applications is characterized structurally by total scattering experiments including high-energy synchrotron X-ray diffraction (XRD) coupled to atomic pair distribution function (PDF) analysis. Atomic-level models for the NPs are built by molecular dynamics simulations based on the archetypal for current theoretical modeling Sutton-Chen (SC) method. Models are matched against independent experimental data and are demonstrated to be inaccurate unless their theoretical foundation, i.e. the SC method, is supplemented with basic yet crucial information on the length and strength of metal-to-metal bonds and, when necessary, structural disorder in the actual NPs studied. An atomic PDF-based approach for accessing such information and implementing it in theoretical modeling is put forward. For completeness, the approach is concisely demonstrated on 15 nm in size water-dispersed Au particles explored for bio-medical applications and 16 nm in size hexane-dispersed Fe48Pd52 particles explored for magnetic applications as well. It is argued that when ``tuned up'' against experiments relevant to metals and alloys confined to nanoscale dimensions, such as total scattering coupled to atomic PDF analysis, rather than by mere intuition and/or against data for the respective solids, atomic-level theoretical modeling can provide a sound understanding of the synthesis-structure-property relationships in real-world metallic NPs. Ultimately this can help advance nanoscience and technology a step closer to producing metallic NPs by rational design.The extent to which current theoretical modeling alone can

  4. Use of theoretical and conceptual frameworks in qualitative research.

    Science.gov (United States)

    Green, Helen Elise

    2014-07-01

    To debate the definition and use of theoretical and conceptual frameworks in qualitative research. There is a paucity of literature to help the novice researcher to understand what theoretical and conceptual frameworks are and how they should be used. This paper acknowledges the interchangeable usage of these terms and researchers' confusion about the differences between the two. It discusses how researchers have used theoretical and conceptual frameworks and the notion of conceptual models. Detail is given about how one researcher incorporated a conceptual framework throughout a research project, the purpose for doing so and how this led to a resultant conceptual model. Concepts from Abbott (1988) and Witz ( 1992 ) were used to provide a framework for research involving two case study sites. The framework was used to determine research questions and give direction to interviews and discussions to focus the research. Some research methods do not overtly use a theoretical framework or conceptual framework in their design, but this is implicit and underpins the method design, for example in grounded theory. Other qualitative methods use one or the other to frame the design of a research project or to explain the outcomes. An example is given of how a conceptual framework was used throughout a research project. Theoretical and conceptual frameworks are terms that are regularly used in research but rarely explained. Textbooks should discuss what they are and how they can be used, so novice researchers understand how they can help with research design. Theoretical and conceptual frameworks need to be more clearly understood by researchers and correct terminology used to ensure clarity for novice researchers.

  5. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  6. A theoretical framework informing research about the role of stress in the pathophysiology of bipolar disorder.

    Science.gov (United States)

    Brietzke, Elisa; Mansur, Rodrigo Barbachan; Soczynska, Joanna; Powell, Alissa M; McIntyre, Roger S

    2012-10-01

    The staggering illness burden associated with Bipolar Disorder (BD) invites the need for primary prevention strategies. Before preventative strategies can be considered in individuals during a pre-symptomatic period (i.e., at risk), unraveling the mechanistic steps wherein external stress is transduced and interacts with genetic vulnerability in the early stages of BD will be a critical conceptual necessity. Herein we comprehensively review extant studies reporting on stress and bipolar disorder. The overarching aim is to propose a conceptual framework to inform research about the role of stress in the pathophysiology of BD. Computerized databases i.e. PubMed, PsychInfo, Cochrane Library and Scielo were searched using the following terms: "bipolar disorder" cross-referenced with "stress", "general reaction to stress", "resilience", "resistance", "recovery" "stress-diathesis", "allostasis", and "hormesis". Data from literature indicate the existence of some theoretical models to understand the influence of stress in the pathophysiology of BD, including classical stress-diathesis model and new models such as allostasis and hormesis. In addition, molecular mechanisms involved in stress adaptation (resistance, resilience and recovery) can also be translated in research strategies to investigate the impact of stress in the pathophysiology of BD. Most studies are retrospective and/or cross sectional, do not consider the period of development, assess brain function with only one or few methodologies, and use animal models which are not always similar to human phenotypes. The interaction between stress and brain development is dynamic and complex. In this article we proposed a theoretical model for investigation about the role of stress in the pathophysiology of BD, based on the different kinds of stress adaptation response and their putative neurobiological underpinnings. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. A Combined Theoretical and Experimental Study for Silver Electroplating

    Science.gov (United States)

    Liu, Anmin; Ren, Xuefeng; An, Maozhong; Zhang, Jinqiu; Yang, Peixia; Wang, Bo; Zhu, Yongming; Wang, Chong

    2014-01-01

    A novel method combined theoretical and experimental study for environmental friendly silver electroplating was introduced. Quantum chemical calculations and molecular dynamic (MD) simulations were employed for predicting the behaviour and function of the complexing agents. Electronic properties, orbital information, and single point energies of the 5,5-dimethylhydantoin (DMH), nicotinic acid (NA), as well as their silver(I)-complexes were provided by quantum chemical calculations based on density functional theory (DFT). Adsorption behaviors of the agents on copper and silver surfaces were investigated using MD simulations. Basing on the data of quantum chemical calculations and MD simulations, we believed that DMH and NA could be the promising complexing agents for silver electroplating. The experimental results, including of electrochemical measurement and silver electroplating, further confirmed the above prediction. This efficient and versatile method thus opens a new window to study or design complexing agents for generalized metal electroplating and will vigorously promote the level of this research region.

  8. A Combined Theoretical and Experimental Study for Silver Electroplating

    Science.gov (United States)

    Liu, Anmin; Ren, Xuefeng; An, Maozhong; Zhang, Jinqiu; Yang, Peixia; Wang, Bo; Zhu, Yongming; Wang, Chong

    2014-01-01

    A novel method combined theoretical and experimental study for environmental friendly silver electroplating was introduced. Quantum chemical calculations and molecular dynamic (MD) simulations were employed for predicting the behaviour and function of the complexing agents. Electronic properties, orbital information, and single point energies of the 5,5-dimethylhydantoin (DMH), nicotinic acid (NA), as well as their silver(I)-complexes were provided by quantum chemical calculations based on density functional theory (DFT). Adsorption behaviors of the agents on copper and silver surfaces were investigated using MD simulations. Basing on the data of quantum chemical calculations and MD simulations, we believed that DMH and NA could be the promising complexing agents for silver electroplating. The experimental results, including of electrochemical measurement and silver electroplating, further confirmed the above prediction. This efficient and versatile method thus opens a new window to study or design complexing agents for generalized metal electroplating and will vigorously promote the level of this research region. PMID:24452389

  9. Innovation in Information Technology: Theoretical and Empirical Study in SMQR Section of Export Import in Automotive Industry

    Science.gov (United States)

    Edi Nugroho Soebandrija, Khristian; Pratama, Yogi

    2014-03-01

    This paper has the objective to provide the innovation in information technology in both theoretical and empirical study. Precisely, both aspects relate to the Shortage Mispacking Quality Report (SMQR) Claims in Export and Import in Automotive Industry. This paper discusses the major aspects of Innovation, Information Technology, Performance and Competitive Advantage. Furthermore, In the empirical study of PT. Astra Honda Motor (AHM) refers to SMQR Claims, Communication Systems, Analysis and Design Systems. Briefly both aspects of the major aspects and its empirical study are discussed in the Introduction Session. Furthermore, the more detail discussion is conducted in the related aspects in other sessions of this paper, in particular in Literature Review in term classical and updated reference of current research. The increases of SMQR claim and communication problem at PT. Astra Daihatsu Motor (PT. ADM) which still using the email cause the time of claim settlement become longer and finally it causes the rejected of SMQR claim by supplier. With presence of this problem then performed to design the integrated communication system to manage the communication process of SMQR claim between PT. ADM with supplier. The systems was analyzed and designed is expected to facilitate the claim communication process so that can be run in accordance with the procedure and fulfill the target of claim settlement time and also eliminate the difficulties and problems on the previous manual communication system with the email. The design process of the system using the approach of system development life cycle method by Kendall & Kendall (2006)which design process covers the SMQR problem communication process, judgment process by the supplier, claim process, claim payment process and claim monitoring process. After getting the appropriate system designs for managing the SMQR claim, furthermore performed the system implementation and can be seen the improvement in claim communication

  10. A Theoretical Modeling of Digital World History: Premises, Paradigm, and Scientific Data Strategy

    Directory of Open Access Journals (Sweden)

    Xudong Wang

    2007-10-01

    Full Text Available Digital World History is a new expression of world history (or maybe "a new method for world history expression" and a paradigm of world history description, study, and application by virtual informatization and recovery. It is also a comprehensive systematic study through dynamic marks, integrated description, and retrieval of human society evolution and its causality dependant on the theory and methodology of digitization information. It aims at breaking the limitation of diachronic language attributed to the process of history cognition, summation, and recovery, addressing a possible scheme to fuse historical factors in relation to changing history, dynamically applying a multiplicity of results so that the discipline of world history can meet the needs of the information-equipped society of the 21st century. In this article, the author uses theoretical modelling methods, resulting in a blueprint of the quality issue, namely the Digital World History premise, and a paradigm for setting the foundation and scientific data strategy as a basis for its necessity.

  11. Research on a Method of Geographical Information Service Load Balancing

    Science.gov (United States)

    Li, Heyuan; Li, Yongxing; Xue, Zhiyong; Feng, Tao

    2018-05-01

    With the development of geographical information service technologies, how to achieve the intelligent scheduling and high concurrent access of geographical information service resources based on load balancing is a focal point of current study. This paper presents an algorithm of dynamic load balancing. In the algorithm, types of geographical information service are matched with the corresponding server group, then the RED algorithm is combined with the method of double threshold effectively to judge the load state of serve node, finally the service is scheduled based on weighted probabilistic in a certain period. At the last, an experiment system is built based on cluster server, which proves the effectiveness of the method presented in this paper.

  12. FEATURE SELECTION METHODS BASED ON MUTUAL INFORMATION FOR CLASSIFYING HETEROGENEOUS FEATURES

    Directory of Open Access Journals (Sweden)

    Ratri Enggar Pawening

    2016-06-01

    Full Text Available Datasets with heterogeneous features can affect feature selection results that are not appropriate because it is difficult to evaluate heterogeneous features concurrently. Feature transformation (FT is another way to handle heterogeneous features subset selection. The results of transformation from non-numerical into numerical features may produce redundancy to the original numerical features. In this paper, we propose a method to select feature subset based on mutual information (MI for classifying heterogeneous features. We use unsupervised feature transformation (UFT methods and joint mutual information maximation (JMIM methods. UFT methods is used to transform non-numerical features into numerical features. JMIM methods is used to select feature subset with a consideration of the class label. The transformed and the original features are combined entirely, then determine features subset by using JMIM methods, and classify them using support vector machine (SVM algorithm. The classification accuracy are measured for any number of selected feature subset and compared between UFT-JMIM methods and Dummy-JMIM methods. The average classification accuracy for all experiments in this study that can be achieved by UFT-JMIM methods is about 84.47% and Dummy-JMIM methods is about 84.24%. This result shows that UFT-JMIM methods can minimize information loss between transformed and original features, and select feature subset to avoid redundant and irrelevant features.

  13. About application during lectures on protection of the information and information security of the method of "the round table"

    Directory of Open Access Journals (Sweden)

    Simon Zh. Simavoryan

    2011-05-01

    Full Text Available In article the analysis of one of passive methods of transfer of knowledge – lecture is resulted. Experience of teaching of a subject on protection of the information and information security shows that students acquire a teaching material if during lecture to apply an active method of transfer of knowledge – a method of "a round table" is better.

  14. Information content in B→VV decays and the angular moments method

    International Nuclear Information System (INIS)

    Dighe, A.; Sen, S.

    1998-10-01

    The time-dependent angular distributions of decays of neutral B mesons into two vector mesons contain information about the lifetimes, mass differences, strong and weak phases, form factors, and CP violating quantities. A statistical analysis of the information content is performed by giving the ''information'' a quantitative meaning. It is shown that for some parameters of interest, the information content in time and angular measurements combined may be orders of magnitude more than the information from time measurements alone and hence the angular measurements are highly recommended. The method of angular moments is compared with the (maximum) likelihood method to find that it works almost as well in the region of interest for the one-angle distribution. For the complete three-angle distribution, an estimate of possible statistical errors expected on the observables of interest is obtained. It indicates that the three-angle distribution, unraveled by the method of angular moments, would be able to nail down many quantities of interest and will help in pointing unambiguously to new physics. (author)

  15. The theoretical preconditions for problem situation realization while studying information technology at school

    Directory of Open Access Journals (Sweden)

    Ольга Александровна Прусакова

    2012-03-01

    Full Text Available Within the framework of modern pedagogy and educational practice there have been worked out and realized various theoretical conceptions, theories, educational approaches including humanistic, personality-oriented, activity-oriented, competence-oriented. One of such approaches to education and personality development is the problem-solving approach.

  16. Consumers’ Acceptance and Use of Information and Communications Technology: A UTAUT and Flow Based Theoretical Model

    Directory of Open Access Journals (Sweden)

    Saleh Alwahaishi

    2013-03-01

    Full Text Available The world has changed a lot in the past years. The rapid advances in technology and the changing of the communication channels have changed the way people work and, for many, where do they work from. The Internet and mobile technology, the two most dynamic technological forces in modern information and communications technology (ICT are converging into one ubiquitous mobile Internet service, which will change our way of both doing business and dealing with our daily routine activities. As the use of ICT expands globally, there is need for further research into cultural aspects and implications of ICT. The acceptance of Information Technology (IT has become a fundamental part of the research plan for most organizations (Igbaria 1993. In IT research, numerous theories are used to understand users’ adoption of new technologies. Various models were developed including the Technology Acceptance Model, Theory of Reasoned Action, Theory of Planned Behavior, and recently, the Unified Theory of Acceptance and Use of Technology. Each of these models has sought to identify the factors which influence a citizen’s intention or actual use of information technology. Drawing on the UTAUT model and Flow Theory, this research composes a new hybrid theoretical framework to identify the factors affecting the acceptance and use of Mobile Internet -as an ICT application- in a consumer context. The proposed model incorporates eight constructs: Performance Expectancy, Effort Expectancy, Facilitating Conditions, Social Influences, Perceived Value, Perceived Playfulness, Attention Focus, and Behavioral intention. Data collected online from 238 respondents in Saudi Arabia were tested against the research model, using the structural equation modeling approach. The proposed model was mostly supported by the empirical data. The findings of this study provide several crucial implications for ICT and, in particular, mobile Internet service practitioners and researchers

  17. Theoretical Perspectives of How Digital Natives Learn

    Science.gov (United States)

    Kivunja, Charles

    2014-01-01

    Marck Prensky, an authority on teaching and learning especially with the aid of Information and Communication Technologies, has referred to 21st century children born after 1980 as "Digital Natives". This paper reviews literature of leaders in the field to shed some light on theoretical perspectives of how Digital Natives learn and how…

  18. Theoretical Approaches to Lignin Chemistry

    OpenAIRE

    Shevchenko, Sergey M.

    1994-01-01

    A critical review is presented of the applications of theoretical methods to the studies of the structure and chemical reactivity of lignin, including simulation of macromolecular properties, conformational calculations, quantum chemical analyses of electronic structure, spectra and chemical reactivity. Modern concepts of spatial organization and chemical reactivity of lignins are discussed.

  19. Actor-network Theory and cartography of controversies in Information Science

    OpenAIRE

    LOURENÇO, Ramon Fernandes; TOMAÉL, Maria Inês

    2018-01-01

    Abstract The present study aims to discuss the interactions between the Actor-network Theory and the Cartography of Controversies method in Information Science research. A literature review was conducted on books, scholarly articles, and any other sources addressing the Theory-Actor Network and Cartography of Controversies. The understanding of the theoretical assumptions that guide the Network-Actor Theory allows examining important aspects to Information Science research, seeking to identif...

  20. Physical Activity Informational Websites: Accuracy, Language Ease, and Fear Appeal

    OpenAIRE

    Paige, Samantha Rose

    2014-01-01

    Introduction. Health information is one of the most common searches on the Internet. Literature supports that, in general, health information readily available to Internet consumers is not accurate, lacks plain language for ease of understanding, and does not incorporate behavior-change theoretical frameworks. The purpose of this study was to evaluate each of these components. Method. Three keywords, "physical activity," "exercise," and "fitness," were entered into four popular search engines...

  1. Discussion of a method for providing general risk information by linking with the nuclear information

    International Nuclear Information System (INIS)

    Shobu, Nobuhiro; Yokomizo, Shirou; Umezawa, Sayaka

    2004-06-01

    'Risk information navigator (http://www.ricotti.jp/risknavi/)', an internet tool for arousing public interest and fostering people's risk literacy, has been developed as the contents for the official website of Techno Community Square 'RICOTTI' (http://www.ricotti.jp) at TOKAI village. In this report we classified the risk information into the fields, Health/Daily Life', 'Society/Crime/Disaster' and Technology/Environment/Energy', for the internet tool contents. According to these categories we discussed a method for providing various risk information on general fields by linking with the information on nuclear field. The web contents are attached to this report with the CD-R media. (author)

  2. State of the art/science: Visual methods and information behavior research

    DEFF Research Database (Denmark)

    Hartel, Jenna; Sonnenwald, Diane H.; Lundh, Anna

    2012-01-01

    This panel reports on methodological innovation now underway as information behavior scholars begin to experiment with visual methods. The session launches with a succinct introduction to visual methods by Jenna Hartel and then showcases three exemplar visual research designs. First, Dianne Sonne...... will have gained: knowledge of the state of the art/science of visual methods in information behavior research; an appreciation for the richness the approach brings to the specialty; and a platform to take new visual research designs forward....

  3. [Dissemination of medical information in Europe, the USA and Japan, 1850-1870: focusing on information concerning the hypodermic injection method].

    Science.gov (United States)

    Tsukisawa, Miyoko

    2011-12-01

    Modern medicine was introduced in Japan in the second half of the nineteenth century. In order to investigate this historical process, this paper focuses on the dissemination of information of a new medical technology developed in the mid-nineteenth century; it does so by making comparisons of the access to medical information between Europe, the USA and Japan. The hypodermic injection method was introduced in the clinical field in Europe and the USA as a newly developed therapeutic method during the 1850s and 1870s. This study analyzed information on the medical assessments of this method by clinicians of these periods. The crucial factor in accumulating this information was to develop a worldwide inter-medical communication circle with the aid of the medical journals. Information on the hypodermic injection method was introduced in Japan almost simultaneously with its introduction in Europe and the USA. However, because of the geographical distance and the language barrier, Japanese clinicians lacked access to this worldwide communication circle, and they accepted this new method without adequate medical technology assessments.

  4. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  5. A queer-theoretical approach to community health psychology.

    Science.gov (United States)

    Easpaig, Bróna R Nic Giolla; Fryer, David M; Linn, Seònaid E; Humphrey, Rhianna H

    2014-01-01

    Queer-theoretical resources offer ways of productively rethinking how central concepts such as 'person-context', 'identity' and 'difference' may be understood for community health psychologists. This would require going beyond consideration of the problems with which queer theory is popularly associated to cautiously engage with the aspects of this work relevant to the promotion of collective practice and engaging with processes of marginalisation. In this article, we will draw upon and illustrate the queer-theoretical concepts of 'performativity' and 'cultural intelligibility' before moving towards a preliminary mapping of what a queer-informed approach to community health psychology might involve.

  6. 41st Vietnam National Conference on Theoretical Physics

    International Nuclear Information System (INIS)

    2017-01-01

    Preface The 41 st Vietnam National Conference on Theoretical Physics (NCTP-41) was held during 1-4 August 2016 in Nha Trang, Vietnam. The NCTP-41 was organized by the Institute of Physics, Vietnam Academy of Science and Technology (IOP-VAST) under the support of the Vietnamese Theoretical Physics Society (VTPS). This meeting belongs to a series of annual theoretical physics conferences that started in 1976. The conference has covered a wide range of theoretical physics topics from 4 major fields: • Particle, nuclear and astro- physics, • Molecular physics, quantum optics and quantum computation, • Condensed matter physics, • Soft matter, biological and interdisciplinary physics. 115 participants have participated in the conference. 2 invited talks, 22 oral and 75 poster contributions were presented. This volume contains selected papers contributed by the participants. Editors of the NCTP-41 Proceedings Trinh Xuan Hoang, Hoang Anh Tuan and Vu Ngoc Tuoc Information about Organizer, Sponsor, Honorary Chair and Chair and also lists of committees and participants are available in the PDF (paper)

  7. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  8. Infantilism: Theoretical Construct and Operationalization

    Science.gov (United States)

    Sabelnikova, Y. V.; Khmeleva, N. L.

    2018-01-01

    The aim of this article is to define and operationalize the construct of infantilism. The methods of theoretical research involve analysis and synthesis. Age and content criteria are analyzed for childhood and adulthood. Infantile traits in an adult are described. Results: The characteristics of adult infantilism in the modern world are defined,…

  9. Assessing Two Theoretical Frameworks of Civic Engagement

    Science.gov (United States)

    García-Cabrero, Benilde; Pérez-Martínez, María Guadalupe; Sandoval-Hernández, Andrés; Caso-Niebla, Joaquín; Díaz-López, Carlos David

    2016-01-01

    The purpose of this study was to empirically test two major theoretical models: a modified version of the social capital model (Pattie, Seyd and Whiteley, 2003), and the Informed Social Engagement Model (Barr and Selman, 2014; Selman and Kwok, 2010), to explain civic participation and civic knowledge of adolescents from Chile, Colombia and Mexico,…

  10. Theoretical study of some aspects of the nucleo-bases reactivity: definition of new theoretical tools for the study of chemical reactivity

    International Nuclear Information System (INIS)

    Labet, V.

    2009-09-01

    In this work, three kinds of nucleo-base damages were studied from a theoretical point of view with quantum chemistry methods based on the density-functional theory: the spontaneous deamination of cytosine and its derivatives, the formation of tandem lesion induced by hydroxyl radicals in anaerobic medium and the formation of pyrimidic dimers under exposition to an UV radiation. The complementary use of quantitative static methods allowing the exploration of the potential energy surface of a chemical reaction, and of 'conceptual DFT' principles, leads to information concerning the mechanisms involved and to the rationalization of the differences in the nucleo-bases reactivity towards the formation of a same kind of damage. At the same time, a reflexion was undertaken on the asynchronous concerted mechanism concept, in terms of physical meaning of the transition state, respect of the Maximum Hardness Principle, and determination of the number of primitive processes involved. Finally, a new local reactivity index was developed, relevant to understand the reactivity of a molecular system in an excited state. (author)

  11. Deriving harmonised forest information in Europe using remote sensing methods

    DEFF Research Database (Denmark)

    Seebach, Lucia Maria

    the need for harmonised forest information can be satisfied using remote sensing methods. In conclusion, the study showed that it is possible to derive harmonised forest information of high spatial detail in Europe with remote sensing. The study also highlighted the imperative provision of accuracy...

  12. Theoretical interpretation of data from high-energy nuclear collisions

    International Nuclear Information System (INIS)

    Fai, G.

    1988-09-01

    Nuclear collision data at energies ranging from medium to relativistic are interpreted theoretically. The major objective is a better understanding of high-energy heavy-ion collisions, with particular emphasis on the properties of excited nuclear matter. Further progress towards a satisfactory description of excited subsaturation nuclear matter is achieved. The mean free path of a nucleon in nuclear matter, which is a critical parameter in assessing the applicability of certain nuclear collision models, is investigated. Experimental information is used together with theoretical concepts in collaborations with experimentalists in order to learn about the reaction mechanism and about excited nuclear matter properties. In the framework of a more strictly theoretical program development, subnuclear degrees of freedom and nonlinear phenomena in model field theories are studied

  13. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  14. Developing corpus-based translation methods between informal and formal mathematics : project description

    NARCIS (Netherlands)

    Kaliszyk, C.; Urban, J.; Vyskocil, J.; Geuvers, J.H.; Watt, S.M.; Davenport, J.H.; Sexton, A.P.; Sojka, P.; Urban, J.

    2014-01-01

    The goal of this project is to (i) accumulate annotated informal/formal mathematical corpora suitable for training semi-automated translation between informal and formal mathematics by statistical machine-translation methods, (ii) to develop such methods oriented at the formalization task, and in

  15. Decomposition of overlapping protein complexes: A graph theoretical method for analyzing static and dynamic protein associations

    Directory of Open Access Journals (Sweden)

    Guimarães Katia S

    2006-04-01

    Full Text Available Abstract Background Most cellular processes are carried out by multi-protein complexes, groups of proteins that bind together to perform a specific task. Some proteins form stable complexes, while other proteins form transient associations and are part of several complexes at different stages of a cellular process. A better understanding of this higher-order organization of proteins into overlapping complexes is an important step towards unveiling functional and evolutionary mechanisms behind biological networks. Results We propose a new method for identifying and representing overlapping protein complexes (or larger units called functional groups within a protein interaction network. We develop a graph-theoretical framework that enables automatic construction of such representation. We illustrate the effectiveness of our method by applying it to TNFα/NF-κB and pheromone signaling pathways. Conclusion The proposed representation helps in understanding the transitions between functional groups and allows for tracking a protein's path through a cascade of functional groups. Therefore, depending on the nature of the network, our representation is capable of elucidating temporal relations between functional groups. Our results show that the proposed method opens a new avenue for the analysis of protein interaction networks.

  16. Informal Versus Formal Search : Which Yields a Better Pay?

    OpenAIRE

    Semih Tumen

    2015-01-01

    Estimates on the effect of job contact method – i.e., informal versus formal search – on wage offers vary considerably across studies, with some of them finding a positive correlation between getting help from informal connections and obtaining high-paying jobs, while others finding a negative one. In this paper, I theoretically investigate the sources of discrepancies in these empirical results. Using a formal job search framework, I derive an equilibrium wage distribution which reveals that...

  17. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Directory of Open Access Journals (Sweden)

    David Balduzzi

    2008-06-01

    Full Text Available This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks

  18. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Science.gov (United States)

    Balduzzi, David; Tononi, Giulio

    2008-06-13

    This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized

  19. Identification of source velocities on 3D structures in non-anechoic environments: Theoretical background and experimental validation of the inverse patch transfer functions method

    Science.gov (United States)

    Aucejo, M.; Totaro, N.; Guyader, J.-L.

    2010-08-01

    In noise control, identification of the source velocity field remains a major problem open to investigation. Consequently, methods such as nearfield acoustical holography (NAH), principal source projection, the inverse frequency response function and hybrid NAH have been developed. However, these methods require free field conditions that are often difficult to achieve in practice. This article presents an alternative method known as inverse patch transfer functions, designed to identify source velocities and developed in the framework of the European SILENCE project. This method is based on the definition of a virtual cavity, the double measurement of the pressure and particle velocity fields on the aperture surfaces of this volume, divided into elementary areas called patches and the inversion of impedances matrices, numerically computed from a modal basis obtained by FEM. Theoretically, the method is applicable to sources with complex 3D geometries and measurements can be carried out in a non-anechoic environment even in the presence of other stationary sources outside the virtual cavity. In the present paper, the theoretical background of the iPTF method is described and the results (numerical and experimental) for a source with simple geometry (two baffled pistons driven in antiphase) are presented and discussed.

  20. Psychotherapy Integration via Theoretical Unification

    Directory of Open Access Journals (Sweden)

    Warren W. Tryon

    2017-01-01

    Full Text Available Meaningful psychotherapy integration requires theoretical unification because psychotherapists can only be expected to treat patients with the same diagnoses similarly if they understand these disorders similarly and if they agree on the mechanisms by which effective treatments work. Tryon (in press has proposed a transtheoretic transdiagnostic psychotherapy based on an Applied Psychological Science (APS clinical orientation, founded on a BioPsychology Network explanatory system that provides sufficient theoretical unification to support meaningful psychotherapy integration. That proposal focused mainly on making a neuroscience argument. This article makes a different argument for theoretical unification and consequently psychotherapy integration. The strength of theories of psychotherapy, like all theory, is to focus on certain topics, goals, and methods. But this strength is also a weakness because it can blind one to alternative perspectives and thereby promote unnecessary competition among therapies. This article provides a broader perspective based on learning and memory that is consistent with the behavioral, cognitive, cognitive-behavioral, psychodynamic, pharmacologic, and Existential/Humanistic/Experiential clinical orientations. It thereby provides a basis for meaningful psychotherapy integration.

  1. Theoretical model and experimental verification on the PID tracking method using liquid crystal optical phased array

    Science.gov (United States)

    Wang, Xiangru; Xu, Jianhua; Huang, Ziqiang; Wu, Liang; Zhang, Tianyi; Wu, Shuanghong; Qiu, Qi

    2017-02-01

    Liquid crystal optical phased array (LC-OPA) has been considered with great potential on the non-mechanical laser deflector because it is fabricated using photolithographic patterning technology which has been well advanced by the electronics and display industry. As a vital application of LC-OPA, free space laser communication has demonstrated its merits on communication bandwidth. Before data communication, ATP (acquisition, tracking and pointing) process costs relatively long time to result in a bottle-neck of free space laser communication. Meanwhile, dynamic real time accurate tracking is sensitive to keep a stable communication link. The electro-optic medium liquid crystal with low driving voltage can be used as the laser beam deflector. This paper presents a fast-track method using liquid crystal optical phased array as the beam deflector, CCD as a beacon light detector. PID (Proportion Integration Differentiation) loop algorithm is introduced as the controlling algorithm to generate the corresponding steering angle. To achieve the goal of fast and accurate tracking, theoretical analysis and experimental verification are demonstrated that PID closed-loop system can suppress the attitude random vibration. Meanwhile, theoretical analysis shows that tracking accuracy can be less than 6.5μrad, with a relative agreement with experimental results which is obtained after 10 adjustments that the tracking accuracy is less than12.6μrad.

  2. Do pseudo-absence selection strategies influence species distribution models and their predictions? An information-theoretic approach based on simulated data

    Directory of Open Access Journals (Sweden)

    Guisan Antoine

    2009-04-01

    Full Text Available Abstract Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a real absences b pseudo-absences selected randomly from the background and c two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97, and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have

  3. Theoretical and methodological basis of the comparative historical and legal method development

    Directory of Open Access Journals (Sweden)

    Д. А. Шигаль

    2015-05-01

    Full Text Available Problem setting. Development of any scientific method is always both a question of its structural and functional characteristics and place in the system of scientific methods, and a comment as for practicability of such methodological work. This paper attempts to give a detailed response to the major comments and objections arising in respect of the separation as an independent means of special and scientific knowledge of comparative historical and legal method. Recent research and publications analysis. Analyzing research and publications within the theme of the scientific article, it should be noted that attention to methodological issues of both general and legal science at the time was paid by such prominent foreign and domestic scholars as I. D. Andreev, Yu. Ya. Baskin, O. L. Bygych, M. A. Damirli, V. V. Ivanov, I. D. Koval'chenko, V. F. Kolomyitsev, D. V. Lukyanov, L. A. Luts, J. Maida, B. G. Mogilnytsky, N. M. Onishchenko, N. M. Parkhomenko, O. V. Petryshyn, S. P. Pogrebnyak, V. I. Synaisky, V. M. Syryh, O. F. Skakun, A. O. Tille, D. I. Feldman and others. It should be noted that, despite a large number of scientific papers in this field, the interest of research partnership in the methodology of history of state and law science still unfairly remains very low. Paper objective. The purpose of this scientific paper is theoretical and methodological rationale for the need of separation and development of comparative historical and legal method in the form of answers to more common questions and objections that arise in scientific partnership in this regard. Paper main body. Development of comparative historical and legal means of knowledge is quite justified because it meets the requirements of the scientific method efficiency, which criteria are the speed for achieving this goal, ease of use of one or another way of scientific knowledge, universality of research methods, convenience of techniques that are used and so on. Combining the

  4. Interface methods for using intranet portal organizational memory information system.

    Science.gov (United States)

    Ji, Yong Gu; Salvendy, Gavriel

    2004-12-01

    In this paper, an intranet portal is considered as an information infrastructure (organizational memory information system, OMIS) supporting organizational learning. The properties and the hierarchical structure of information and knowledge in an intranet portal OMIS was identified as a problem for navigation tools of an intranet portal interface. The problem relates to navigation and retrieval functions of intranet portal OMIS and is expected to adversely affect user performance, satisfaction, and usefulness. To solve the problem, a conceptual model for navigation tools of an intranet portal interface was proposed and an experiment using a crossover design was conducted with 10 participants. In the experiment, a separate access method (tabbed tree tool) was compared to an unified access method (single tree tool). The results indicate that each information/knowledge repository for which a user has a different structural knowledge should be handled separately with a separate access to increase user satisfaction and the usefulness of the OMIS and to improve user performance in navigation.

  5. Method s for Measuring Productivity in Libraries and Information Centres

    OpenAIRE

    Mohammad Alaaei

    2009-01-01

      Within Information centers, productivity is the result of optimal and effective use of information resources, service quality improvement, increased user satisfaction, pleasantness of working environment, increased motivation and enthusiasm of staff to work better. All contribute to the growth and development of information centers. Thus these centers would need to be familiar with methods employed in productivity measurement. Productivity is one of the criteria for evaluating system perfor...

  6. Intuition in Decision Making –Theoretical and Empirical Aspects

    Directory of Open Access Journals (Sweden)

    Kamila Malewska

    2015-11-01

    Full Text Available In an economy dominated by information and knowledge, analysis ceases to be the sole and sufficient source of knowledge. Managers seek alternative ways of obtaining and interpreting information and knowledge. Here, managerial intuitive potential begins to play an important role. The aim of this paper is to present the issue of intuition in decision making in both theoretical and empirical terms. The first part presents the essence of intuition and its role in management, especially in decision making. Then, the empirical part attempts to identify the intuitive potential of managers and the extent of its use in practical decision making. The case study method was used in order to achieve this goal. The analysis involved a Polish food company “Fawor” that employs more than 300 workers. These literature and empirical studies in the area of intuition were conducted within the research project „The impact of managerial intuitive potential on the effectiveness of decision making processes”, financed by the National Science Centre, Poland (funds allocated on the basis of decision No. DEC-2014/13/D/HS4/01750

  7. The use of information theory for the evaluation of biomarkers of aging and physiological age.

    Science.gov (United States)

    Blokh, David; Stambler, Ilia

    2017-04-01

    The present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians. The use of information-theoretical methods, utilizing normalized mutual information, revealed the exact amount of information that various diagnostic parameters or their combinations contained about the persons' age. Based on those exact informative values for the correlation of measured parameters with age, we constructed a diagnostic rule (a decision tree) to evaluate physiological age, as compared to chronological age. The present data illustrated that younger subjects suffering from heart disease showed characteristics of people of higher age (higher physiological age). Utilizing information-theoretical measures, with additional data, it may be possible to create further clinically applicable information-theory-based markers and models for the evaluation of physiological age, its relation to age-related diseases and its potential modifications by therapeutic interventions. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. The Trojan horse method in nuclear astrophysics

    International Nuclear Information System (INIS)

    Aliotta, M.; Rolfs, C.; Lattuada, M.; Pellegriti, M.G.; Pizzone, R.G.; Spitaleri, C.; Miljanic, Dj.; Typel, S.; Wolter, H.H.

    2001-01-01

    Because of the Coulomb barrier, reaction cross sections in astrophysics cannot be accessed directly at the relevant Gamow energies, unless very favourable conditions are met (e.g. LUNA--underground experiments). Theoretical extrapolations of available data are then needed to derive the astrophysical S(0)-factor. Various indirect processes have been used in order to obtain additional information on the parameters entering these extrapolations. The Trojan Horse Method is an indirect method which might help to bypass some of the problems typically encountered in direct measurements, namely the presence of the Coulomb barrier and the effect of the electron screening. However, a comparison with direct data in an appropriate energy region (e.g. around the Coulomb barrier) is crucial before extending the method to the relevant Gamow energy. Additionally, experimental and theoretical tests are needed to validate the assumptions underlying the method. The application of the Trojan Horse Method to some cases of interest is discussed

  9. Identification of Dynamic Flow Stress Curves Using the Virtual Fields Methods: Theoretical Feasibility Analysis

    Science.gov (United States)

    Leem, Dohyun; Kim, Jin-Hwan; Barlat, Frédéric; Song, Jung Han; Lee, Myoung-Gyu

    2018-03-01

    An inverse approach based on the virtual fields method (VFM) is presented to identify the material hardening parameters under dynamic deformation. This dynamic-VFM (D-VFM) method does not require load information for the parameter identification. Instead, it utilizes acceleration fields in a specimen's gage region. To investigate the feasibility of the proposed inverse approach for dynamic deformation, the virtual experiments using dynamic finite element simulations were conducted. The simulation could provide all the necessary data for the identification such as displacement, strain, and acceleration fields. The accuracy of the identification results was evaluated by changing several parameters such as specimen geometry, velocity, and traction boundary conditions. The analysis clearly shows that the D-VFM which utilizes acceleration fields can be a good alternative to the conventional identification procedure that uses load information. Also, it was found that proper deformation conditions are required for generating sufficient acceleration fields during dynamic deformation to enhance the identification accuracy with the D-VFM.

  10. Combination of real options and game-theoretic approach in investment analysis

    Science.gov (United States)

    Arasteh, Abdollah

    2016-09-01

    Investments in technology create a large amount of capital investments by major companies. Assessing such investment projects is identified as critical to the efficient assignment of resources. Viewing investment projects as real options, this paper expands a method for assessing technology investment decisions in the linkage existence of uncertainty and competition. It combines the game-theoretic models of strategic market interactions with a real options approach. Several key characteristics underlie the model. First, our study shows how investment strategies rely on competitive interactions. Under the force of competition, firms hurry to exercise their options early. The resulting "hurry equilibrium" destroys the option value of waiting and involves violent investment behavior. Second, we get best investment policies and critical investment entrances. This suggests that integrating will be unavoidable in some information product markets. The model creates some new intuitions into the forces that shape market behavior as noticed in the information technology industry. It can be used to specify best investment policies for technology innovations and adoptions, multistage R&D, and investment projects in information technology.

  11. Refresher Course in Theoretical Physics at St. Stephen's College ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 5. Refresher Course in Theoretical Physics at St. Stephen's College University of Delhi, Delhi. Information and Announcements Volume 7 Issue 5 May 2002 pp 103-103 ...

  12. Association of Trans-theoretical Model (TTM based Exercise Behavior Change with Body Image Evaluation among Female Iranian Students

    Directory of Open Access Journals (Sweden)

    Sahar Rostami

    2017-03-01

    Full Text Available BackgroundBody image is a determinant of individual attractiveness and physical activity among the young people. This study was aimed to assess the association of Trans-theoretical model based exercise behavior change with body image evaluation among the female Iranian students.Materials and MethodsThis cross-sectional study was conducted in Sanandaj city, Iran in 2016. Using multistage sampling method, a total of 816 high school female students were included in the study. They completed a three-section questionnaire, including demographic information, Trans-theoretical model constructs and body image evaluation. The obtained data were fed into SPSS version 21.0.  ResultsThe results showed more than 60% of participants were in the pre-contemplation and contemplation stages of exercise behavior. The means of perceived self-efficacy, barriers and benefits were found to have a statistically significant difference during the stages of exercise behavior change (P

  13. The Padé approximant in theoretical physics

    CERN Document Server

    Baker, George Allen

    1970-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  14. A framework for understanding culture and its relationship to information behaviour: Taiwanese aborigines' information behaviour

    Directory of Open Access Journals (Sweden)

    Nei-Ching Yeh

    2007-01-01

    Full Text Available Introduction. This article proposes a model of culture and its relationship to information behaviour based on two empirical studies of Taiwanese aborigines' information behaviour. Method. The research approach is ethnographic and the material was collected through observations, conversations, questionnaires, interviews and relevant documents. In 2003-2004, the author lived with two Taiwan aboriginal tribes, the Yami tribe and the Tsau tribe and conducted forty-two theme-based interviews. Analysis. Data were analysed with the help of software for qualitative analysis (NVivo, where all sentences from both interviews and field notes were coded. The conceptual framework used is the sociology of knowledge. Results. The model of culture and its relationship to information behaviour can show us how to think about the relationship between culture and human information behaviour. This model also identifies elements of the model, which are habitus, tradition and prejudice and suggests how we can apply the concepts of information fullness and emptiness to view the relationship between culture and human information behaviour. Conclusion. . Theoretically, this research puts forward a new model of information behaviour and focuses on the role and the importance of culture when thinking about and studying human information behaviour. Methodologically, this study demonstrates how an ethnographic research method can contribute to exploring the influence that culture has on human life and the details of the human life world and information behaviour.

  15. Theoretical description and design of nanomaterial slab waveguides: application to compensation of optical diffraction.

    Science.gov (United States)

    Kivijärvi, Ville; Nyman, Markus; Shevchenko, Andriy; Kaivola, Matti

    2018-04-02

    Planar optical waveguides made of designable spatially dispersive nanomaterials can offer new capabilities for nanophotonic components. As an example, a thin slab waveguide can be designed to compensate for optical diffraction and provide divergence-free propagation for strongly focused optical beams. Optical signals in such waveguides can be transferred in narrow channels formed by the light itself. We introduce here a theoretical method for characterization and design of nanostructured waveguides taking into account their inherent spatial dispersion and anisotropy. Using the method, we design a diffraction-compensating slab waveguide that contains only a single layer of silver nanorods. The waveguide shows low propagation loss and broadband diffraction compensation, potentially allowing transfer of optical information at a THz rate.

  16. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  17. Towards an Information Theoretic Analysis of Searchable Encryption

    NARCIS (Netherlands)

    Sedghi, S.; Doumen, J.M.; Hartel, Pieter H.; Jonker, Willem

    2008-01-01

    Searchable encryption is a technique that allows a client to store data in encrypted form on a curious server, such that data can be retrieved while leaking a minimal amount of information to the server. Many searchable encryption schemes have been proposed and proved secure in their own

  18. Quantum Wells, Wires and Dots Theoretical and Computational Physics of Semiconductor Nanostructures

    CERN Document Server

    Harrison, Paul

    2011-01-01

    Quantum Wells, Wires and Dots, 3rd Edition is aimed at providing all the essential information, both theoretical and computational, in order that the reader can, starting from essentially nothing, understand how the electronic, optical and transport properties of semiconductor heterostructures are calculated. Completely revised and updated, this text is designed to lead the reader through a series of simple theoretical and computational implementations, and slowly build from solid foundations, to a level where the reader can begin to initiate theoretical investigations or explanations of their

  19. "The Integrity and Obstinacy of Intellectual Creations": Jurgen Habermas and Librarianship's Theoretical Literature

    Science.gov (United States)

    Buschman, John

    2006-01-01

    Librarianship and library and information science (LIS) have long struggled with an ongoing lack of a theoretical and epistemological basis. There have been renewed efforts to explore various theoretical and philosophical positions and their meaning for librarianship and LIS research. This article explores the framework that Jurgen Habermas offers…

  20. Theoretical and Practical Studies on a Possible Genetic Method for Tsetse Fly Control

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, C. F. [Tsetse Research Laboratory, School of Veterinary Science, University Of Bristol, Langford, Bristol (United Kingdom); Hill, W. G. [Institute of Animal Genetics, Edinburgh (United Kingdom)

    1968-06-15

    Chromosome translocations may be useful in pest control because they are a common type of mutation in a variety of organisms and, frequently, the heterozygote is semi-sterile and the homo- zygote folly fertile. It might be possible to induce such a translocation in a pest species, to breed from a selected ancestral pair of translocation homozygotes a large number of the homozygotes and to release these into a wild population. This would cause the production of heterozygotes in the wild population and hence would reduce the fertility of the population. This reduction would persist for a number of generations. Calculations, based on simplified assumptions, showed that this method of fertility reduction might be more economical than the use of sterilized males. In the present paper a theoretical comparison is made of the translocation and sterilized-male methods for the control of tsetse flies (Glossina sp.). A computer model has been set up which simulates, as far as possible, the known facts about birth, mating and death in a wild tsetse population. The predicted effects of releases of sterilized males and of translocation homozygotes are described and the modifications which would be caused by density-dependent mortality, migration and reduced viability of the translocation genotypes and sterilized males are indicated. It is concluded that to eradicate a well isolated wild population the numbers of translocation homozygotes required might well be considerably less than the number of sterilized males required for the same task. However, immigration into the population would greatly reduce the efficiency of the translocation method. The progress so far in attempting to produce a suitable translocation in Glossina austeni is described. Males have been treated with 5-7 krad of gamma radiation and a number of semi-sterile individuals have been selected from among their progeny. The semi-sterility is inherited and, by analogy with the results in other organisms, is

  1. COMPOSITE METHOD OF RELIABILITY RESEARCH FOR HIERARCHICAL MULTILAYER ROUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. B. Tregubov

    2016-09-01

    Full Text Available The paper deals with the idea of a research method for hierarchical multilayer routing systems. The method represents a composition of methods of graph theories, reliability, probabilities, etc. These methods are applied to the solution of different private analysis and optimization tasks and are systemically connected and coordinated with each other through uniform set-theoretic representation of the object of research. The hierarchical multilayer routing systems are considered as infrastructure facilities (gas and oil pipelines, automobile and railway networks, systems of power supply and communication with distribution of material resources, energy or information with the use of hierarchically nested functions of routing. For descriptive reasons theoretical constructions are considered on the example of task solution of probability determination for up state of specific infocommunication system. The author showed the possibility of constructive combination of graph representation of structure of the object of research and a logic probable analysis method of its reliability indices through uniform set-theoretic representation of its elements and processes proceeding in them.

  2. Classification Method in Integrated Information Network Using Vector Image Comparison

    Directory of Open Access Journals (Sweden)

    Zhou Yuan

    2014-05-01

    Full Text Available Wireless Integrated Information Network (WMN consists of integrated information that can get data from its surrounding, such as image, voice. To transmit information, large resource is required which decreases the service time of the network. In this paper we present a Classification Approach based on Vector Image Comparison (VIC for WMN that improve the service time of the network. The available methods for sub-region selection and conversion are also proposed.

  3. Reference group theory with implications for information studies: a theoretical essay

    Directory of Open Access Journals (Sweden)

    E. Murell Dawson

    2001-01-01

    Full Text Available This article explores the role and implications of reference group theory in relation to the field of library and information science. Reference group theory is based upon the principle that people take the standards of significant others as a basis for making self-appraisals, comparisons, and choices regarding need and use of information. Research that applies concepts of reference group theory to various sectors of library and information studies can provide data useful in enhancing areas such as information-seeking research, special populations, and uses of information. Implications are promising that knowledge gained from like research can be beneficial in helping information professionals better understand the role theory plays in examining ways in which people manage their information and social worlds.

  4. Reconstructing Information in Large-Scale Structure via Logarithmic Mapping

    Science.gov (United States)

    Szapudi, Istvan

    We propose to develop a new method to extract information from large-scale structure data combining two-point statistics and non-linear transformations; before, this information was available only with substantially more complex higher-order statistical methods. Initially, most of the cosmological information in large-scale structure lies in two-point statistics. With non- linear evolution, some of that useful information leaks into higher-order statistics. The PI and group has shown in a series of theoretical investigations how that leakage occurs, and explained the Fisher information plateau at smaller scales. This plateau means that even as more modes are added to the measurement of the power spectrum, the total cumulative information (loosely speaking the inverse errorbar) is not increasing. Recently we have shown in Neyrinck et al. (2009, 2010) that a logarithmic (and a related Gaussianization or Box-Cox) transformation on the non-linear Dark Matter or galaxy field reconstructs a surprisingly large fraction of this missing Fisher information of the initial conditions. This was predicted by the earlier wave mechanical formulation of gravitational dynamics by Szapudi & Kaiser (2003). The present proposal is focused on working out the theoretical underpinning of the method to a point that it can be used in practice to analyze data. In particular, one needs to deal with the usual real-life issues of galaxy surveys, such as complex geometry, discrete sam- pling (Poisson or sub-Poisson noise), bias (linear, or non-linear, deterministic, or stochastic), redshift distortions, pro jection effects for 2D samples, and the effects of photometric redshift errors. We will develop methods for weak lensing and Sunyaev-Zeldovich power spectra as well, the latter specifically targetting Planck. In addition, we plan to investigate the question of residual higher- order information after the non-linear mapping, and possible applications for cosmology. Our aim will be to work out

  5. Eighteenth annual West Coast theoretical chemistry conference

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    Abstracts are presented from the eighteenth annual west coast theoretical chemistry conference. Topics include molecular simulations; quasiclassical simulations of reactions; photodissociation reactions; molecular dynamics;interface studies; electronic structure; and semiclassical methods of reactive systems.

  6. The theoretical study of passive and active optical devices via planewave based transfer (scattering) matrix method and other approaches

    Energy Technology Data Exchange (ETDEWEB)

    Zhuo, Ye [Iowa State Univ., Ames, IA (United States)

    2011-01-01

    In this thesis, we theoretically study the electromagnetic wave propagation in several passive and active optical components and devices including 2-D photonic crystals, straight and curved waveguides, organic light emitting diodes (OLEDs), and etc. Several optical designs are also presented like organic photovoltaic (OPV) cells and solar concentrators. The first part of the thesis focuses on theoretical investigation. First, the plane-wave-based transfer (scattering) matrix method (TMM) is briefly described with a short review of photonic crystals and other numerical methods to study them (Chapter 1 and 2). Next TMM, the numerical method itself is investigated in details and developed in advance to deal with more complex optical systems. In chapter 3, TMM is extended in curvilinear coordinates to study curved nanoribbon waveguides. The problem of a curved structure is transformed into an equivalent one of a straight structure with spatially dependent tensors of dielectric constant and magnetic permeability. In chapter 4, a new set of localized basis orbitals are introduced to locally represent electromagnetic field in photonic crystals as alternative to planewave basis. The second part of the thesis focuses on the design of optical devices. First, two examples of TMM applications are given. The first example is the design of metal grating structures as replacements of ITO to enhance the optical absorption in OPV cells (chapter 6). The second one is the design of the same structure as above to enhance the light extraction of OLEDs (chapter 7). Next, two design examples by ray tracing method are given, including applying a microlens array to enhance the light extraction of OLEDs (chapter 5) and an all-angle wide-wavelength design of solar concentrator (chapter 8). In summary, this dissertation has extended TMM which makes it capable of treating complex optical systems. Several optical designs by TMM and ray tracing method are also given as a full complement of this

  7. Affinity-based, biophysical methods to detect and analyze ligand binding to recombinant proteins: matching high information content with high throughput.

    Science.gov (United States)

    Holdgate, Geoff A; Anderson, Malcolm; Edfeldt, Fredrik; Geschwindner, Stefan

    2010-10-01

    Affinity-based technologies have become impactful tools to detect, monitor and characterize molecular interactions using recombinant target proteins. This can aid the understanding of biological function by revealing mechanistic details, and even more importantly, enables the identification of new improved ligands that can modulate the biological activity of those targets in a desired fashion. The selection of the appropriate technology is a key step in that process, as each one of the currently available technologies offers a characteristic type of biophysical information about the ligand-binding event. Alongside the indisputable advantages of each of those technologies they naturally display diverse restrictions that are quite frequently related to the target system to be studied but also to the affinity, solubility and molecular size of the ligands. This paper discusses some of the theoretical and experimental aspects of the most common affinity-based methods, what type of information can be gained from each one of those approaches, and what requirements as well as limitations are expected from working with recombinant proteins on those platforms and how those can be optimally addressed.

  8. Actors’ Competencies or Methods? A Case Study of Successful Information Systems Development

    DEFF Research Database (Denmark)

    Omland, Hans Olav; Nielsen, Peter Axel

    2009-01-01

    and methods are exercised. Emphasising the intertwining of competencies and methods, we discuss the character of the intertwining process, how different actors relate to different methods, and how methods may be part of the problem rather than part of the solution to challenges in information systems...... between actors’ competencies and their deployment of methods, arguing that this relationship is described over-simplistically and needs a better explanation. Through a case study of a successful information systems development project we identify some central situations where a variety of competencies...... development. The paper suggests elements for a new model for explaining actors’ competencies and their use of methods....

  9. Theoretical study of the electronic structure of f-element complexes by quantum chemical methods

    International Nuclear Information System (INIS)

    Vetere, V.

    2002-09-01

    This thesis is related to comparative studies of the chemical properties of molecular complexes containing lanthanide or actinide trivalent cations, in the context of the nuclear waste disposal. More precisely, our aim was a quantum chemical analysis of the metal-ligand bonding in such species. Various theoretical approaches were compared, for the inclusion of correlation (density functional theory, multiconfigurational methods) and of relativistic effects (relativistic scalar and 2-component Hamiltonians, relativistic pseudopotentials). The performance of these methods were checked by comparing computed structural properties to published experimental data, on small model systems: lanthanide and actinide tri-halides and on X 3 M-L species (X=F, Cl; M=La, Nd, U; L = NH 3 , acetonitrile, CO). We have thus shown the good performance of density functionals combined with a quasi-relativistic method, as well as of gradient-corrected functionals associated with relativistic pseudopotentials. In contrast, functionals including some part of exact exchange are less reliable to reproduce experimental trends, and we have given a possible explanation for this result . Then, a detailed analysis of the bonding has allowed us to interpret the discrepancies observed in the structural properties of uranium and lanthanides complexes, based on a covalent contribution to the bonding, in the case of uranium(III), which does not exist in the lanthanide(III) homologues. Finally, we have examined more sizeable systems, closer to experimental species, to analyse the influence of the coordination number, of the counter-ions and of the oxidation state of uranium, on the metal-ligand bonding. (author)

  10. Writing Information Literacy Assessment Plans: A Guide to Best Practice

    Directory of Open Access Journals (Sweden)

    Megan Oakleaf

    2010-03-01

    Full Text Available Academic librarians throughout higher education add value to the teaching and learning missions of their institutions though information literacy instruction. To demonstrate the full impact of librarians on students in higher education, librarians need comprehensive information literacy assessment plans, composed of instructional program-level and outcome-level components, that summarize the purpose of information literacy assessment, emphasize the theoretical basis of their assessment efforts, articulate specific information literacy goals and outcomes, describe the major assessment methods and tools used to capture evidence of student learning, report assessment results, and highlight improvements made as a consequence of learning assessment.

  11. METHODS OF POLYMODAL INFORMATION TRANSMISSION

    Directory of Open Access Journals (Sweden)

    O. O. Basov

    2015-03-01

    Full Text Available The research results upon the application of the existing information transmission methods in polymodal info communication systems are presented herein. The analysis of the existing commutation ways and multiplexing schemes has revealed that modern means of telecommunication are capable of providing polymodal information delivery with the required quality to the customer correspondent terminal. Under these conditions substantial capacity resource consumption in the data transmission networks with a simultaneous static time multiplexing is required, however, it is easier to achieve the modality synchronization within that kind of an infrastructure. The data networks with a static time multiplexing demand employing more sophisticated supporting algorithms of the guaranteed data blocks delivery quality. However, due to the stochastic data blocks delays modality synchronizing during the off-line processing is more difficult to provide. Nowadays there are objective preconditions for a data networking realization which is invariable to the applied transmission technology. This capability is defined by a wide (person-to-person application of the optical technologies in the transport infrastructure of the polymodal info communication systems. In case of the availability of the customer terminal and networking functioning matching mode it becomes possible to organize channels in the latter which can adaptively select the most effective networking technology according to the current volume allocation and modality types in the messages.

  12. Theoretical Study of Palladium Membrane Reactor Performance During Propane Dehydrogenation Using CFD Method

    Directory of Open Access Journals (Sweden)

    Kamran Ghasemzadeh

    2017-04-01

    Full Text Available This study presents a 2D-axisymmetric computational fluid dynamic (CFD model to investigate the performance Pd membrane reactor (MR during propane dehydrogenation process for hydrogen production. The proposed CFD model provided the local information of temperature and component concentration for the driving force analysis. After investigation of mesh independency of CFD model, the validation of CFD model results was carried out by other modeling data and a good agreement between CFD model results and theoretical data was achieved. Indeed, in the present model, a tubular reactor with length of 150 mm was considered, in which the Pt-Sn-K/Al2O3 as catalyst were filled in reaction zone. Hence, the effects of the important operating parameter (reaction temperature on the performances of membrane reactor (MR were studied in terms of propane conversion and hydrogen yield. The CFD results showed that the suggested MR system during propane dehydrogenation reaction presents higher performance with respect to once obtained in the conventional reactor (CR. In particular, by applying Pd membrane, was found that propane conversion can be increased from 41% to 49%. Moreover, the highest value of propane conversion (X = 91% was reached in case of Pd-Ag MR. It was also established that the feed flow rate of the MR is to be the one of the most important factors defining efficiency of the propane dehydrogenation process.

  13. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    Science.gov (United States)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  14. Development and Validation of a Theoretical Test in Endosonography for Pulmonary Diseases

    DEFF Research Database (Denmark)

    Savran, Mona M; Clementsen, Paul Frost; Annema, Jouke T

    2014-01-01

    evidence for this test. METHODS: Initially, 78 questions were constructed after informal conversational interviews with 4 international experts in endosonography. The clarity and content validity of the questions were tested using a Delphi-like approach. Construct validity was explored by administering......BACKGROUND: Theoretical testing provides the necessary foundation to perform technical skills. Additionally, testing improves the retention of knowledge. OBJECTIVES: The aims of this study were to develop a multiple-choice test in endosonography for pulmonary diseases and to gather validity...... consistently than the novices (p = 0.037) and the intermediates (p Validity evidence was gathered, and the test demonstrated content and construct validity....

  15. Comparison of two heuristic evaluation methods for evaluating the usability of health information systems.

    Science.gov (United States)

    Khajouei, Reza; Hajesmaeel Gohari, Sadrieh; Mirzaee, Moghaddameh

    2018-04-01

    In addition to following the usual Heuristic Evaluation (HE) method, the usability of health information systems can also be evaluated using a checklist. The objective of this study is to compare the performance of these two methods in identifying usability problems of health information systems. Eight evaluators independently evaluated different parts of a Medical Records Information System using two methods of HE (usual and with a checklist). The two methods were compared in terms of the number of problems identified, problem type, and the severity of identified problems. In all, 192 usability problems were identified by two methods in the Medical Records Information System. This was significantly higher than the number of usability problems identified by the checklist and usual method (148 and 92, respectively) (p information systems. The results demonstrated that the checklist method had significantly better performance in terms of the number of identified usability problems; however, the performance of the usual method for identifying problems of higher severity was significantly better. Although the checklist method can be more efficient for less experienced evaluators, wherever usability is critical, the checklist should be used with caution in usability evaluations. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Information and crystal structure estimation

    International Nuclear Information System (INIS)

    Wilkins, S.W.; Commonwealth Scientific and Industrial Research Organization, Clayton; Varghese, J.N.; Steenstrup, S.

    1984-01-01

    The conceptual foundations of a general information-theoretic based approach to X-ray structure estimation are reexamined with a view to clarifying some of the subtleties inherent in the approach and to enhancing the scope of the method. More particularly, general reasons for choosing the minimum of the Shannon-Kullback measure for information as the criterion for inference are discussed and it is shown that the minimum information (or maximum entropy) principle enters the present treatment of the structure estimation problem in at least to quite separate ways, and that three formally similar but conceptually quite different expressions for relative information appear at different points in the theory. One of these is the general Shannon-Kullback expression, while the second is a derived form pertaining only under the restrictive assumptions of the present stochastic model for allowed structures, and the third is a measure of the additional information involved in accepting a fluctuation relative to an arbitrary mean structure. (orig.)

  17. Evaluation of group theoretical characteristics using the symbolic manipulation language MAPLE

    International Nuclear Information System (INIS)

    Taneri, U.; Paldus, J.

    1994-01-01

    Relying on theoretical developments exploiting quasispin and the pseudo-orthogonal group in the Hubbard model of cyclic polyenes, the general expressions for generating polynomials, providing the dimensional information for relevant irreducible representations, were derived. These generating polynomials result from 1-dimensional formulas through rather tedious algebraic manipulations involving ratios of polynomials with fractional powers. It is shown that these expressions may be efficiently handled using the symbolic manipulation language MAPLE and the dimensional information for an arbitrary spin, isospin, and quasimomentum obtained. Exploitation of symbolic computation for other group theoretical problems that are relevant in quantum chemical calculations and their relationship with Guassian polynomial based combinatorial approaches is also briefly addressed and various possible applications outlined

  18. Theoretically informed correlates of hepatitis B knowledge among four Asian groups: the health behavior framework.

    Science.gov (United States)

    Maxwell, Annette E; Stewart, Susan L; Glenn, Beth A; Wong, Weng Kee; Yasui, Yutaka; Chang, L Cindy; Taylor, Victoria M; Nguyen, Tung T; Chen, Moon S; Bastani, Roshan

    2012-01-01

    Few studies have examined theoretically informed constructs related to hepatitis B (HBV) testing, and comparisons across studies are challenging due to lack of uniformity in constructs assessed. The present analysis examined relationships among Health Behavior Framework factors across four Asian American groups to advance the development of theory-based interventions for HBV testing in at-risk populations. Data were collected from 2007-2010 as part of baseline surveys during four intervention trials promoting HBV testing among Vietnamese-, Hmong-, Korean- and Cambodian-Americans (n = 1,735). Health Behavior Framework constructs assessed included: awareness of HBV, knowledge of transmission routes, perceived susceptibility, perceived severity, doctor recommendation, stigma of HBV infection, and perceived efficacy of testing. Within each group we assessed associations between our intermediate outcome of knowledge of HBV transmission and other constructs, to assess the concurrent validity of our model and instruments. While the absolute levels for Health Behavior Framework factors varied across groups, relationships between knowledge and other factors were generally consistent. This suggests similarities rather than differences with respect to posited drivers of HBV-related behavior. Our findings indicate that Health Behavior Framework constructs are applicable to diverse ethnic groups and provide preliminary evidence for the construct validity of the Health Behavior Framework.

  19. History of information science

    OpenAIRE

    Buckland, MK; Liu, Z

    1998-01-01

    This informative volume concentrates on the following areas: Historiography of Information Science; Paul Otlet and His Successors; Techniques, Tools, and Systems; People and Organizations; Theoretical Topics; and Literature.

  20. Information-seeking behavior during residency is associated with quality of theoretical learning, academic career achievements, and evidence-based medical practice: a strobe-compliant article.

    Science.gov (United States)

    Oussalah, Abderrahim; Fournier, Jean-Paul; Guéant, Jean-Louis; Braun, Marc

    2015-02-01

    Data regarding knowledge acquisition during residency training are sparse. Predictors of theoretical learning quality, academic career achievements and evidence-based medical practice during residency are unknown. We performed a cross-sectional study on residents and attending physicians across several residency programs in 2 French faculties of medicine. We comprehensively evaluated the information-seeking behavior (I-SB) during residency using a standardized questionnaire and looked for independent predictors of theoretical learning quality, academic career achievements, and evidence-based medical practice among I-SB components using multivariate logistic regression analysis. Between February 2013 and May 2013, 338 fellows and attending physicians were included in the study. Textbooks and international medical journals were reported to be used on a regular basis by 24% and 57% of the respondents, respectively. Among the respondents, 47% refer systematically (4.4%) or frequently (42.6%) to published guidelines from scientific societies upon their publication. The median self-reported theoretical learning quality score was 5/10 (interquartile range, 3-6; range, 1-10). A high theoretical learning quality score (upper quartile) was independently and strongly associated with the following I-SB components: systematic reading of clinical guidelines upon their publication (odds ratio [OR], 5.55; 95% confidence interval [CI], 1.77-17.44); having access to a library that offers the leading textbooks of the specialty in the medical department (OR, 2.45, 95% CI, 1.33-4.52); knowledge of the specialty leading textbooks (OR, 2.12; 95% CI, 1.09-4.10); and PubMed search skill score ≥5/10 (OR, 1.94; 95% CI, 1.01-3.73). Research Master (M2) and/or PhD thesis enrolment were independently and strongly associated with the following predictors: PubMed search skill score ≥5/10 (OR, 4.10; 95% CI, 1.46-11.53); knowledge of the leading medical journals of the specialty (OR, 3.33; 95

  1. Inference of ICF Implosion Core Mix using Experimental Data and Theoretical Mix Modeling

    International Nuclear Information System (INIS)

    Welser-Sherrill, L.; Haynes, D.A.; Mancini, R.C.; Cooley, J.H.; Tommasini, R.; Golovkin, I.E.; Sherrill, M.E.; Haan, S.W.

    2009-01-01

    The mixing between fuel and shell materials in Inertial Confinement Fusion (ICF) implosion cores is a current topic of interest. The goal of this work was to design direct-drive ICF experiments which have varying levels of mix, and subsequently to extract information on mixing directly from the experimental data using spectroscopic techniques. The experimental design was accomplished using hydrodynamic simulations in conjunction with Haan's saturation model, which was used to predict the mix levels of candidate experimental configurations. These theoretical predictions were then compared to the mixing information which was extracted from the experimental data, and it was found that Haan's mix model performed well in predicting trends in the width of the mix layer. With these results, we have contributed to an assessment of the range of validity and predictive capability of the Haan saturation model, as well as increased our confidence in the methods used to extract mixing information from experimental data.

  2. A Two-Radius Circular Array Method: Extracting Independent Information on Phase Velocities of Love Waves From Microtremor Records From a Simple Seismic Array

    Science.gov (United States)

    Tada, T.; Cho, I.; Shinozaki, Y.

    2005-12-01

    We have invented a Two-Radius (TR) circular array method of microtremor exploration, an algorithm that enables to estimate phase velocities of Love waves by analyzing horizontal-component records of microtremors that are obtained with an array of seismic sensors placed around circumferences of two different radii. The data recording may be done either simultaneously around the two circles or in two separate sessions with sensors distributed around each circle. Both Rayleigh and Love waves are present in the horizontal components of microtremors, but in the data processing of our TR method, all information on the Rayleigh waves ends up cancelled out, and information on the Love waves alone are left to be analyzed. Also, unlike the popularly used frequency-wavenumber spectral (F-K) method, our TR method does not resolve individual plane-wave components arriving from different directions and analyze their "vector" phase velocities, but instead directly evaluates their "scalar" phase velocities --- phase velocities that contain no information on the arrival direction of waves --- through a mathematical procedure which involves azimuthal averaging. The latter feature leads us to expect that, with our TR method, it is possible to conduct phase velocity analysis with smaller numbers of sensors, with higher stability, and up to longer-wavelength ranges than with the F-K method. With a view to investigating the capabilities and limitations of our TR method in practical implementation to real data, we have deployed circular seismic arrays of different sizes at a test site in Japan where the underground structure is well documented through geophysical exploration. Ten seismic sensors were placed equidistantly around two circumferences, five around each circle, with varying combinations of radii ranging from several meters to several tens of meters, and simultaneous records of microtremors around circles of two different radii were analyzed with our TR method to produce

  3. Pathways from Trauma to Psychotic Experiences: A Theoretically Informed Model of Posttraumatic Stress in Psychosis

    Directory of Open Access Journals (Sweden)

    Amy Hardy

    2017-05-01

    Full Text Available In recent years, empirical data and theoretical accounts relating to the relationship between childhood victimization and psychotic experiences have accumulated. Much of this work has focused on co-occurring Posttraumatic Stress Disorder or putative causal mechanisms in isolation from each other. The complexity of posttraumatic stress reactions experienced in psychosis remains poorly understood. This paper therefore attempts to synthesize the current evidence base into a theoretically informed, multifactorial model of posttraumatic stress in psychosis. Three trauma-related vulnerability factors are proposed to give rise to intrusions and to affect how people appraise and cope with them. First, understandable attempts to survive trauma become habitual ways of regulating emotion, manifesting in cognitive-affective, behavioral and interpersonal responses. Second, event memories, consisting of perceptual and episodic representations, are impacted by emotion experienced during trauma. Third, personal semantic memory, specifically appraisals of the self and others, are shaped by event memories. It is proposed these vulnerability factors have the potential to lead to two types of intrusions. The first type is anomalous experiences arising from emotion regulation and/or the generation of novel images derived from trauma memory. The second type is trauma memory intrusions reflecting, to varying degrees, the retrieval of perceptual, episodic and personal semantic representations. It is speculated trauma memory intrusions may be experienced on a continuum from contextualized to fragmented, depending on memory encoding and retrieval. Personal semantic memory will then impact on how intrusions are appraised, with habitual emotion regulation strategies influencing people’s coping responses to these. Three vignettes are outlined to illustrate how the model accounts for different pathways between victimization and psychosis, and implications for therapy are

  4. High-throughput theoretical design of lithium battery materials

    International Nuclear Information System (INIS)

    Ling Shi-Gang; Gao Jian; Xiao Rui-Juan; Chen Li-Quan

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. (topical review)

  5. Effectiveness of Visual Methods in Information Procedures for Stem Cell Recipients and Donors

    Directory of Open Access Journals (Sweden)

    Çağla Sarıtürk

    2017-12-01

    Full Text Available Objective: Obtaining informed consent from hematopoietic stem cell recipients and donors is a critical step in the transplantation process. Anxiety may affect their understanding of the provided information. However, use of audiovisual methods may facilitate understanding. In this prospective randomized study, we investigated the effectiveness of using an audiovisual method of providing information to patients and donors in combination with the standard model. Materials and Methods: A 10-min informational animation was prepared for this purpose. In total, 82 participants were randomly assigned to two groups: group 1 received the additional audiovisual information and group 2 received standard information. A 20-item questionnaire was administered to participants at the end of the informational session. Results: A reliability test and factor analysis showed that the questionnaire was reliable and valid. For all participants, the mean overall satisfaction score was 184.8±19.8 (maximum possible score of 200. However, for satisfaction with information about written informed consent, group 1 scored significantly higher than group 2 (p=0.039. Satisfaction level was not affected by age, education level, or differences between the physicians conducting the informative session. Conclusion: This study shows that using audiovisual tools may contribute to a better understanding of the informed consent procedure and potential risks of stem cell transplantation.

  6. Online adaptive approach for a game-theoretic strategy for complete vehicle energy management

    NARCIS (Netherlands)

    Chen, H.; Kessels, J.T.B.A.; Weiland, S.

    2015-01-01

    This paper introduces an adaptive approach for a game-theoretic strategy on Complete Vehicle Energy Management. The proposed method enhances the game-theoretic approach such that the strategy is able to adapt to real driving behavior. The classical game-theoretic approach relies on one probability

  7. Applying Multiple Methods to Comprehensively Evaluate a Patient Portal’s Effectiveness to Convey Information to Patients

    Science.gov (United States)

    Krist, Alex H; Aycock, Rebecca A; Kreps, Gary L

    2016-01-01

    Background Patient portals have yet to achieve their full potential for enhancing health communication and improving health outcomes. Although the Patient Protection and Affordable Care Act in the United States mandates the utilization of patient portals, and usage continues to rise, their impact has not been as profound as anticipated. Objective The objective of our case study was to evaluate how well portals convey information to patients. To demonstrate how multiple methodologies could be used to evaluate and improve the design of patient-centered portals, we conducted an in-depth evaluation of an exemplar patient-centered portal designed to promote preventive care to consumers. Methods We used 31 critical incident patient interviews, 2 clinician focus groups, and a thematic content analysis to understand patients’ and clinicians’ perspectives, as well as theoretical understandings of the portal’s use. Results We gathered over 140 critical incidents, 71.8% (102/142) negative and 28.2% (40/142) positive. Positive incident categories were (1) instant medical information access, (2) clear health information, and (3) patient vigilance. Negative incident categories were (1) standardized content, (2) desire for direct communication, (3) website functionality, and (4) difficulty interpreting laboratory data. Thematic analysis of the portal’s immediacy resulted in high scores in the attributes enhances understanding (18/23, 78%), personalization (18/24, 75%), and motivates behavior (17/24, 71%), but low levels of interactivity (7/24, 29%) and engagement (2/24, 8%). Two overarching themes emerged to guide portal refinements: (1) communication can be improved with directness and interactivity and (2) perceived personalization must be greater to engage patients. Conclusions Results suggest that simple modifications, such as increased interactivity and personalized messages, can make portals customized, robust, easily accessible, and trusted information sources

  8. Experimental and theoretical analysis of the rate of solvent equilibration in the hanging drop method of protein crystal growth

    Science.gov (United States)

    Fowlis, William W.; Delucas, Lawrence J.; Twigg, Pamela J.; Howard, Sandra B.; Meehan, Edward J.

    1988-01-01

    The principles of the hanging-drop method of crystal growth are discussed, and the rate of water evaporation in a water droplet (containing protein, buffer, and a precipitating agent) suspended above a well containing a double concentration of precipitating agent is investigated theoretically. It is shown that, on earth, the rate of evaporation may be determined from diffusion theory and the colligative properties of solutions. The parameters affecting the rate of evaporation include the temperature, the vapor pressure of water, the ionization constant of the salt, the volume of the drop, the contact angle between the droplet and the coverslip, the number of moles of salt in the droplet, the number of moles of water and salt in the well, the molar volumes of water and salt, the distance from the droplet to the well, and the coefficient of diffusion of water vapor through air. To test the theoretical equations, hanging-drop experiments were conducted using various reagent concentrations in 25-microliter droplets and measuring the evaporation times at 4 C and 25 C. The results showed good agreement with the theory.

  9. Hybrid methods to represent incomplete and uncertain information

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, C. [NASA Goddard Space Flight Center, Greenbelt, MD (United States)

    1996-12-31

    Decision making is cast in the semiotic context of perception, decision, and action loops. Towards the goal of properly grounding hybrid representations of information and uncertainty from this semiotic perspective, we consider the roles of and relations among the mathematical components of General Information Theory (GIT), particularly among fuzzy sets, possibility theory, probability theory, and random sets. We do so by using a clear distinction between the syntactic, mathematical formalism and the semantic domains of application of each of these fields, placing the emphasis on available measurement and action methods appropriate for each formalism, to which and from which the decision-making process flows.

  10. On the Adaptation of an Agile Information Systems Development Method

    NARCIS (Netherlands)

    Aydin, M.N.; Harmsen, F.; van Slooten, C.; Stegwee, R.A.

    2005-01-01

    Little specific research has been conducted to date on the adaptation of agile information systems development (ISD) methods. This article presents the work practice in dealing with the adaptation of such a method in the ISD department of one of the leading financial institutes in Europe. Two forms

  11. Information-theoretical approach to control of quantum-mechanical systems

    International Nuclear Information System (INIS)

    Kawabata, Shiro

    2003-01-01

    Fundamental limits on the controllability of quantum mechanical systems are discussed in the light of quantum information theory. It is shown that the amount of entropy-reduction that can be extracted from a quantum system by feedback controller is upper bounded by a sum of the decrease of entropy achievable in open-loop control and the mutual information between the quantum system and the controller. This upper bound sets a fundamental limit on the performance of any quantum controllers whose designs are based on the possibilities to attain low entropy states. An application of this approach pertaining to quantum error correction is also discussed

  12. Theoretically Guided Analytical Method Development and Validation for the Estimation of Rifampicin in a Mixture of Isoniazid and Pyrazinamide by UV Spectrophotometer.

    Science.gov (United States)

    Khan, Mohammad F; Rita, Shamima A; Kayser, Md Shahidulla; Islam, Md Shariful; Asad, Sharmeen; Bin Rashid, Ridwan; Bari, Md Abdul; Rahman, Muhammed M; Al Aman, D A Anwar; Setu, Nurul I; Banoo, Rebecca; Rashid, Mohammad A

    2017-01-01

    A simple, rapid, economic, accurate, and precise method for the estimation of rifampicin in a mixture of isoniazid and pyrazinamide by UV spectrophotometeric technique (guided by the theoretical investigation of physicochemical properties) was developed and validated. Theoretical investigations revealed that isoniazid and pyrazinamide both were freely soluble in water and slightly soluble in ethyl acetate whereas rifampicin was practically insoluble in water but freely soluble in ethyl acetate. This indicates that ethyl acetate is an effective solvent for the extraction of rifampicin from a water mixture of isoniazid and pyrazinamide. Computational study indicated that pH range of 6.0-8.0 would favor the extraction of rifampicin. Rifampicin is separated from isoniazid and pyrazinamide at pH 7.4 ± 0.1 by extracting with ethyl acetate. The ethyl acetate was then analyzed at λ max of 344.0 nm. The developed method was validated for linearity, accuracy and precision according to ICH guidelines. The proposed method exhibited good linearity over the concentration range of 2.5-35.0 μg/mL. The intraday and inter-day precision in terms of % RSD ranged from 1.09 to 1.70% and 1.63 to 2.99%, respectively. The accuracy (in terms of recovery) of the method varied from of 96.7 ± 0.9 to 101.1 ± 0.4%. The LOD and LOQ were found to be 0.83 and 2.52 μg/mL, respectively. In addition, the developed method was successfully applied to determine rifampicin combination (isoniazid and pyrazinamide) brands available in Bangladesh.

  13. Theoretical framework of community education improvement

    Directory of Open Access Journals (Sweden)

    Zaúl Brizuela Castillo

    2015-05-01

    Full Text Available The paper explains the connection between the approach selected for the analysis and development of community education and the contradictions manifested in its theoretical and practical comprehension. As a result, a comprehensive model for community education, describing the theoretical and methodological framework to improve community education, is devised. This framework is based on a conscious organizing of educative influences applied to the regular task of the community under the coordinate action of social institutions and organization that promote the transformational action of the neighborhood assuming a protagonist role in the improvement of the quality of live and morals related to the socialism updating process. The comprehensive model was proved experimentally at District 59 of San Miguel town; the transformation of the community was scientifically registered together with the information gather by means of observation and interviewing. The findings proved the pertinence and feasibility of the proposed model.

  14. Automatic spike sorting using tuning information.

    Science.gov (United States)

    Ventura, Valérie

    2009-09-01

    Current spike sorting methods focus on clustering neurons' characteristic spike waveforms. The resulting spike-sorted data are typically used to estimate how covariates of interest modulate the firing rates of neurons. However, when these covariates do modulate the firing rates, they provide information about spikes' identities, which thus far have been ignored for the purpose of spike sorting. This letter describes a novel approach to spike sorting, which incorporates both waveform information and tuning information obtained from the modulation of firing rates. Because it efficiently uses all the available information, this spike sorter yields lower spike misclassification rates than traditional automatic spike sorters. This theoretical result is verified empirically on several examples. The proposed method does not require additional assumptions; only its implementation is different. It essentially consists of performing spike sorting and tuning estimation simultaneously rather than sequentially, as is currently done. We used an expectation-maximization maximum likelihood algorithm to implement the new spike sorter. We present the general form of this algorithm and provide a detailed implementable version under the assumptions that neurons are independent and spike according to Poisson processes. Finally, we uncover a systematic flaw of spike sorting based on waveform information only.

  15. Theoretical studies of densiometric methods using γ-radiation

    International Nuclear Information System (INIS)

    Luebbesmeyer, D.; Wesser, U.

    1975-10-01

    Some conclusions could be drawn from the calculations performed for the practical measuring method to be applied: 1) The incident method for the density measurement of an inhomogenous two-phase flow involves a lot of errors. 2) Should one, due to limited expense, only use two detectors for the measuring chains, then the scattered-beam method is more advantageous than the two-beam method. 3) If three detectors can be used, a greater accuracy can be expected than with the scattered-beam method. 4) The accuracy of all methods increases if a certain homogenity of a part of the flow is allowed. 5) The most favourable energy region is different for scattered-beam and multi-beam processes. Whereas the scattered-beam method can be used to an optimum at energies of about 60 KeV due to the enlarged scattering cross sections at small radiation energies, the energies with multi-beam methods should be more than 100 KeV. 6) If small calibration problems are important, than the multi-beam method is preferable to the scattered-beam method. A good compromise between apparative expenditure and the accuracy to be obtained is the three-beam method with, e.g., 137 Cs as a source. (orig./LH) [de

  16. Adjusting Estimates of the Expected Value of Information for Implementation: Theoretical Framework and Practical Application.

    Science.gov (United States)

    Andronis, Lazaros; Barton, Pelham M

    2016-04-01

    Value of information (VoI) calculations give the expected benefits of decision making under perfect information (EVPI) or sample information (EVSI), typically on the premise that any treatment recommendations made in light of this information will be implemented instantly and fully. This assumption is unlikely to hold in health care; evidence shows that obtaining further information typically leads to "improved" rather than "perfect" implementation. To present a method of calculating the expected value of further research that accounts for the reality of improved implementation. This work extends an existing conceptual framework by introducing additional states of the world regarding information (sample information, in addition to current and perfect information) and implementation (improved implementation, in addition to current and optimal implementation). The extension allows calculating the "implementation-adjusted" EVSI (IA-EVSI), a measure that accounts for different degrees of implementation. Calculations of implementation-adjusted estimates are illustrated under different scenarios through a stylized case study in non-small cell lung cancer. In the particular case study, the population values for EVSI and IA-EVSI were £ 25 million and £ 8 million, respectively; thus, a decision assuming perfect implementation would have overestimated the expected value of research by about £ 17 million. IA-EVSI was driven by the assumed time horizon and, importantly, the specified rate of change in implementation: the higher the rate, the greater the IA-EVSI and the lower the difference between IA-EVSI and EVSI. Traditionally calculated measures of population VoI rely on unrealistic assumptions about implementation. This article provides a simple framework that accounts for improved, rather than perfect, implementation and offers more realistic estimates of the expected value of research. © The Author(s) 2015.

  17. THEORETICAL PRESSUPOSITIONS OF EDUCATION: SOME HISTORICAL REFLECTIONS

    Directory of Open Access Journals (Sweden)

    Rodrigo Regert

    2017-09-01

    Full Text Available Education has always been a much discussed theme and continues to be so. Based on this idea, the goal of this article is to discuss the theoretical pressupositions of education, beginning with the idea itself of human intellectual development and passing through the Ancient, Medieval, Modern and Contemporary Ages. It is important to point out that the article does not have the intention of covering the whole of this theme, nor even all of the theoretical pressupositions, which would be impossible. But it intends to begin or at least continue this discussion. For this the research made use of the descriptive method and its technical procedures took place in a bibliographic way. We conclude that it is important to discuss the theoretical pressupositions of education in history since, without this, it is not even possible to understand current education.

  18. Information-Theoretic Limits on Broadband Multi-Antenna Systems in the Presence of Mutual Coupling

    Science.gov (United States)

    Taluja, Pawandeep Singh

    2011-12-01

    Multiple-input, multiple-output (MIMO) systems have received considerable attention over the last decade due to their ability to provide high throughputs and mitigate multipath fading effects. While most of these benefits are obtained for ideal arrays with large separation between the antennas, practical devices are often constrained in physical dimensions. With smaller inter-element spacings, signal correlation and mutual coupling between the antennas start to degrade the system performance, thereby limiting the deployment of a large number of antennas. Various studies have proposed transceiver designs based on optimal matching networks to compensate for this loss. However, such networks are considered impractical due to their multiport structure and sensitivity to the RF bandwidth of the system. In this dissertation, we investigate two aspects of compact transceiver design. First, we consider simpler architectures that exploit coupling between the antennas, and second, we establish information-theoretic limits of broadband communication systems with closely-spaced antennas. We begin with a receiver model of a diversity antenna selection system and propose novel strategies that make use of inactive elements by virtue of mutual coupling. We then examine the limits on the matching efficiency of a single antenna system using broadband matching theory. Next, we present an extension to this theory for coupled MIMO systems to elucidate the impact of coupling on the RF bandwidth of the system, and derive optimal transceiver designs. Lastly, we summarize the main findings of this dissertation and suggest open problems for future work.

  19. The application of information theory for the research of aging and aging-related diseases.

    Science.gov (United States)

    Blokh, David; Stambler, Ilia

    2017-10-01

    This article reviews the application of information-theoretical analysis, employing measures of entropy and mutual information, for the study of aging and aging-related diseases. The research of aging and aging-related diseases is particularly suitable for the application of information theory methods, as aging processes and related diseases are multi-parametric, with continuous parameters coexisting alongside discrete parameters, and with the relations between the parameters being as a rule non-linear. Information theory provides unique analytical capabilities for the solution of such problems, with unique advantages over common linear biostatistics. Among the age-related diseases, information theory has been used in the study of neurodegenerative diseases (particularly using EEG time series for diagnosis and prediction), cancer (particularly for establishing individual and combined cancer biomarkers), diabetes (mainly utilizing mutual information to characterize the diseased and aging states), and heart disease (mainly for the analysis of heart rate variability). Few works have employed information theory for the analysis of general aging processes and frailty, as underlying determinants and possible early preclinical diagnostic measures for aging-related diseases. Generally, the use of information-theoretical analysis permits not only establishing the (non-linear) correlations between diagnostic or therapeutic parameters of interest, but may also provide a theoretical insight into the nature of aging and related diseases by establishing the measures of variability, adaptation, regulation or homeostasis, within a system of interest. It may be hoped that the increased use of such measures in research may considerably increase diagnostic and therapeutic capabilities and the fundamental theoretical mathematical understanding of aging and disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Research on the method of measuring space information network capacity in communication service

    Directory of Open Access Journals (Sweden)

    Zhu Shichao

    2017-02-01

    Full Text Available Because of the large scale characteristic of space information network in terms of space and time and the increasing of its complexity,existing measuring methods of information transmission capacity have been unable to measure the existing and future space information networkeffectively.In this study,we firstly established a complex model of space information network,and measured the whole space information network capacity by means of analyzing data access capability to the network and data transmission capability within the network.At last,we verified the rationality of the proposed measuring method by using STK and Matlab simulation software for collaborative simulation.