WorldWideScience

Sample records for information-theoretic distribution test

  1. Goodness-of-fit tests for the Gompertz distribution

    DEFF Research Database (Denmark)

    Lenart, Adam; Missov, Trifon

    The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing ...... for the mean of the sample hazard and a nested test against the generalized extreme value distributions are discussed. Along with an application to laboratory rat data, critical values calculated by the empirical distribution of the test statistics are also presented.......The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing...

  2. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  3. Information-theoretic security proof for quantum-key-distribution protocols

    International Nuclear Information System (INIS)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-01-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel

  4. Information-theoretic security proof for quantum-key-distribution protocols

    Science.gov (United States)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-07-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel.

  5. On precipitation monitoring with theoretical statistical distributions

    Science.gov (United States)

    Cindrić, Ksenija; Juras, Josip; Pasarić, Zoran

    2018-04-01

    A common practice in meteorological drought monitoring is to transform the observed precipitation amounts to the standardised precipitation index (SPI). Though the gamma distribution is usually employed for this purpose, some other distribution may be used, particularly in regions where zero precipitation amounts are recorded frequently. In this study, two distributions are considered alongside with the gamma distribution: the compound Poisson exponential distribution (CPE) and the square root normal distribution (SRN). They are fitted to monthly precipitation amounts measured at 24 stations in Croatia in the 55-year-long period (1961-2015). At five stations, long-term series (1901-2015) are available and they have been used for a more detailed investigation. The accommodation of the theoretical distributions to empirical ones is tested by comparison of the corresponding empirical and theoretical ratios of the skewness and the coefficient of variation. Furthermore, following the common approach to precipitation monitoring (CLIMAT reports), the comparison of the empirical and theoretical quintiles in the two periods (1961-1990 and 1991-2015) is examined. The results from the present study reveal that it would be more appropriate to implement theoretical distributions in such climate reports, since they provide better evaluation for monitoring purposes than the current empirical distribution. Nevertheless, deciding on an optimal theoretical distribution for different climate regimes and for different time periods is not easy to accomplish. With regard to Croatian stations (covering different climate regimes), the CPE or SRN distribution could also be the right choice in the climatological practice, in addition to the gamma distribution.

  6. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  7. Theoretical value of psychological testing.

    Science.gov (United States)

    Shapiro, David

    2012-01-01

    Apart from their diagnostic value, psychological tests, especially the Rorschach test, have an important theoretical value for understanding psychopathology. They present a picture of a living person, in contrast to a picture of forces and agencies within the person. This rests on 2 advantages of tests over the usual psychiatric and psychoanalytic interviews: Tests are ahistorical and they present information primarily of a formal kind.

  8. Game-Theoretic Learning in Distributed Control

    KAUST Repository

    Marden, Jason R.

    2018-01-05

    In distributed architecture control problems, there is a collection of interconnected decision-making components that seek to realize desirable collective behaviors through local interactions and by processing local information. Applications range from autonomous vehicles to energy to transportation. One approach to control of such distributed architectures is to view the components as players in a game. In this approach, two design considerations are the components’ incentives and the rules that dictate how components react to the decisions of other components. In game-theoretic language, the incentives are defined through utility functions, and the reaction rules are online learning dynamics. This chapter presents an overview of this approach, covering basic concepts in game theory, special game classes, measures of distributed efficiency, utility design, and online learning rules, all with the interpretation of using game theory as a prescriptive paradigm for distributed control design.

  9. Exploring super-gaussianity towards robust information-theoretical time delay estimation

    DEFF Research Database (Denmark)

    Petsatodis, Theodoros; Talantzis, Fotios; Boukis, Christos

    2013-01-01

    the effect upon TDE when modeling the source signal with different speech-based distributions. An information theoretical TDE method indirectly encapsulating higher order statistics (HOS) formed the basis of this work. The underlying assumption of Gaussian distributed source has been replaced...

  10. Information-theoretic analysis of rotational distributions from quantal and quasiclassical computations of reactive and nonreactive scattering

    International Nuclear Information System (INIS)

    Bernstein, R.B.

    1976-01-01

    An information-theoretic approach to the analysis of rotational excitation cross sections was developed by Levine, Bernstein, Johnson, Procaccia, and coworkers and applied to state-to-state cross sections available from numerical computations of reactive and nonreactive scattering (for example, by Wyatt and Kuppermann and their coworkers and by Pack and Pattengill and others). The rotational surprisals are approximately linear in the energy transferred, thereby accounting for the so-called ''exponential gap law'' for rotational relaxation discovered experimentally by Polanyi, Woodall, and Ding. For the ''linear surprisal'' case the unique relation between the surprisal parameter theta/sub R/ and the first moment of the rotational energy distribution provides a link between the pattern of the rotational state distribution and those features of the potential surface which govern the average energy transfer

  11. Information-Theoretic Approaches for Evaluating Complex Adaptive Social Simulation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Ganguly, Auroop R [ORNL; Jiao, Yu [ORNL

    2009-01-01

    In this paper, we propose information-theoretic approaches for comparing and evaluating complex agent-based models. In information theoretic terms, entropy and mutual information are two measures of system complexity. We used entropy as a measure of the regularity of the number of agents in a social class; and mutual information as a measure of information shared by two social classes. Using our approaches, we compared two analogous agent-based (AB) models developed for regional-scale social-simulation system. The first AB model, called ABM-1, is a complex AB built with 10,000 agents on a desktop environment and used aggregate data; the second AB model, ABM-2, was built with 31 million agents on a highperformance computing framework located at Oak Ridge National Laboratory, and fine-resolution data from the LandScan Global Population Database. The initializations were slightly different, with ABM-1 using samples from a probability distribution and ABM-2 using polling data from Gallop for a deterministic initialization. The geographical and temporal domain was present-day Afghanistan, and the end result was the number of agents with one of three behavioral modes (proinsurgent, neutral, and pro-government) corresponding to the population mindshare. The theories embedded in each model were identical, and the test simulations focused on a test of three leadership theories - legitimacy, coercion, and representative, and two social mobilization theories - social influence and repression. The theories are tied together using the Cobb-Douglas utility function. Based on our results, the hypothesis that performance measures can be developed to compare and contrast AB models appears to be supported. Furthermore, we observed significant bias in the two models. Even so, further tests and investigations are required not only with a wider class of theories and AB models, but also with additional observed or simulated data and more comprehensive performance measures.

  12. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  13. Inform: Efficient Information-Theoretic Analysis of Collective Behaviors

    Directory of Open Access Journals (Sweden)

    Douglas G. Moore

    2018-06-01

    Full Text Available The study of collective behavior has traditionally relied on a variety of different methodological tools ranging from more theoretical methods such as population or game-theoretic models to empirical ones like Monte Carlo or multi-agent simulations. An approach that is increasingly being explored is the use of information theory as a methodological framework to study the flow of information and the statistical properties of collectives of interacting agents. While a few general purpose toolkits exist, most of the existing software for information theoretic analysis of collective systems is limited in scope. We introduce Inform, an open-source framework for efficient information theoretic analysis that exploits the computational power of a C library while simplifying its use through a variety of wrappers for common higher-level scripting languages. We focus on two such wrappers here: PyInform (Python and rinform (R. Inform and its wrappers are cross-platform and general-purpose. They include classical information-theoretic measures, measures of information dynamics and information-based methods to study the statistical behavior of collective systems, and expose a lower-level API that allow users to construct measures of their own. We describe the architecture of the Inform framework, study its computational efficiency and use it to analyze three different case studies of collective behavior: biochemical information storage in regenerating planaria, nest-site selection in the ant Temnothorax rugatulus, and collective decision making in multi-agent simulations.

  14. Hash functions and information theoretic security

    DEFF Research Database (Denmark)

    Bagheri, Nasoor; Knudsen, Lars Ramkilde; Naderi, Majid

    2009-01-01

    Information theoretic security is an important security notion in cryptography as it provides a true lower bound for attack complexities. However, in practice attacks often have a higher cost than the information theoretic bound. In this paper we study the relationship between information theoretic...

  15. Robust recognition via information theoretic learning

    CERN Document Server

    He, Ran; Yuan, Xiaotong; Wang, Liang

    2014-01-01

    This Springer Brief represents a comprehensive review of information theoretic methods for robust recognition. A variety of information theoretic methods have been proffered in the past decade, in a large variety of computer vision applications; this work brings them together, attempts to impart the theory, optimization and usage of information entropy.The?authors?resort to a new information theoretic concept, correntropy, as a robust measure and apply it to solve robust face recognition and object recognition problems. For computational efficiency,?the brief?introduces the additive and multip

  16. Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic

    NARCIS (Netherlands)

    Emons, W.H.M.; Meijer, R.R.; Sijtsma, K.

    2002-01-01

    The accuracy with which the theoretical sampling distribution of van der Flier's person-.t statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I

  17. Role of information theoretic uncertainty relations in quantum theory

    International Nuclear Information System (INIS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-01-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed

  18. Role of information theoretic uncertainty relations in quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  19. Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic

    NARCIS (Netherlands)

    Emons, Wilco H.M.; Meijer, R.R.; Sijtsma, Klaas

    2002-01-01

    The accuracy with which the theoretical sampling distribution of van der Flier’s person-fit statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I

  20. Toward a Theoretical Framework for Information Science

    Directory of Open Access Journals (Sweden)

    Amanda Spink

    2000-01-01

    Full Text Available Information Science is beginning to develop a theoretical framework for the modeling of users’ interactions with information retrieval (IR technologies within the more holistic context of human information behavior (Spink, 1998b. This paper addresses the following questions: (1 What is the nature of Information Science? and (2 What theoretical framework and model is most appropriate for Information Science? This paper proposes a theoretical framework for Information Science based on an explication of the processes of human information coordinating behavior and information feedback that facilitate the relationship between human information behavior and human interaction with information retrieval (IR technologies (Web, digital libraries, etc..

  1. Information theoretic description of networks

    Science.gov (United States)

    Wilhelm, Thomas; Hollunder, Jens

    2007-11-01

    We present a new information theoretic approach for network characterizations. It is developed to describe the general type of networks with n nodes and L directed and weighted links, i.e., it also works for the simpler undirected and unweighted networks. The new information theoretic measures for network characterizations are based on a transmitter-receiver analogy of effluxes and influxes. Based on these measures, we classify networks as either complex or non-complex and as either democracy or dictatorship networks. Directed networks, in particular, are furthermore classified as either information spreading and information collecting networks. The complexity classification is based on the information theoretic network complexity measure medium articulation (MA). It is proven that special networks with a medium number of links ( L∼n1.5) show the theoretical maximum complexity MA=(log n)2/2. A network is complex if its MA is larger than the average MA of appropriately randomized networks: MA>MAr. A network is of the democracy type if its redundancy Rdictatorship network. In democracy networks all nodes are, on average, of similar importance, whereas in dictatorship networks some nodes play distinguished roles in network functioning. In other words, democracy networks are characterized by cycling of information (or mass, or energy), while in dictatorship networks there is a straight through-flow from sources to sinks. The classification of directed networks into information spreading and information collecting networks is based on the conditional entropies of the considered networks ( H(A/B)=uncertainty of sender node if receiver node is known, H(B/A)=uncertainty of receiver node if sender node is known): if H(A/B)>H(B/A), it is an information collecting network, otherwise an information spreading network. Finally, different real networks (directed and undirected, weighted and unweighted) are classified according to our general scheme.

  2. Comparing Simulated and Theoretical Sampling Distributions of the U3 Person-Fit Statistic.

    Science.gov (United States)

    Emons, Wilco H. M.; Meijer, Rob R.; Sijtsma, Klaas

    2002-01-01

    Studied whether the theoretical sampling distribution of the U3 person-fit statistic is in agreement with the simulated sampling distribution under different item response theory models and varying item and test characteristics. Simulation results suggest that the use of standard normal deviates for the standardized version of the U3 statistic may…

  3. Do pseudo-absence selection strategies influence species distribution models and their predictions? An information-theoretic approach based on simulated data

    Directory of Open Access Journals (Sweden)

    Guisan Antoine

    2009-04-01

    Full Text Available Abstract Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a real absences b pseudo-absences selected randomly from the background and c two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97, and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have

  4. Information density converges in dialogue: Towards an information-theoretic model.

    Science.gov (United States)

    Xu, Yang; Reitter, David

    2018-01-01

    The principle of entropy rate constancy (ERC) states that language users distribute information such that words tend to be equally predictable given previous contexts. We examine the applicability of this principle to spoken dialogue, as previous findings primarily rest on written text. The study takes into account the joint-activity nature of dialogue and the topic shift mechanisms that are different from monologue. It examines how the information contributions from the two dialogue partners interactively evolve as the discourse develops. The increase of local sentence-level information density (predicted by ERC) is shown to apply to dialogue overall. However, when the different roles of interlocutors in introducing new topics are identified, their contribution in information content displays a new converging pattern. We draw explanations to this pattern from multiple perspectives: Casting dialogue as an information exchange system would mean that the pattern is the result of two interlocutors maintaining their own context rather than sharing one. Second, we present some empirical evidence that a model of Interactive Alignment may include information density to explain the effect. Third, we argue that building common ground is a process analogous to information convergence. Thus, we put forward an information-theoretic view of dialogue, under which some existing theories of human dialogue may eventually be unified. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. An Information-Theoretic-Cluster Visualization for Self-Organizing Maps.

    Science.gov (United States)

    Brito da Silva, Leonardo Enzo; Wunsch, Donald C

    2018-06-01

    Improved data visualization will be a significant tool to enhance cluster analysis. In this paper, an information-theoretic-based method for cluster visualization using self-organizing maps (SOMs) is presented. The information-theoretic visualization (IT-vis) has the same structure as the unified distance matrix, but instead of depicting Euclidean distances between adjacent neurons, it displays the similarity between the distributions associated with adjacent neurons. Each SOM neuron has an associated subset of the data set whose cardinality controls the granularity of the IT-vis and with which the first- and second-order statistics are computed and used to estimate their probability density functions. These are used to calculate the similarity measure, based on Renyi's quadratic cross entropy and cross information potential (CIP). The introduced visualizations combine the low computational cost and kernel estimation properties of the representative CIP and the data structure representation of a single-linkage-based grouping algorithm to generate an enhanced SOM-based visualization. The visual quality of the IT-vis is assessed by comparing it with other visualization methods for several real-world and synthetic benchmark data sets. Thus, this paper also contains a significant literature survey. The experiments demonstrate the IT-vis cluster revealing capabilities, in which cluster boundaries are sharply captured. Additionally, the information-theoretic visualizations are used to perform clustering of the SOM. Compared with other methods, IT-vis of large SOMs yielded the best results in this paper, for which the quality of the final partitions was evaluated using external validity indices.

  6. Informing Physics: Jacob Bekenstein and the Informational Turn in Theoretical Physics

    Science.gov (United States)

    Belfer, Israel

    2014-03-01

    In his PhD dissertation in the early 1970s, the Mexican-Israeli theoretical physicist Jacob Bekenstein developed the thermodynamics of black holes using a generalized version of the second law of thermodynamics. This work made it possible for physicists to describe and analyze black holes using information-theoretical concepts. It also helped to transform information theory into a fundamental and foundational concept in theoretical physics. The story of Bekenstein's work—which was initially opposed by many scientists, including Stephen Hawking—highlights the transformation within physics towards an information-oriented scientific mode of theorizing. This "informational turn" amounted to a mild-mannered revolution within physics, revolutionary without being rebellious.

  7. Information-Theoretic Inference of Common Ancestors

    Directory of Open Access Journals (Sweden)

    Bastian Steudel

    2015-04-01

    Full Text Available A directed acyclic graph (DAG partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version.

  8. On distribution reduction and algorithm implementation in inconsistent ordered information systems.

    Science.gov (United States)

    Zhang, Yanqin

    2014-01-01

    As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems.

  9. Information theoretic bounds for compressed sensing in SAR imaging

    International Nuclear Information System (INIS)

    Jingxiong, Zhang; Ke, Yang; Jianzhong, Guo

    2014-01-01

    Compressed sensing (CS) is a new framework for sampling and reconstructing sparse signals from measurements significantly fewer than those prescribed by Nyquist rate in the Shannon sampling theorem. This new strategy, applied in various application areas including synthetic aperture radar (SAR), relies on two principles: sparsity, which is related to the signals of interest, and incoherence, which refers to the sensing modality. An important question in CS-based SAR system design concerns sampling rate necessary and sufficient for exact or approximate recovery of sparse signals. In the literature, bounds of measurements (or sampling rate) in CS have been proposed from the perspective of information theory. However, these information-theoretic bounds need to be reviewed and, if necessary, validated for CS-based SAR imaging, as there are various assumptions made in the derivations of lower and upper bounds on sub-Nyquist sampling rates, which may not hold true in CS-based SAR imaging. In this paper, information-theoretic bounds of sampling rate will be analyzed. For this, the SAR measurement system is modeled as an information channel, with channel capacity and rate-distortion characteristics evaluated to enable the determination of sampling rates required for recovery of sparse scenes. Experiments based on simulated data will be undertaken to test the theoretic bounds against empirical results about sampling rates required to achieve certain detection error probabilities

  10. Theoretical information measurement in nonrelativistic time-dependent approach

    Science.gov (United States)

    Najafizade, S. A.; Hassanabadi, H.; Zarrinkamar, S.

    2018-02-01

    The information-theoretic measures of time-dependent Schrödinger equation are investigated via the Shannon information entropy, variance and local Fisher quantities. In our calculations, we consider the two first states n = 0,1 and obtain the position Sx (t) and momentum Sp (t) Shannon entropies as well as Fisher information Ix (t) in position and momentum Ip (t) spaces. Using the Fourier transformed wave function, we obtain the results in momentum space. Some interesting features of the information entropy densities ρs (x,t) and γs (p,t), as well as the probability densities ρ (x,t) and γ (p,t) for time-dependent states are demonstrated. We establish a general relation between variance and Fisher's information. The Bialynicki-Birula-Mycielski inequality is tested and verified for the states n = 0,1.

  11. One-dimensional barcode reading: an information theoretic approach

    Science.gov (United States)

    Houni, Karim; Sawaya, Wadih; Delignon, Yves

    2008-03-01

    In the convergence context of identification technology and information-data transmission, the barcode found its place as the simplest and the most pervasive solution for new uses, especially within mobile commerce, bringing youth to this long-lived technology. From a communication theory point of view, a barcode is a singular coding based on a graphical representation of the information to be transmitted. We present an information theoretic approach for 1D image-based barcode reading analysis. With a barcode facing the camera, distortions and acquisition are modeled as a communication channel. The performance of the system is evaluated by means of the average mutual information quantity. On the basis of this theoretical criterion for a reliable transmission, we introduce two new measures: the theoretical depth of field and the theoretical resolution. Simulations illustrate the gain of this approach.

  12. Information-Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory.

    Science.gov (United States)

    Agres, Kat; Abdallah, Samer; Pearce, Marcus

    2018-01-01

    A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different types of statistical information affect listeners' memory for auditory stimuli. We used a combination of behavioral and computational methods to investigate memory for non-linguistic auditory sequences. Participants repeatedly heard tone sequences varying systematically in their information-theoretic properties. Expectedness ratings of tones were collected during three listening sessions, and a recognition memory test was given after each session. Information-theoretic measures of sequential predictability significantly influenced listeners' expectedness ratings, and variations in these properties had a significant impact on memory performance. Predictable sequences yielded increasingly better memory performance with increasing exposure. Computational simulations using a probabilistic model of auditory expectation suggest that listeners dynamically formed a new, and increasingly accurate, implicit cognitive model of the information-theoretic structure of the sequences throughout the experimental session. Copyright © 2017 Cognitive Science Society, Inc.

  13. Information theoretic preattentive saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2011-01-01

    Employing an information theoretic operational definition of bottom-up attention from the field of computational visual perception a very general expression for saliency is provided. As opposed to many of the current approaches to determining a saliency map there is no need for an explicit data...... of which features, image information is described. We illustrate our result by determining a few specific saliency maps based on particular choices of features. One of them makes the link with the mapping underlying well-known Harris interest points, which is a result recently obtained in isolation...

  14. System identification with information theoretic criteria

    NARCIS (Netherlands)

    A.A. Stoorvogel; J.H. van Schuppen (Jan)

    1995-01-01

    textabstractAttention is focused in this paper on the approximation problem of system identification with information theoretic criteria. For a class of problems it is shown that the criterion of mutual information rate is identical to the criterion of exponential-of-quadratic cost and to

  15. Beyond the 2×2 -contingency table: a primer on entropies and mutual information in various scenarios involving m diagnostic categories and n categories of diagnostic tests.

    Science.gov (United States)

    Reibnegger, Gilbert

    2013-10-21

    Usual evaluation tools for diagnostic tests such as, sensitivity/specificity and ROC analyses, are designed for the discrimination between two diagnostic categories, using dichotomous test results. Information theoretical quantities such as mutual information allow in depth-analysis of more complex discrimination problems, including continuous test results, but are rarely used in clinical chemistry. This paper provides a primer on useful information theoretical concepts with a strong focus on typical diagnostic scenarios. Information theoretical concepts are shortly explained. Mathematica CDF documents are provided which compute entropies and mutual information as function of pretest probabilities and the distribution of test results among the categories, and allow interactive exploration of the behavior of these quantities in comparison with more conventional diagnostic measures. Using data from a previously published study, the application of information theory to practical diagnostic problems involving up to 4×4 -contingency tables is demonstrated. Information theoretical concepts are particularly useful for diagnostic problems requiring more than the usual binary classification. Quantitative test results can be properly analyzed, and in contrast to popular concepts such as ROC analysis, the effects of variations of pre-test probabilities of the diagnostic categories can be explicitly taken into account. © 2013 Elsevier B.V. All rights reserved.

  16. 10 CFR 431.198 - Enforcement testing for distribution transformers.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Enforcement testing for distribution transformers. 431.198... COMMERCIAL AND INDUSTRIAL EQUIPMENT Distribution Transformers Compliance and Enforcement § 431.198 Enforcement testing for distribution transformers. (a) Test notice. Upon receiving information in writing...

  17. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  18. Information-Theoretical Analysis of EEG Microstate Sequences in Python

    Directory of Open Access Journals (Sweden)

    Frederic von Wegner

    2018-06-01

    Full Text Available We present an open-source Python package to compute information-theoretical quantities for electroencephalographic data. Electroencephalography (EEG measures the electrical potential generated by the cerebral cortex and the set of spatial patterns projected by the brain's electrical potential on the scalp surface can be clustered into a set of representative maps called EEG microstates. Microstate time series are obtained by competitively fitting the microstate maps back into the EEG data set, i.e., by substituting the EEG data at a given time with the label of the microstate that has the highest similarity with the actual EEG topography. As microstate sequences consist of non-metric random variables, e.g., the letters A–D, we recently introduced information-theoretical measures to quantify these time series. In wakeful resting state EEG recordings, we found new characteristics of microstate sequences such as periodicities related to EEG frequency bands. The algorithms used are here provided as an open-source package and their use is explained in a tutorial style. The package is self-contained and the programming style is procedural, focusing on code intelligibility and easy portability. Using a sample EEG file, we demonstrate how to perform EEG microstate segmentation using the modified K-means approach, and how to compute and visualize the recently introduced information-theoretical tests and quantities. The time-lagged mutual information function is derived as a discrete symbolic alternative to the autocorrelation function for metric time series and confidence intervals are computed from Markov chain surrogate data. The software package provides an open-source extension to the existing implementations of the microstate transform and is specifically designed to analyze resting state EEG recordings.

  19. Information theoretic learning Renyi's entropy and Kernel perspectives

    CERN Document Server

    Principe, Jose C

    2010-01-01

    This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy. ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesi

  20. Testing the mutual information expansion of entropy with multivariate Gaussian distributions.

    Science.gov (United States)

    Goethe, Martin; Fita, Ignacio; Rubi, J Miguel

    2017-12-14

    The mutual information expansion (MIE) represents an approximation of the configurational entropy in terms of low-dimensional integrals. It is frequently employed to compute entropies from simulation data of large systems, such as macromolecules, for which brute-force evaluation of the full configurational integral is intractable. Here, we test the validity of MIE for systems consisting of more than m = 100 degrees of freedom (dofs). The dofs are distributed according to multivariate Gaussian distributions which were generated from protein structures using a variant of the anisotropic network model. For the Gaussian distributions, we have semi-analytical access to the configurational entropy as well as to all contributions of MIE. This allows us to accurately assess the validity of MIE for different situations. We find that MIE diverges for systems containing long-range correlations which means that the error of consecutive MIE approximations grows with the truncation order n for all tractable n ≪ m. This fact implies severe limitations on the applicability of MIE, which are discussed in the article. For systems with correlations that decay exponentially with distance, MIE represents an asymptotic expansion of entropy, where the first successive MIE approximations approach the exact entropy, while MIE also diverges for larger orders. In this case, MIE serves as a useful entropy expansion when truncated up to a specific truncation order which depends on the correlation length of the system.

  1. An Information Theoretic Characterisation of Auditory Encoding

    Science.gov (United States)

    Overath, Tobias; Cusack, Rhodri; Kumar, Sukhbinder; von Kriegstein, Katharina; Warren, Jason D; Grube, Manon; Carlyon, Robert P; Griffiths, Timothy D

    2007-01-01

    The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. By systematically varying the entropy of pitch sequences, we sought brain areas where neural activity and energetic demands increase as a function of entropy. Such a relationship is predicted to occur in an efficient encoding mechanism that uses less computational resource when less information is present in the signal: we specifically tested the hypothesis that such a relationship is present in the planum temporale (PT). In two convergent functional MRI studies, we demonstrated this relationship in PT for encoding, while furthermore showing that a distributed fronto-parietal network for retrieval of acoustic information is independent of entropy. The results establish PT as an efficient neural engine that demands less computational resource to encode redundant signals than those with high information content. PMID:17958472

  2. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    Science.gov (United States)

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  3. LPI Optimization Framework for Target Tracking in Radar Network Architectures Using Information-Theoretic Criteria

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2014-01-01

    Full Text Available Widely distributed radar network architectures can provide significant performance improvement for target detection and localization. For a fixed radar network, the achievable target detection performance may go beyond a predetermined threshold with full transmitted power allocation, which is extremely vulnerable in modern electronic warfare. In this paper, we study the problem of low probability of intercept (LPI design for radar network and propose two novel LPI optimization schemes based on information-theoretic criteria. For a predefined threshold of target detection, Schleher intercept factor is minimized by optimizing transmission power allocation among netted radars in the network. Due to the lack of analytical closed-form expression for receiver operation characteristics (ROC, we employ two information-theoretic criteria, namely, Bhattacharyya distance and J-divergence as the metrics for target detection performance. The resulting nonconvex and nonlinear LPI optimization problems associated with different information-theoretic criteria are cast under a unified framework, and the nonlinear programming based genetic algorithm (NPGA is used to tackle the optimization problems in the framework. Numerical simulations demonstrate that our proposed LPI strategies are effective in enhancing the LPI performance for radar network.

  4. Development and validation of a theoretical test in basic laparoscopy

    DEFF Research Database (Denmark)

    Strandbygaard, Jeanett; Maagaard, Mathilde; Larsen, Christian Rifbjerg

    2013-01-01

    for first-year residents in obstetrics and gynecology. This study therefore aimed to develop and validate a framework for a theoretical knowledge test, a multiple-choice test, in basic theory related to laparoscopy. METHODS: The content of the multiple-choice test was determined by conducting informal...... conversational interviews with experts in laparoscopy. The subsequent relevance of the test questions was evaluated using the Delphi method involving regional chief physicians. Construct validity was tested by comparing test results from three groups with expected different clinical competence and knowledge.......001). Internal consistency (Cronbach's alpha) was 0.82. There was no evidence of differential item functioning between the three groups tested. CONCLUSIONS: A newly developed knowledge test in basic laparoscopy proved to have content and construct validity. The formula for the development and validation...

  5. Information Theoretic-Learning Auto-Encoder

    OpenAIRE

    Santana, Eder; Emigh, Matthew; Principe, Jose C

    2016-01-01

    We propose Information Theoretic-Learning (ITL) divergence measures for variational regularization of neural networks. We also explore ITL-regularized autoencoders as an alternative to variational autoencoding bayes, adversarial autoencoders and generative adversarial networks for randomly generating sample data without explicitly defining a partition function. This paper also formalizes, generative moment matching networks under the ITL framework.

  6. Investigation of Means of Mitigating Congestion in Complex, Distributed Network Systems by Optimization Means and Information Theoretic Procedures

    Science.gov (United States)

    2008-02-01

    Information Theoretic Proceedures Frank Mufalli Rakesh Nagi Jim Llinas Sumita Mishra SUNY at Buffalo— CUBRC 4455 Genessee Street Buffalo...5f. WORK UNIT NUMBER NY 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) SUNY at Buffalo— CUBRC * Paine College ** 4455 Genessee

  7. Information-Theoretic Bounded Rationality and ε-Optimality

    Directory of Open Access Journals (Sweden)

    Daniel A. Braun

    2014-08-01

    Full Text Available Bounded rationality concerns the study of decision makers with limited information processing resources. Previously, the free energy difference functional has been suggested to model bounded rational decision making, as it provides a natural trade-off between an energy or utility function that is to be optimized and information processing costs that are measured by entropic search costs. The main question of this article is how the information-theoretic free energy model relates to simple ε-optimality models of bounded rational decision making, where the decision maker is satisfied with any action in an ε-neighborhood of the optimal utility. We find that the stochastic policies that optimize the free energy trade-off comply with the notion of ε-optimality. Moreover, this optimality criterion even holds when the environment is adversarial. We conclude that the study of bounded rationality based on ε-optimality criteria that abstract away from the particulars of the information processing constraints is compatible with the information-theoretic free energy model of bounded rationality.

  8. Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.

    Science.gov (United States)

    Harman, Donna; And Others

    1991-01-01

    Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…

  9. Theoretical development of information science: A brief history

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    the strongest “paradigms” in the field is a tradition derived from the Cranfield experiments in the 1960s and the bibliometric research following the publication of Science Citation Index from 1963 and forward. Among the competing theoretical frameworks, ‘the cognitive view’ became influential from the 1970s......This paper presents a brief history of information science (IS) as viewed by the author. The term ‘information science’ goes back to 1955 and evolved in the aftermath of Claude Shannon’s ‘information theory’ (1948), which also inspired research into problems in fields of library science...... and documentation. These subjects were a main focus of what became established as ‘information science’, which from 1964 onwards was often termed ‘library and information science’ (LIS). However, the usefulness of Shannon’s information theory as the theoretical foundation of the field was been challenged. Among...

  10. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar

  11. A Game-theoretical Approach for Distributed Cooperative Control of Autonomous Underwater Vehicles

    KAUST Repository

    Lu, Yimeng

    2018-05-01

    This thesis explores a game-theoretical approach for underwater environmental monitoring applications. We first apply game-theoretical algorithm to multi-agent resource coverage problem in drifting environments. Furthermore, existing utility design and learning process of the algorithm are modified to fit specific constraints of underwater exploration/monitoring tasks. The revised approach can take the real scenario of underwater monitoring applications such as the effect of sea current, previous knowledge of the resource and occasional communications between agents into account, and adapt to them to reach better performance. As the motivation of this thesis is from real applications, in this work we emphasize highly on implementation phase. A ROS-Gazebo simulation environment was created for preparation of actual tests. The algorithms are implemented in simulating both the dynamics of vehicles and the environment. After that, a multi-agent underwater autonomous robotic system was developed for hardware test in real settings with local controllers to make their own decisions. These systems are used for testing above mentioned algorithms and future development of other underwater projects. After that, other works related to robotics during this thesis will be briefly mentioned, including contributions in MBZIRC robotics competition and distributed control of UAVs in an adversarial environment.

  12. Information-theoretic lengths of Jacobi polynomials

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, A; Dehesa, J S [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, Granada (Spain); Sanchez-Moreno, P, E-mail: agmartinez@ugr.e, E-mail: pablos@ugr.e, E-mail: dehesa@ugr.e [Instituto ' Carlos I' de Fisica Teorica y Computacional, Universidad de Granada, Granada (Spain)

    2010-07-30

    The information-theoretic lengths of the Jacobi polynomials P{sup ({alpha}, {beta})}{sub n}(x), which are information-theoretic measures (Renyi, Shannon and Fisher) of their associated Rakhmanov probability density, are investigated. They quantify the spreading of the polynomials along the orthogonality interval [- 1, 1] in a complementary but different way as the root-mean-square or standard deviation because, contrary to this measure, they do not refer to any specific point of the interval. The explicit expressions of the Fisher length are given. The Renyi lengths are found by the use of the combinatorial multivariable Bell polynomials in terms of the polynomial degree n and the parameters ({alpha}, {beta}). The Shannon length, which cannot be exactly calculated because of its logarithmic functional form, is bounded from below by using sharp upper bounds to general densities on [- 1, +1] given in terms of various expectation values; moreover, its asymptotics is also pointed out. Finally, several computational issues relative to these three quantities are carefully analyzed.

  13. Information-theoretic metamodel of organizational evolution

    Science.gov (United States)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  14. Development and Validation of a Theoretical Test in Endosonography for Pulmonary Diseases

    DEFF Research Database (Denmark)

    Savran, Mona M; Clementsen, Paul Frost; Annema, Jouke T

    2014-01-01

    evidence for this test. METHODS: Initially, 78 questions were constructed after informal conversational interviews with 4 international experts in endosonography. The clarity and content validity of the questions were tested using a Delphi-like approach. Construct validity was explored by administering......BACKGROUND: Theoretical testing provides the necessary foundation to perform technical skills. Additionally, testing improves the retention of knowledge. OBJECTIVES: The aims of this study were to develop a multiple-choice test in endosonography for pulmonary diseases and to gather validity...... consistently than the novices (p = 0.037) and the intermediates (p Validity evidence was gathered, and the test demonstrated content and construct validity....

  15. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Meyer Patrick

    2007-01-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  16. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Patrick E. Meyer

    2007-06-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  17. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings.

    Science.gov (United States)

    Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon

    2014-01-01

    Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  18. Several foundational and information theoretic implications of Bell’s theorem

    Science.gov (United States)

    Kar, Guruprasad; Banik, Manik

    2016-08-01

    In 1935, Albert Einstein and two colleagues, Boris Podolsky and Nathan Rosen (EPR) developed a thought experiment to demonstrate what they felt was a lack of completeness in quantum mechanics (QM). EPR also postulated the existence of more fundamental theory where physical reality of any system would be completely described by the variables/states of that fundamental theory. This variable is commonly called hidden variable and the theory is called hidden variable theory (HVT). In 1964, John Bell proposed an empirically verifiable criterion to test for the existence of these HVTs. He derived an inequality, which must be satisfied by any theory that fulfill the conditions of locality and reality. He also showed that QM, as it violates this inequality, is incompatible with any local-realistic theory. Later it has been shown that Bell’s inequality (BI) can be derived from different set of assumptions and it also find applications in useful information theoretic protocols. In this review, we will discuss various foundational as well as information theoretic implications of BI. We will also discuss about some restricted nonlocal feature of quantum nonlocality and elaborate the role of Uncertainty principle and Complementarity principle in explaining this feature.

  19. Theoretical basis for transfer of laboratory test results of grain size distribution of coal to real object

    Energy Technology Data Exchange (ETDEWEB)

    Sikora, W; Chodura, J [Politechnika Sladska, Gliwice (Poland). Instytut Mechanizacji Gornictwa

    1989-01-01

    Evaluates a method for forecasting size distribution of black coal mined by shearer loaders in one coal seam. Laboratory tests for determining coal comminution during cutting and haulage along the face are analyzed. Methods for forecasting grain size distribution of coal under operational conditions using formulae developed on the basis of laboratory tests are discussed. Recommendations for design of a test stand and test conditions are discussed. A laboratory stand should accurately model operational conditions of coal cutting, especially dimensions of the individual elements of the shearer loader, geometry of the cutting drum and cutting tools, and strength characteristics of the coal seam. 9 refs.

  20. The pressure distribution for biharmonic transmitting array: theoretical study

    Science.gov (United States)

    Baranowska, A.

    2005-03-01

    The aim of the paper is theoretical analysis of the finite amplitude waves interaction problem for the biharmonic transmitting array. We assume that the array consists of 16 circular pistons of the same dimensions that regrouped in two sections. Two different arrangements of radiating elements were considered. In this situation the radiating surface is non-continuous without axial symmetry. The mathematical model was built on the basis of the Khokhlov - Zabolotskaya - Kuznetsov (KZK) equation. To solve the problem the finite-difference method was applied. On-axis pressure amplitude for different frequency waves as a function of distance from the source, transverse pressure distribution of these waves at fixed distances from the source and pressure amplitude distribution for them at fixed planes were examined. Especially changes of normalized pressure amplitude for difference frequency were studied. The paper presents mathematical model and some results of theoretical investigations obtained for different values of source parameters.

  1. Effect of distributive mass of spring on power flow in engineering test

    Science.gov (United States)

    Sheng, Meiping; Wang, Ting; Wang, Minqing; Wang, Xiao; Zhao, Xuan

    2018-06-01

    Mass of spring is always neglected in theoretical and simulative analysis, while it may be a significance in practical engineering. This paper is concerned with the distributive mass of a steel spring which is used as an isolator to simulate isolation performance of a water pipe in a heating system. Theoretical derivation of distributive mass effect of steel spring on vibration is presented, and multiple eigenfrequencies are obtained, which manifest that distributive mass results in extra modes and complex impedance properties. Furthermore, numerical simulation visually shows several anti-resonances of the steel spring corresponding to impedance and power flow curves. When anti-resonances emerge, the spring collects large energy which may cause damage and unexpected consequences in practical engineering and needs to be avoided. Finally, experimental tests are conducted and results show consistency with that of the simulation of the spring with distributive mass.

  2. Theoretical Models, Assessment Frameworks and Test Construction.

    Science.gov (United States)

    Chalhoub-Deville, Micheline

    1997-01-01

    Reviews the usefulness of proficiency models influencing second language testing. Findings indicate that several factors contribute to the lack of congruence between models and test construction and make a case for distinguishing between theoretical models. Underscores the significance of an empirical, contextualized and structured approach to the…

  3. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings

    Directory of Open Access Journals (Sweden)

    Siaw-Teng Liaw

    2014-10-01

    Full Text Available Introduction Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework.Methods We searched PubMed, Medline, Web of Science, ABI Inform (Proquest and Business Source Premier (EBSCO using the terms curation, information ecosystem, data quality management (DQM, data governance, information governance (IG and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise.Findings There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly ‘big-data’ environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle.Conclusions The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  4. Biometric security from an information-theoretical perspective

    NARCIS (Netherlands)

    Ignatenko, T.; Willems, F.M.J.

    2012-01-01

    In this review, biometric systems are studied from an information theoretical point of view. In the first part biometric authentication systems are studied. The objective of these systems is, observing correlated enrollment and authentication biometric sequences, to generate or convey as large as

  5. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  6. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  7. Nonlocal correlations as an information-theoretic resource

    International Nuclear Information System (INIS)

    Barrett, Jonathan; Massar, Serge; Pironio, Stefano; Linden, Noah; Popescu, Sandu; Roberts, David

    2005-01-01

    It is well known that measurements performed on spatially separated entangled quantum systems can give rise to correlations that are nonlocal, in the sense that a Bell inequality is violated. They cannot, however, be used for superluminal signaling. It is also known that it is possible to write down sets of 'superquantum' correlations that are more nonlocal than is allowed by quantum mechanics, yet are still nonsignaling. Viewed as an information-theoretic resource, superquantum correlations are very powerful at reducing the amount of communication needed for distributed computational tasks. An intriguing question is why quantum mechanics does not allow these more powerful correlations. We aim to shed light on the range of quantum possibilities by placing them within a wider context. With this in mind, we investigate the set of correlations that are constrained only by the no-signaling principle. These correlations form a polytope, which contains the quantum correlations as a (proper) subset. We determine the vertices of the no-signaling polytope in the case that two observers each choose from two possible measurements with d outcomes. We then consider how interconversions between different sorts of correlations may be achieved. Finally, we consider some multipartite examples

  8. An Information-Theoretic Approach to PMU Placement in Electric Power Systems

    OpenAIRE

    Li, Qiao; Cui, Tao; Weng, Yang; Negi, Rohit; Franchetti, Franz; Ilic, Marija D.

    2012-01-01

    This paper presents an information-theoretic approach to address the phasor measurement unit (PMU) placement problem in electric power systems. Different from the conventional 'topological observability' based approaches, this paper advocates a much more refined, information-theoretic criterion, namely the mutual information (MI) between the PMU measurements and the power system states. The proposed MI criterion can not only include the full system observability as a special case, but also ca...

  9. Model-Driven Test Generation of Distributed Systems

    Science.gov (United States)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  10. Information-theoretic decomposition of embodied and situated systems.

    Science.gov (United States)

    Da Rold, Federico

    2018-07-01

    The embodied and situated view of cognition stresses the importance of real-time and nonlinear bodily interaction with the environment for developing concepts and structuring knowledge. In this article, populations of robots controlled by an artificial neural network learn a wall-following task through artificial evolution. At the end of the evolutionary process, time series are recorded from perceptual and motor neurons of selected robots. Information-theoretic measures are estimated on pairings of variables to unveil nonlinear interactions that structure the agent-environment system. Specifically, the mutual information is utilized to quantify the degree of dependence and the transfer entropy to detect the direction of the information flow. Furthermore, the system is analyzed with the local form of such measures, thus capturing the underlying dynamics of information. Results show that different measures are interdependent and complementary in uncovering aspects of the robots' interaction with the environment, as well as characteristics of the functional neural structure. Therefore, the set of information-theoretic measures provides a decomposition of the system, capturing the intricacy of nonlinear relationships that characterize robots' behavior and neural dynamics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    Energy Technology Data Exchange (ETDEWEB)

    Tsourtis, Anastasios, E-mail: tsourtis@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, Crete (Greece); Pantazis, Yannis, E-mail: pantazis@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, and Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), GR-70013 Heraklion, Crete (Greece)

    2015-07-07

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities.

  12. Score distributions in information retrieval

    NARCIS (Netherlands)

    Arampatzis, A.; Robertson, S.; Kamps, J.

    2009-01-01

    We review the history of modeling score distributions, focusing on the mixture of normal-exponential by investigating the theoretical as well as the empirical evidence supporting its use. We discuss previously suggested conditions which valid binary mixture models should satisfy, such as the

  13. Information Ergonomics A theoretical approach and practical experience in transportation

    CERN Document Server

    Sandl, Peter

    2012-01-01

    The variety and increasing availability of hypermedia information systems, which are used in stationary applications like operators’ consoles as well as mobile systems, e.g. driver information and navigation systems in automobiles form a foundation for the mediatization of the society. From the human engineering point of view this development and the ensuing increased importance of information systems for economic and private needs require careful deliberation of the derivation and application of ergonomics methods particularly in the field of information systems. This book consists of two closely intertwined parts. The first, theoretical part defines the concept of an information system, followed by an explanation of action regulation as well as cognitive theories to describe man information system interaction. A comprehensive description of information ergonomics concludes the theoretical approach. In the second, practically oriented part of this book authors from industry as well as from academic institu...

  14. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    Science.gov (United States)

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of

  15. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  16. THEORETICAL ASPECTS OF INFORMATIONAL SERVICES REGIONAL MARKET EFFECTIVE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    I.N. Korabejnikov

    2008-12-01

    Full Text Available The peculiarities and priorities of the informational services regional market formation as a part of network model of the economic development are described in this article. The authors present the classification of the factors which have an influence on the effectiveness of the informational services regional market development. Theoretical aspects of the informational services regional market effective development are shown.

  17. Theoretical information reuse and integration

    CERN Document Server

    Rubin, Stuart

    2016-01-01

    Information Reuse and Integration addresses the efficient extension and creation of knowledge through the exploitation of Kolmogorov complexity in the extraction and application of domain symmetry. Knowledge, which seems to be novel, can more often than not be recast as the image of a sequence of transformations, which yield symmetric knowledge. When the size of those transformations and/or the length of that sequence of transforms exceeds the size of the image, then that image is said to be novel or random. It may also be that the new knowledge is random in that no such sequence of transforms, which produces it exists, or is at least known. The nine chapters comprising this volume incorporate symmetry, reuse, and integration as overt operational procedures or as operations built into the formal representations of data and operators employed. Either way, the aforementioned theoretical underpinnings of information reuse and integration are supported.

  18. Research on the novel FBG detection system for temperature and strain field distribution

    Science.gov (United States)

    Liu, Zhi-chao; Yang, Jin-hua

    2017-10-01

    In order to collect the information of temperature and strain field distribution information, the novel FBG detection system was designed. The system applied linear chirped FBG structure for large bandwidth. The structure of novel FBG cover was designed as a linear change in thickness, in order to have a different response at different locations. It can obtain the temperature and strain field distribution information by reflection spectrum simultaneously. The structure of novel FBG cover was designed, and its theoretical function is calculated. Its solution is derived for strain field distribution. By simulation analysis the change trend of temperature and strain field distribution were analyzed in the conditions of different strain strength and action position, the strain field distribution can be resolved. The FOB100 series equipment was used to test the temperature in experiment, and The JSM-A10 series equipment was used to test the strain field distribution in experiment. The average error of experimental results was better than 1.1% for temperature, and the average error of experimental results was better than 1.3% for strain. There were individual errors when the strain was small in test data. It is feasibility by theoretical analysis, simulation calculation and experiment, and it is very suitable for application practice.

  19. Information-theoretic signatures of biodiversity in the barcoding gene.

    Science.gov (United States)

    Barbosa, Valmir C

    2018-08-14

    Analyzing the information content of DNA, though holding the promise to help quantify how the processes of evolution have led to information gain throughout the ages, has remained an elusive goal. Paradoxically, one of the main reasons for this has been precisely the great diversity of life on the planet: if on the one hand this diversity is a rich source of data for information-content analysis, on the other hand there is so much variation as to make the task unmanageable. During the past decade or so, however, succinct fragments of the COI mitochondrial gene, which is present in all animal phyla and in a few others, have been shown to be useful for species identification through DNA barcoding. A few million such fragments are now publicly available through the BOLD systems initiative, thus providing an unprecedented opportunity for relatively comprehensive information-theoretic analyses of DNA to be attempted. Here we show how a generalized form of total correlation can yield distinctive information-theoretic descriptors of the phyla represented in those fragments. In order to illustrate the potential of this analysis to provide new insight into the evolution of species, we performed principal component analysis on standardized versions of the said descriptors for 23 phyla. Surprisingly, we found that, though based solely on the species represented in the data, the first principal component correlates strongly with the natural logarithm of the number of all known living species for those phyla. The new descriptors thus constitute clear information-theoretic signatures of the processes whereby evolution has given rise to current biodiversity, which suggests their potential usefulness in further related studies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. STRUCTURAL AND METHODICAL MODEL OF INCREASING THE LEVEL OF THEORETICAL TRAINING OF CADETS USING INFORMATION AND COMMUNICATION TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladislav V. Bulgakov

    2018-03-01

    Full Text Available Features of training in higher educational institutions of system of EMERCOM of Russia demand introduction of the new educational techniques and the technical means directed on intensification of educational process, providing an opportunity of preparation of cadets at any time in the independent mode and improving quality of their theoretical knowledge. The authors have developed a structural and methodological model of increasing the level of theoretical training of cadets using information and communication technologies. The proposed structural and methodological model that includes elements to stimulate and enhance cognitive activity, allows you to generate the trajectory of theoretical training of cadets for the entire period of study at the University, to organize a systematic independent work, objective, current and final control of theoretical knowledge. The structural and methodological model for improving the level of theoretical training consists of three main elements: the base of theoretical questions, functional modules "teacher" and "cadet". The basis of the structural and methodological model of increasing the level of theoretical training of cadets is the base of theoretical issues, developed in all disciplines specialty 20.05.01 – fire safety. The functional module "teacher" allows you to create theoretical questions of various kinds, edit questions and delete them from the database if necessary, as well as create tests and monitor their implementation. The functional module "cadet" provides ample opportunities for theoretical training through independent work, testing for current and final control, the implementation of the game form of training in the form of a duel, as well as for the formation of the results of the cadets in the form of statistics and rankings. Structural and methodical model of increasing the level of theoretical training of cadets is implemented in practice in the form of a multi-level automated system

  1. The Theoretical Principles of the Organization of Information Systems.

    Science.gov (United States)

    Kulikowski, Juliusz Lech

    A survey of the theoretical problems connected with the organization and design of systems for processing and transmitting information is presented in this article. It gives a definition of Information Systems (IS) and classifies them from various points of view. It discusses briefly the most important aspects of the organization of IS, such as…

  2. Theoretical aspects of cellular decision-making and information-processing.

    Science.gov (United States)

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2012-01-01

    Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.

  3. Utility of Web search query data in testing theoretical assumptions about mephedrone.

    Science.gov (United States)

    Kapitány-Fövény, Máté; Demetrovics, Zsolt

    2017-05-01

    With growing access to the Internet, people who use drugs and traffickers started to obtain information about novel psychoactive substances (NPS) via online platforms. This paper aims to analyze whether a decreasing Web interest in formerly banned substances-cocaine, heroin, and MDMA-and the legislative status of mephedrone predict Web interest about this NPS. Google Trends was used to measure changes of Web interest on cocaine, heroin, MDMA, and mephedrone. Google search results for mephedrone within the same time frame were analyzed and categorized. Web interest about classic drugs found to be more persistent. Regarding geographical distribution, location of Web searches for heroin and cocaine was less centralized. Illicit status of mephedrone was a negative predictor of its Web search query rates. The connection between mephedrone-related Web search rates and legislative status of this substance was significantly mediated by ecstasy-related Web search queries, the number of documentaries, and forum/blog entries about mephedrone. The results might provide support for the hypothesis that mephedrone's popularity was highly correlated with its legal status as well as it functioned as a potential substitute for MDMA. Google Trends was found to be a useful tool for testing theoretical assumptions about NPS. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Physics Without Physics. The Power of Information-theoretical Principles

    Science.gov (United States)

    D'Ariano, Giacomo Mauro

    2017-01-01

    David Finkelstein was very fond of the new information-theoretic paradigm of physics advocated by John Archibald Wheeler and Richard Feynman. Only recently, however, the paradigm has concretely shown its full power, with the derivation of quantum theory (Chiribella et al., Phys. Rev. A 84:012311, 2011; D'Ariano et al., 2017) and of free quantum field theory (D'Ariano and Perinotti, Phys. Rev. A 90:062106, 2014; Bisio et al., Phys. Rev. A 88:032301, 2013; Bisio et al., Ann. Phys. 354:244, 2015; Bisio et al., Ann. Phys. 368:177, 2016) from informational principles. The paradigm has opened for the first time the possibility of avoiding physical primitives in the axioms of the physical theory, allowing a re-foundation of the whole physics over logically solid grounds. In addition to such methodological value, the new information-theoretic derivation of quantum field theory is particularly interesting for establishing a theoretical framework for quantum gravity, with the idea of obtaining gravity itself as emergent from the quantum information processing, as also suggested by the role played by information in the holographic principle (Susskind, J. Math. Phys. 36:6377, 1995; Bousso, Rev. Mod. Phys. 74:825, 2002). In this paper I review how free quantum field theory is derived without using mechanical primitives, including space-time, special relativity, Hamiltonians, and quantization rules. The theory is simply provided by the simplest quantum algorithm encompassing a countable set of quantum systems whose network of interactions satisfies the three following simple principles: homogeneity, locality, and isotropy. The inherent discrete nature of the informational derivation leads to an extension of quantum field theory in terms of a quantum cellular automata and quantum walks. A simple heuristic argument sets the scale to the Planck one, and the currently observed regime where discreteness is not visible is the so-called "relativistic regime" of small wavevectors, which

  5. Expectancy-Violation and Information-Theoretic Models of Melodic Complexity

    Directory of Open Access Journals (Sweden)

    Tuomas Eerola

    2016-07-01

    Full Text Available The present study assesses two types of models for melodic complexity: one based on expectancy violations and the other one related to an information-theoretic account of redundancy in music. Seven different datasets spanning artificial sequences, folk and pop songs were used to refine and assess the models. The refinement eliminated unnecessary components from both types of models. The final analysis pitted three variants of the two model types against each other and could explain from 46-74% of the variance in the ratings across the datasets. The most parsimonious models were identified with an information-theoretic criterion. This suggested that the simplified expectancy-violation models were the most efficient for these sets of data. However, the differences between all optimized models were subtle in terms both of performance and simplicity.

  6. Theoretical Model of Development of Information Competence among Students Enrolled in Elective Courses

    Science.gov (United States)

    Zhumasheva, Anara; Zhumabaeva, Zaida; Sakenov, Janat; Vedilina, Yelena; Zhaxylykova, Nuriya; Sekenova, Balkumis

    2016-01-01

    The current study focuses on the research topic of creating a theoretical model of development of information competence among students enrolled in elective courses. In order to examine specific features of the theoretical model of development of information competence among students enrolled in elective courses, we performed an analysis of…

  7. Information theoretic analysis of canny edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  8. Information-theoretic temporal Bell inequality and quantum computation

    International Nuclear Information System (INIS)

    Morikoshi, Fumiaki

    2006-01-01

    An information-theoretic temporal Bell inequality is formulated to contrast classical and quantum computations. Any classical algorithm satisfies the inequality, while quantum ones can violate it. Therefore, the violation of the inequality is an immediate consequence of the quantumness in the computation. Furthermore, this approach suggests a notion of temporal nonlocality in quantum computation

  9. Institutional and structural barriers to HIV testing: elements for a theoretical framework.

    Science.gov (United States)

    Meyerson, Beth; Barnes, Priscilla; Emetu, Roberta; Bailey, Marlon; Ohmit, Anita; Gillespie, Anthony

    2014-01-01

    Stigma is a barrier to HIV health seeking, but little is known about institutional and structural expressions of stigma in HIV testing. This study examines evidence of institutional and structural stigma in the HIV testing process. A qualitative, grounded theory study was conducted using secondary data from a 2011 HIV test site evaluation data in a Midwestern, moderate HIV incidence state. Expressions of structural and institutional stigma were found with over half of the testing sites and at three stages of the HIV testing visit. Examples of structural stigma included social geography, organization, and staff behavior at first encounter and reception, and staff behavior when experiencing the actual HIV test. Institutional stigma was socially expressed through staff behavior at entry/reception and when experiencing the HIV test. The emerging elements demonstrate the potential compounding of stigma experiences with deleterious effect. Study findings may inform future development of a theoretical framework. In practice, findings can guide organizations seeking to reduce HIV testing barriers, as they provide a window into how test seekers experience HIV test sites at first encounter, entry/reception, and at testing stages; and can identify how stigma might be intensified by structural and institutional expressions.

  10. Simulation of Daily Weather Data Using Theoretical Probability Distributions.

    Science.gov (United States)

    Bruhn, J. A.; Fry, W. E.; Fick, G. W.

    1980-09-01

    A computer simulation model was constructed to supply daily weather data to a plant disease management model for potato late blight. In the weather model Monte Carlo techniques were employed to generate daily values of precipitation, maximum temperature, minimum temperature, minimum relative humidity and total solar radiation. Each weather variable is described by a known theoretical probability distribution but the values of the parameters describing each distribution are dependent on the occurrence of rainfall. Precipitation occurrence is described by a first-order Markov chain. The amount of rain, given that rain has occurred, is described by a gamma probability distribution. Maximum and minimum temperature are simulated with a trivariate normal probability distribution involving maximum temperature on the previous day, maximum temperature on the current day and minimum temperature on the current day. Parameter values for this distribution are dependent on the occurrence of rain on the previous day. Both minimum relative humidity and total solar radiation are assumed to be normally distributed. The values of the parameters describing the distribution of minimum relative humidity is dependent on rainfall occurrence on the previous day and current day. Parameter values for total solar radiation are dependent on the occurrence of rain on the current day. The assumptions made during model construction were found to be appropriate for actual weather data from Geneva, New York. The performance of the weather model was evaluated by comparing the cumulative frequency distributions of simulated weather data with the distributions of actual weather data from Geneva, New York and Fort Collins, Colorado. For each location, simulated weather data were similar to actual weather data in terms of mean response, variability and autocorrelation. The possible applications of this model when used with models of other components of the agro-ecosystem are discussed.

  11. Testing can counteract proactive interference by integrating competing information

    Science.gov (United States)

    Wahlheim, Christopher N.

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A–B, A–D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A–B, A–B), control pairs appeared in List 2 only (A–B, C–D), and changed pairs appeared with the same cue in both lists but with different responses (A–B, A–D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information. PMID:25120241

  12. Testing can counteract proactive interference by integrating competing information.

    Science.gov (United States)

    Wahlheim, Christopher N

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A-B, A-D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A-B, A-B), control pairs appeared in List 2 only (A-B, C-D), and changed pairs appeared with the same cue in both lists but with different responses (A-B, A-D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information.

  13. Correlated Sources in Distributed Networks--Data Transmission, Common Information Characterization and Inferencing

    Science.gov (United States)

    Liu, Wei

    2011-01-01

    Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of…

  14. Experimental Verification of a Jarzynski-Related Information-Theoretic Equality by a Single Trapped Ion.

    Science.gov (United States)

    Xiong, T P; Yan, L L; Zhou, F; Rehan, K; Liang, D F; Chen, L; Yang, W L; Ma, Z H; Feng, M; Vedral, V

    2018-01-05

    Most nonequilibrium processes in thermodynamics are quantified only by inequalities; however, the Jarzynski relation presents a remarkably simple and general equality relating nonequilibrium quantities with the equilibrium free energy, and this equality holds in both the classical and quantum regimes. We report a single-spin test and confirmation of the Jarzynski relation in the quantum regime using a single ultracold ^{40}Ca^{+} ion trapped in a harmonic potential, based on a general information-theoretic equality for a temporal evolution of the system sandwiched between two projective measurements. By considering both initially pure and mixed states, respectively, we verify, in an exact and fundamental fashion, the nonequilibrium quantum thermodynamics relevant to the mutual information and Jarzynski equality.

  15. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  16. Theoretical size distribution of fossil taxa: analysis of a null model

    Directory of Open Access Journals (Sweden)

    Hughes Barry D

    2007-03-01

    Full Text Available Abstract Background This article deals with the theoretical size distribution (of number of sub-taxa of a fossil taxon arising from a simple null model of macroevolution. Model New species arise through speciations occurring independently and at random at a fixed probability rate, while extinctions either occur independently and at random (background extinctions or cataclysmically. In addition new genera are assumed to arise through speciations of a very radical nature, again assumed to occur independently and at random at a fixed probability rate. Conclusion The size distributions of the pioneering genus (following a cataclysm and of derived genera are determined. Also the distribution of the number of genera is considered along with a comparison of the probability of a monospecific genus with that of a monogeneric family.

  17. A game-theoretic approach to real-time system testing

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Li, Shuhao

    2008-01-01

    This paper presents a game-theoretic approach to the testing of uncontrollable real-time systems. By modelling the systems with Timed I/O Game Automata and specifying the test purposes as Timed CTL formulas, we employ a recently developed timed game solver UPPAAL-TIGA to synthesize winning...... strategies, and then use these strategies to conduct black-box conformance testing of the systems. The testing process is proved to be sound and complete with respect to the given test purposes. Case study and preliminary experimental results indicate that this is a viable approach to uncontrollable timed...... system testing....

  18. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    Science.gov (United States)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  19. On the information-theoretic approach to G\\"odel's incompleteness theorem

    OpenAIRE

    D'Abramo, Germano

    2002-01-01

    In this paper we briefly review and analyze three published proofs of Chaitin's theorem, the celebrated information-theoretic version of G\\"odel's incompleteness theorem. Then, we discuss our main perplexity concerning a key step common to all these demonstrations.

  20. E-loyalty towards a cancer information website: applying a theoretical framework.

    Science.gov (United States)

    Crutzen, Rik; Beekers, Nienke; van Eenbergen, Mies; Becker, Monique; Jongen, Lilian; van Osch, Liesbeth

    2014-06-01

    To provide more insight into user perceptions related to e-loyalty towards a cancer information website. This is needed to assure adequate provision of high quality information during the full process of cancer treatment-from diagnosis to after care-and an important first step towards optimizing cancer information websites in order to promote e-loyalty. Participants were cancer patients (n = 63) and informal caregivers (n = 202) that visited a website providing regional information about cancer care for all types of cancer. Subsequently, they filled out a questionnaire assessing e-loyalty towards the website and user perceptions (efficiency, effectiveness, active trust and enjoyment) based on a theoretical framework derived from the field of e-commerce. A structural equation model was constructed to test the relationships between user perceptions and e-loyalty. Participants in general could find the information they were looking for (efficiency), thought it was relevant (effectiveness) and that they could act upon it (active trust) and thought the visit itself was pleasant (enjoyment). Effectiveness and enjoyment were both positively related with e-loyalty, but this was mediated by active trust. Efficiency was positively related with e-loyalty. The explained variance of e-loyalty was high (R(2)  = 0.70). This study demonstrates that the importance of user perceptions is not limited to fields such as e-commerce but is also present within the context of cancer information websites. The high information need among participants might explain the positive relationship between efficiency and e-loyalty. Therefore, cancer information websites need to foster easy search and access of information provided. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Formal Specification of Distributed Information Systems

    NARCIS (Netherlands)

    Vis, J.; Brinksma, Hendrik; de By, R.A.; de By, R.A.

    The design of distributed information systems tends to be complex and therefore error-prone. However, in the field of monolithic, i.e. non-distributed, information systems much has already been achieved, and by now, the principles of their design seem to be fairly well-understood. The past decade

  2. Adaptive information-theoretic bounded rational decision-making with parametric priors

    OpenAIRE

    Grau-Moya, Jordi; Braun, Daniel A.

    2015-01-01

    Deviations from rational decision-making due to limited computational resources have been studied in the field of bounded rationality, originally proposed by Herbert Simon. There have been a number of different approaches to model bounded rationality ranging from optimality principles to heuristics. Here we take an information-theoretic approach to bounded rationality, where information-processing costs are measured by the relative entropy between a posterior decision strategy and a given fix...

  3. Decision-theoretic designs for a series of trials with correlated treatment effects using the Sarmanov multivariate beta-binomial distribution.

    Science.gov (United States)

    Hee, Siew Wan; Parsons, Nicholas; Stallard, Nigel

    2018-03-01

    The motivation for the work in this article is the setting in which a number of treatments are available for evaluation in phase II clinical trials and where it may be infeasible to try them concurrently because the intended population is small. This paper introduces an extension of previous work on decision-theoretic designs for a series of phase II trials. The program encompasses a series of sequential phase II trials with interim decision making and a single two-arm phase III trial. The design is based on a hybrid approach where the final analysis of the phase III data is based on a classical frequentist hypothesis test, whereas the trials are designed using a Bayesian decision-theoretic approach in which the unknown treatment effect is assumed to follow a known prior distribution. In addition, as treatments are intended for the same population it is not unrealistic to consider treatment effects to be correlated. Thus, the prior distribution will reflect this. Data from a randomized trial of severe arthritis of the hip are used to test the application of the design. We show that the design on average requires fewer patients in phase II than when the correlation is ignored. Correspondingly, the time required to recommend an efficacious treatment for phase III is quicker. © 2017 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. An observer-theoretic approach to estimating neutron flux distribution

    International Nuclear Information System (INIS)

    Park, Young Ho; Cho, Nam Zin

    1989-01-01

    State feedback control provides many advantages such as stabilization and improved transient response. However, when the state feedback control is considered for spatial control of a nuclear reactor, it requires complete knowledge of the distributions of the system state variables. This paper describes a method for estimating the flux spatial distribution using only limited flux measurements. It is based on the Luenberger observer in control theory, extended to the distributed parameter systems such as the space-time reactor dynamics equation. The results of the application of the method to simple reactor models showed that the flux distribution is estimated by the observer very efficiently using information from only a few sensors

  5. Information–theoretic implications of quantum causal structures

    DEFF Research Database (Denmark)

    Chaves, Rafael; Majenz, Christian; Gross, David

    2015-01-01

    . However, no systematic method is known for treating such problems in a way that generalizes to quantum systems. Here, we describe a general algorithm for computing information–theoretic constraints on the correlations that can arise from a given causal structure, where we allow for quantum systems as well...... as classical random variables. The general technique is applied to two relevant cases: first, we show that the principle of information causality appears naturally in our framework and go on to generalize and strengthen it. Second, we derive bounds on the correlations that can occur in a networked architecture......It is a relatively new insight of classical statistics that empirical data can contain information about causation rather than mere correlation. First algorithms have been proposed that are capable of testing whether a presumed causal relationship is compatible with an observed distribution...

  6. Theory of the Concealed Information Test

    NARCIS (Netherlands)

    Verschuere, B.; Ben-Shakhar, G.; Verschuere, B.; Ben-Shakhar, G.; Meijer, E.

    2011-01-01

    It is now well established that physiological measures can be validly used to detect concealed information. An important challenge is to elucidate the underlying mechanisms of concealed information detection. We review theoretical approaches that can be broadly classified in two major categories:

  7. A theoretical cost optimization model of reused flowback distribution network of regional shale gas development

    International Nuclear Information System (INIS)

    Li, Huajiao; An, Haizhong; Fang, Wei; Jiang, Meng

    2017-01-01

    The logistical issues surrounding the timing and transport of flowback generated by each shale gas well to the next is a big challenge. Due to more and more flowback being stored temporarily near the shale gas well and reused in the shale gas development, both transportation cost and storage cost are the heavy burden for the developers. This research proposed a theoretical cost optimization model to get the optimal flowback distribution solution for regional multi shale gas wells in a holistic perspective. Then, we used some empirical data of Marcellus Shale to do the empirical study. In addition, we compared the optimal flowback distribution solution by considering both the transportation cost and storage cost with the flowback distribution solution which only minimized the transportation cost or only minimized the storage cost. - Highlights: • A theoretical cost optimization model to get optimal flowback distribution solution. • An empirical study using the shale gas data in Bradford County of Marcellus Shale. • Visualization of optimal flowback distribution solutions under different scenarios. • Transportation cost is a more important factor for reducing the cost. • Help the developers to cut the storage and transportation cost of reusing flowback.

  8. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    OpenAIRE

    Waleed Lagrab; Noura AKNIN

    2015-01-01

    Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facil...

  9. Accelerated life testing design using geometric process for pareto distribution

    OpenAIRE

    Mustafa Kamal; Shazia Zarrin; Arif Ul Islam

    2013-01-01

    In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...

  10. Community structure informs species geographic distributions

    KAUST Repository

    Montesinos-Navarro, Alicia

    2018-05-23

    Understanding what determines species\\' geographic distributions is crucial for assessing global change threats to biodiversity. Measuring limits on distributions is usually, and necessarily, done with data at large geographic extents and coarse spatial resolution. However, survival of individuals is determined by processes that happen at small spatial scales. The relative abundance of coexisting species (i.e. \\'community structure\\') reflects assembly processes occurring at small scales, and are often available for relatively extensive areas, so could be useful for explaining species distributions. We demonstrate that Bayesian Network Inference (BNI) can overcome several challenges to including community structure into studies of species distributions, despite having been little used to date. We hypothesized that the relative abundance of coexisting species can improve predictions of species distributions. In 1570 assemblages of 68 Mediterranean woody plant species we used BNI to incorporate community structure into Species Distribution Models (SDMs), alongside environmental information. Information on species associations improved SDM predictions of community structure and species distributions moderately, though for some habitat specialists the deviance explained increased by up to 15%. We demonstrate that most species associations (95%) were positive and occurred between species with ecologically similar traits. This suggests that SDM improvement could be because species co-occurrences are a proxy for local ecological processes. Our study shows that Bayesian Networks, when interpreted carefully, can be used to include local conditions into measurements of species\\' large-scale distributions, and this information can improve the predictions of species distributions.

  11. New information on parton distributions

    International Nuclear Information System (INIS)

    Martin, A.D.; Stirling, W.J.; Roberts, R.G.

    1992-04-01

    New data on structure functions from deep-inelastic scattering provide new information on parton distributions, particularly in the 0.01 2 data from the New Muon Collaboration (NMC) and its implications for other processes, and the evidence for SU(2) symmetry breaking in the light quark sea. We show that although good fits can be obtained with or without this symmetry breaking, more physically reasonable parton distributions are obtained if we allow d-bar > u-bar at small x. With the inclusion of the latest deep-inelastic data we find α s (M Z ) = 0.111 -0.005 +0.004 . We also show how W, Z and Drell-Yan production at p-barp colliders can give information on parton distributions. (Author)

  12. Optimal design of accelerated life tests for an extension of the exponential distribution

    International Nuclear Information System (INIS)

    Haghighi, Firoozeh

    2014-01-01

    Accelerated life tests provide information quickly on the lifetime distribution of the products by testing them at higher than usual levels of stress. In this paper, the lifetime of a product at any level of stress is assumed to have an extension of the exponential distribution. This new family has been recently introduced by Nadarajah and Haghighi (2011 [1]); it can be used as an alternative to the gamma, Weibull and exponentiated exponential distributions. The scale parameter of lifetime distribution at constant stress levels is assumed to be a log-linear function of the stress levels and a cumulative exposure model holds. For this model, the maximum likelihood estimates (MLEs) of the parameters, as well as the Fisher information matrix, are derived. The asymptotic variance of the scale parameter at a design stress is adopted as an optimization objective and its expression formula is provided using the maximum likelihood method. A Monte Carlo simulation study is carried out to examine the performance of these methods. The asymptotic confidence intervals for the parameters and hypothesis test for the parameter of interest are constructed

  13. Information theoretic analysis of edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  14. Using information Theory in Optimal Test Point Selection for Health Management in NASA's Exploration Vehicles

    Science.gov (United States)

    Mehr, Ali Farhang; Tumer, Irem

    2005-01-01

    In this paper, we will present a new methodology that measures the "worth" of deploying an additional testing instrument (sensor) in terms of the amount of information that can be retrieved from such measurement. This quantity is obtained using a probabilistic model of RLV's that has been partially developed in the NASA Ames Research Center. A number of correlated attributes are identified and used to obtain the worth of deploying a sensor in a given test point from an information-theoretic viewpoint. Once the information-theoretic worth of sensors is formulated and incorporated into our general model for IHM performance, the problem can be formulated as a constrained optimization problem where reliability and operational safety of the system as a whole is considered. Although this research is conducted specifically for RLV's, the proposed methodology in its generic form can be easily extended to other domains of systems health monitoring.

  15. A trade-off between local and distributed information processing associated with remote episodic versus semantic memory.

    Science.gov (United States)

    Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R

    2014-01-01

    Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

  16. Game-theoretic interference coordination approaches for dynamic spectrum access

    CERN Document Server

    Xu, Yuhua

    2016-01-01

    Written by experts in the field, this book is based on recent research findings in dynamic spectrum access for cognitive radio networks. It establishes a game-theoretic framework and presents cutting-edge technologies for distributed interference coordination. With game-theoretic formulation and the designed distributed learning algorithms, it provides insights into the interactions between multiple decision-makers and the converging stable states. Researchers, scientists and engineers in the field of cognitive radio networks will benefit from the book, which provides valuable information, useful methods and practical algorithms for use in emerging 5G wireless communication.

  17. Theoretical and experimental investigations on the cooling capacity distributions at the stages in the thermally-coupled two-stage Stirling-type pulse tube cryocooler without external precooling

    Science.gov (United States)

    Tan, Jun; Dang, Haizheng

    2017-03-01

    The two-stage Stirling-type pulse tube cryocooler (SPTC) has advantages in simultaneously providing the cooling powers at two different temperatures, and the capacity in distributing these cooling capacities between the stages is significant to its practical applications. In this paper, a theoretical model of the thermally-coupled two-stage SPTC without external precooling is established based on the electric circuit analogy with considering real gas effects, and the simulations of both the cooling performances and PV power distribution between stages are conducted. The results indicate that the PV power is inversely proportional to the acoustic impedance of each stage, and the cooling capacity distribution is determined by the cold finger cooling efficiency and the PV power into each stage together. The design methods of the cold fingers to achieve both the desired PV power and the cooling capacity distribution between the stages are summarized. The two-stage SPTC is developed and tested based on the above theoretical investigations, and the experimental results show that it can simultaneously achieve 0.69 W at 30 K and 3.1 W at 85 K with an electric input power of 330 W and a reject temperature of 300 K. The consistency between the simulated and the experimental results is observed and the theoretical investigations are experimentally verified.

  18. Testing market informational efficiency of Constanta port operators

    Science.gov (United States)

    Roşca, E.; Popa, M.; Ruscă, F.; Burciu, Ş.

    2015-11-01

    The Romanian capital market is still an emergent one. Following the mass- privatization process and the private investments, three of the most important handling and storage companies acting in Constantza Port (OIL Terminal, Comvex and SOCEP) are listed on Romanian Stock Exchange. The paper investigates their evolution on the market, identifying the expected rate of return and the components of the shares risk (specific and systematic). Also, the price evolution could be analyzed through the informational efficiency which instantly reflects the price relevance. The Jarque-Bera normality test regarding the shares return rate distribution and the Fama test for the informational efficiency are completed for each company. The market price model is taken into consideration for price forecasting, computing the return rate auto-correlations. The results are subject of interpretation considering additional managerial and financial information of the companies’ activity.

  19. Building a foundation to study distributed information behaviour

    Directory of Open Access Journals (Sweden)

    Terry L. von Thaden

    2007-01-01

    Full Text Available Introduction. The purpose of this research is to assess information behaviour as it pertains to operational teams in dynamic safety critical operations. Method. In this paper, I describe some of the problems faced by crews on modern flight decks and suggest a framework modelled on Information Science, Human Factors, and Activity Theory research to assess the distribution of information actions, namely information identification, gathering and use, by teams of users in a dynamic, safety critical environment. Analysis. By analysing the information behaviour of crews who have accidents and those who do not, researchers may be able to ascertain how they (fail to make use of essential, safety critical information in their information environment. The ultimate goal of this research is to differentiate information behaviour among the distinct outcomes. Results. This research affords the possibility to discern differences in distributed information behaviour illustrating that crews who err to the point of an accident appear to practice different distributed information behaviour than those who do not. This foundation serves to operationalise team sense-making through illustrating the social practice of information structuring within the activity of the work environment. Conclusion. . The distributed information behaviour framework provides a useful structure to study the patterning and organization of information distributed over space and time, to reach a common goal. This framework may allow researchers and investigators alike to identify critical information activity in the negotiation of meaning in high reliability safety critical work, eventually informing safer practice. This framework is applicable to other domains.

  20. theoretical investigation of stresses distributions in hollow sandcrete

    African Journals Online (AJOL)

    user

    The test thin plate distributes the load on the block and the hollow block is regarded as a two ... Some research works had been done on the relationship between cavity ... The results would help reduce the cost, labour and time necessary to.

  1. Impulse tests on distribution transformers protected by means of spark gaps

    Energy Technology Data Exchange (ETDEWEB)

    Pykaelae, M L; Palva, V [Helsinki Univ. of Technology, Otaniemi (Finland). High Voltage Institute; Niskanen, K [ABB Corporate Research, Vaasa (Finland)

    1998-12-31

    Distribution transformers in rural networks have to cope with transient overvoltages, even with those caused by the direct lightning strokes to the lines. In Finland the 24 kV network conditions, such as wooden pole lines, high soil resistivity and isolated neutral network, lead into fast transient overvoltages. Impulse testing of pole-mounted distribution transformers ({<=} 200 kVA) protected by means of spark gaps were studied. Different failure detection methods were used. Results can be used as background information for standardization work dealing with distribution transformers protected by means of spark gaps. (orig.) 9 refs.

  2. Impulse tests on distribution transformers protected by means of spark gaps

    Energy Technology Data Exchange (ETDEWEB)

    Pykaelae, M.L.; Palva, V. [Helsinki Univ. of Technology, Otaniemi (Finland). High Voltage Institute; Niskanen, K. [ABB Corporate Research, Vaasa (Finland)

    1997-12-31

    Distribution transformers in rural networks have to cope with transient overvoltages, even with those caused by the direct lightning strokes to the lines. In Finland the 24 kV network conditions, such as wooden pole lines, high soil resistivity and isolated neutral network, lead into fast transient overvoltages. Impulse testing of pole-mounted distribution transformers ({<=} 200 kVA) protected by means of spark gaps were studied. Different failure detection methods were used. Results can be used as background information for standardization work dealing with distribution transformers protected by means of spark gaps. (orig.) 9 refs.

  3. A comparison of SAR ATR performance with information theoretic predictions

    Science.gov (United States)

    Blacknell, David

    2003-09-01

    Performance assessment of automatic target detection and recognition algorithms for SAR systems (or indeed any other sensors) is essential if the military utility of the system / algorithm mix is to be quantified. This is a relatively straightforward task if extensive trials data from an existing system is used. However, a crucial requirement is to assess the potential performance of novel systems as a guide to procurement decisions. This task is no longer straightforward since a hypothetical system cannot provide experimental trials data. QinetiQ has previously developed a theoretical technique for classification algorithm performance assessment based on information theory. The purpose of the study presented here has been to validate this approach. To this end, experimental SAR imagery of targets has been collected using the QinetiQ Enhanced Surveillance Radar to allow algorithm performance assessments as a number of parameters are varied. In particular, performance comparisons can be made for (i) resolutions up to 0.1m, (ii) single channel versus polarimetric (iii) targets in the open versus targets in scrubland and (iv) use versus non-use of camouflage. The change in performance as these parameters are varied has been quantified from the experimental imagery whilst the information theoretic approach has been used to predict the expected variation of performance with parameter value. A comparison of these measured and predicted assessments has revealed the strengths and weaknesses of the theoretical technique as will be discussed in the paper.

  4. Strength of wood versus rate of testing - A theoretical approach

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    2007-01-01

    Strength of wood is normally measured in ramp load experiments. Experience shows that strength increases with increasing rate of testing. This feature is considered theoretically in this paper. It is shown that the influence of testing rate is a phenomenon, which depends on the quality...... of the considered wood. Low quality wood shows lesser influence of testing rate. This observation agrees with the well-known statement made by Borg Madsen that weak wood subjected to a constant load, has a longer lifetime than strong wood. In general, the influence of testing rate on strength increases...

  5. A Distributed User Information System

    Science.gov (United States)

    1990-03-01

    NOE08 Department of Computer Science NOVO 8 1990 University of Maryland S College Park, MD 20742 D Abstract Current user information database technology ...Transactions on Computer Systems, May 1988. [So189] K. Sollins. A plan for internet directory services. Technical report, DDN Network Information Center...2424 A Distributed User Information System DTiC Steven D. Miller, Scott Carson, and Leo Mark DELECTE Institute for Advanced Computer Studies and

  6. Theoretical study of rock mass investigation efficiency

    International Nuclear Information System (INIS)

    Holmen, Johan G.; Outters, Nils

    2002-05-01

    The study concerns a mathematical modelling of a fractured rock mass and its investigations by use of theoretical boreholes and rock surfaces, with the purpose of analysing the efficiency (precision) of such investigations and determine the amount of investigations necessary to obtain reliable estimations of the structural-geological parameters of the studied rock mass. The study is not about estimating suitable sample sizes to be used in site investigations.The purpose of the study is to analyse the amount of information necessary for deriving estimates of the geological parameters studied, within defined confidence intervals and confidence level In other words, how the confidence in models of the rock mass (considering a selected number of parameters) will change with amount of information collected form boreholes and surfaces. The study is limited to a selected number of geometrical structural-geological parameters: Fracture orientation: mean direction and dispersion (Fisher Kappa and SRI). Different measures of fracture density (P10, P21 and P32). Fracture trace-length and strike distributions as seen on horizontal windows. A numerical Discrete Fracture Network (DFN) was used for representation of a fractured rock mass. The DFN-model was primarily based on the properties of an actual fracture network investigated at the Aespoe Hard Rock Laboratory. The rock mass studied (DFN-model) contained three different fracture sets with different orientations and fracture densities. The rock unit studied was statistically homogeneous. The study includes a limited sensitivity analysis of the properties of the DFN-model. The study is a theoretical and computer-based comparison between samples of fracture properties of a theoretical rock unit and the known true properties of the same unit. The samples are derived from numerically generated boreholes and surfaces that intersect the DFN-network. Two different boreholes are analysed; a vertical borehole and a borehole that is

  7. Theoretical Fundaments for a Turing-Type Test for Virtual Environments

    OpenAIRE

    Sonnenfeld, Nathan

    2016-01-01

    AbstractAlan Turing supposed the “imitation game” which has also been called the Turing Test. In the game an interrogator is tasked with telling two similar stimuli apart by asking questions then, based on their responses, correctly identifying both stimuli. Applying this concept to virtual reality to create a similar test will be important as virtual reality becomes more and more like reality. It is also important to explore the conceptual and theoretical issues with the Turing Test in order...

  8. The informal recycling in the international and local context: theoretical Elements

    International Nuclear Information System (INIS)

    Yepes P, Dora Luz

    2002-01-01

    This article is a synthesis of the theoretical aspects related with the urban problem of the informal recycling in our means, and it is framed inside the denominated investigation project alternatives for their invigoration of the informal recycling in Medellin, which is a thesis of the grade that looks for to strengthen the informal recycling through the study of the factors associated to the labor productivity of the informal recycle. Specifically, the study will identify options of improvement of its work y points to propose alternatives to dignify the labor of these people integrally by the light of environmental precepts, technicians, normative, institutional social and of sustainability. This document describe the theoretical elements in which this investigation will be based, showing the informal recycling inside of an international context, and their situation in a national and local environment. As a result of the bibliographical revision carried out, can be said, that it glimpses a low interest in to improve the conditions of work a International level of the informal recycle, unless the strategies that it outlines the international labor organization, with regard to the strengthening of the informal economy; in Latin America, it has not been possible to go further of the official rhetoric and the pro motion of the groups environmentalists, but in the issue of the recovery policies, reuse, and the recycling of solid wastes, if there. Has been a sustained advance; at national level clear strategies to improve the informal work of the recycle are being identified, however, lacks many efforts to develop the committed actions with these strategies, in spite of the fact that has been advancing the creation of recycle organizations little by little

  9. Quantum information theoretical analysis of various constructions for quantum secret sharing

    NARCIS (Netherlands)

    Rietjens, K.P.T.; Schoenmakers, B.; Tuyls, P.T.

    2005-01-01

    Recently, an information theoretical model for quantum secret sharing (QSS) schemes was introduced. By using this model, we prove that pure state quantum threshold schemes (QTS) can be constructed from quantum MDS codes and vice versa. In particular, we consider stabilizer codes and give a

  10. Distributed Energy Resources Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NREL's Distributed Energy Resources Test Facility (DERTF) is a working laboratory for interconnection and systems integration testing. This state-of-the-art facility...

  11. THEORETICAL JUSTIFICATION OF EXPONENTIAL DISTRIBUTION LAW OF DISTANCES BETWEEN STOPS OF CITY PUBLIC TRANSPORT

    Directory of Open Access Journals (Sweden)

    Gorbachov, P.

    2012-06-01

    Full Text Available The paper presents the results of investigation of relation bitween the trip distance on stops location on the route between places of attraction. Theoretical justification of the use fulness of exponential distribution with the shift parameter for describing the trip distance between stops is given.

  12. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  13. CLASSIFICATION, DISTRIBUTION AND PRODUCTION OF KNOWLEDGE: THEORETICAL SUMMARY

    Directory of Open Access Journals (Sweden)

    R. A. Tchupin

    2013-01-01

    Full Text Available The paper is devoted to systemizing the main theoretical approaches to classification, distribution and production of knowledge in the global economy. The author focuses on F. Machlup’s knowledge classification and the concept of useful knowledge by J. Mokyr.The interpersonal and public channels of communication and acquisition of knowledge are observed taking into consideration the total changes caused by transition from industrial to postindustrial economy. The paper provides a comparative analysis of the given model and alternative concepts of knowledge generation: finalization of science, strategic research, post-normal science, academic capitalism, post-academic science, and the triple helix concept. The author maintains that the current concepts of knowledge generation reflect the fact of transformation of modern institutional technical environment due to the global technological changes, and increasing contribution of knowledge to the economic development. Accordingly, the roles of the main participants of the given process are changing along with the growing integration of education and science, state and businesses. 

  14. Characterising Information Systems in Australia: A Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Gail Ridley

    2006-11-01

    Full Text Available The study reported in this volume aims to investigate the state of the Information Systems academic discipline in Australia from a historical and current perspective, collecting evidence across a range of dimensions. To maximise the strategic potential of the study, the results need to be capable of integration, so that the relationships within and across the dimensions and geographical units are understood. A meaningful theoretical framework will help relate the results of the different dimensions of the study to characterise the discipline in the region, and assist in empowering the Australian IS research community. This paper reviewed literature on the development of disciplines, before deriving a theoretical framework for the broader study reported in this volume. The framework considered the current and past state of IS in Australian universities from the perspective of the development of a discipline. The components of the framework were derived and validated through a thematic analysis of both the IS and non-IS literature. This paper also presents brief vignettes of the development of two other related disciplines. The framework developed in this paper, which has been partly guided by Whitley’s Theory of Scientific Change, has been used to analyse data collated from the Australian states and the Australian Capital Territory. The degree of variation in Australian IS as an indication of its “professionalisation”, the nature of its body of knowledge and its mechanisms of control, will be used to frame the analysis. Research reported in several of the papers that follow in this volume has drawn upon the theoretical framework presented below.

  15. IASI's sensitivity to near-surface carbon monoxide (CO): Theoretical analyses and retrievals on test cases

    Science.gov (United States)

    Bauduin, Sophie; Clarisse, Lieven; Theunissen, Michael; George, Maya; Hurtmans, Daniel; Clerbaux, Cathy; Coheur, Pierre-François

    2017-03-01

    Separating concentrations of carbon monoxide (CO) in the boundary layer from the rest of the atmosphere with nadir satellite measurements is of particular importance to differentiate emission from transport. Although thermal infrared (TIR) satellite sounders are considered to have limited sensitivity to the composition of the near-surface atmosphere, previous studies show that they can provide information on CO close to the ground in case of high thermal contrast. In this work we investigate the capability of IASI (Infrared Atmospheric Sounding Interferometer) to retrieve near-surface CO concentrations, and we quantitatively assess the influence of thermal contrast on such retrievals. We present a 3-part analysis, which relies on both theoretical forward simulations and retrievals on real data, performed for a large range of negative and positive thermal contrast situations. First, we derive theoretically the IASI detection threshold of CO enhancement in the boundary layer, and we assess its dependence on thermal contrast. Then, using the optimal estimation formalism, we quantify the role of thermal contrast on the error budget and information content of near-surface CO retrievals. We demonstrate that, contrary to what is usually accepted, large negative thermal contrast values (ground cooler than air) lead to a better decorrelation between CO concentrations in the low and the high troposphere than large positive thermal contrast (ground warmer than the air). In the last part of the paper we use Mexico City and Barrow as test cases to contrast our theoretical predictions with real retrievals, and to assess the accuracy of IASI surface CO retrievals through comparisons to ground-based in-situ measurements.

  16. Distributed Administrative Management Information System (DAMIS).

    Science.gov (United States)

    Juckiewicz, Robert; Kroculick, Joseph

    Columbia University's major program to distribute its central administrative data processing to its various schools and departments is described. The Distributed Administrative Management Information System (DAMIS) will link every department and school within the university via micrcomputers, terminals, and/or minicomputers to the central…

  17. Explorations In Theoretical Computer Science For Kids (using paper toys)

    DEFF Research Database (Denmark)

    Valente, Andrea

    2004-01-01

    The computational card (c-cards for short) project is a study and realization of an educational tool based on playing cards. C-cards are an educational tool to introduce children 8 to 10 (or older) to the concept of computation, seen as manipulation of symbols. The game provides teachers...... and learners with a physical, tangible metaphor for exploring core concepts of computer science, such as deterministic and probabilistic state machines, frequencies and probability distributions, and the central elements of Shannon's information theory, like information, communication, errors and error...... detection. Our idea is implemented both with paper cards and by an editor/simulator software (a prototype realized in javascript). We also designed the structure of a course in (theoretical) computer science, based on c-cards, and we will test it this summer....

  18. An Information Theoretic Analysis of Classification Sorting and Cognition by Ninth Grade Children within a Piagetian Setting.

    Science.gov (United States)

    Dunlop, David Livingston

    The purpose of this study was to use an information theoretic memory model to quantitatively investigate classification sorting and recall behaviors of various groups of students. The model provided theorems for the determination of information theoretic measures from which inferences concerning mental processing were made. The basic procedure…

  19. Development and validation of a theoretical test in non-anaesthesiologist-administered propofol sedation for gastrointestinal endoscopy

    DEFF Research Database (Denmark)

    Jensen, Jeppe Thue; Savran, Mona Meral; Møller, Ann Merete

    2016-01-01

    OBJECTIVE: Safety with non-anaesthesiologist-administered propofol sedation (NAAP) during gastrointestinal (GI) endoscopy is related to theoretical knowledge. A summative testing of knowledge before attempting supervised nurse-administered propofol sedation (NAPS) in the clinic is advised. The aims...... of this study were to develop a theoretical test about propofol sedation, to gather validity evidence for the test and to measure the effect of a NAPS-specific training course. MATERIAL AND METHODS: A three-phased psychometric study on multiple choice questionnaire (MCQ) test development, gathering of validity......% increase; p = 0.001 and 0.001, respectively). CONCLUSIONS: Data supported the validity of the developed MCQ test. The NAPS-specific course with pre-course testing adds theoretical knowledge to already well-prepared participants....

  20. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  1. Comparison of theoretical proteomes: identification of COGs with conserved and variable pI within the multimodal pI distribution.

    Science.gov (United States)

    Nandi, Soumyadeep; Mehra, Nipun; Lynn, Andrew M; Bhattacharya, Alok

    2005-09-09

    Theoretical proteome analysis, generated by plotting theoretical isoelectric points (pI) against molecular masses of all proteins encoded by the genome show a multimodal distribution for pI. This multimodal distribution is an effect of allowed combinations of the charged amino acids, and not due to evolutionary causes. The variation in this distribution can be correlated to the organisms ecological niche. Contributions to this variation maybe mapped to individual proteins by studying the variation in pI of orthologs across microorganism genomes. The distribution of ortholog pI values showed trimodal distributions for all prokaryotic genomes analyzed, similar to whole proteome plots. Pairwise analysis of pI variation show that a few COGs are conserved within, but most vary between, the acidic and basic regions of the distribution, while molecular mass is more highly conserved. At the level of functional grouping of orthologs, five groups vary significantly from the population of orthologs, which is attributed to either conservation at the level of sequences or a bias for either positively or negatively charged residues contributing to the function. Individual COGs conserved in both the acidic and basic regions of the trimodal distribution are identified, and orthologs that best represent the variation in levels of the acidic and basic regions are listed. The analysis of pI distribution by using orthologs provides a basis for resolution of theoretical proteome comparison at the level of individual proteins. Orthologs identified that significantly vary between the major acidic and basic regions maybe used as representative of the variation of the entire proteome.

  2. Using Graph and Vertex Entropy to Compare Empirical Graphs with Theoretical Graph Models

    Directory of Open Access Journals (Sweden)

    Tomasz Kajdanowicz

    2016-09-01

    Full Text Available Over the years, several theoretical graph generation models have been proposed. Among the most prominent are: the Erdős–Renyi random graph model, Watts–Strogatz small world model, Albert–Barabási preferential attachment model, Price citation model, and many more. Often, researchers working with real-world data are interested in understanding the generative phenomena underlying their empirical graphs. They want to know which of the theoretical graph generation models would most probably generate a particular empirical graph. In other words, they expect some similarity assessment between the empirical graph and graphs artificially created from theoretical graph generation models. Usually, in order to assess the similarity of two graphs, centrality measure distributions are compared. For a theoretical graph model this means comparing the empirical graph to a single realization of a theoretical graph model, where the realization is generated from the given model using an arbitrary set of parameters. The similarity between centrality measure distributions can be measured using standard statistical tests, e.g., the Kolmogorov–Smirnov test of distances between cumulative distributions. However, this approach is both error-prone and leads to incorrect conclusions, as we show in our experiments. Therefore, we propose a new method for graph comparison and type classification by comparing the entropies of centrality measure distributions (degree centrality, betweenness centrality, closeness centrality. We demonstrate that our approach can help assign the empirical graph to the most similar theoretical model using a simple unsupervised learning method.

  3. A short course in quantum information theory. An approach from theoretical physics

    International Nuclear Information System (INIS)

    Diosi, L.

    2007-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. (orig.)

  4. Testing a theoretical model of clinical nurses' intent to stay.

    Science.gov (United States)

    Cowden, Tracy L; Cummings, Greta G

    2015-01-01

    Published theoretical models of nurses' intent to stay (ITS) report inconsistent outcomes, and not all hypothesized models have been adequately tested. Research has focused on cognitive rather than emotional determinants of nurses' ITS. The aim of this study was to empirically verify a complex theoretical model of nurses' ITS that includes both affective and cognitive determinants and to explore the influence of relational leadership on staff nurses' ITS. The study was a correlational, mixed-method, nonexperimental design. A subsample of the Quality Work Environment Study survey data 2009 (n = 415 nurses) was used to test our theoretical model of clinical nurses' ITS as a structural equation model. The model explained 63% of variance in ITS. Organizational commitment, empowerment, and desire to stay were the model concepts with the strongest effects on nurses' ITS. Leadership practices indirectly influenced ITS. How nurses evaluate and respond to their work environment is both an emotional and rational process. Health care organizations need to be cognizant of the influence that nurses' feelings and views of their work setting have on their intention decisions and integrate that knowledge into the development of retention strategies. Leadership practices play an important role in staff nurses' perceptions of the workplace. Identifying the mechanisms by which leadership influences staff nurses' intentions to stay presents additional focus areas for developing retention strategies.

  5. Development, Demonstration, and Field Testing of Enterprise-Wide Distributed Generation Energy Management System: Phase 1 Report

    Energy Technology Data Exchange (ETDEWEB)

    2003-04-01

    This report describes RealEnergy's evolving distributed generation command and control system, called the"Distributed Energy Information System" (DEIS). This system uses algorithms to determine how to operate distributed generation systems efficiently and profitably. The report describes the system and RealEnergy's experiences in installing and applying the system to manage distributed generators for commercial building applications.The report is divided into six tasks. The first five describe the DEIS; the sixth describes RE's regulatory and contractual obligations: Task 1: Define Information and Communications Requirements; Task 2: Develop Command and Control Algorithms for Optimal Dispatch; Task 3: Develop Codes and Modules for Optimal Dispatch Algorithms; Task 4: Test Codes Using Simulated Data; Task 5: Install and Test Energy Management Software; Task 6: Contractual and Regulatory Issues.

  6. Iterative Multiview Side Information for Enhanced Reconstruction in Distributed Video Coding

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Distributed video coding (DVC is a new paradigm for video compression based on the information theoretical results of Slepian and Wolf (SW and Wyner and Ziv (WZ. DVC entails low-complexity encoders as well as separate encoding of correlated video sources. This is particularly attractive for multiview camera systems in video surveillance and camera sensor network applications, where low complexity is required at the encoder. In addition, the separate encoding of the sources implies no communication between the cameras in a practical scenario. This is an advantage since communication is time and power consuming and requires complex networking. In this work, different intercamera estimation techniques for side information (SI generation are explored and compared in terms of estimating quality, complexity, and rate distortion (RD performance. Further, a technique called iterative multiview side information (IMSI is introduced, where the final SI is used in an iterative reconstruction process. The simulation results show that IMSI significantly improves the RD performance for video with significant motion and activity. Furthermore, DVC outperforms AVC/H.264 Intra for video with average and low motion but it is still inferior to the Inter No Motion and Inter Motion modes.

  7. Theoretical framework for government information service delivery to deep rural communities in South Africa

    CSIR Research Space (South Africa)

    Mvelase, PS

    2009-10-01

    Full Text Available This paper reports on a study to determine the information requirements of communities in deep rural areas on government services and how this information can be made available to them. The study then proposes an e-government theoretical framework...

  8. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  9. Comparison of theoretical proteomes: Identification of COGs with conserved and variable pI within the multimodal pI distribution

    Directory of Open Access Journals (Sweden)

    Lynn Andrew M

    2005-09-01

    Full Text Available Abstract Background Theoretical proteome analysis, generated by plotting theoretical isoelectric points (pI against molecular masses of all proteins encoded by the genome show a multimodal distribution for pI. This multimodal distribution is an effect of allowed combinations of the charged amino acids, and not due to evolutionary causes. The variation in this distribution can be correlated to the organisms ecological niche. Contributions to this variation maybe mapped to individual proteins by studying the variation in pI of orthologs across microorganism genomes. Results The distribution of ortholog pI values showed trimodal distributions for all prokaryotic genomes analyzed, similar to whole proteome plots. Pairwise analysis of pI variation show that a few COGs are conserved within, but most vary between, the acidic and basic regions of the distribution, while molecular mass is more highly conserved. At the level of functional grouping of orthologs, five groups vary significantly from the population of orthologs, which is attributed to either conservation at the level of sequences or a bias for either positively or negatively charged residues contributing to the function. Individual COGs conserved in both the acidic and basic regions of the trimodal distribution are identified, and orthologs that best represent the variation in levels of the acidic and basic regions are listed. Conclusion The analysis of pI distribution by using orthologs provides a basis for resolution of theoretical proteome comparison at the level of individual proteins. Orthologs identified that significantly vary between the major acidic and basic regions maybe used as representative of the variation of the entire proteome.

  10. Do bacterial cell numbers follow a theoretical Poisson distribution? Comparison of experimentally obtained numbers of single cells with random number generation via computer simulation.

    Science.gov (United States)

    Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu

    2016-12-01

    We investigated a bacterial sample preparation procedure for single-cell studies. In the present study, we examined whether single bacterial cells obtained via 10-fold dilution followed a theoretical Poisson distribution. Four serotypes of Salmonella enterica, three serotypes of enterohaemorrhagic Escherichia coli and one serotype of Listeria monocytogenes were used as sample bacteria. An inoculum of each serotype was prepared via a 10-fold dilution series to obtain bacterial cell counts with mean values of one or two. To determine whether the experimentally obtained bacterial cell counts follow a theoretical Poisson distribution, a likelihood ratio test between the experimentally obtained cell counts and Poisson distribution which parameter estimated by maximum likelihood estimation (MLE) was conducted. The bacterial cell counts of each serotype sufficiently followed a Poisson distribution. Furthermore, to examine the validity of the parameters of Poisson distribution from experimentally obtained bacterial cell counts, we compared these with the parameters of a Poisson distribution that were estimated using random number generation via computer simulation. The Poisson distribution parameters experimentally obtained from bacterial cell counts were within the range of the parameters estimated using a computer simulation. These results demonstrate that the bacterial cell counts of each serotype obtained via 10-fold dilution followed a Poisson distribution. The fact that the frequency of bacterial cell counts follows a Poisson distribution at low number would be applied to some single-cell studies with a few bacterial cells. In particular, the procedure presented in this study enables us to develop an inactivation model at the single-cell level that can estimate the variability of survival bacterial numbers during the bacterial death process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    Directory of Open Access Journals (Sweden)

    Waleed Lagrab

    2015-03-01

    Full Text Available Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facilities in Mukalla districts in YEMEN which contributed to building a geographic database for the study area. After that the Kindergarten spatial patterns are analyzed in terms of proximity to each other and used near some other land in the surrounding area such as streets highways factories etc. Also measures the concentration dispersion clustering and distribution direction for the kindergarten this study showed the effectiveness of the GIS for spatial data analysis. One of the most important finding that most of the Kindergarten was established in Mukalla city did not take into account the criteria that set by the authorities. Furthermore almost district suffers from a shortage in the number of kindergarten and pattern of distribution of those kindergartens dominated by spatial dispersed.

  12. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  13. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  14. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    Directory of Open Access Journals (Sweden)

    Ross S Williamson

    2015-04-01

    Full Text Available Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID, uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  15. The DELPHI distributed information system for exchanging LEP machine related information

    International Nuclear Information System (INIS)

    Doenszelmann, M.; Gaspar, C.

    1994-01-01

    An information management system was designed and implemented to interchange information between the DELPHI experiment at CERN and the monitoring/control system for the LEP (Large Electron Positron Collider) accelerator. This system is distributed and communicates with many different sources and destinations (LEP) using different types of communication. The system itself communicates internally via a communication system based on a publish-and-subscribe mechanism, DIM (Distributed Information Manager). The information gathered by this system is used for on-line as well as off-line data analysis. Therefore it logs the information to a database and makes it available to operators and users via DUI (DELPHI User Interface). The latter was extended to be capable of displaying ''time-evolution'' plots. It also handles a protocol, implemented using a finite state machine, SMI (State Management Interface), for (semi-)automatic running of the Data Acquisition System and the Slow Controls System. ((orig.))

  16. Optimal information transfer in enzymatic networks: A field theoretic formulation

    Science.gov (United States)

    Samanta, Himadri S.; Hinczewski, Michael; Thirumalai, D.

    2017-07-01

    Signaling in enzymatic networks is typically triggered by environmental fluctuations, resulting in a series of stochastic chemical reactions, leading to corruption of the signal by noise. For example, information flow is initiated by binding of extracellular ligands to receptors, which is transmitted through a cascade involving kinase-phosphatase stochastic chemical reactions. For a class of such networks, we develop a general field-theoretic approach to calculate the error in signal transmission as a function of an appropriate control variable. Application of the theory to a simple push-pull network, a module in the kinase-phosphatase cascade, recovers the exact results for error in signal transmission previously obtained using umbral calculus [Hinczewski and Thirumalai, Phys. Rev. X 4, 041017 (2014), 10.1103/PhysRevX.4.041017]. We illustrate the generality of the theory by studying the minimal errors in noise reduction in a reaction cascade with two connected push-pull modules. Such a cascade behaves as an effective three-species network with a pseudointermediate. In this case, optimal information transfer, resulting in the smallest square of the error between the input and output, occurs with a time delay, which is given by the inverse of the decay rate of the pseudointermediate. Surprisingly, in these examples the minimum error computed using simulations that take nonlinearities and discrete nature of molecules into account coincides with the predictions of a linear theory. In contrast, there are substantial deviations between simulations and predictions of the linear theory in error in signal propagation in an enzymatic push-pull network for a certain range of parameters. Inclusion of second-order perturbative corrections shows that differences between simulations and theoretical predictions are minimized. Our study establishes that a field theoretic formulation of stochastic biological signaling offers a systematic way to understand error propagation in

  17. Online catalog access and distribution of remotely sensed information

    Science.gov (United States)

    Lutton, Stephen M.

    1997-09-01

    Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.

  18. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  19. A theoretical model evaluating the angular distribution of luminescence emission in X-ray scintillating screens

    International Nuclear Information System (INIS)

    Kandarakis, I.; Cavouras, D.; Nikolopoulos, D.; Episkopakis, A.; Kalivas, N.; Liaparinos, P.; Valais, I.; Kagadis, G.; Kourkoutas, K.; Sianoudis, I.; Dimitropoulos, N.; Nomicos, C.; Panayiotakis, G.

    2006-01-01

    The aim of this study was to examine the angular distribution of the light emitted from radiation-excited scintillators in medical imaging detectors. This distribution diverges from Lambert's cosine law and affects the light emission efficiency of scintillators, hence it also affects the dose burden to the patient. In the present study, the angular distribution was theoretically modeled and was used to fit experimental data on various scintillator materials. Results of calculations revealed that the angular distribution is more directional than that predicted by Lambert's law. Divergence from this law is more pronounced for high values of light attenuation coefficient and thick scintillator layers (screens). This type of divergence reduces light emission efficiency and hence it increases the incident X-ray flux required for a given level of image brightness

  20. Nonlocal approach to the analysis of the stress distribution in granular systems. I. Theoretical framework

    Science.gov (United States)

    Kenkre, V. M.; Scott, J. E.; Pease, E. A.; Hurd, A. J.

    1998-05-01

    A theoretical framework for the analysis of the stress distribution in granular materials is presented. It makes use of a transformation of the vertical spatial coordinate into a formal time variable and the subsequent study of a generally non-Markoffian, i.e., memory-possessing (nonlocal) propagation equation. Previous treatments are obtained as particular cases corresponding to, respectively, wavelike and diffusive limits of the general evolution. Calculations are presented for stress propagation in bounded and unbounded media. They can be used to obtain desired features such as a prescribed stress distribution within the compact.

  1. Auxiliary Heat Exchanger Flow Distribution Test

    International Nuclear Information System (INIS)

    Kaufman, J.S.; Bressler, M.M.

    1983-01-01

    The Auxiliary Heat Exchanger Flow Distribution Test was the first part of a test program to develop a water-cooled (tube-side), compact heat exchanger for removing heat from the circulating gas in a high-temperature gas-cooled reactor (HTGR). Measurements of velocity and pressure were made with various shell side inlet and outlet configurations. A flow configuration was developed which provides acceptable velocity distribution throughout the heat exchanger without adding excessive pressure drop

  2. Scalable Distributed Architectures for Information Retrieval

    National Research Council Canada - National Science Library

    Lu, Zhihong

    1999-01-01

    .... Our distributed architectures exploit parallelism in information retrieval on a cluster of parallel IR servers using symmetric multiprocessors, and use partial collection replication and selection...

  3. Distributed morality in an information society.

    Science.gov (United States)

    Floridi, Luciano

    2013-09-01

    The phenomenon of distributed knowledge is well-known in epistemic logic. In this paper, a similar phenomenon in ethics, somewhat neglected so far, is investigated, namely distributed morality. The article explains the nature of distributed morality, as a feature of moral agency, and explores the implications of its occurrence in advanced information societies. In the course of the analysis, the concept of infraethics is introduced, in order to refer to the ensemble of moral enablers, which, although morally neutral per se, can significantly facilitate or hinder both positive and negative moral behaviours.

  4. Medication errors in residential aged care facilities: a distributed cognition analysis of the information exchange process.

    Science.gov (United States)

    Tariq, Amina; Georgiou, Andrew; Westbrook, Johanna

    2013-05-01

    Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May-September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding

  5. Phenomenological description of selected elementary chemical reaction mechanisms: An information-theoretic study

    International Nuclear Information System (INIS)

    Esquivel, R.O.; Flores-Gallegos, N.; Iuga, C.; Carrera, E.M.; Angulo, J.C.; Antolin, J.

    2010-01-01

    The information-theoretic description of the course of two elementary chemical reactions allows a phenomenological description of the chemical course of the hydrogenic abstraction and the S N 2 identity reactions by use of Shannon entropic measures in position and momentum spaces. The analyses reveal their synchronous/asynchronous mechanistic behavior.

  6. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data

    Science.gov (United States)

    Darmon, David

    2018-03-01

    In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.

  7. Quantum information and coherence

    CERN Document Server

    Öhberg, Patrik

    2014-01-01

    This book offers an introduction to ten key topics in quantum information science and quantum coherent phenomena, aimed at graduate-student level. The chapters cover some of the most recent developments in this dynamic research field where theoretical and experimental physics, combined with computer science, provide a fascinating arena for groundbreaking new concepts in information processing. The book addresses both the theoretical and experimental aspects of the subject, and clearly demonstrates how progress in experimental techniques has stimulated a great deal of theoretical effort and vice versa. Experiments are shifting from simply preparing and measuring quantum states to controlling and manipulating them, and the book outlines how the first real applications, notably quantum key distribution for secure communication, are starting to emerge. The chapters cover quantum retrodiction, ultracold quantum gases in optical lattices, optomechanics, quantum algorithms, quantum key distribution, quantum cont...

  8. A distributed name resolution system in information centric networks

    Science.gov (United States)

    Elbreiki, Walid; Arlimatti, Shivaleela; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Information Centric Networks (ICN) is the new paradigm that envisages to shift the Internet away from its existing Point-to-Point architecture to a data centric, where communication is based on named hosts rather than the information stored on these hosts. Name Resolution is the center of attraction for ICN, where Named Data Objects (NDO) are used for identifying the information and guiding for routing or forwarding inside ICN. Recently, several researches use distributed NRS to overcome the problem of interest flooding, congestion and overloading. Yet the distribution of NRS is based on random distribution. How to distribute the NRS is still an important and challenging problem. In this work, we address the problem of distribution of NRS by proposing a new mechanism called Distributed Name Resolution System (DNRS), by considering the time of publishing the NDOs in the NRS. This mechanism partitions the network to distribute the workload among NRSs by increasing storage capacity. In addition, partitioning the network increases flexibility and scalability of NRS. We evaluate the effectiveness of our proposed mechanism, which achieves lesser end-to-end delay with more average throughputs compared to random distribution of NRS without disturbing the underlying routing or forwarding strategies.

  9. Information-theoretic characterization of dynamic energy systems

    Science.gov (United States)

    Bevis, Troy Lawson

    sources are compounded by the dynamics of the grid itself. Loads are constantly changing, as well as the sources; this can sometimes lead to a quick change in system states. There is a need for a metric to be able to take into consideration all of the factors detailed above; it needs to be able to take into consideration the amount of information that is available in the system and the rate that the information is losing its value. In a dynamic system, the information is only valid for a length of time, and the controller must be able to take into account the decay of currently held information. This thesis will present the information theory metrics in a way that is useful for application to dynamic energy systems. A test case involving synchronization of several generators is presented for analysis and application of the theory. The objective is to synchronize all the generators and connect them to a common bus. As the phase shift of each generator is a random process, the effects of latency and information decay can be directly observed. The results of the experiments clearly show that the expected outcomes are observed and that entropy and information theory is a valid metric for timing requirement extraction.

  10. Theoretical study of the dependence of single impurity Anderson model on various parameters within distributional exact diagonalization method

    Science.gov (United States)

    Syaina, L. P.; Majidi, M. A.

    2018-04-01

    Single impurity Anderson model describes a system consisting of non-interacting conduction electrons coupled with a localized orbital having strongly interacting electrons at a particular site. This model has been proven successful to explain the phenomenon of metal-insulator transition through Anderson localization. Despite the well-understood behaviors of the model, little has been explored theoretically on how the model properties gradually evolve as functions of hybridization parameter, interaction energy, impurity concentration, and temperature. Here, we propose to do a theoretical study on those aspects of a single impurity Anderson model using the distributional exact diagonalization method. We solve the model Hamiltonian by randomly generating sampling distribution of some conducting electron energy levels with various number of occupying electrons. The resulting eigenvalues and eigenstates are then used to define the local single-particle Green function for each sampled electron energy distribution using Lehmann representation. Later, we extract the corresponding self-energy of each distribution, then average over all the distributions and construct the local Green function of the system to calculate the density of states. We repeat this procedure for various values of those controllable parameters, and discuss our results in connection with the criteria of the occurrence of metal-insulator transition in this system.

  11. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  12. Information-theoretic semi-supervised metric learning via entropy regularization.

    Science.gov (United States)

    Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi

    2014-08-01

    We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.

  13. Information Technologies of the Distributed Applications Design

    Directory of Open Access Journals (Sweden)

    Safwan Al SALAIMEH

    2007-01-01

    Full Text Available The questions of distributed systems development based on Java RMI, EJB and J2EE technologies and tools are rated. Here is brought the comparative analysis, which determines the domain of an expedient demand of the considered information technologies as applied to the concrete distributed applications requirements.

  14. Theoretical fringe profiles with crossed Babinet compensators in testing concave aspheric surfaces.

    Science.gov (United States)

    Saxena, A K; Lancelot, J P

    1982-11-15

    This paper presents the theory for the use of crossed Babinet compensators in testing concave aspheric surfaces. Theoretical fringe profiles for a sphere and for an aspheric surface with primary aberration are shown. Advantages of this method are discussed.

  15. Derivation of Human Chromatic Discrimination Ability from an Information-Theoretical Notion of Distance in Color Space.

    Science.gov (United States)

    da Fonseca, María; Samengo, Inés

    2016-12-01

    The accuracy with which humans detect chromatic differences varies throughout color space. For example, we are far more precise when discriminating two similar orange stimuli than two similar green stimuli. In order for two colors to be perceived as different, the neurons representing chromatic information must respond differently, and the difference must be larger than the trial-to-trial variability of the response to each separate color. Photoreceptors constitute the first stage in the processing of color information; many more stages are required before humans can consciously report whether two stimuli are perceived as chromatically distinguishable. Therefore, although photoreceptor absorption curves are expected to influence the accuracy of conscious discriminability, there is no reason to believe that they should suffice to explain it. Here we develop information-theoretical tools based on the Fisher metric that demonstrate that photoreceptor absorption properties explain about 87% of the variance of human color discrimination ability, as tested by previous behavioral experiments. In the context of this theory, the bottleneck in chromatic information processing is determined by photoreceptor absorption characteristics. Subsequent encoding stages modify only marginally the chromatic discriminability at the photoreceptor level.

  16. Box-Cox Test: the theoretical justification and US-China empirical study

    Directory of Open Access Journals (Sweden)

    Tam Bang Vu

    2011-01-01

    Full Text Available In econometrics, the derivation of a theoretical model leads sometimes to two econometric models, which can be considered justified based on their respective approximation approaches. Hence, the decision of choosing one between the two hinges on applied econometric tools. In this paper, the authors develop a theoretical econometrics consumer maximization model to measure the flow of durables’ expenditures where depreciation is added to former classical econometrics model. The proposed model was formulated in both linear and logarithmic forms. Box-Cox tests were used to choose the most appropriate one among them. The proposed model was then applied to the historical data from the U.S. and China for a comparative study and the results discussed.

  17. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  18. The asymmetric distribution of informative face information during gender recognition.

    Science.gov (United States)

    Hu, Fengpei; Hu, Huan; Xu, Lian; Qin, Jungang

    2013-02-01

    Recognition of the gender of a face is important in social interactions. In the current study, the distribution of informative facial information was systematically examined during gender judgment using two methods, Bubbles and Focus windows techniques. Two experiments found that the most informative information was around the eyes, followed by the mouth and nose. Other parts of the face contributed to the gender recognition but were less important. The left side of the face was used more during gender recognition in two experiments. These results show mainly areas around the eyes are used for gender judgment and demonstrate perceptual asymmetry with a normal (non-chimeric) face.

  19. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-09-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  20. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-12-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  1. A brief overview of the distribution test grids with a distributed generation inclusion case study

    Directory of Open Access Journals (Sweden)

    Stanisavljević Aleksandar M.

    2018-01-01

    Full Text Available The paper presents an overview of the electric distribution test grids issued by different technical institutions. They are used for testing different scenarios in operation of a grid for research, benchmarking, comparison and other purposes. Their types, main characteristics, features as well as application possibilities are shown. Recently, these grids are modified with inclusion of distributed generation. An example of modification and application of the IEEE 13-bus for testing effects of faults in cases without and with a distributed generator connection to the grid is presented. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. III 042004: Smart Electricity Distribution Grids Based on Distribution Management System and Distributed Generation

  2. Study of mass and momentum transfer in diesel sprays based on X-ray mass distribution measurements and on a theoretical derivation

    Energy Technology Data Exchange (ETDEWEB)

    Desantes, J.M.; Salvador, F.J.; Lopez, J.J.; Morena, J. de la [Universidad Politecnica de Valencia, CMT-Motores Termicos, Valencia (Spain)

    2011-02-15

    In this paper, a research aimed at quantifying mass and momentum transfer in the near-nozzle field of diesel sprays injected into stagnant ambient air is reported. The study combines X-ray measurements for two different nozzles and axial positions, which provide mass distributions in the spray, with a theoretical model based on momentum flux conservation, which was previously validated. This investigation has allowed the validation of Gaussian profiles for local fuel concentration and velocity near the nozzle exit, as well as the determination of Schmidt number at realistic diesel spray conditions. This information could be very useful for those who are interested in spray modeling, especially at high-pressure injection conditions. (orig.)

  3. Theoretical and experimental stress analyses of ORNL thin-shell cylinder-to-cylinder model 3

    International Nuclear Information System (INIS)

    Gwaltney, R.C.; Bolt, S.E.; Corum, J.M.; Bryson, J.W.

    1975-06-01

    The third in a series of four thin-shell cylinder-to-cylinder models was tested, and the experimentally determined elastic stress distributions were compared with theoretical predictions obtained from a thin-shell finite-element analysis. The models are idealized thin-shell structures consisting of two circular cylindrical shells that intersect at right angles. There are no transitions, reinforcements, or fillets in the junction region. This series of model tests serves two basic purposes: the experimental data provide design information directly applicable to nozzles in cylindrical vessels; and the idealized models provide test results for use in developing and evaluating theoretical analyses applicable to nozzles in cylindrical vessels and to thin piping tees. The cylinder of model 3 had a 10 in. OD and the nozzle had a 1.29 in. OD, giving a d 0 /D 0 ratio of 0.129. The OD/thickness ratios for the cylinder and the nozzle were 50 and 7.68 respectively. Thirteen separate loading cases were analyzed. In each, one end of the cylinder was rigidly held. In addition to an internal pressure loading, three mutually perpendicular force components and three mutually perpendicular moment components were individually applied at the free end of the cylinder and at the end of the nozzle. The experimental stress distributions for all the loadings were obtained using 158 three-gage strain rosettes located on the inner and outer surfaces. The loading cases were also analyzed theoretically using a finite-element shell analysis developed at the University of California, Berkeley. The analysis used flat-plate elements and considered five degrees of freedom per node in the final assembled equations. The comparisons between theory and experiment show reasonably good agreement for this model. (U.S.)

  4. Theoretical and experimental stress analyses of ORNL thin-shell cylinder-to-cylinder model 4

    International Nuclear Information System (INIS)

    Gwaltney, R.C.; Bolt, S.E.; Bryson, J.W.

    1975-06-01

    The last in a series of four thin-shell cylinder-to-cylinder models was tested, and the experimentally determined elastic stress distributions were compared with theoretical predictions obtained from a thin-shell finite-element analysis. The models in the series are idealized thin-shell structures consisting of two circular cylindrical shells that intersect at right angles. There are no transitions, reinforcements, or fillets in the junction region. This series of model tests serves two basic purposes: (1) the experimental data provide design information directly applicable to nozzles in cylindrical vessels, and (2) the idealized models provide test results for use in developing and evaluating theoretical analyses applicable to nozzles in cylindrical vessels and to thin piping tees. The cylinder of model 4 had an outside diameter of 10 in., and the nozzle had an outside diameter of 1.29 in., giving a d 0 /D 0 ratio of 0.129. The OD/thickness ratios were 50 and 20.2 for the cylinder and nozzle respectively. Thirteen separate loading cases were analyzed. For each loading condition one end of the cylinder was rigidly held. In addition to an internal pressure loading, three mutually perpendicular force components and three mutually perpendicular moment components were individually applied at the free end of the cylinder and at the end of the nozzle. The experimental stress distributions for each of the 13 loadings were obtained using 157 three-gage strain rosettes located on the inner and outer surfaces. Each of the 13 loading cases was also analyzed theoretically using a finite-element shell analysis developed at the University of California, Berkeley. The analysis used flat-plate elements and considered five degrees of freedom per node in the final assembled equations. The comparisons between theory and experiment show reasonably good agreement for this model. (U.S.)

  5. Verifying seismic design of nuclear reactors by testing. Volume 2: appendix, theoretical discussions

    International Nuclear Information System (INIS)

    1979-01-01

    Theoretical discussions on seismic design testing are presented under the following appendix headings: system functions, pulse optimization program, system identification, and motion response calculations from inertance measurements of a nuclear power plant

  6. Theoretically informed correlates of hepatitis B knowledge among four Asian groups: the health behavior framework.

    Science.gov (United States)

    Maxwell, Annette E; Stewart, Susan L; Glenn, Beth A; Wong, Weng Kee; Yasui, Yutaka; Chang, L Cindy; Taylor, Victoria M; Nguyen, Tung T; Chen, Moon S; Bastani, Roshan

    2012-01-01

    Few studies have examined theoretically informed constructs related to hepatitis B (HBV) testing, and comparisons across studies are challenging due to lack of uniformity in constructs assessed. The present analysis examined relationships among Health Behavior Framework factors across four Asian American groups to advance the development of theory-based interventions for HBV testing in at-risk populations. Data were collected from 2007-2010 as part of baseline surveys during four intervention trials promoting HBV testing among Vietnamese-, Hmong-, Korean- and Cambodian-Americans (n = 1,735). Health Behavior Framework constructs assessed included: awareness of HBV, knowledge of transmission routes, perceived susceptibility, perceived severity, doctor recommendation, stigma of HBV infection, and perceived efficacy of testing. Within each group we assessed associations between our intermediate outcome of knowledge of HBV transmission and other constructs, to assess the concurrent validity of our model and instruments. While the absolute levels for Health Behavior Framework factors varied across groups, relationships between knowledge and other factors were generally consistent. This suggests similarities rather than differences with respect to posited drivers of HBV-related behavior. Our findings indicate that Health Behavior Framework constructs are applicable to diverse ethnic groups and provide preliminary evidence for the construct validity of the Health Behavior Framework.

  7. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  8. Distributed video coding with multiple side information

    DEFF Research Database (Denmark)

    Huang, Xin; Brites, C.; Ascenso, J.

    2009-01-01

    Distributed Video Coding (DVC) is a new video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of some decoder side information. The quality of the side information has a major impact on the DVC rate-distortion (RD) performance in the same way...... the quality of the predictions had a major impact in predictive video coding. In this paper, a DVC solution exploiting multiple side information is proposed; the multiple side information is generated by frame interpolation and frame extrapolation targeting to improve the side information of a single...

  9. Charge state distribution of 16O from the 4He(12C,16O)γ reaction of astrophysical interest studied both experimentally and theoretically

    International Nuclear Information System (INIS)

    Liu, Shengjin; Sakurai, Makoto; Sagara, Kenshi; Teranishi, Takashi; Fujita, Kunihiro; Yamaguchi, Hiroyuki; Matsuda, Sayaka; Mitsuzumi, Tatsuki; Iwazaki, Makoto; Rosary, Mariya T.; Kato, Daiji; Tolstikhina, I.Yu.

    2014-01-01

    In astrophysics, 4 He( 12 C, 16 O)γ reaction places an important role. At Kyushu University Tandem accelerator Laboratory (KUTL), the measurement of 4 He( 12 C, 16 O)γ cross section is in progress in the energy range of astrophysical nuclear reaction. Since the charge state of product 16 O ions after passing through the gas target is spread and only one charge state can be measured at terminal detector, it is necessary to know the charge state distribution of 16 O ions passing through the He gas target precisely. Here, we report the charge state distribution of the 16 O recoils both experimentally and theoretically. Experimentally, we measured the equilibrium charge state distribution of 16 O ions in the windowless helium gas target with the beam energy of primary 16 O ions at 7.2, 4.5, and 3.45 MeV at KUTL. The measured results showed a Gaussian distribution for the charge state fraction. Theoretically, we proposed a framework for the charge state distribution study. Charge state distribution was computed by solving a set of differential equations including a series of charge exchange cross sections. For the ionization cross section, plane-wave Born approximation was applied and modified by taking target atomic screening as a function of momentum transfer into account. For the capture cross section, continuum distorted wave approximation was applied and the influence of the gas target density was taken into account in the process of electron capture. Using above charge exchange cross sections, the charge state evolution was simulated. According to the equilibrium distribution, we compared the theoretical calculation to the experimental data. After taking into account the density effects in the charge exchange process, the theoretical charge state distributions shows a good agreement with the experimental data. Both experimental and theoretical results are useful to understand the charge fraction of recoil oxygen created via 4 He( 12 C, 16 O)γ reaction

  10. Ray-tracing toroidal axisymmetric devices. 1. theoretical analysis

    International Nuclear Information System (INIS)

    Cardinali, A.; Brambilla, M.

    1981-06-01

    Ray tracing technique for lower hybrid waves is used to obtain informations about accessibility, power deposition profiles and eventually electric field distribution. In the first part a critical discussion to establish the meaning and validity of this technique is presented, while in the second part of this work applications to small and to large, fat tokamaks are presented, which support and explain the theoretical arguments

  11. Distributed team cohesion – not an oxymoron. The impact of information and communications technologies on teamness in globally distributed IT projects

    Directory of Open Access Journals (Sweden)

    Olga Stawnicza

    2015-01-01

    Full Text Available Globally distributed IT projects are common practice in today’s globalized world. Typically, project team members’ work on interdependent tasks, with a common goal to be achieved as one team. However, being split between multiple locations impedes communication among team members and hampers the development of trust. Information and communications media enable communication between geographically distributed project team members and help to create and maintain trust within project units. Communication and trust are particularly significant for fostering a feeling of oneness among project team members. Oneness, also referred to as “teamness”, is repeatedly mentioned as one of the challenges facing global project teams. However, prior literature on teamness is very scarce and its importance is underrepresented. This research contributes to the field in two ways. First, the theoretical study based on a systematic literature review examines available evidence of teamness in globally distributed projects. Secondly, an empirical study based on interviews conducted with global project managers fills the current gap in literature on the link between use of ICT and establishing a sense of team unity. This paper draws practitioners’ attention to the importance of striving for teamness in spite of the geographical distance that exists between project team members.

  12. Information, power, and social rationality

    International Nuclear Information System (INIS)

    Keck, O.

    1993-01-01

    The effect of institutional structures on economic and political processes and on the results of such processes is a central economics and political-science problem. This study adopts the recent fundamental information economics thesis which maintains that unevenly distributed information has a negative effect on politico-economic systems and that the corresponding individual and collective countermeasures are keys to the structures and characteristics of economic and political institutions. The theoretical part presents some relatively simple game theory models to elucidate the basic problems caused by unevenly distributed information in politico-economic systems. The empirical part is dedicated to an international comparative analysis of nuclear energy policies in the United States, in Great Britain, France, and in Germany. The results obtained reveal that the information economics approach explains the way in which the different institutional arrangements have influenced the policies and their results in the respective countries. The last chapter classifies the theoretical and empirical results by some further theoretical political-science, sociological and economic approaches. Points of contact with and differences between combined federalistic, neo-corporate, bureaucratic-politics and technocratic approaches are discussed. (orig./UA) [de

  13. Information theoretically secure, enhanced Johnson noise based key distribution over the smart grid with switched filters.

    Science.gov (United States)

    Gonzalez, Elias; Kish, Laszlo B; Balog, Robert S; Enjeti, Prasad

    2013-01-01

    We introduce a protocol with a reconfigurable filter system to create non-overlapping single loops in the smart power grid for the realization of the Kirchhoff-Law-Johnson-(like)-Noise secure key distribution system. The protocol is valid for one-dimensional radial networks (chain-like power line) which are typical of the electricity distribution network between the utility and the customer. The speed of the protocol (the number of steps needed) versus grid size is analyzed. When properly generalized, such a system has the potential to achieve unconditionally secure key distribution over the smart power grid of arbitrary geometrical dimensions.

  14. Information-theoretic limitations on approximate quantum cloning and broadcasting

    Science.gov (United States)

    Lemm, Marius; Wilde, Mark M.

    2017-07-01

    We prove quantitative limitations on any approximate simultaneous cloning or broadcasting of mixed states. The results are based on information-theoretic (entropic) considerations and generalize the well-known no-cloning and no-broadcasting theorems. We also observe and exploit the fact that the universal cloning machine on the symmetric subspace of n qudits and symmetrized partial trace channels are dual to each other. This duality manifests itself both in the algebraic sense of adjointness of quantum channels and in the operational sense that a universal cloning machine can be used as an approximate recovery channel for a symmetrized partial trace channel and vice versa. The duality extends to give control of the performance of generalized universal quantum cloning machines (UQCMs) on subspaces more general than the symmetric subspace. This gives a way to quantify the usefulness of a priori information in the context of cloning. For example, we can control the performance of an antisymmetric analog of the UQCM in recovering from the loss of n -k fermionic particles.

  15. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    Science.gov (United States)

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  16. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  17. Comparison of information-theoretic to statistical methods for gene-gene interactions in the presence of genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Sucheston Lara

    2010-09-01

    Full Text Available Abstract Background Multifactorial diseases such as cancer and cardiovascular diseases are caused by the complex interplay between genes and environment. The detection of these interactions remains challenging due to computational limitations. Information theoretic approaches use computationally efficient directed search strategies and thus provide a feasible solution to this problem. However, the power of information theoretic methods for interaction analysis has not been systematically evaluated. In this work, we compare power and Type I error of an information-theoretic approach to existing interaction analysis methods. Methods The k-way interaction information (KWII metric for identifying variable combinations involved in gene-gene interactions (GGI was assessed using several simulated data sets under models of genetic heterogeneity driven by susceptibility increasing loci with varying allele frequency, penetrance values and heritability. The power and proportion of false positives of the KWII was compared to multifactor dimensionality reduction (MDR, restricted partitioning method (RPM and logistic regression. Results The power of the KWII was considerably greater than MDR on all six simulation models examined. For a given disease prevalence at high values of heritability, the power of both RPM and KWII was greater than 95%. For models with low heritability and/or genetic heterogeneity, the power of the KWII was consistently greater than RPM; the improvements in power for the KWII over RPM ranged from 4.7% to 14.2% at for α = 0.001 in the three models at the lowest heritability values examined. KWII performed similar to logistic regression. Conclusions Information theoretic models are flexible and have excellent power to detect GGI under a variety of conditions that characterize complex diseases.

  18. Data, Information, Knowledge, Wisdom (DIKW: A Semiotic Theoretical and Empirical Exploration of the Hierarchy and its Quality Dimension

    Directory of Open Access Journals (Sweden)

    Sasa Baskarada

    2013-03-01

    Full Text Available What exactly is the difference between data and information? What is the difference between data quality and information quality; is there any difference between the two? And, what are knowledge and wisdom? Are there such things as knowledge quality and wisdom quality? As these primitives are the most basic axioms of information systems research, it is somewhat surprising that consensus on exact definitions seems to be lacking. This paper presents a theoretical and empirical exploration of the sometimes directly quoted, and often implied Data, Information, Knowledge, Wisdom (DIKW hierarchy and its quality dimension. We first review relevant literature from a range of perspectives and develop and contextualise a theoretical DIKW framework through semiotics. The literature review identifies definitional commonalities and divergences from a scholarly perspective; the theoretical discussion contextualises the terms and their relationships within a semiotic framework and proposes relevant definitions grounded in that framework. Next, rooted in Wittgenstein’s ordinary language philosophy, we analyse 20 online news articles for their uses of the terms and present the results of an online focus group discussion comprising 16 information systems experts. The empirical exploration identifies a range of definitional ambiguities from a practical perspective.

  19. Encryption of covert information into multiple statistical distributions

    International Nuclear Information System (INIS)

    Venkatesan, R.C.

    2007-01-01

    A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model

  20. Information system for personnel work distribution in Kaunas Maironis gymnasium

    OpenAIRE

    Ivanauskaitė, Eglė

    2005-01-01

    This information technology master degree work. In this work it is made research of the task of teacher's work distribution acording to schoolchildren's individual studying plans. In this work it it analyzed the process of teacher's work distribution and their information needs. This information system was created in MS Visio surroundings and realized with MS SQL Server and VBA implements.

  1. State-of-the-Art: Research Theoretical Framework of Information Systems Implementation Research in the Health Sector in Sub-Saharan Africa

    DEFF Research Database (Denmark)

    Tetteh, Godwin Kofi

    2014-01-01

    This study is about the state-of-the-art of reference theories and theoretical framework of information systems implementation research in the health industry in the Sub-Saharan countries from a process perspective. A process – variance framework, Poole et al, (2000), Markus & Robey, (1988......) and Shaw & Jarvenpaa, (1997) is employed to examine reference theories employed in research conducted on information systems implementation in the health sector in the Sub-Saharan region and published between 2003 and 2013. Using a number of key words and searching on a number of databases, EBSCO, CSA...... the process theoretical framework to enhance our insight into successful information systems implementation in the region. It is our optimism that the process based theoretical framework will be useful for, information system practitioners and organisational managers and researchers in the health sector...

  2. Nash Bargaining Game-Theoretic Framework for Power Control in Distributed Multiple-Radar Architecture Underlying Wireless Communication System

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2018-04-01

    Full Text Available This paper presents a novel Nash bargaining solution (NBS-based cooperative game-theoretic framework for power control in a distributed multiple-radar architecture underlying a wireless communication system. Our primary objective is to minimize the total power consumption of the distributed multiple-radar system (DMRS with the protection of wireless communication user’s transmission, while guaranteeing each radar’s target detection requirement. A unified cooperative game-theoretic framework is proposed for the optimization problem, where interference power constraints (IPCs are imposed to protect the communication user’s transmission, and a minimum signal-to-interference-plus-noise ratio (SINR requirement is employed to provide reliable target detection for each radar. The existence, uniqueness and fairness of the NBS to this cooperative game are proven. An iterative Nash bargaining power control algorithm with low computational complexity and fast convergence is developed and is shown to converge to a Pareto-optimal equilibrium for the cooperative game model. Numerical simulations and analyses are further presented to highlight the advantages and testify to the efficiency of our proposed cooperative game algorithm. It is demonstrated that the distributed algorithm is effective for power control and could protect the communication system with limited implementation overhead.

  3. Quantum cryptography: Theoretical protocols for quantum key distribution and tests of selected commercial QKD systems in commercial fiber networks

    Science.gov (United States)

    Jacak, Monika; Jacak, Janusz; Jóźwiak, Piotr; Jóźwiak, Ireneusz

    2016-06-01

    The overview of the current status of quantum cryptography is given in regard to quantum key distribution (QKD) protocols, implemented both on nonentangled and entangled flying qubits. Two commercial R&D platforms of QKD systems are described (the Clavis II platform by idQuantique implemented on nonentangled photons and the EPR S405 Quelle platform by AIT based on entangled photons) and tested for feasibility of their usage in commercial TELECOM fiber metropolitan networks. The comparison of systems efficiency, stability and resistivity against noise and hacker attacks is given with some suggestion toward system improvement, along with assessment of two models of QKD.

  4. Comment on ‘Information hidden in the velocity distribution of ions and the exact kinetic Bohm criterion’

    Science.gov (United States)

    Mustafaev, A. S.; Sukhomlinov, V. S.; Timofeev, N. A.

    2018-03-01

    This Comment is devoted to some mathematical inaccuracies made by the authors of the paper ‘Information hidden in the velocity distribution of ions and the exact kinetic Bohm criterion’ (Plasma Sources Science and Technology 26 055003). In the Comment, we show that the diapason of plasma parameters for the validity of the theoretical results obtained by the authors was defined incorrectly; we made a more accurate definition of this diapason. As a result, we show that it is impossible to confirm or refute the feasibility of the Bohm kinetic criterion on the basis of the data of the cited paper.

  5. Testing Agile Information Management Systems with Video Test Client. Case Study - DIMES

    National Research Council Canada - National Science Library

    Yan, Lok K

    2006-01-01

    .... At the core of NCW is the ability for these distributed forces to collaborate and share information, which are the two services provided by information management systems located on the Global Information Grid (GIG...

  6. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  7. Probabilistic distributions of wind velocity for the evaluation of the wind power potential; Distribuicoes probabilisticas de velocidades do vento para avaliacao do potencial energetico eolico

    Energy Technology Data Exchange (ETDEWEB)

    Vendramini, Elisa Zanuncio

    1986-10-01

    The theoretical model of wind speed distributions allow valuable information about the probability of events relative to the variable in study eliminating the necessity of a new experiment. The most used distributions has been the Weibull and the Rayleigh. These distributions are examined in the present investigation, as well as the exponential, gamma, chi square and lognormal distributions. Three years of hourly averages wind data recorded from a anemometer setting at the city of Ataliba Leonel, Sao Paulo State, Brazil, were used. Using wind speed distribution the theoretical relative frequency was calculated from the distributions which have been examined. Results from the Kolmogorov - Smirnov test allow to conclude that the lognormal distribution fit better the wind speed data, followed by the gamma and Rayleigh distributions. Using the lognormal probability density function the yearly energy output from a wind generator installed in the side was calculated. 30 refs, 4 figs, 14 tabs

  8. Developing and Testing a Theoretical Framework for Computer-Mediated Transparency of Local Governments

    NARCIS (Netherlands)

    Grimmelikhuijsen, S.G.; Welch, E.W.

    2012-01-01

    This article contributes to the emerging literature on transparency by developing and empirically testing a theoretical framework that explains the determinants of local government Web site transparency. It aims to answer the following central question: What institutional factors determine the

  9. Velocity distribution of electrons in time-varying low-temperature plasmas: progress in theoretical procedures over the past 70 years

    Science.gov (United States)

    Makabe, Toshiaki

    2018-03-01

    A time-varying low-temperature plasma sustained by electrical powers with various kinds of fRequencies has played a key role in the historical development of new technologies, such as gas lasers, ozonizers, micro display panels, dry processing of materials, medical care, and so on, since World War II. Electrons in a time-modulated low-temperature plasma have a proper velocity spectrum, i.e. velocity distribution dependent on the microscopic quantum characteristics of the feed gas molecule and on the external field strength and the frequency. In order to solve and evaluate the time-varying velocity distribution, we have mostly two types of theoretical methods based on the classical and linear Boltzmann equations, namely, the expansion method using the orthogonal function and the procedure of non-expansional temporal evolution. Both methods have been developed discontinuously and progressively in synchronization with those technological developments. In this review, we will explore the historical development of the theoretical procedure to evaluate the electron velocity distribution in a time-varying low-temperature plasma over the past 70 years.

  10. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin

    2010-03-01

    Geospatial information systems provide an abundance of information for researchers and scientists. Unfortunately this type of data can usually only be analyzed a few megapixels at a time, giving researchers a very narrow view into these voluminous data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented levels expediting analysis, interrogation, and discovery. ©2010 IEEE.

  11. Improving behaviour in self-testing (IBIS): Study on frequency of use, consequences, information needs and use, and quality of currently available consumer information (protocol).

    Science.gov (United States)

    Grispen, Janaica E J; Ickenroth, Martine H P; de Vries, Nanne K; Dinant, Geert-Jan; Ronda, Gaby; van der Weijden, Trudy

    2010-08-03

    Self-tests are available to consumers for more than 25 conditions, ranging from infectious diseases to cardiovascular risk factors. Self-tests are defined as in-vitro tests on body materials such as blood, urine, faeces, or saliva that are initiated by consumers to diagnose a particular disorder or risk factor without involving a medical professional. In 2006, 16% of a sample of Dutch Internet users had ever used at least one self-test and 17% intended to use a self-test in the future. The objectives of this study are to determine (1) the frequency of self-test use, (2) the consumers' reasons for using or not using a self-test, (3) the information that is used by self-testers in the different self-test stages and the consumers' interpretation of the quality of this information, (4) the consumers' response to self-test results in terms of their confidence in the result, reassurance by the test result, and follow-up behaviour, (5) the information consumers report to need in the decision making process of using or not using a self-test, and in further management on the basis of the self-test result, and (6) the quality of the currently available consumer information on a selected set of self-tests. Mixed methods study with (1) a cross-sectional study consisting of a two-phase Internet-questionnaire, (2) semi-structured interviews with self-testers and consumers who intend to use a self-test, and (3) the assessment of the quality of consumer information of self-tests. The Health Belief Model and the Theory of Planned Behaviour will serve as the theoretical basis for the questionnaires and the interview topic guides. The self-testing area is still in a state of flux and therefore it is expected that self-test use will increase in the future. To the best of our knowledge, this is the first study which combines quantitative and qualitative research to identify consumers' information needs and use concerning self-testing, and the consumers' actual follow-up behaviour based

  12. Information theoretical assessment of visual communication with wavelet coding

    Science.gov (United States)

    Rahman, Zia-ur

    1995-06-01

    A visual communication channel can be characterized by the efficiency with which it conveys information, and the quality of the images restored from the transmitted data. Efficient data representation requires the use of constraints of the visual communication channel. Our information theoretic analysis combines the design of the wavelet compression algorithm with the design of the visual communication channel. Shannon's communication theory, Wiener's restoration filter, and the critical design factors of image gathering and display are combined to provide metrics for measuring the efficiency of data transmission, and for quantitatively assessing the visual quality of the restored image. These metrics are: a) the mutual information (Eta) between the radiance the radiance field and the restored image, and b) the efficiency of the channel which can be roughly measured by as the ratio (Eta) /H, where H is the average number of bits being used to transmit the data. Huck, et al. (Journal of Visual Communication and Image Representation, Vol. 4, No. 2, 1993) have shown that channels desinged to maximize (Eta) , also maximize. Our assessment provides a framework for designing channels which provide the highest possible visual quality for a given amount of data under the critical design limitations of the image gathering and display devices. Results show that a trade-off exists between the maximum realizable information of the channel and its efficiency: an increase in one leads to a decrease in the other. The final selection of which of these quantities to maximize is, of course, application dependent.

  13. SAIL: Summation-bAsed Incremental Learning for Information-Theoretic Text Clustering.

    Science.gov (United States)

    Cao, Jie; Wu, Zhiang; Wu, Junjie; Xiong, Hui

    2013-04-01

    Information-theoretic clustering aims to exploit information-theoretic measures as the clustering criteria. A common practice on this topic is the so-called Info-Kmeans, which performs K-means clustering with KL-divergence as the proximity function. While expert efforts on Info-Kmeans have shown promising results, a remaining challenge is to deal with high-dimensional sparse data such as text corpora. Indeed, it is possible that the centroids contain many zero-value features for high-dimensional text vectors, which leads to infinite KL-divergence values and creates a dilemma in assigning objects to centroids during the iteration process of Info-Kmeans. To meet this challenge, in this paper, we propose a Summation-bAsed Incremental Learning (SAIL) algorithm for Info-Kmeans clustering. Specifically, by using an equivalent objective function, SAIL replaces the computation of KL-divergence by the incremental computation of Shannon entropy. This can avoid the zero-feature dilemma caused by the use of KL-divergence. To improve the clustering quality, we further introduce the variable neighborhood search scheme and propose the V-SAIL algorithm, which is then accelerated by a multithreaded scheme in PV-SAIL. Our experimental results on various real-world text collections have shown that, with SAIL as a booster, the clustering performance of Info-Kmeans can be significantly improved. Also, V-SAIL and PV-SAIL indeed help improve the clustering quality at a lower cost of computation.

  14. A Bayesian Decision-Theoretic Approach to Logically-Consistent Hypothesis Testing

    Directory of Open Access Journals (Sweden)

    Gustavo Miranda da Silva

    2015-09-01

    Full Text Available This work addresses an important issue regarding the performance of simultaneous test procedures: the construction of multiple tests that at the same time are optimal from a statistical perspective and that also yield logically-consistent results that are easy to communicate to practitioners of statistical methods. For instance, if hypothesis A implies hypothesis B, is it possible to create optimal testing procedures that reject A whenever they reject B? Unfortunately, several standard testing procedures fail in having such logical consistency. Although this has been deeply investigated under a frequentist perspective, the literature lacks analyses under a Bayesian paradigm. In this work, we contribute to the discussion by investigating three rational relationships under a Bayesian decision-theoretic standpoint: coherence, invertibility and union consonance. We characterize and illustrate through simple examples optimal Bayes tests that fulfill each of these requisites separately. We also explore how far one can go by putting these requirements together. We show that although fairly intuitive tests satisfy both coherence and invertibility, no Bayesian testing scheme meets the desiderata as a whole, strengthening the understanding that logical consistency cannot be combined with statistical optimality in general. Finally, we associate Bayesian hypothesis testing with Bayes point estimation procedures. We prove the performance of logically-consistent hypothesis testing by means of a Bayes point estimator to be optimal only under very restrictive conditions.

  15. Theoretical and empirical convergence results for additive congruential random number generators

    Science.gov (United States)

    Wikramaratna, Roy S.

    2010-03-01

    Additive Congruential Random Number (ACORN) generators represent an approach to generating uniformly distributed pseudo-random numbers that is straightforward to implement efficiently for arbitrarily large order and modulus; if it is implemented using integer arithmetic, it becomes possible to generate identical sequences on any machine. This paper briefly reviews existing results concerning ACORN generators and relevant theory concerning sequences that are well distributed mod 1 in k dimensions. It then demonstrates some new theoretical results for ACORN generators implemented in integer arithmetic with modulus M=2[mu] showing that they are a family of generators that converge (in a sense that is defined in the paper) to being well distributed mod 1 in k dimensions, as [mu]=log2M tends to infinity. By increasing k, it is possible to increase without limit the number of dimensions in which the resulting sequences approximate to well distributed. The paper concludes by applying the standard TestU01 test suite to ACORN generators for selected values of the modulus (between 260 and 2150), the order (between 4 and 30) and various odd seed values. On the basis of these and earlier results, it is recommended that an order of at least 9 be used together with an odd seed and modulus equal to 230p, for a small integer value of p. While a choice of p=2 should be adequate for most typical applications, increasing p to 3 or 4 gives a sequence that will consistently pass all the tests in the TestU01 test suite, giving additional confidence in more demanding applications. The results demonstrate that the ACORN generators are a reliable source of uniformly distributed pseudo-random numbers, and that in practice (as suggested by the theoretical convergence results) the quality of the ACORN sequences increases with increasing modulus and order.

  16. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  17. Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity.

    Science.gov (United States)

    Lizier, Joseph T; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail

    2011-02-01

    The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.

  18. A thermodynamic and theoretical view for enzyme regulation.

    Science.gov (United States)

    Zhao, Qinyi

    2015-01-01

    Precise regulation is fundamental to the proper functioning of enzymes in a cell. Current opinions about this, such as allosteric regulation and dynamic contribution to enzyme regulation, are experimental models and substantially empirical. Here we proposed a theoretical and thermodynamic model of enzyme regulation. The main idea is that enzyme regulation is processed via the regulation of abundance of active conformation in the reaction buffer. The theoretical foundation, experimental evidence, and experimental criteria to test our model are discussed and reviewed. We conclude that basic principles of enzyme regulation are laws of protein thermodynamics and it can be analyzed using the concept of distribution curve of active conformations of enzymes.

  19. An Information-Theoretic Approach for Indirect Train Traffic Monitoring Using Building Vibration

    Directory of Open Access Journals (Sweden)

    Susu Xu

    2017-05-01

    Full Text Available This paper introduces an indirect train traffic monitoring method to detect and infer real-time train events based on the vibration response of a nearby building. Monitoring and characterizing traffic events are important for cities to improve the efficiency of transportation systems (e.g., train passing, heavy trucks, and traffic. Most prior work falls into two categories: (1 methods that require intensive labor to manually record events or (2 systems that require deployment of dedicated sensors. These approaches are difficult and costly to execute and maintain. In addition, most prior work uses dedicated sensors designed for a single purpose, resulting in deployment of multiple sensor systems. This further increases costs. Meanwhile, with the increasing demands of structural health monitoring, many vibration sensors are being deployed in commercial buildings. Traffic events create ground vibration that propagates to nearby building structures inducing noisy vibration responses. We present an information-theoretic method for train event monitoring using commonly existing vibration sensors deployed for building health monitoring. The key idea is to represent the wave propagation in a building induced by train traffic as information conveyed in noisy measurement signals. Our technique first uses wavelet analysis to detect train events. Then, by analyzing information exchange patterns of building vibration signals, we infer the category of the events (i.e., southbound or northbound train. Our algorithm is evaluated with an 11-story building where trains pass by frequently. The results show that the method can robustly achieve a train event detection accuracy of up to a 93% true positive rate and an 80% true negative rate. For direction categorization, compared with the traditional signal processing method, our information-theoretic approach reduces categorization error from 32.1 to 12.1%, which is a 2.5× improvement.

  20. Appendix B: Fisher, lynx, wolverine summary of distribution information

    Science.gov (United States)

    Mary Maj

    1994-01-01

    We present maps depicting distributions of fisher, lynx, and wolverine in the western United States since 1961. Comparison of past and current distributions of species can shed light on population persistence, periods of population isolation, meta-population structure, and important connecting landscapes. Information on the distribution of the American marten is not...

  1. An Everyday and Theoretical Reading of "Perezhivanie" for Informing Research in Early Childhood Education

    Science.gov (United States)

    Fleer, Marilyn

    2016-01-01

    The concept of "perezhivanie" has received increasing attention in recent years. However, a clear understanding of this term has not yet been established. Mostly what is highlighted is the need for more informed theoretical discussion. In this paper, discussions centre on what "perezhivanie" means for research in early…

  2. A short course in quantum information theory. An approach from theoretical physics. 2. ed.

    International Nuclear Information System (INIS)

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition: ''The best things about this book are its brevity and clarity. In around 100 pages it provides a tutorial introduction to quantum information theory, including problems and solutions.. it's worth a look if you want to quickly get up to speed with the language and central concepts of quantum information theory, including the background classical information theory.'' (Craig Savage, Australian Physics, Vol. 44 (2), 2007). (orig.)

  3. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin; Kuester, Falk

    2010-01-01

    data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented

  4. Towards a distributed information architecture for avionics data

    Science.gov (United States)

    Mattmann, Chris; Freeborn, Dana; Crichton, Dan

    2003-01-01

    Avionics data at the National Aeronautics and Space Administration's (NASA) Jet Propulsion Laboratory (JPL consists of distributed, unmanaged, and heterogeneous information that is hard for flight system design engineers to find and use on new NASA/JPL missions. The development of a systematic approach for capturing, accessing and sharing avionics data critical to the support of NASA/JPL missions and projects is required. We propose a general information architecture for managing the existing distributed avionics data sources and a method for querying and retrieving avionics data using the Object Oriented Data Technology (OODT) framework. OODT uses XML messaging infrastructure that profiles data products and their locations using the ISO-11179 data model for describing data products. Queries against a common data dictionary (which implements the ISO model) are translated to domain dependent source data models, and distributed data products are returned asynchronously through the OODT middleware. Further work will include the ability to 'plug and play' new manufacturer data sources, which are distributed at avionics component manufacturer locations throughout the United States.

  5. Theoretical study on the inverse modeling of deep body temperature measurement

    International Nuclear Information System (INIS)

    Huang, Ming; Chen, Wenxi

    2012-01-01

    We evaluated the theoretical aspects of monitoring the deep body temperature distribution with the inverse modeling method. A two-dimensional model was built based on anatomical structure to simulate the human abdomen. By integrating biophysical and physiological information, the deep body temperature distribution was estimated from cutaneous surface temperature measurements using an inverse quasilinear method. Simulations were conducted with and without the heat effect of blood perfusion in the muscle and skin layers. The results of the simulations showed consistently that the noise characteristics and arrangement of the temperature sensors were the major factors affecting the accuracy of the inverse solution. With temperature sensors of 0.05 °C systematic error and an optimized 16-sensor arrangement, the inverse method could estimate the deep body temperature distribution with an average absolute error of less than 0.20 °C. The results of this theoretical study suggest that it is possible to reconstruct the deep body temperature distribution with the inverse method and that this approach merits further investigation. (paper)

  6. Assessing Two Theoretical Frameworks of Civic Engagement

    Science.gov (United States)

    García-Cabrero, Benilde; Pérez-Martínez, María Guadalupe; Sandoval-Hernández, Andrés; Caso-Niebla, Joaquín; Díaz-López, Carlos David

    2016-01-01

    The purpose of this study was to empirically test two major theoretical models: a modified version of the social capital model (Pattie, Seyd and Whiteley, 2003), and the Informed Social Engagement Model (Barr and Selman, 2014; Selman and Kwok, 2010), to explain civic participation and civic knowledge of adolescents from Chile, Colombia and Mexico,…

  7. Distributed Collaborative Learning Communities Enabled by Information Communication Technology

    NARCIS (Netherlands)

    H.L. Alvarez (Heidi Lee)

    2006-01-01

    textabstractHow and why can Information Communication Technology (ICT) contribute to enhancing learning in distributed Collaborative Learning Communities (CLCs)? Drawing from relevant theories concerned with phenomenon of ICT enabled distributed collaborative learning, this book identifies gaps in

  8. Autonomous Information Fading and Provision to Achieve High Response Time in Distributed Information Systems

    Science.gov (United States)

    Lu, Xiaodong; Arfaoui, Helene; Mori, Kinji

    In highly dynamic electronic commerce environment, the need for adaptability and rapid response time to information service systems has become increasingly important. In order to cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed. FIF is a distributed information service system architecture, sustained by push/pull mobile agents to bring high-assurance of services through a recursive demand-oriented provision of the most popular information closer to the users to make a tradeoff between the cost of information service allocation and access. In this paper, based on the analysis of the relationship that exists among the users distribution, information provision and access time, we propose the technology for FIF design to resolve the competing requirements of users and providers to improve users' access time. In addition, to achieve dynamic load balancing with changing users preference, the autonomous information reallocation technology is proposed. We proved the effectiveness of the proposed technology through the simulation and comparison with the conventional system.

  9. Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2012-02-27

    The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study in detail the cases of the multivariate skew-normal and skew-t distributions. We implement our findings to the application of the optimal design of an ozone monitoring station network in Santiago de Chile. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  10. Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Contreras-Reyes, Javier E.; Genton, Marc G.

    2012-01-01

    The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study in detail the cases of the multivariate skew-normal and skew-t distributions. We implement our findings to the application of the optimal design of an ozone monitoring station network in Santiago de Chile. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  11. Transactive System: Part I: Theoretical Underpinnings of Payoff Functions, Control Decisions, Information Privacy, and Solution Concepts

    Energy Technology Data Exchange (ETDEWEB)

    Lian, Jianming [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Wei [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sun, Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Marinovici, Laurentiu D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kalsi, Karanjit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2018-01-17

    new transactive energy system design with demonstrable guarantees on stability and performance. Specifically, the goals are to (1) establish a theoretical basis for evaluating the performance of different transactive systems, (2) devise tools to address canonical problems that exemplify challenges and scenarios of transactive systems, and (3) provide guidelines for design of future transactive systems. This report, Part 1 of a two part series, advances the above-listed research objectives by reviewing existing transactive systems and identifying a theoretical foundation that integrates payoff functions, control decisions, information privacy, and mathematical solution concepts.

  12. Asymptotically Distribution-Free Goodness-of-Fit Testing for Copulas

    NARCIS (Netherlands)

    Can, S.U.; Einmahl, John; Laeven, R.J.A.

    2017-01-01

    Consider a random sample from a continuous multivariate distribution function F with copula C. In order to test the null hypothesis that C belongs to a certain parametric family, we construct an under H0 asymptotically distribution-free process that serves as a tests generator. The process is a

  13. 21 CFR 211.165 - Testing and release for distribution.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Laboratory Controls § 211.165 Testing and release for distribution. (a) For each batch of drug product, there shall be... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Testing and release for distribution. 211.165...

  14. Mutual Information Based Analysis for the Distribution of Financial Contagion in Stock Markets

    Directory of Open Access Journals (Sweden)

    Xudong Wang

    2017-01-01

    Full Text Available This paper applies mutual information to research the distribution of financial contagion in global stock markets during the US subprime crisis. First, we symbolize the daily logarithmic stock returns based on their quantiles. Then, the mutual information of the stock indices is calculated and the block bootstrap approach is adopted to test the financial contagion. We analyze not only the contagion distribution during the entire crisis period but also its evolution over different stages by using the sliding window method. The empirical results prove the widespread existence of financial contagion and show that markets impacted by contagion tend to cluster geographically. The distribution of the contagion strength is positively skewed and leptokurtic. The average contagion strength is low at the beginning and then witnesses an uptrend. It has larger values in the middle stage and declines in the late phase of the crisis. Meanwhile, the cross-regional contagion between Europe and America is stronger than that between either America and Asia or Europe and Asia. Europe is found to be the region most deeply impacted by the contagion, whereas Asia is the least affected.

  15. Testing Theoretical Relationships: Factors Influencing Positive Health Practices (PHP) in Filipino College Students

    Science.gov (United States)

    Ayres, Cynthia; Mahat, Ganga; Atkins, Robert

    2013-01-01

    Objective: To examine variables influencing the positive health practices (PHP) of Filipino college students to gain a better understanding of health practices in this ethnic/racial group. Cross-sectional study tested theoretical relationships postulated among (a) PHP, (b) social support (SS), (c) optimism, and (d) acculturation. Participants: A…

  16. A short course in quantum information theory an approach from theoretical physics

    CERN Document Server

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition...

  17. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    Science.gov (United States)

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  18. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    Science.gov (United States)

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  19. A Theoretical Investigation of Composite Overwrapped Pressure Vessel (COPV) Mechanics Applied to NASA Full Scale Tests

    Science.gov (United States)

    Thesken, John C.; Murthy, Pappu L. N.; Phoenix, S. L.; Greene, N.; Palko, Joseph L.; Eldridge, Jeffrey; Sutter, James; Saulsberry, R.; Beeson, H.

    2009-01-01

    A theoretical investigation of the factors controlling the stress rupture life of the National Aeronautics and Space Administration's (NASA) composite overwrapped pressure vessels (COPVs) continues. Kevlar (DuPont) fiber overwrapped tanks are of particular concern due to their long usage and the poorly understood stress rupture process in Kevlar filaments. Existing long term data show that the rupture process is a function of stress, temperature and time. However due to the presence of a load sharing liner, the manufacturing induced residual stresses and the complex mechanical response, the state of actual fiber stress in flight hardware and test articles is not clearly known. This paper is a companion to a previously reported experimental investigation and develops a theoretical framework necessary to design full-scale pathfinder experiments and accurately interpret the experimentally observed deformation and failure mechanisms leading up to static burst in COPVs. The fundamental mechanical response of COPVs is described using linear elasticity and thin shell theory and discussed in comparison to existing experimental observations. These comparisons reveal discrepancies between physical data and the current analytical results and suggest that the vessel s residual stress state and the spatial stress distribution as a function of pressure may be completely different from predictions based upon existing linear elastic analyses. The 3D elasticity of transversely isotropic spherical shells demonstrates that an overly compliant transverse stiffness relative to membrane stiffness can account for some of this by shifting a thin shell problem well into the realm of thick shell response. The use of calibration procedures are demonstrated as calibrated thin shell model results and finite element results are shown to be in good agreement with the experimental results. The successes reported here have lead to continuing work with full scale testing of larger NASA COPV

  20. Improving behaviour in self-testing (IBIS: Study on frequency of use, consequences, information needs and use, and quality of currently available consumer information (protocol

    Directory of Open Access Journals (Sweden)

    de Vries Nanne K

    2010-08-01

    Full Text Available Abstract Background Self-tests are available to consumers for more than 25 conditions, ranging from infectious diseases to cardiovascular risk factors. Self-tests are defined as in-vitro tests on body materials such as blood, urine, faeces, or saliva that are initiated by consumers to diagnose a particular disorder or risk factor without involving a medical professional. In 2006, 16% of a sample of Dutch Internet users had ever used at least one self-test and 17% intended to use a self-test in the future. The objectives of this study are to determine (1 the frequency of self-test use, (2 the consumers' reasons for using or not using a self-test, (3 the information that is used by self-testers in the different self-test stages and the consumers' interpretation of the quality of this information, (4 the consumers' response to self-test results in terms of their confidence in the result, reassurance by the test result, and follow-up behaviour, (5 the information consumers report to need in the decision making process of using or not using a self-test, and in further management on the basis of the self-test result, and (6 the quality of the currently available consumer information on a selected set of self-tests. Methods Mixed methods study with (1 a cross-sectional study consisting of a two-phase Internet-questionnaire, (2 semi-structured interviews with self-testers and consumers who intend to use a self-test, and (3 the assessment of the quality of consumer information of self-tests. The Health Belief Model and the Theory of Planned Behaviour will serve as the theoretical basis for the questionnaires and the interview topic guides. Conclusions The self-testing area is still in a state of flux and therefore it is expected that self-test use will increase in the future. To the best of our knowledge, this is the first study which combines quantitative and qualitative research to identify consumers' information needs and use concerning self-testing

  1. Multiparton interactions and multiparton distributions in QCD

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, Markus

    2011-11-15

    After a brief recapitulation of the general interest of parton densities, we discuss multiple hard interactions and multiparton distributions. We report on recent theoretical progress in their QCD description, on outstanding conceptual problems and on possibilities to use multiparton distributions as a laboratory to test and improve our understanding of hadron structure. (orig.)

  2. Multiparton interactions and multiparton distributions in QCD

    International Nuclear Information System (INIS)

    Diehl, Markus

    2011-11-01

    After a brief recapitulation of the general interest of parton densities, we discuss multiple hard interactions and multiparton distributions. We report on recent theoretical progress in their QCD description, on outstanding conceptual problems and on possibilities to use multiparton distributions as a laboratory to test and improve our understanding of hadron structure. (orig.)

  3. Experimental Entanglement Distribution by Separable States

    Science.gov (United States)

    Vollmer, Christina E.; Schulze, Daniela; Eberle, Tobias; Händchen, Vitus; Fiurášek, Jaromír; Schnabel, Roman

    2013-12-01

    Distribution of entanglement between macroscopically separated parties is crucial for future quantum information networks. Surprisingly, it has been theoretically shown that two distant systems can be entangled by sending a third system that is not entangled with either of them. Here, we experimentally distribute entanglement and successfully prove that our transmitted light beam is indeed not entangled with the parties’ local systems. Our work demonstrates an unexpected variant of entanglement distribution and improves the understanding necessary to engineer multipartite quantum networks.

  4. Information-Theoretic Data Discarding for Dynamic Trees on Data Streams

    Directory of Open Access Journals (Sweden)

    Christoforos Anagnostopoulos

    2013-12-01

    Full Text Available Ubiquitous automated data collection at an unprecedented scale is making available streaming, real-time information flows in a wide variety of settings, transforming both science and industry. Learning algorithms deployed in such contexts often rely on single-pass inference, where the data history is never revisited. Learning may also need to be temporally adaptive to remain up-to-date against unforeseen changes in the data generating mechanism. Online Bayesian inference remains challenged by such transient, evolving data streams. Nonparametric modeling techniques can prove particularly ill-suited, as the complexity of the model is allowed to increase with the sample size. In this work, we take steps to overcome these challenges by porting information theoretic heuristics, such as exponential forgetting and active learning, into a fully Bayesian framework. We showcase our methods by augmenting a modern non-parametric modeling framework, dynamic trees, and illustrate its performance on a number of practical examples. The end product is a powerful streaming regression and classification tool, whose performance compares favorably to the state-of-the-art.

  5. Dimensional Information-Theoretic Measurement of Facial Emotion Expressions in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Jihun Hamm

    2014-01-01

    Full Text Available Altered facial expressions of emotions are characteristic impairments in schizophrenia. Ratings of affect have traditionally been limited to clinical rating scales and facial muscle movement analysis, which require extensive training and have limitations based on methodology and ecological validity. To improve reliable assessment of dynamic facial expression changes, we have developed automated measurements of facial emotion expressions based on information-theoretic measures of expressivity of ambiguity and distinctiveness of facial expressions. These measures were examined in matched groups of persons with schizophrenia (n=28 and healthy controls (n=26 who underwent video acquisition to assess expressivity of basic emotions (happiness, sadness, anger, fear, and disgust in evoked conditions. Persons with schizophrenia scored higher on ambiguity, the measure of conditional entropy within the expression of a single emotion, and they scored lower on distinctiveness, the measure of mutual information across expressions of different emotions. The automated measures compared favorably with observer-based ratings. This method can be applied for delineating dynamic emotional expressivity in healthy and clinical populations.

  6. Statistical test for the distribution of galaxies on plates

    International Nuclear Information System (INIS)

    Garcia Lambas, D.

    1985-01-01

    A statistical test for the distribution of galaxies on plates is presented. We apply the test to synthetic astronomical plates obtained by means of numerical simulation (Garcia Lambas and Sersic 1983) with three different models for the 3-dimensional distribution, comparison with an observational plate, suggest the presence of filamentary structure. (author)

  7. An experimental test of the information model for negotiation of biparental care.

    Directory of Open Access Journals (Sweden)

    Jessica Meade

    Full Text Available BACKGROUND: Theoretical modelling of biparental care suggests that it can be a stable strategy if parents partially compensate for changes in behaviour by their partners. In empirical studies, however, parents occasionally match rather than compensate for the actions of their partners. The recently proposed "information model" adds to the earlier theory by factoring in information on brood value and/or need into parental decision-making. This leads to a variety of predicted parental responses following a change in partner work-rate depending on the information available to parents. METHODOLOGY/PRINCIPAL FINDINGS: We experimentally test predictions of the information model using a population of long-tailed tits. We show that parental information on brood need varies systematically through the nestling period and use this variation to predict parental responses to an experimental increase in partner work-rate via playback of extra chick begging calls. When parental information is relatively high, partial compensation is predicted, whereas when parental information is low, a matching response is predicted. CONCLUSIONS/SIGNIFICANCE: We find that although some responses are consistent with predictions, parents match a change in their partner's work-rate more often than expected and we discuss possible explanations for our findings.

  8. Distributed quantum information processing via quantum dot spins

    International Nuclear Information System (INIS)

    Jun, Liu; Qiong, Wang; Le-Man, Kuang; Hao-Sheng, Zeng

    2010-01-01

    We propose a scheme to engineer a non-local two-qubit phase gate between two remote quantum-dot spins. Along with one-qubit local operations, one can in principal perform various types of distributed quantum information processing. The scheme employs a photon with linearly polarisation interacting one after the other with two remote quantum-dot spins in cavities. Due to the optical spin selection rule, the photon obtains a Faraday rotation after the interaction process. By measuring the polarisation of the final output photon, a non-local two-qubit phase gate between the two remote quantum-dot spins is constituted. Our scheme may has very important applications in the distributed quantum information processing

  9. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  10. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    Science.gov (United States)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were

  11. Sodium flow distribution test of the air cooler tubes

    International Nuclear Information System (INIS)

    Uchida, Hiroyuki; Ohta, Hidehisa; Shimazu, Hisashi

    1980-01-01

    In the heat transfer tubes of the air cooler which is installed in the auxiliary core cooling system of the fast breeder prototype plant reactor ''Monju'', sodium freezing may be caused by undercooling the sodium induced by an extremely unbalanced sodium flow in the tubes. Thus, the sodium flow distribution test of the air cooler tubes was performed to examine the flow distribution of the tubes and to estimate the possibility of sodium freezing in the tubes. This test was performed by using a one fourth air cooler model installed in the water flow test facility. As the test results show, the flow distribution from the inlet header to each tube is almost equal at any operating condition, that is, the velocity deviation from normalized mean velocity is less than 6% and sodium freezing does not occur up to 250% air velocity deviation at stand-by condition. It was clear that the proposed air cooler design for the ''Monju'' will have a good sodium flow distribution at any operating condition. (author)

  12. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  13. Wild edible plant knowledge, distribution and transmission

    DEFF Research Database (Denmark)

    Turreira Garcia, Nerea; Theilade, Ida; Meilby, Henrik

    2015-01-01

    the distribution, transmission and loss of traditional ecological knowledge (TEK) concerning WEPs used by a Mayan community of Guatemala and to enumerate such plants. Methods: The case study was carried out in a semi-isolated community where part of the population took refuge in the mountains in 1982...... key informants. Information about the theoretical dimension of knowledge was gathered through free listing and a questionnaire survey, while practical skills were assessed using a plant identification test with photographs. All villagers older than 7 years participated in the research (n = 62......% of the cases, which led to increased knowledge of plants and ability to recognise them. Conclusions: The WEP survey may serve as a reference point and as a useful compilation of knowledge for the community for their current and future generations. This study shows that the elder and the refugees living...

  14. Plant management tools tested with a small-scale distributed generation laboratory

    International Nuclear Information System (INIS)

    Ferrari, Mario L.; Traverso, Alberto; Pascenti, Matteo; Massardo, Aristide F.

    2014-01-01

    Highlights: • Thermal grid innovative layouts. • Experimental rig for distributed generation. • Real-time management tool. • Experimental results for plant management. • Comparison with results from an optimization complete software. - Abstract: Optimization of power generation with smart grids is an important issue for extensive sustainable development of distributed generation. Since an experimental approach is essential for implementing validated optimization software, the TPG research team of the University of Genoa has installed a laboratory facility for carrying out studies on polygeneration grids. The facility consists of two co-generation prime movers based on conventional technology: a 100 kWe gas turbine (mGT) and a 20 kWe internal combustion engine (ICE). The rig high flexibility allows the possibility of integration with renewable-source based devices, such as biomass-fed boilers and solar panels. Special attention was devoted to thermal distribution grid design. To ensure the possibility of application in medium-large districts, composed of several buildings including energy users, generators or both, an innovative layout based on two ring pipes was examined. Thermal storage devices were also included in order to have a complete hardware platform suitable for assessing the performance of different management tools. The test presented in this paper was carried out with both the mGT and the ICE connected to this innovative thermal grid, while users were emulated by means of fan coolers controlled by inverters. During this test the plant is controlled by a real-time model capable of calculating a machine performance ranking, which is necessary in order to split power demands between the prime movers (marginal cost decrease objective). A complete optimization tool devised by TPG (ECoMP program) was also used in order to obtain theoretical results considering the same machines and load values. The data obtained with ECoMP were compared with the

  15. Population distribution around the Nevada Test Site, 1984

    International Nuclear Information System (INIS)

    Smith, D.D.; Coogan, J.S.

    1984-08-01

    The Environmental Monitoring Systems Laboratory (EMSL-LV) conducts an offsite radiological safety program outside the boundaries of the Nevada Test Site. As part of this program, the EMSL-LV maintains a comprehensive and current listing of all rural offsite residents and dairy animals within the controllable sectors (areas where the EMSL-LV could implement protective or remedial actions that would assure public safety). This report was produced to give a brief overview of the population distribution and information on the activities within the controllable sectors. Obviously the numbers of people in a sector change dependent upon the season of the year, and such diverse information as the price of minerals which relates to the opening and closing of mining operations. Currently, the controllable sectors out to 200 kilometers from the Control Point on the NTS are considered to be the entire northeast, north-northeast, north, north-northwest, west-northwest sectors and portions of the east and east-northeast sectors. The west-southwest and south-southwest sections are considered controllable out to 40 to 80 kilometers. No major population centers or dairy farms lie within these sectors. 7 references, 5 figures, 2 tables

  16. High-Rate Field Demonstration of Large-Alphabet Quantum Key Distribution

    Science.gov (United States)

    2016-10-12

    count rate of Bob’s detectors. In this detector-limited regime , it is advantageous to increase M to encode as much information as possible in each...High- rate field demonstration of large-alphabet quantum key distribution Catherine Lee,1, 2 Darius Bunandar,1 Zheshen Zhang,1 Gregory R. Steinbrecher...October 12, 2016) 2 Quantum key distribution (QKD) enables secure symmetric key exchange for information-theoretically secure com- munication via one-time

  17. Adhesion strength of Ni film on Ti substrate characterized by three-point bend test, peel test and theoretic calculation

    International Nuclear Information System (INIS)

    Ren, F.Z.; Liu, P.; Jia, S.G.; Tian, B.H.; Su, J.H.

    2006-01-01

    Electroplating was employed to fabricate the Ni film on the Ti substrate. Adhesion strength of Ni film on Ti substrate was determined using the three-point bend technique that was proposed in standard mechanics test. The experimental results demonstrate that the interface fracture energies obviously increase with the roughness of Ti substrates, and are independence with the thickness of Ni films. Moreover, the adhesion strength of Ni film on Ti substrate was also measured by peel test, and was evaluated by Miedema model of experiential electron theory. The intrinsic interface fracture energy measured by three-point bend test is reasonable agreement with that obtained by theoretical calculation of Miedema model, and is roughly comparable to that by peel test

  18. The Value of Information in Distributed Decision Networks

    Science.gov (United States)

    2016-03-04

    formulation, and then we describe the various results at- tained. 1 Mathematical description of Distributed Decision Network un- der Information...Constraints We now define a mathematical framework for networks. Let G = (V,E) be an undirected random network (graph) drawn from a known distribution pG, 1...to any linear, combinatorial problem like shortest path optimization, and, further, so long as the original combinatorial problem can be solved in

  19. Toward theoretical understanding of the fertility preservation decision-making process: examining information processing among young women with cancer.

    Science.gov (United States)

    Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2013-01-01

    Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.

  20. Theoretical analysis of radiographic images by nonstationary Poisson processes

    International Nuclear Information System (INIS)

    Tanaka, Kazuo; Uchida, Suguru; Yamada, Isao.

    1980-01-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process. (author)

  1. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  2. Goodness-of-fit tests for a heavy tailed distribution

    NARCIS (Netherlands)

    A.J. Koning (Alex); L. Peng (Liang)

    2005-01-01

    textabstractFor testing whether a distribution function is heavy tailed, we study the Kolmogorov test, Berk-Jones test, score test and their integrated versions. A comparison is conducted via Bahadur efficiency and simulations. The score test and the integrated score test show the best

  3. Modelling Dynamic Forgetting in Distributed Information Systems

    NARCIS (Netherlands)

    N.F. Höning (Nicolas); M.C. Schut

    2010-01-01

    htmlabstractWe describe and model a new aspect in the design of distributed information systems. We build upon a previously described problem on the microlevel, which asks how quickly agents should discount (forget) their experience: If they cherish their memories, they can build their reports on

  4. Multipath interference test method for distributed amplifiers

    Science.gov (United States)

    Okada, Takahiro; Aida, Kazuo

    2005-12-01

    A method for testing distributed amplifiers is presented; the multipath interference (MPI) is detected as a beat spectrum between the multipath signal and the direct signal using a binary frequency shifted keying (FSK) test signal. The lightwave source is composed of a DFB-LD that is directly modulated by a pulse stream passing through an equalizer, and emits the FSK signal of the frequency deviation of about 430MHz at repetition rate of 80-100 kHz. The receiver consists of a photo-diode and an electrical spectrum analyzer (ESA). The base-band power spectrum peak appeared at the frequency of the FSK frequency deviation can be converted to amount of MPI using a calibration chart. The test method has improved the minimum detectable MPI as low as -70 dB, compared to that of -50 dB of the conventional test method. The detailed design and performance of the proposed method are discussed, including the MPI simulator for calibration procedure, computer simulations for evaluating the error caused by the FSK repetition rate and the fiber length under test and experiments on singlemode fibers and distributed Raman amplifier.

  5. An Information Theoretic Framework and Self-organizing Agent- based Sensor Network Architecture for Power Plant Condition Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Loparo, Kenneth [Case Western Reserve Univ., Cleveland, OH (United States); Kolacinski, Richard [Case Western Reserve Univ., Cleveland, OH (United States); Threeanaew, Wanchat [Case Western Reserve Univ., Cleveland, OH (United States); Agharazi, Hanieh [Case Western Reserve Univ., Cleveland, OH (United States)

    2017-01-30

    A central goal of the work was to enable both the extraction of all relevant information from sensor data, and the application of information gained from appropriate processing and fusion at the system level to operational control and decision-making at various levels of the control hierarchy through: 1. Exploiting the deep connection between information theory and the thermodynamic formalism, 2. Deployment using distributed intelligent agents with testing and validation in a hardware-in-the loop simulation environment. Enterprise architectures are the organizing logic for key business processes and IT infrastructure and, while the generality of current definitions provides sufficient flexibility, the current architecture frameworks do not inherently provide the appropriate structure. Of particular concern is that existing architecture frameworks often do not make a distinction between ``data'' and ``information.'' This work defines an enterprise architecture for health and condition monitoring of power plant equipment and further provides the appropriate foundation for addressing shortcomings in current architecture definition frameworks through the discovery of the information connectivity between the elements of a power generation plant. That is, to identify the correlative structure between available observations streams using informational measures. The principle focus here is on the implementation and testing of an emergent, agent-based, algorithm based on the foraging behavior of ants for eliciting this structure and on measures for characterizing differences between communication topologies. The elicitation algorithms are applied to data streams produced by a detailed numerical simulation of Alstom’s 1000 MW ultra-super-critical boiler and steam plant. The elicitation algorithm and topology characterization can be based on different informational metrics for detecting connectivity, e.g. mutual information and linear correlation.

  6. Game-Theoretical Design of an Adaptive Distributed Dissemination Protocol for VANETs.

    Science.gov (United States)

    Iza-Paredes, Cristhian; Mezher, Ahmad Mohamad; Aguilar Igartua, Mónica; Forné, Jordi

    2018-01-19

    Road safety applications envisaged for Vehicular Ad Hoc Networks (VANETs) depend largely on the dissemination of warning messages to deliver information to concerned vehicles. The intended applications, as well as some inherent VANET characteristics, make data dissemination an essential service and a challenging task in this kind of networks. This work lays out a decentralized stochastic solution for the data dissemination problem through two game-theoretical mechanisms. Given the non-stationarity induced by a highly dynamic topology, diverse network densities, and intermittent connectivity, a solution for the formulated game requires an adaptive procedure able to exploit the environment changes. Extensive simulations reveal that our proposal excels in terms of number of transmissions, lower end-to-end delay and reduced overhead while maintaining high delivery ratio, compared to other proposals.

  7. A new technique for testing distribution of knowledge and to estimate sampling sufficiency in ethnobiology studies.

    Science.gov (United States)

    Araújo, Thiago Antonio Sousa; Almeida, Alyson Luiz Santos; Melo, Joabe Gomes; Medeiros, Maria Franco Trindade; Ramos, Marcelo Alves; Silva, Rafael Ricardo Vasconcelos; Almeida, Cecília Fátima Castelo Branco Rangel; Albuquerque, Ulysses Paulino

    2012-03-15

    We propose a new quantitative measure that enables the researcher to make decisions and test hypotheses about the distribution of knowledge in a community and estimate the richness and sharing of information among informants. In our study, this measure has two levels of analysis: intracultural and intrafamily. Using data collected in northeastern Brazil, we evaluated how these new estimators of richness and sharing behave for different categories of use. We observed trends in the distribution of the characteristics of informants. We were also able to evaluate how outliers interfere with these analyses and how other analyses may be conducted using these indices, such as determining the distance between the knowledge of a community and that of experts, as well as exhibiting the importance of these individuals' communal information of biological resources. One of the primary applications of these indices is to supply the researcher with an objective tool to evaluate the scope and behavior of the collected data.

  8. Spatial distribution and landuse planning of informal automobile ...

    African Journals Online (AJOL)

    Spatial distribution and landuse planning of informal automobile workshops in Osogbo, ... data pertaining to the activities and other related issues of their workshops. ... The study therefore, recommends the establishment of mechanic complex, ...

  9. Predicting incident size from limited information

    International Nuclear Information System (INIS)

    Englehardt, J.D.

    1995-01-01

    Predicting the size of low-probability, high-consequence natural disasters, industrial accidents, and pollutant releases is often difficult due to limitations in the availability of data on rare events and future circumstances. When incident data are available, they may be difficult to fit with a lognormal distribution. Two Bayesian probability distributions for inferring future incident-size probabilities from limited, indirect, and subjective information are proposed in this paper. The distributions are derived from Pareto distributions that are shown to fit data on different incident types and are justified theoretically. The derived distributions incorporate both inherent variability and uncertainty due to information limitations. Results were analyzed to determine the amount of data needed to predict incident-size probabilities in various situations. Information requirements for incident-size prediction using the methods were low, particularly when the population distribution had a thick tail. Use of the distributions to predict accumulated oil-spill consequences was demonstrated

  10. Determination of distribution pattern of the heavy metal concentrations in the potable network of Gachsaran by Geographical Information System (GIS

    Directory of Open Access Journals (Sweden)

    G Paraham

    2013-12-01

    . Methods: In this descriptive, cross-sectional study, samples were taken from11 spots of the drinking water distribution network and tested for concentration of 10 metals by Inductivity Coupled Ions Plasma (ICP method in summer of 2010. The research data were compared with national and international water standards. Then the distribution map of heavy metals concentrations in the drinking water wells of the region was prepared by using the Geographical Information System (GIS software. Data were analyzed by the Kruskal-Wallis tests. Results: In all samples, the average concentration of heavy metals were: Arsenic 0.54, Cadmium 0.05, Zinc 55.9, Lead 0.18, Copper .82, Chromium 1.6, Barium 36.5, Selenium0.5, Mercury 0.1 and Silver 0.05 micrograms per liter and was less than the water quality standard. Conclusion: Based on the results obtained, it can be concluded that concentrations of heavy metals in Gachsaran’s drinking water distribution network are not higher than national and international standards and therefore not harmful for people. Key words: Heavy metals, Distribution network, Gachsaran, geographical information system (GIS

  11. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  12. Change detection in bi-temporal data by canonical information analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. Where CCA is ideal for Gaussian data, CIA facilitates...

  13. Materials Science Research Rack-1 Fire Suppressant Distribution Test Report

    Science.gov (United States)

    Wieland, P. O.

    2002-01-01

    Fire suppressant distribution testing was performed on the Materials Science Research Rack-1 (MSRR-1), a furnace facility payload that will be installed in the U.S. Lab module of the International Space Station. Unlike racks that were tested previously, the MSRR-1 uses the Active Rack Isolation System (ARIS) to reduce vibration on experiments, so the effects of ARIS on fire suppressant distribution were unknown. Two tests were performed to map the distribution of CO2 fire suppressant throughout a mockup of the MSRR-1 designed to have the same component volumes and flowpath restrictions as the flight rack. For the first test, the average maximum CO2 concentration for the rack was 60 percent, achieved within 45 s of discharge initiation, meeting the requirement to reach 50 percent throughout the rack within 1 min. For the second test, one of the experiment mockups was removed to provide a worst-case configuration, and the average maximum CO2 concentration for the rack was 58 percent. Comparing the results of this testing with results from previous testing leads to several general conclusions that can be used to evaluate future racks. The MSRR-1 will meet the requirements for fire suppressant distribution. Primary factors that affect the ability to meet the CO2 distribution requirements are the free air volume in the rack and the total area and distribution of openings in the rack shell. The length of the suppressant flowpath and degree of tortuousness has little correlation with CO2 concentration. The total area of holes in the rack shell could be significantly increased. The free air volume could be significantly increased. To ensure the highest maximum CO2 concentration, the PFE nozzle should be inserted to the stop on the nozzle.

  14. Optimal Information Processing in Biochemical Networks

    Science.gov (United States)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  15. Validation of the Information/Communications Technology Literacy Test

    Science.gov (United States)

    2016-10-01

    Technical Report 1360 Validation of the Information /Communications Technology Literacy Test D. Matthew Trippe Human Resources Research...TITLE AND SUBTITLE Validation of the Information /Communications Technology Literacy Test 5a. CONTRACT OR GRANT NUMBER W91WAS-09-D-0013 5b...validate a measure of cyber aptitude, the Information /Communications Technology Literacy Test (ICTL), in predicting trainee performance in Information

  16. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer

    Science.gov (United States)

    Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2014-01-01

    Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086

  17. Development and validation of a theoretical test of proficiency for video-assisted thoracoscopic surgery (VATS) lobectomy

    DEFF Research Database (Denmark)

    Savran, Mona M; Hansen, Henrik Jessen; Horsleben Petersen, René

    2015-01-01

    BACKGROUND: Testing stimulates learning, improves long-term retention, and promotes technical performance. No purpose-orientated test of competence in the theoretical aspects of VATS lobectomy has previously been presented. The purpose of this study was, therefore, to develop and gather validity...... performed significantly better than the novices (p better than the novices (p

  18. Information-theoretical analysis of private content identification

    NARCIS (Netherlands)

    Voloshynovskiy, S.; Koval, O.; Beekhof, F.; Farhadzadeh, F.; Holotyak, T.

    2010-01-01

    In recent years, content identification based on digital fingerprinting attracts a lot of attention in different emerging applications. At the same time, the theoretical analysis of digital fingerprinting systems for finite length case remains an open issue. Additionally, privacy leaks caused by

  19. A Simple theoretical model for 63Ni betavoltaic battery

    International Nuclear Information System (INIS)

    ZUO, Guoping; ZHOU, Jianliang; KE, Guotu

    2013-01-01

    A numerical simulation of the energy deposition distribution in semiconductors is performed for 63 Ni beta particles. Results show that the energy deposition distribution exhibits an approximate exponential decay law. A simple theoretical model is developed for 63 Ni betavoltaic battery based on the distribution characteristics. The correctness of the model is validated by two literature experiments. Results show that the theoretical short-circuit current agrees well with the experimental results, and the open-circuit voltage deviates from the experimental results in terms of the influence of the PN junction defects and the simplification of the source. The theoretical model can be applied to 63 Ni and 147 Pm betavoltaic batteries. - Highlights: • The energy deposition distribution is found following an approximate exponential decay law when beta particles emitted from 63 Ni pass through a semiconductor. • A simple theoretical model for 63 Ni betavoltaic battery is constructed based on the exponential decay law. • Theoretical model can be applied to the betavoltaic batteries which radioactive source has a similar energy spectrum with 63 Ni, such as 147 Pm

  20. Distributed retrieval practice promotes superior recall of anatomy information.

    Science.gov (United States)

    Dobson, John L; Perez, Jose; Linderholm, Tracy

    2017-07-01

    Effortful retrieval produces greater long-term recall of information when compared to studying (i.e., reading), as do learning sessions that are distributed (i.e., spaced apart) when compared to those that are massed together. Although the retrieval and distributed practice effects are well-established in the cognitive science literature, no studies have examined their additive effect with regard to learning anatomy information. The aim of this study was to determine how the benefits of retrieval practice vary with massed versus distributed learning. Participants used the following strategies to learn sets of skeletal muscle anatomy: (1) studying on three different days over a seven day period (SSSS 7,2,0 ), (2) studying and retrieving on three different days over a seven day period (SRSR 7,2,0 ), (3) studying on two different days over a two day period (SSSSSS 2,0 ), (4) studying and retrieving on two separate days over a two day period (SRSRSR 2,0 ), and (5) studying and retrieving on one day (SRx6 0 ). All strategies consisted of 12 learning phases and lasted exactly 24 minutes. Muscle information retention was assessed via free recall and using repeated measures ANOVAs. A week after learning, the recall scores were 24.72 ± 3.12, 33.88 ± 3.48, 15.51 ± 2.48, 20.72 ± 2.94, and 12.86 ± 2.05 for the SSSS 7,2,0 , SRSR 7,2,0 , SSSSSS 2,0 , STSTST 2,0 , and SRx6 0 strategies, respectively. In conclusion, the distributed strategies produced significantly better recall than the massed strategies, the retrieval-based strategies produced significantly better recall than the studying strategies, and the combination of distributed and retrieval practice generated the greatest recall of anatomy information. Anat Sci Educ 10: 339-347. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  1. AGIS: Evolution of Distributed Computing Information system for ATLAS

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria; Karavakis, Edward

    2015-01-01

    The variety of the ATLAS Computing Infrastructure requires a central information system to define the topology of computing resources and to store the different parameters and configuration data which are needed by the various ATLAS software components. The ATLAS Grid Information System is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services.

  2. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    Science.gov (United States)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi

  3. Distributed Information Search and Retrieval for Astronomical Resource Discovery and Data Mining

    Science.gov (United States)

    Murtagh, Fionn; Guillaume, Damien

    Information search and retrieval has become by nature a distributed task. We look at tools and techniques which are of importance in this area. Current technological evolution can be summarized as the growing stability and cohesiveness of distributed architectures of searchable objects. The objects themselves are more often than not multimedia, including published articles or grey literature reports, yellow page services, image data, catalogs, presentation and online display materials, and ``operations'' information such as scheduling and publicly accessible proposal information. The evolution towards distributed architectures, protocols and formats, and the direction of our own work, are focussed on in this paper.

  4. Distributed Sensor Network Software Development Testing through Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Sean M. [Univ. of New Mexico, Albuquerque, NM (United States)

    2003-12-01

    The distributed sensor network (DSN) presents a novel and highly complex computing platform with dif culties and opportunities that are just beginning to be explored. The potential of sensor networks extends from monitoring for threat reduction, to conducting instant and remote inventories, to ecological surveys. Developing and testing for robust and scalable applications is currently practiced almost exclusively in hardware. The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for DSNs independent of hardware constraints. The exibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness and scaling issues, to explore arbitrary algorithms for distributed sensors, and to defeat those algorithms through simulated failure. The user speci es the topology, the environment, the application, and any number of arbitrary failures; DSS provides the virtual environmental embedding.

  5. Probability Distribution and Deviation Information Fusion Driven Support Vector Regression Model and Its Application

    Directory of Open Access Journals (Sweden)

    Changhao Fan

    2017-01-01

    Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.

  6. MODELING OF TECHNICAL CHANNELS OF INFORMATION LEAKAGE AT DISTRIBUTED CONTROL OBJECTS

    Directory of Open Access Journals (Sweden)

    Aleksander Vladimirovich Karpov

    2018-05-01

    Full Text Available The significant increase in requirements for distributed control objects’ functioning can’t be realized only at the expense of the widening and strengthening of security control measures. The first step in ensuring the information security at such objects is the analysis of the conditions of their functioning and modeling of technical channels of information leakage. The development of models of such channels is essentially the only method of complete study of their opportunities and it is pointed toward receiving quantitative assessments of the safe operation of compound objects. The evaluation data are necessary to make a decision on the degree of the information security from a leak according to the current criterion. The existing models are developed for the standard concentrated objects and allow to evaluate the level of information security from a leak on each of channels separately, what involves the significant increase in the required protective resource and time of assessment of information security on an object in general. The article deals with a logical-and-probabilistic method of a security assessment of structurally-compound objects. The model of a security leak on the distributed control objects is cited as an example. It is recommended to use a software package of an automated structurally-logistical modeling of compound systems, which allows to evaluate risk of information leakage in the loudspeaker. A possibility of information leakage by technical channels is evaluated and such differential characteristics of the safe operation of the distributed control objects as positive and negative contributions of the initiating events and conditions, which cause a leak are calculated. Purpose. The aim is a quantitative assessment of data risk, which is necessary for justifying the rational composition of organizational and technical protection measures, as well as a variant of the structure of the information security system from a

  7. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    Science.gov (United States)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  8. Theoretical clarity is not “Manicheanism”

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2011-01-01

    It is argued that in order to establish a new theoretical approach to information science it is necessary to express disagreement with some established views. The “social turn” in information science is not just exemplified in relation to the works of Marcia Bates but in relation to many different...... researchers in the field. Therefore it should not be taken personally, and the debate should focus on the substance. Marcia Bates has contributed considerably to information science. In spite of this some of her theoretical points of departure may be challenged. It is important to seek theoretical clarity...... and this may involve a degree of schematic confrontation that should not be confused with theoretical one-sidedness, “Manicheanism” or lack of respect....

  9. Distributed Systems and Applications of Information Filtering and Retrieval

    CERN Document Server

    Giuliani, Alessandro; Semeraro, Giovanni; DART 2012

    2014-01-01

    This volume focuses on new challenges in distributed Information Filtering and Retrieval. It collects invited chapters and extended research contributions from the special session on Information Filtering and Retrieval: Novel Distributed Systems and Applications (DART) of the 4th International Conference on Knowledge Discovery and Information Retrieval (KDIR 2012), held in Barcelona, Spain, on 4-7 October 2012. The main focus of DART was to discuss and compare suitable novel solutions based on intelligent techniques and applied to real-world applications. The chapters of this book present a comprehensive review of related works and state of the art. Authors, both practitioners and researchers, shared their results in several topics such as "Multi-Agent Systems", "Natural Language Processing", "Automatic Advertisement", "Customer Interaction Analytics", "Opinion Mining". Contributions have been careful reviewed by experts in the area, who also gave useful suggestions to improve the quality of the volume.

  10. Information system for administrating and distributing color images through internet

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The information system for administrating and distributing color images through the Internet ensures the consistent replication of color images, their storage - in an on-line data base - and predictable distribution, by means of a digitally distributed flow, based on Windows platform and POD (Print On Demand technology. The consistent replication of color images inde-pendently from the parameters of the processing equipment and from the features of the programs composing the technological flow, is ensured by the standard color management sys-tem defined by ICC (International Color Consortium, which is integrated by the Windows operation system and by the POD technology. The latter minimize the noticeable differences between the colors captured, displayed or printed by various replication equipments and/or edited by various graphical applications. The system integrated web application ensures the uploading of the color images in an on-line database and their administration and distribution among the users via the Internet. For the preservation of the data expressed by the color im-ages during their transfer along a digitally distributed flow, the software application includes an original tool ensuring the accurate replication of colors on computer displays or when printing them by means of various color printers or presses. For development and use, this application employs a hardware platform based on PC support and a competitive software platform, based on: the Windows operation system, the .NET. Development medium and the C# programming language. This information system is beneficial for creators and users of color images, the success of the printed or on-line (Internet publications depending on the sizeable, predictable and accurate replication of colors employed for the visual expression of information in every activity fields of the modern society. The herein introduced information system enables all interested persons to access the

  11. 242A Distributed Control System Year 2000 Acceptance Test Report

    Energy Technology Data Exchange (ETDEWEB)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct year 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.

  12. 242A Distributed Control System Year 2000 Acceptance Test Report

    International Nuclear Information System (INIS)

    TEATS, M.C.

    1999-01-01

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct year 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3(trademark) (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each

  13. Universality in an information-theoretic motivated nonlinear Schrodinger equation

    International Nuclear Information System (INIS)

    Parwani, R; Tabia, G

    2007-01-01

    Using perturbative methods, we analyse a nonlinear generalization of Schrodinger's equation that had previously been obtained through information-theoretic arguments. We obtain analytical expressions for the leading correction, in terms of the nonlinearity scale, to the energy eigenvalues of the linear Schrodinger equation in the presence of an external potential and observe some generic features. In one space dimension these are (i) for nodeless ground states, the energy shifts are subleading in the nonlinearity parameter compared to the shifts for the excited states; (ii) the shifts for the excited states are due predominantly to contribution from the nodes of the unperturbed wavefunctions, and (iii) the energy shifts for excited states are positive for small values of a regulating parameter and negative at large values, vanishing at a universal critical value that is not manifest in the equation. Some of these features hold true for higher dimensional problems. We also study two exactly solved nonlinear Schrodinger equations so as to contrast our observations. Finally, we comment on the possible significance of our results if the nonlinearity is physically realized

  14. Concerns regarding a call for pluralism of information theory and hypothesis testing

    Science.gov (United States)

    Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.

    2007-01-01

    1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.

  15. Mnemonic transmission, social contagion, and emergence of collective memory: Influence of emotional valence, group structure, and information distribution.

    Science.gov (United States)

    Choi, Hae-Yoon; Kensinger, Elizabeth A; Rajaram, Suparna

    2017-09-01

    Social transmission of memory and its consequence on collective memory have generated enduring interdisciplinary interest because of their widespread significance in interpersonal, sociocultural, and political arenas. We tested the influence of 3 key factors-emotional salience of information, group structure, and information distribution-on mnemonic transmission, social contagion, and collective memory. Participants individually studied emotionally salient (negative or positive) and nonemotional (neutral) picture-word pairs that were completely shared, partially shared, or unshared within participant triads, and then completed 3 consecutive recalls in 1 of 3 conditions: individual-individual-individual (control), collaborative-collaborative (identical group; insular structure)-individual, and collaborative-collaborative (reconfigured group; diverse structure)-individual. Collaboration enhanced negative memories especially in insular group structure and especially for shared information, and promoted collective forgetting of positive memories. Diverse group structure reduced this negativity effect. Unequally distributed information led to social contagion that creates false memories; diverse structure propagated a greater variety of false memories whereas insular structure promoted confidence in false recognition and false collective memory. A simultaneous assessment of network structure, information distribution, and emotional valence breaks new ground to specify how network structure shapes the spread of negative memories and false memories, and the emergence of collective memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. A multivariate rank test for comparing mass size distributions

    KAUST Repository

    Lombard, F.

    2012-04-01

    Particle size analyses of a raw material are commonplace in the mineral processing industry. Knowledge of particle size distributions is crucial in planning milling operations to enable an optimum degree of liberation of valuable mineral phases, to minimize plant losses due to an excess of oversize or undersize material or to attain a size distribution that fits a contractual specification. The problem addressed in the present paper is how to test the equality of two or more underlying size distributions. A distinguishing feature of these size distributions is that they are not based on counts of individual particles. Rather, they are mass size distributions giving the fractions of the total mass of a sampled material lying in each of a number of size intervals. As such, the data are compositional in nature, using the terminology of Aitchison [1] that is, multivariate vectors the components of which add to 100%. In the literature, various versions of Hotelling\\'s T 2 have been used to compare matched pairs of such compositional data. In this paper, we propose a robust test procedure based on ranks as a competitor to Hotelling\\'s T 2. In contrast to the latter statistic, the power of the rank test is not unduly affected by the presence of outliers or of zeros among the data. © 2012 Copyright Taylor and Francis Group, LLC.

  17. Multi parton interactions and multi parton distributions in QCD

    International Nuclear Information System (INIS)

    Diehl, M.

    2012-01-01

    After a brief recapitulation of the general interest of parton densities, we discuss multiple hard interactions and multi parton distributions. We report on recent theoretical progress in their QCD description, on outstanding conceptual problems and on possibilities to use multi parton distributions as a laboratory to test and improve our understanding of hadron structure. (author)

  18. Test Information Targeting Strategies for Adaptive Multistage Testing Designs.

    Science.gov (United States)

    Luecht, Richard M.; Burgin, William

    Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…

  19. Towards an Information Model of Consistency Maintenance in Distributed Interactive Applications

    Directory of Open Access Journals (Sweden)

    Xin Zhang

    2008-01-01

    Full Text Available A novel framework to model and explore predictive contract mechanisms in distributed interactive applications (DIAs using information theory is proposed. In our model, the entity state update scheme is modelled as an information generation, encoding, and reconstruction process. Such a perspective facilitates a quantitative measurement of state fidelity loss as a result of the distribution protocol. Results from an experimental study on a first-person shooter game are used to illustrate the utility of this measurement process. We contend that our proposed model is a starting point to reframe and analyse consistency maintenance in DIAs as a problem in distributed interactive media compression.

  20. Tiered guidance for risk-informed environmental health and safety testing of nanotechnologies

    International Nuclear Information System (INIS)

    Collier, Zachary A.; Kennedy, Alan J.; Poda, Aimee R.; Cuddy, Michael F.; Moser, Robert D.; MacCuspie, Robert I.; Harmon, Ashley; Plourde, Kenton; Haines, Christopher D.; Steevens, Jeffery A.

    2015-01-01

    Provided the rapid emergence of novel technologies containing engineered nanomaterials, there is a need to better understand the potential environmental, health, and safety effects of nanotechnologies before wide-scale deployment. However, the unique properties of nanomaterials and uncertainty regarding applicable test methods have led to a lack of consensus regarding the collection and evaluation of data related to hazard and exposure potentials. Often, overly conservative approaches to characterization and data collection result in prolonged, unfocused, or irrelevant testing, which increases costs and delays deployment. In this paper, we provide a novel testing guidance framework for determining whether a nanotechnology has the potential to release material with nano-specific parameters that pose a risk to humans or the environment. The framework considers methods to categorize nanotechnologies by their structure and within their relevant-use scenarios to inform testing in a time- and resource-limited reality. Based on the precedent of dredged sediment testing, a five-tiered approach is proposed in which opportunities are presented to conclude testing once sufficient risk-related information has been collected, or that the technology in question does not require nano-specific scrutiny. A series of screening stages are suggested, covering relevant aspects including size, surface area, distribution, unique behaviors, and release potential. The tiered, adaptive guidance approach allows users to concentrate on collecting the most relevant data, thus accelerating technology deployment while minimizing risk

  1. Uncovering Bugs in Distributed Storage Systems during Testing (not in Production!)

    OpenAIRE

    Deligiannis, P; McCutchen, M; Thomson, P; Chen, S; Donaldson, AF; Erickson, J; Huang, C; Lal, A; Mudduluru, R; Qadeer, S; Schulte, W

    2016-01-01

    Testing distributed systems is challenging due to multiple sources of nondeterminism. Conventional testing techniques, such as unit, integration and stress testing, are ineffective in preventing serious but subtle bugs from reaching production. Formal techniques, such as TLA+, can only verify high-level specifications of systems at the level of logic-based models, and fall short of checking the actual executable code. In this paper, we present a new methodology for testing distributed systems...

  2. Project W-320 acceptance test report for AY-farm electrical distribution

    International Nuclear Information System (INIS)

    Bevins, R.R.

    1998-01-01

    This Acceptance Test Procedure (ATP) has been prepared to demonstrate that the AY-Farm Electrical Distribution System functions as required by the design criteria. This test is divided into three parts to support the planned construction schedule; Section 8 tests Mini-Power Pane AY102-PPI and the EES; Section 9 tests the SSS support systems; Section 10 tests the SSS and the Multi-Pak Group Control Panel. This test does not include the operation of end-use components (loads) supplied from the distribution system. Tests of the end-use components (loads) will be performed by other W-320 ATPs

  3. Testing the Pareto against the lognormal distributions with the uniformly most powerful unbiased test applied to the distribution of cities.

    Science.gov (United States)

    Malevergne, Yannick; Pisarenko, Vladilen; Sornette, Didier

    2011-03-01

    Fat-tail distributions of sizes abound in natural, physical, economic, and social systems. The lognormal and the power laws have historically competed for recognition with sometimes closely related generating processes and hard-to-distinguish tail properties. This state-of-affair is illustrated with the debate between Eeckhout [Amer. Econ. Rev. 94, 1429 (2004)] and Levy [Amer. Econ. Rev. 99, 1672 (2009)] on the validity of Zipf's law for US city sizes. By using a uniformly most powerful unbiased (UMPU) test between the lognormal and the power-laws, we show that conclusive results can be achieved to end this debate. We advocate the UMPU test as a systematic tool to address similar controversies in the literature of many disciplines involving power laws, scaling, "fat" or "heavy" tails. In order to demonstrate that our procedure works for data sets other than the US city size distribution, we also briefly present the results obtained for the power-law tail of the distribution of personal identity (ID) losses, which constitute one of the major emergent risks at the interface between cyberspace and reality.

  4. Model-free information-theoretic approach to infer leadership in pairs of zebrafish.

    Science.gov (United States)

    Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio

    2016-04-01

    Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.

  5. Local Information as a Resource in Distributed Quantum Systems

    Science.gov (United States)

    Horodecki, Michał; Horodecki, Karol; Horodecki, Paweł; Horodecki, Ryszard; Oppenheim, Jonathan; Sende, Aditi; Sen, Ujjwal

    2003-03-01

    A new paradigm for distributed quantum systems where information is a valuable resource is developed. After finding a unique measure for information, we construct a scheme for its manipulation in analogy with entanglement theory. In this scheme, instead of maximally entangled states, two parties distill local states. We show that, surprisingly, the main tools of entanglement theory are general enough to work in this opposite scheme. Up to plausible assumptions, we show that the amount of information that must be lost during the protocol of concentration of local information can be expressed as the relative entropy distance from some special set of states.

  6. Perspectives on Cybersecurity Information Sharing among Multiple Stakeholders Using a Decision-Theoretic Approach.

    Science.gov (United States)

    He, Meilin; Devine, Laura; Zhuang, Jun

    2018-02-01

    The government, private sectors, and others users of the Internet are increasingly faced with the risk of cyber incidents. Damage to computer systems and theft of sensitive data caused by cyber attacks have the potential to result in lasting harm to entities under attack, or to society as a whole. The effects of cyber attacks are not always obvious, and detecting them is not a simple proposition. As the U.S. federal government believes that information sharing on cybersecurity issues among organizations is essential to safety, security, and resilience, the importance of trusted information exchange has been emphasized to support public and private decision making by encouraging the creation of the Information Sharing and Analysis Center (ISAC). Through a decision-theoretic approach, this article provides new perspectives on ISAC, and the advent of the new Information Sharing and Analysis Organizations (ISAOs), which are intended to provide similar benefits to organizations that cannot fit easily into the ISAC structure. To help understand the processes of information sharing against cyber threats, this article illustrates 15 representative information sharing structures between ISAC, government, and other participating entities, and provide discussions on the strategic interactions between different stakeholders. This article also identifies the costs of information sharing and information security borne by different parties in this public-private partnership both before and after cyber attacks, as well as the two main benefits. This article provides perspectives on the mechanism of information sharing and some detailed cost-benefit analysis. © 2017 Society for Risk Analysis.

  7. Game-Theoretic Learning in Distributed Control

    KAUST Repository

    Marden, Jason R.; Shamma, Jeff S.

    2018-01-01

    from autonomous vehicles to energy to transportation. One approach to control of such distributed architectures is to view the components as players in a game. In this approach, two design considerations are the components’ incentives and the rules

  8. Learning to merge search results for efficient Distributed Information Retrieval

    NARCIS (Netherlands)

    Tjin-Kam-Jet, Kien; Hiemstra, Djoerd

    2010-01-01

    Merging search results from different servers is a major problem in Distributed Information Retrieval. We used Regression-SVM and Ranking-SVM which would learn a function that merges results based on information that is readily available: i.e. the ranks, titles, summaries and URLs contained in the

  9. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  10. Machine Learning with Squared-Loss Mutual Information

    Directory of Open Access Journals (Sweden)

    Masashi Sugiyama

    2012-12-01

    Full Text Available Mutual information (MI is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the ƒ-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference.

  11. Test Expectancy and Memory for Important Information

    Science.gov (United States)

    Middlebrooks, Catherine D.; Murayama, Kou; Castel, Alan D.

    2017-01-01

    Prior research suggests that learners study and remember information differently depending upon the type of test they expect to later receive. The current experiments investigate how testing expectations impact the study of and memory for valuable information. Participants studied lists of words ranging in value from 1 to 10 points with the goal…

  12. RECONSTRUCTING REDSHIFT DISTRIBUTIONS WITH CROSS-CORRELATIONS: TESTS AND AN OPTIMIZED RECIPE

    International Nuclear Information System (INIS)

    Matthews, Daniel J.; Newman, Jeffrey A.

    2010-01-01

    Many of the cosmological tests to be performed by planned dark energy experiments will require extremely well-characterized photometric redshift measurements. Current estimates for cosmic shear are that the true mean redshift of the objects in each photo-z bin must be known to better than 0.002(1 + z), and the width of the bin must be known to ∼0.003(1 + z) if errors in cosmological measurements are not to be degraded significantly. A conventional approach is to calibrate these photometric redshifts with large sets of spectroscopic redshifts. However, at the depths probed by Stage III surveys (such as DES), let alone Stage IV (LSST, JDEM, and Euclid), existing large redshift samples have all been highly (25%-60%) incomplete, with a strong dependence of success rate on both redshift and galaxy properties. A powerful alternative approach is to exploit the clustering of galaxies to perform photometric redshift calibrations. Measuring the two-point angular cross-correlation between objects in some photometric redshift bin and objects with known spectroscopic redshift, as a function of the spectroscopic z, allows the true redshift distribution of a photometric sample to be reconstructed in detail, even if it includes objects too faint for spectroscopy or if spectroscopic samples are highly incomplete. We test this technique using mock DEEP2 Galaxy Redshift survey light cones constructed from the Millennium Simulation semi-analytic galaxy catalogs. From this realistic test, which incorporates the effects of galaxy bias evolution and cosmic variance, we find that the true redshift distribution of a photometric sample can, in fact, be determined accurately with cross-correlation techniques. We also compare the empirical error in the reconstruction of redshift distributions to previous analytic predictions, finding that additional components must be included in error budgets to match the simulation results. This extra error contribution is small for surveys that sample

  13. Information access, income distribution, and the Environmental Kuznets Curve

    Energy Technology Data Exchange (ETDEWEB)

    Bimonte, Salvatore [Department of Political Economy, University of Siena, Piazza S. Francesco 7, 53100 Siena (Italy)

    2002-04-01

    Recent empirical studies have tested the hypothesis of an Environmental Kuznets Curve (EKC) focusing primarily on the relationship between per capita income and certain types of pollutant emissions. Given the stock-nature of many pollution problems, emissions only partially account for the environmental impacts. Moreover, almost all of the studies have given consideration to little more than income levels as explanatory variables. This paper empirically tests the hypothesis of the EKC existence for a stock-sensitive indicator, that is, the percentage of protected area (PA) within national territory. It does theorize that economic growth is a necessary condition in order to better address environmental issues. But it also stresses that other variables (income distribution, education, information accessibility) may play a fundamental role in determining environmental quality. Contrary to other studies that mainly focus on the calculation of the income level corresponding to the transition point, this paper is more concerned with the calculation of environmental quality corresponding to that transition point, that is, the minimum level of environmental quality that a country is willing to accept. This paper highlights the idea that if the transition point is determined primarily by income level, social policies determine the level of environmental quality corresponding to that point.

  14. Information access, income distribution, and the Environmental Kuznets Curve

    International Nuclear Information System (INIS)

    Bimonte, Salvatore

    2002-01-01

    Recent empirical studies have tested the hypothesis of an Environmental Kuznets Curve (EKC) focusing primarily on the relationship between per capita income and certain types of pollutant emissions. Given the stock-nature of many pollution problems, emissions only partially account for the environmental impacts. Moreover, almost all of the studies have given consideration to little more than income levels as explanatory variables. This paper empirically tests the hypothesis of the EKC existence for a stock-sensitive indicator, that is, the percentage of protected area (PA) within national territory. It does theorize that economic growth is a necessary condition in order to better address environmental issues. But it also stresses that other variables (income distribution, education, information accessibility) may play a fundamental role in determining environmental quality. Contrary to other studies that mainly focus on the calculation of the income level corresponding to the transition point, this paper is more concerned with the calculation of environmental quality corresponding to that transition point, that is, the minimum level of environmental quality that a country is willing to accept. This paper highlights the idea that if the transition point is determined primarily by income level, social policies determine the level of environmental quality corresponding to that point

  15. Computational Study of Chemical Reactivity Using Information-Theoretic Quantities from Density Functional Reactivity Theory for Electrophilic Aromatic Substitution Reactions.

    Science.gov (United States)

    Wu, Wenjie; Wu, Zemin; Rong, Chunying; Lu, Tian; Huang, Ying; Liu, Shubin

    2015-07-23

    The electrophilic aromatic substitution for nitration, halogenation, sulfonation, and acylation is a vastly important category of chemical transformation. Its reactivity and regioselectivity is predominantly determined by nucleophilicity of carbon atoms on the aromatic ring, which in return is immensely influenced by the group that is attached to the aromatic ring a priori. In this work, taking advantage of recent developments in quantifying nucleophilicity (electrophilicity) with descriptors from the information-theoretic approach in density functional reactivity theory, we examine the reactivity properties of this reaction system from three perspectives. These include scaling patterns of information-theoretic quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy and information gain at both molecular and atomic levels, quantitative predictions of the barrier height with both Hirshfeld charge and information gain, and energetic decomposition analyses of the barrier height for the reactions. To that end, we focused in this work on the identity reaction of the monosubstituted-benzene molecule reacting with hydrogen fluoride using boron trifluoride as the catalyst in the gas phase. We also considered 19 substituting groups, 9 of which are ortho/para directing and the other 9 meta directing, besides the case of R = -H. Similar scaling patterns for these information-theoretic quantities found for stable species elsewhere were disclosed for these reactions systems. We also unveiled novel scaling patterns for information gain at the atomic level. The barrier height of the reactions can reliably be predicted by using both the Hirshfeld charge and information gain at the regioselective carbon atom. The energy decomposition analysis ensued yields an unambiguous picture about the origin of the barrier height, where we showed that it is the electrostatic interaction that plays the dominant role, while the roles played by exchange-correlation and

  16. Distribution of model-based multipoint heterogeneity lod scores.

    Science.gov (United States)

    Xing, Chao; Morris, Nathan; Xing, Guan

    2010-12-01

    The distribution of two-point heterogeneity lod scores (HLOD) has been intensively investigated because the conventional χ(2) approximation to the likelihood ratio test is not directly applicable. However, there was no study investigating th e distribution of the multipoint HLOD despite its wide application. Here we want to point out that, compared with the two-point HLOD, the multipoint HLOD essentially tests for homogeneity given linkage and follows a relatively simple limiting distribution ½χ²₀+ ½χ²₁, which can be obtained by established statistical theory. We further examine the theoretical result by simulation studies. © 2010 Wiley-Liss, Inc.

  17. Agent paradigm and services technology for distributed Information Sources

    Directory of Open Access Journals (Sweden)

    Hakima Mellah

    2011-10-01

    Full Text Available The complexity of information is issued from interacting information sources (IS, and could be better exploited with respect to relevance of information. In distributed IS system, relevant information has a content that is in connection with other contents in information network, and is used for a certain purpose. The highlighting point of the proposed model is to contribute to information system agility according to a three-dimensional view involving the content, the use and the structure. This reflects the relevance of information complexity and effective methodologies through self organized principle to manage the complexity. This contribution is primarily focused on presenting some factors that lead and trigger for self organization in a Service Oriented Architecture (SOA and how it can be possible to integrate self organization mechanism in the same.

  18. An Object-Oriented Information Model for Policy-based Management of Distributed Applications

    NARCIS (Netherlands)

    Diaz, G.; Gay, V.C.J.; Horlait, E.; Hamza, M.H.

    2002-01-01

    This paper presents an object-oriented information model to support a policy-based management for distributed multimedia applications. The information base contains application-level information about the users, the applications, and their profile. Our Information model is described in details and

  19. Information Theoretic Characterization of Physical Theories with Projective State Space

    Science.gov (United States)

    Zaopo, Marco

    2015-08-01

    Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.

  20. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    International Nuclear Information System (INIS)

    Kiktenko, Evgeniy O.

    2017-01-01

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. Finally, the proposed technique is based on introducing symmetry in operations of parties, and the consideration of results of unsuccessful belief-propagation decodings.

  1. Geometry of the q-exponential distribution with dependent competing risks and accelerated life testing

    Science.gov (United States)

    Zhang, Fode; Shi, Yimin; Wang, Ruibing

    2017-02-01

    In the information geometry suggested by Amari (1985) and Amari et al. (1987), a parametric statistical model can be regarded as a differentiable manifold with the parameter space as a coordinate system. Note that the q-exponential distribution plays an important role in Tsallis statistics (see Tsallis, 2009), this paper investigates the geometry of the q-exponential distribution with dependent competing risks and accelerated life testing (ALT). A copula function based on the q-exponential function, which can be considered as the generalized Gumbel copula, is discussed to illustrate the structure of the dependent random variable. Employing two iterative algorithms, simulation results are given to compare the performance of estimations and levels of association under different hybrid progressively censoring schemes (HPCSs).

  2. Photon path distribution and optical responses of turbid media: theoretical analysis based on the microscopic Beer-Lambert law.

    Science.gov (United States)

    Tsuchiya, Y

    2001-08-01

    A concise theoretical treatment has been developed to describe the optical responses of a highly scattering inhomogeneous medium using functions of the photon path distribution (PPD). The treatment is based on the microscopic Beer-Lambert law and has been found to yield a complete set of optical responses by time- and frequency-domain measurements. The PPD is defined for possible photons having a total zigzag pathlength of l between the points of light input and detection. Such a distribution is independent of the absorption properties of the medium and can be uniquely determined for the medium under quantification. Therefore, the PPD can be calculated with an imaginary reference medium having the same optical properties as the medium under quantification except for the absence of absorption. One of the advantages of this method is that the optical responses, the total attenuation, the mean pathlength, etc are expressed by functions of the PPD and the absorption distribution.

  3. Molecular acidity: An accurate description with information-theoretic approach in density functional reactivity theory.

    Science.gov (United States)

    Cao, Xiaofang; Rong, Chunying; Zhong, Aiguo; Lu, Tian; Liu, Shubin

    2018-01-15

    Molecular acidity is one of the important physiochemical properties of a molecular system, yet its accurate calculation and prediction are still an unresolved problem in the literature. In this work, we propose to make use of the quantities from the information-theoretic (IT) approach in density functional reactivity theory and provide an accurate description of molecular acidity from a completely new perspective. To illustrate our point, five different categories of acidic series, singly and doubly substituted benzoic acids, singly substituted benzenesulfinic acids, benzeneseleninic acids, phenols, and alkyl carboxylic acids, have been thoroughly examined. We show that using IT quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy, information gain, Onicescu information energy, and relative Rényi entropy, one is able to simultaneously predict experimental pKa values of these different categories of compounds. Because of the universality of the quantities employed in this work, which are all density dependent, our approach should be general and be applicable to other systems as well. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. A general algorithm for distributing information in a graph

    OpenAIRE

    Aji, Srinivas M.; McEliece, Robert J.

    1997-01-01

    We present a general “message-passing” algorithm for distributing information in a graph. This algorithm may help us to understand the approximate correctness of both the Gallager-Tanner-Wiberg algorithm, and the turbo-decoding algorithm.

  5. Adaptive Metropolis Sampling with Product Distributions

    Science.gov (United States)

    Wolpert, David H.; Lee, Chiu Fan

    2005-01-01

    The Metropolis-Hastings (MH) algorithm is a way to sample a provided target distribution pi(z). It works by repeatedly sampling a separate proposal distribution T(x,x') to generate a random walk {x(t)}. We consider a modification of the MH algorithm in which T is dynamically updated during the walk. The update at time t uses the {x(t' less than t)} to estimate the product distribution that has the least Kullback-Leibler distance to pi. That estimate is the information-theoretically optimal mean-field approximation to pi. We demonstrate through computer experiments that our algorithm produces samples that are superior to those of the conventional MH algorithm.

  6. Forecasting an invasive species’ distribution with global distribution data, local data, and physiological information

    Science.gov (United States)

    Jarnevich, Catherine S.; Young, Nicholas E.; Talbert, Marian; Talbert, Colin

    2018-01-01

    Understanding invasive species distributions and potential invasions often requires broad‐scale information on the environmental tolerances of the species. Further, resource managers are often faced with knowing these broad‐scale relationships as well as nuanced environmental factors related to their landscape that influence where an invasive species occurs and potentially could occur. Using invasive buffelgrass (Cenchrus ciliaris), we developed global models and local models for Saguaro National Park, Arizona, USA, based on location records and literature on physiological tolerances to environmental factors to investigate whether environmental relationships of a species at a global scale are also important at local scales. In addition to correlative models with five commonly used algorithms, we also developed a model using a priori user‐defined relationships between occurrence and environmental characteristics based on a literature review. All correlative models at both scales performed well based on statistical evaluations. The user‐defined curves closely matched those produced by the correlative models, indicating that the correlative models may be capturing mechanisms driving the distribution of buffelgrass. Given climate projections for the region, both global and local models indicate that conditions at Saguaro National Park may become more suitable for buffelgrass. Combining global and local data with correlative models and physiological information provided a holistic approach to forecasting invasive species distributions.

  7. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Theoretical investigation of metal magnetic memory testing technique for detection of magnetic flux leakage signals from buried defect

    Science.gov (United States)

    Xu, Kunshan; Qiu, Xingqi; Tian, Xiaoshuai

    2018-01-01

    The metal magnetic memory testing (MMMT) technique has been extensively applied in various fields because of its unique advantages of easy operation, low cost and high efficiency. However, very limited theoretical research has been conducted on application of MMMT to buried defects. To promote study in this area, the equivalent magnetic charge method is employed to establish a self-magnetic flux leakage (SMFL) model of a buried defect. Theoretical results based on the established model successfully capture basic characteristics of the SMFL signals of buried defects, as confirmed via experiment. In particular, the newly developed model can calculate the buried depth of a defect based on the SMFL signals obtained via testing. The results show that the new model can successfully assess the characteristics of buried defects, which is valuable in the application of MMMT in non-destructive testing.

  9. Spatially-Explicit Bayesian Information Entropy Metrics for Calibrating Landscape Transformation Models

    Directory of Open Access Journals (Sweden)

    Kostas Alexandridis

    2013-06-01

    Full Text Available Assessing spatial model performance often presents challenges related to the choice and suitability of traditional statistical methods in capturing the true validity and dynamics of the predicted outcomes. The stochastic nature of many of our contemporary spatial models of land use change necessitate the testing and development of new and innovative methodologies in statistical spatial assessment. In many cases, spatial model performance depends critically on the spatially-explicit prior distributions, characteristics, availability and prevalence of the variables and factors under study. This study explores the statistical spatial characteristics of statistical model assessment of modeling land use change dynamics in a seven-county study area in South-Eastern Wisconsin during the historical period of 1963–1990. The artificial neural network-based Land Transformation Model (LTM predictions are used to compare simulated with historical land use transformations in urban/suburban landscapes. We introduce a range of Bayesian information entropy statistical spatial metrics for assessing the model performance across multiple simulation testing runs. Bayesian entropic estimates of model performance are compared against information-theoretic stochastic entropy estimates and theoretically-derived accuracy assessments. We argue for the critical role of informational uncertainty across different scales of spatial resolution in informing spatial landscape model assessment. Our analysis reveals how incorporation of spatial and landscape information asymmetry estimates can improve our stochastic assessments of spatial model predictions. Finally our study shows how spatially-explicit entropic classification accuracy estimates can work closely with dynamic modeling methodologies in improving our scientific understanding of landscape change as a complex adaptive system and process.

  10. Constructing Common Information Space across Distributed Emergency Medical Teams

    DEFF Research Database (Denmark)

    Zhang, Zhan; Sarcevic, Aleksandra; Bossen, Claus

    2017-01-01

    This paper examines coordination and real-time information sharing across four emergency medical teams in a high-risk and distributed setting as they provide care to critically injured patients within the first hour after injury. Through multiple field studies we explored how common understanding...... of critical patient data is established across these heterogeneous teams and what coordination mechanisms are being used to support information sharing and interpretation. To analyze the data, we drew on the concept of Common Information Spaces (CIS). Our results showed that teams faced many challenges...... in achieving efficient information sharing and coordination, including difficulties in locating and assembling team members, communicating and interpreting information from the field, and accommodating differences in team goals and information needs, all while having minimal technology support. We reflect...

  11. Research on Human Dynamics of Information Release of WeChat Users

    OpenAIRE

    Zhang, Juliang; Zhang, Shengtai; Duo, Fan; Wang, Feifei

    2017-01-01

    The information release behavior of WeChat users is influenced by many factors, and studying the rules of the behavior of users in WeChat can provide theoretical help for the dynamic research of mobile social network users. By crawling WeChat moments information of nine users within 5 years, we used the human behavioral dynamics system to analyze users' behavior. The results show that the information distribution behavior of WeChat users is consistent with the power-law distribution for a cer...

  12. Multirobot autonomous landmine detection using distributed multisensor information aggregation

    Science.gov (United States)

    Jumadinova, Janyl; Dasgupta, Prithviraj

    2012-06-01

    We consider the problem of distributed sensor information fusion by multiple autonomous robots within the context of landmine detection. We assume that different landmines can be composed of different types of material and robots are equipped with different types of sensors, while each robot has only one type of landmine detection sensor on it. We introduce a novel technique that uses a market-based information aggregation mechanism called a prediction market. Each robot is provided with a software agent that uses sensory input of the robot and performs calculations of the prediction market technique. The result of the agent's calculations is a 'belief' representing the confidence of the agent in identifying the object as a landmine. The beliefs from different robots are aggregated by the market mechanism and passed on to a decision maker agent. The decision maker agent uses this aggregate belief information about a potential landmine and makes decisions about which other robots should be deployed to its location, so that the landmine can be confirmed rapidly and accurately. Our experimental results show that, for identical data distributions and settings, using our prediction market-based information aggregation technique increases the accuracy of object classification favorably as compared to two other commonly used techniques.

  13. Test report light duty utility arm power distribution system (PDS)

    International Nuclear Information System (INIS)

    Clark, D.A.

    1996-01-01

    The Light Duty Utility Arm (LDUA) Power Distribution System has completed vendor and post-delivery acceptance testing. The Power Distribution System has been found to be acceptable and is now ready for integration with the overall LDUA system

  14. Data-driven approach for assessing utility of medical tests using electronic medical records.

    Science.gov (United States)

    Skrøvseth, Stein Olav; Augestad, Knut Magne; Ebadollahi, Shahram

    2015-02-01

    To precisely define the utility of tests in a clinical pathway through data-driven analysis of the electronic medical record (EMR). The information content was defined in terms of the entropy of the expected value of the test related to a given outcome. A kernel density classifier was used to estimate the necessary distributions. To validate the method, we used data from the EMR of the gastrointestinal department at a university hospital. Blood tests from patients undergoing surgery for gastrointestinal surgery were analyzed with respect to second surgery within 30 days of the index surgery. The information content is clearly reflected in the patient pathway for certain combinations of tests and outcomes. C-reactive protein tests coupled to anastomosis leakage, a severe complication show a clear pattern of information gain through the patient trajectory, where the greatest gain from the test is 3-4 days post index surgery. We have defined the information content in a data-driven and information theoretic way such that the utility of a test can be precisely defined. The results reflect clinical knowledge. In the case we used the tests carry little negative impact. The general approach can be expanded to cases that carry a substantial negative impact, such as in certain radiological techniques. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Decision-Making under Ambiguity Is Modulated by Visual Framing, but Not by Motor vs. Non-Motor Context. Experiments and an Information-Theoretic Ambiguity Model.

    Science.gov (United States)

    Grau-Moya, Jordi; Ortega, Pedro A; Braun, Daniel A

    2016-01-01

    A number of recent studies have investigated differences in human choice behavior depending on task framing, especially comparing economic decision-making to choice behavior in equivalent sensorimotor tasks. Here we test whether decision-making under ambiguity exhibits effects of task framing in motor vs. non-motor context. In a first experiment, we designed an experience-based urn task with varying degrees of ambiguity and an equivalent motor task where subjects chose between hitting partially occluded targets. In a second experiment, we controlled for the different stimulus design in the two tasks by introducing an urn task with bar stimuli matching those in the motor task. We found ambiguity attitudes to be mainly influenced by stimulus design. In particular, we found that the same subjects tended to be ambiguity-preferring when choosing between ambiguous bar stimuli, but ambiguity-avoiding when choosing between ambiguous urn sample stimuli. In contrast, subjects' choice pattern was not affected by changing from a target hitting task to a non-motor context when keeping the stimulus design unchanged. In both tasks subjects' choice behavior was continuously modulated by the degree of ambiguity. We show that this modulation of behavior can be explained by an information-theoretic model of ambiguity that generalizes Bayes-optimal decision-making by combining Bayesian inference with robust decision-making under model uncertainty. Our results demonstrate the benefits of information-theoretic models of decision-making under varying degrees of ambiguity for a given context, but also demonstrate the sensitivity of ambiguity attitudes across contexts that theoretical models struggle to explain.

  16. Decision-Making under Ambiguity Is Modulated by Visual Framing, but Not by Motor vs. Non-Motor Context. Experiments and an Information-Theoretic Ambiguity Model.

    Directory of Open Access Journals (Sweden)

    Jordi Grau-Moya

    Full Text Available A number of recent studies have investigated differences in human choice behavior depending on task framing, especially comparing economic decision-making to choice behavior in equivalent sensorimotor tasks. Here we test whether decision-making under ambiguity exhibits effects of task framing in motor vs. non-motor context. In a first experiment, we designed an experience-based urn task with varying degrees of ambiguity and an equivalent motor task where subjects chose between hitting partially occluded targets. In a second experiment, we controlled for the different stimulus design in the two tasks by introducing an urn task with bar stimuli matching those in the motor task. We found ambiguity attitudes to be mainly influenced by stimulus design. In particular, we found that the same subjects tended to be ambiguity-preferring when choosing between ambiguous bar stimuli, but ambiguity-avoiding when choosing between ambiguous urn sample stimuli. In contrast, subjects' choice pattern was not affected by changing from a target hitting task to a non-motor context when keeping the stimulus design unchanged. In both tasks subjects' choice behavior was continuously modulated by the degree of ambiguity. We show that this modulation of behavior can be explained by an information-theoretic model of ambiguity that generalizes Bayes-optimal decision-making by combining Bayesian inference with robust decision-making under model uncertainty. Our results demonstrate the benefits of information-theoretic models of decision-making under varying degrees of ambiguity for a given context, but also demonstrate the sensitivity of ambiguity attitudes across contexts that theoretical models struggle to explain.

  17. Decision-Making under Ambiguity Is Modulated by Visual Framing, but Not by Motor vs. Non-Motor Context. Experiments and an Information-Theoretic Ambiguity Model

    Science.gov (United States)

    Grau-Moya, Jordi; Ortega, Pedro A.; Braun, Daniel A.

    2016-01-01

    A number of recent studies have investigated differences in human choice behavior depending on task framing, especially comparing economic decision-making to choice behavior in equivalent sensorimotor tasks. Here we test whether decision-making under ambiguity exhibits effects of task framing in motor vs. non-motor context. In a first experiment, we designed an experience-based urn task with varying degrees of ambiguity and an equivalent motor task where subjects chose between hitting partially occluded targets. In a second experiment, we controlled for the different stimulus design in the two tasks by introducing an urn task with bar stimuli matching those in the motor task. We found ambiguity attitudes to be mainly influenced by stimulus design. In particular, we found that the same subjects tended to be ambiguity-preferring when choosing between ambiguous bar stimuli, but ambiguity-avoiding when choosing between ambiguous urn sample stimuli. In contrast, subjects’ choice pattern was not affected by changing from a target hitting task to a non-motor context when keeping the stimulus design unchanged. In both tasks subjects’ choice behavior was continuously modulated by the degree of ambiguity. We show that this modulation of behavior can be explained by an information-theoretic model of ambiguity that generalizes Bayes-optimal decision-making by combining Bayesian inference with robust decision-making under model uncertainty. Our results demonstrate the benefits of information-theoretic models of decision-making under varying degrees of ambiguity for a given context, but also demonstrate the sensitivity of ambiguity attitudes across contexts that theoretical models struggle to explain. PMID:27124723

  18. Theoretical models to predict the mechanical behavior of thick composite tubes

    Directory of Open Access Journals (Sweden)

    Volnei Tita

    2012-02-01

    Full Text Available This paper shows theoretical models (analytical formulations to predict the mechanical behavior of thick composite tubes and how some parameters can influence this behavior. Thus, firstly, it was developed the analytical formulations for a pressurized tube made of composite material with a single thick ply and only one lamination angle. For this case, the stress distribution and the displacement fields are investigated as function of different lamination angles and reinforcement volume fractions. The results obtained by the theoretical model are physic consistent and coherent with the literature information. After that, the previous formulations are extended in order to predict the mechanical behavior of a thick laminated tube. Both analytical formulations are implemented as a computational tool via Matlab code. The results obtained by the computational tool are compared to the finite element analyses, and the stress distribution is considered coherent. Moreover, the engineering computational tool is used to perform failure analysis, using different types of failure criteria, which identifies the damaged ply and the mode of failure.

  19. Texture side information generation for distributed coding of video-plus-depth

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Raket, Lars Lau; Zamarin, Marco

    2013-01-01

    We consider distributed video coding in a monoview video-plus-depth scenario, aiming at coding textures jointly with their corresponding depth stream. Distributed Video Coding (DVC) is a video coding paradigm in which the complexity is shifted from the encoder to the decoder. The Side Information...... components) is strongly correlated, so the additional depth information may be used to generate more accurate SI for the texture stream, increasing the efficiency of the system. In this paper we propose various methods for accurate texture SI generation, comparing them with other state-of-the-art solutions...

  20. Probability of error in information-hiding protocols

    NARCIS (Netherlands)

    Chatzikokolakis, K.; Palamidessi, C.; Panangaden, P.

    2007-01-01

    Randomized protocols for hiding private information can fruitfully be regarded as noisy channels in the information-theoretic sense, and the inference of the concealed information can be regarded as a hypothesis-testing problem. We consider the Bayesian approach to the problem, and investigate the

  1. Pharmacogenetic testing, informed consent and the problem of secondary information.

    Science.gov (United States)

    Netzer, Christian; Biller-Andorno, Nikola

    2004-08-01

    Numerous benefits for patients have been predicted if prescribing decisions were routinely accompanied by pharmacogenetic testing. So far, little attention has been paid to the possibility that the routine application of this new technology could result in considerable harm to patients. This article emphasises that pharmacogenetic testing shares both the opportunities and the pitfalls with 'conventional' disease-genetic testing. It demonstrates that performing pharmacogenetic tests as well as interpreting the results are extraordinarily complex issues requiring a high level of expertise. It further argues that pharmacogenetic testing can have a huge impact on clinical decisions and may influence the therapeutic strategy as well as the clinical monitoring of a patient. This view challenges the predominant paradigm that pharmacogenetic testing will predict patients' responses to medicines, but that it will not provide any other significant disease-specific predictive information about the patient or family members. The article also questions published proposals to reduce the consent procedure for pharmacogenetic testing to a simple statement that the physician wishes to test a sample of the patient's DNA to see if a drug will be safe or whether it will work, and presents an alternative model that is better suited to protect patient's interests and to obtain meaningful informed consent. The paper concludes by outlining conditions for the application of pharmacogenetic testing in clinical practice in a way that can make full use of its potential benefits while minimising possible harm to patients and their families.

  2. The distribution and use of information and communication ...

    African Journals Online (AJOL)

    The study determined the distribution and use of Information and Communication Technology (ICT) in teaching and learning in some faculties in the University of Ghana. Study specifically looks at the availability of ICT laboratories in the faculties, the purpose ICT is used f or as well as the challenges in its use. Survey of 300 ...

  3. Fault-Tolerant Consensus of Multi-Agent System With Distributed Adaptive Protocol.

    Science.gov (United States)

    Chen, Shun; Ho, Daniel W C; Li, Lulu; Liu, Ming

    2015-10-01

    In this paper, fault-tolerant consensus in multi-agent system using distributed adaptive protocol is investigated. Firstly, distributed adaptive online updating strategies for some parameters are proposed based on local information of the network structure. Then, under the online updating parameters, a distributed adaptive protocol is developed to compensate the fault effects and the uncertainty effects in the leaderless multi-agent system. Based on the local state information of neighboring agents, a distributed updating protocol gain is developed which leads to a fully distributed continuous adaptive fault-tolerant consensus protocol design for the leaderless multi-agent system. Furthermore, a distributed fault-tolerant leader-follower consensus protocol for multi-agent system is constructed by the proposed adaptive method. Finally, a simulation example is given to illustrate the effectiveness of the theoretical analysis.

  4. Community problem-solving framed as a distributed information use environment: bridging research and practice

    Directory of Open Access Journals (Sweden)

    Joan C. Durrance

    2006-01-01

    Full Text Available Introduction. This article results from a qualitative study of 1 information behavior in community problem-solving framed as a distributed information use environment and 2 approaches used by a best-practice library to anticipate information needs associated with community problem solving. Method. Several approaches to data collection were used - focus groups, interviews, observation of community and library meetings, and analysis of supporting documents. We focused first on the information behaviour of community groups. Finding that the library supported these activities we sought to understand its approach. Analysis. Data were coded thematically for both information behaviour concepts and themes germane to problem-solving activity. A grounded theory approach was taken to capture aspects of the library staff's practice. Themes evolved from the data; supporting documentation - reports, articles and library communication - was also coded. Results. The study showed 1 how information use environment components (people, setting, problems, problem resolutions combine in this distributed information use environment to determine specific information needs and uses; and 2 how the library contributed to the viability of this distributed information use environment. Conclusion. Community problem solving, here explicated as a distributed IUE, is likely to be seen in multiple communities. The library model presented demonstrates that by reshaping its information practice within the framework of an information use environment, a library can anticipate community information needs as they are generated and where they are most relevant.

  5. The value of private patient information in the physician-patient relationship: a game-theoretic account.

    Science.gov (United States)

    De Jaegher, Kris

    2012-01-01

    This paper presents a game-theoretical model of the physician-patient relationship. There is a conflict of interests between physician and patient, in that the physician prefers the patient to always obtain a particular treatment, even if the patient would not consider this treatment in his interest. The patient obtains imperfect cues of whether or not he needs the treatment. The effect of an increase in the quality of the patient's private information is studied, in the form of an improvement in the quality of his cues. It is shown that when the patient's information improves in this sense, he may either become better off or worse off. The precise circumstances under which either result is obtained are derived.

  6. Theoretical frameworks informing family-based child and adolescent obesity interventions

    DEFF Research Database (Denmark)

    Alulis, Sarah; Grabowski, Dan

    2017-01-01

    into focus. However, the use of theoretical frameworks to strengthen these interventions is rare and very uneven. OBJECTIVE AND METHOD: To conduct a qualitative meta-synthesis of family-based interventions for child and adolescent obesity to identify the theoretical frameworks applied, thus understanding how...... inconsistencies and a significant void between research results and health care practice. Based on the analysis, this article proposes three themes to be used as focus points when designing future interventions and when selecting theories for the development of solid, theory-based frameworks for application...... cognitive, self-efficacy and Family Systems Theory appeared most frequently. The remaining 24 were classified as theory-related as theoretical elements of self-monitoring; stimulus control, reinforcement and modelling were used. CONCLUSION: The designs of family-based interventions reveal numerous...

  7. Geographically distributed hybrid testing & collaboration between geotechnical centrifuge and structures laboratories

    Science.gov (United States)

    Ojaghi, Mobin; Martínez, Ignacio Lamata; Dietz, Matt S.; Williams, Martin S.; Blakeborough, Anthony; Crewe, Adam J.; Taylor, Colin A.; Madabhushi, S. P. Gopal; Haigh, Stuart K.

    2018-01-01

    Distributed Hybrid Testing (DHT) is an experimental technique designed to capitalise on advances in modern networking infrastructure to overcome traditional laboratory capacity limitations. By coupling the heterogeneous test apparatus and computational resources of geographically distributed laboratories, DHT provides the means to take on complex, multi-disciplinary challenges with new forms of communication and collaboration. To introduce the opportunity and practicability afforded by DHT, here an exemplar multi-site test is addressed in which a dedicated fibre network and suite of custom software is used to connect the geotechnical centrifuge at the University of Cambridge with a variety of structural dynamics loading apparatus at the University of Oxford and the University of Bristol. While centrifuge time-scaling prevents real-time rates of loading in this test, such experiments may be used to gain valuable insights into physical phenomena, test procedure and accuracy. These and other related experiments have led to the development of the real-time DHT technique and the creation of a flexible framework that aims to facilitate future distributed tests within the UK and beyond. As a further example, a real-time DHT experiment between structural labs using this framework for testing across the Internet is also presented.

  8. Three-dimensionality of space and the quantum bit: an information-theoretic approach

    International Nuclear Information System (INIS)

    Müller, Markus P; Masanes, Lluís

    2013-01-01

    It is sometimes pointed out as a curiosity that the state space of quantum two-level systems, i.e. the qubit, and actual physical space are both three-dimensional and Euclidean. In this paper, we suggest an information-theoretic analysis of this relationship, by proving a particular mathematical result: suppose that physics takes place in d spatial dimensions, and that some events happen probabilistically (not assuming quantum theory in any way). Furthermore, suppose there are systems that carry ‘minimal amounts of direction information’, interacting via some continuous reversible time evolution. We prove that this uniquely determines spatial dimension d = 3 and quantum theory on two qubits (including entanglement and unitary time evolution), and that it allows observers to infer local spatial geometry from probability measurements. (paper)

  9. Determinant Factors of Rural Income Distribution with Special Reference to Information and Communication Technology

    Directory of Open Access Journals (Sweden)

    Hamid Sepehrdoust

    2014-06-01

    Full Text Available The aim of this study is to evaluate the impact of information and communication technology development on economic development and income distribution of rural communities and to answer this question that whether the development of information and communication technologies in rural areas could improve income distribution condition in these communities or not. To this end, data on 30 province of country during 2000-2009 and panel data method has used. Results approves Kuznet's inverted U theory with respect to the economic growth and income distribution and shows that information and communication technology development has improved the income distribution and economic justice in country's rural communities. The negative and significant coefficient (-0.15, of number of computer users among rural households, show that the development of information and communication technologies in rural areas of the country play as a factor for improving income distribution in these communities. The model estimation also showed a significant and positive effect of urbanization and unemployment on the dependent variable. This means that with rising unemployment, the condition of income distribution has worsened in rural communities during the period of study.

  10. Numerical distribution functions of fractional unit root and cointegration tests

    DEFF Research Database (Denmark)

    MacKinnon, James G.; Nielsen, Morten Ørregaard

    We calculate numerically the asymptotic distribution functions of likelihood ratio tests for fractional unit roots and cointegration rank. Because these distributions depend on a real-valued parameter, b, which must be estimated, simple tabulation is not feasible. Partly due to the presence...

  11. Statistical Tests for Frequency Distribution of Mean Gravity Anomalies

    African Journals Online (AJOL)

    The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...

  12. Real World Awareness in Distributed Organizations: A View on Informal Processes

    Directory of Open Access Journals (Sweden)

    Eldar Sultanow

    2011-06-01

    Full Text Available Geographically distributed development has consistently had to deal with the challenge of intense awareness extensively more than locally concentrated development. Awareness marks the state of being informed incorporated with an understanding of project-related activities, states or relationships of each individual employee within a given group as a whole. In multifarious offices, where social interaction is necessary in order to distribute and locate information together with experts, awareness becomes a concurrent process which amplifies the exigency of easy routes for staff to be able to access this information, deferred or decentralized, in a formalized and problem-oriented way. Although the subject of Awareness has immensely increased in importance, there is extensive disagreement about how this transparency can be conceptually and technically implemented [1]. This paper introduces a model in order to visualize and navigate this information in three tiers using semantic networks, GIS and Web3D.

  13. A Game-Theoretic Approach to Information-Flow Control via Protocol Composition

    Directory of Open Access Journals (Sweden)

    Mário S. Alvim

    2018-05-01

    Full Text Available In the inference attacks studied in Quantitative Information Flow (QIF, the attacker typically tries to interfere with the system in the attempt to increase its leakage of secret information. The defender, on the other hand, typically tries to decrease leakage by introducing some controlled noise. This noise introduction can be modeled as a type of protocol composition, i.e., a probabilistic choice among different protocols, and its effect on the amount of leakage depends heavily on whether or not this choice is visible to the attacker. In this work, we consider operators for modeling visible and hidden choice in protocol composition, and we study their algebraic properties. We then formalize the interplay between defender and attacker in a game-theoretic framework adapted to the specific issues of QIF, where the payoff is information leakage. We consider various kinds of leakage games, depending on whether players act simultaneously or sequentially, and on whether or not the choices of the defender are visible to the attacker. In the case of sequential games, the choice of the second player is generally a function of the choice of the first player, and his/her probabilistic choice can be either over the possible functions (mixed strategy or it can be on the result of the function (behavioral strategy. We show that when the attacker moves first in a sequential game with a hidden choice, then behavioral strategies are more advantageous for the defender than mixed strategies. This contrasts with the standard game theory, where the two types of strategies are equivalent. Finally, we establish a hierarchy of these games in terms of their information leakage and provide methods for finding optimal strategies (at the points of equilibrium for both attacker and defender in the various cases.

  14. Industry sector analysis, Mexico: Electric power production and distribution equipment. Export Trade Information

    International Nuclear Information System (INIS)

    Wood, J.S.; Miller, R.W.

    1988-09-01

    The Industry Sector Analyses (I.S.A.) for electric power production and distribution equipment contains statistical and narrative information on projected market demand, end-users, receptivity of Mexican consumers to U.S. products, the competitive situation - Mexican production, total import market, U.S. market position, foreign competition, and competitive factors, and market access - Mexican tariffs, non-tariff barriers, standards, taxes and distribution channels. The I.S.A. provides the United States industry with meaningful information regarding the Mexican market for electric power production and distribution equipment

  15. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  16. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  17. Systems information management: graph theoretical approach

    NARCIS (Netherlands)

    Temel, T.

    2006-01-01

    This study proposes a new method for characterising the underlying information structure of a multi-sector system. A complete characterisation is accomplished by identifying information gaps and cause-effect information pathways in the system, and formulating critical testable hypotheses.

  18. Methods for testing transport models

    International Nuclear Information System (INIS)

    Singer, C.; Cox, D.

    1993-01-01

    This report documents progress to date under a three-year contract for developing ''Methods for Testing Transport Models.'' The work described includes (1) choice of best methods for producing ''code emulators'' for analysis of very large global energy confinement databases, (2) recent applications of stratified regressions for treating individual measurement errors as well as calibration/modeling errors randomly distributed across various tokamaks, (3) Bayesian methods for utilizing prior information due to previous empirical and/or theoretical analyses, (4) extension of code emulator methodology to profile data, (5) application of nonlinear least squares estimators to simulation of profile data, (6) development of more sophisticated statistical methods for handling profile data, (7) acquisition of a much larger experimental database, and (8) extensive exploratory simulation work on a large variety of discharges using recently improved models for transport theories and boundary conditions. From all of this work, it has been possible to define a complete methodology for testing new sets of reference transport models against much larger multi-institutional databases

  19. Some Observations on the Concepts of Information-Theoretic Entropy and Randomness

    Directory of Open Access Journals (Sweden)

    Jonathan D.H. Smith

    2001-02-01

    Full Text Available Abstract: Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed. Pre-dating Shannon, the concept is traced back to Pauli. A derivation from first principles is given, without use of approximations. The concept depends on the underlying degree of randomness. In physical applications, this translates to dependence on the experimental apparatus available. An example illustrates how this dependence affects Prigogine's proposal for the use of the Second Law of Thermodynamics as a selection principle for the breaking of time symmetry. The dependence also serves to yield a resolution of the so-called ``Gibbs Paradox.'' Extension of the concept from the discrete to the continuous case is discussed. The usual extension is shown to be dimensionally incorrect. Correction introduces a reference density, leading to the concept of Kullback entropy. Practical relativistic considerations suggest a possible proper reference density.

  20. 10 CFR 431.193 - Test procedures for measuring energy consumption of distribution transformers.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Test procedures for measuring energy consumption of distribution transformers. 431.193 Section 431.193 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY... § 431.193 Test procedures for measuring energy consumption of distribution transformers. The test...

  1. Human papillomavirus (HPV) information needs: a theoretical framework

    Science.gov (United States)

    Marlow, Laura A V; Wardle, Jane; Waller, Jo; Grant, Nina

    2009-01-01

    Background With the introduction of human papillomavirus (HPV) testing and vaccination in the UK, health professionals will start to receive questions about the virus from their patients. This study aimed to identify the key questions about HPV that British women will ask when considering having an HPV test or vaccination. Methods Face-to-face interviews were carried out with 21 women to discover what they wanted to know about HPV. A thematic framework approach was used to analyse the data and identify key themes in women's HPV knowledge requirements. Results Women's questions about HPV fell into six areas: identity (e.g. What are the symptoms?), cause (e.g. How do you get HPV?), timeline (e.g. How long does it last?), consequences (e.g. Does it always cause cervical cancer?) and control-cure (e.g. Can you prevent infection?). In addition, they asked procedural questions about testing and vaccination (e.g. Where do I get an HPV test?). These mapped well onto the dimensions identified in Leventhal's description of lay models of illness, called the 'Common Sense Model' (CSM). Discussion and conclusions These results indicated that the majority of the questions women asked about HPV fitted well into the CSM, which therefore provides a structure for women's information needs. The findings could help health professionals understand what questions they may be expected to answer. Framing educational materials using the CSM themes may also help health educators achieve a good fit with what the public want to know. PMID:19126314

  2. Testing an alternate informed consent process.

    Science.gov (United States)

    Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra

    2009-01-01

    One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.

  3. Theoretical test of Jarzynski's equality for reversible volume-switching processes of an ideal gas system.

    Science.gov (United States)

    Sung, Jaeyoung

    2007-07-01

    We present an exact theoretical test of Jarzynski's equality (JE) for reversible volume-switching processes of an ideal gas system. The exact analysis shows that the prediction of JE for the free energy difference is the same as the work done on the gas system during the reversible process that is dependent on the shape of path of the reversible volume-switching process.

  4. Exploiting the information content of hydrological ''outliers'' for goodness-of-fit testing

    Directory of Open Access Journals (Sweden)

    F. Laio

    2010-10-01

    Full Text Available Validation of probabilistic models based on goodness-of-fit tests is an essential step for the frequency analysis of extreme events. The outcome of standard testing techniques, however, is mainly determined by the behavior of the hypothetical model, FX(x, in the central part of the distribution, while the behavior in the tails of the distribution, which is indeed very relevant in hydrological applications, is relatively unimportant for the results of the tests. The maximum-value test, originally proposed as a technique for outlier detection, is a suitable, but seldom applied, technique that addresses this problem. The test is specifically targeted to verify if the maximum (or minimum values in the sample are consistent with the hypothesis that the distribution FX(x is the real parent distribution. The application of this test is hindered by the fact that the critical values for the test should be numerically obtained when the parameters of FX(x are estimated on the same sample used for verification, which is the standard situation in hydrological applications. We propose here a simple, analytically explicit, technique to suitably account for this effect, based on the application of censored L-moments estimators of the parameters. We demonstrate, with an application that uses artificially generated samples, the superiority of this modified maximum-value test with respect to the standard version of the test. We also show that the test has comparable or larger power with respect to other goodness-of-fit tests (e.g., chi-squared test, Anderson-Darling test, Fung and Paul test, in particular when dealing with small samples (sample size lower than 20–25 and when the parent distribution is similar to the distribution being tested.

  5. Theoretical Issues

    Energy Technology Data Exchange (ETDEWEB)

    Marc Vanderhaeghen

    2007-04-01

    The theoretical issues in the interpretation of the precision measurements of the nucleon-to-Delta transition by means of electromagnetic probes are highlighted. The results of these measurements are confronted with the state-of-the-art calculations based on chiral effective-field theories (EFT), lattice QCD, large-Nc relations, perturbative QCD, and QCD-inspired models. The link of the nucleon-to-Delta form factors to generalized parton distributions (GPDs) is also discussed.

  6. Assessment of the Nevada Test Site as a Site for Distributed Resource Testing and Project Plan: March 2002

    Energy Technology Data Exchange (ETDEWEB)

    Horgan, S.; Iannucci, J.; Whitaker, C.; Cibulka, L.; Erdman, W.

    2002-05-01

    The objective of this project was to evaluate the Nevada Test Site (NTS) as a location for performing dedicated, in-depth testing of distributed resources (DR) integrated with the electric distribution system. In this large scale testing, it is desired to operate multiple DRs and loads in an actual operating environment, in a series of controlled tests to concentrate on issues of interest to the DR community. This report includes an inventory of existing facilities at NTS, an assessment of site attributes in relation to DR testing requirements, and an evaluation of the feasibility and cost of upgrades to the site that would make it a fully qualified DR testing facility.

  7. A CLASS OF DISTRIBUTION-FREE TESTS FOR INDEPENDENCE AGAINST POSITIVE QUADRANT DEPENDENCE

    Directory of Open Access Journals (Sweden)

    Parameshwar V Pandit

    2014-02-01

    Full Text Available A class of distribution-free tests based on convex combination of two U-statistics is considered for testing independence against positive quadrant dependence. The class of tests proposed by Kochar and Gupta (1987 and Kendall’s test are members of the proposed class. The performance of the proposed class is evaluated in terms of Pitman asymptotic relative efficiency for Block- Basu (1974 model and Woodworth family of distributions. It has been observed that some members of the class perform better than the existing tests in the literature.  Unbiasedness and consistency of the proposed class of tests have been established.

  8. INFORMATION SECURITY RISKS OPTIMIZATION IN CLOUDY SERVICES ON THE BASIS OF LINEAR PROGRAMMING

    Directory of Open Access Journals (Sweden)

    I. A. Zikratov

    2013-01-01

    Full Text Available The paper discusses theoretical aspects of secure cloud services creation for information processing of various confidentiality degrees. A new approach to the reasoning of information security composition in distributed computing structures is suggested, presenting the problem of risk assessment as an extreme problem of decisionmaking. Linear programming method application is proved to minimize the risk of information security for given performance security in compliance with the economic balance for the maintenance of security facilities and cost of services. An example is given to illustrate the obtained theoretical results.

  9. About possibilities using of theoretical calculation methods in radioecology

    International Nuclear Information System (INIS)

    Demoukhamedova, S.D.; Aliev, D.I.; Alieva, I.N.

    2002-01-01

    remaining the biological activity and changing the herbicide properties and selectivity is determined by charges distribution into naphthalene ring. The changing of charge distribution in the naphthalene ring has been induced by the effect of ionizing radiation. So, the theoretical calculation methods capable provide more detailed information concerns the radiation effect on ecosystem

  10. In-core flow rate distribution measurement test of the JOYO irradiation core

    International Nuclear Information System (INIS)

    Suzuki, Toshihiro; Isozaki, Kazunori; Suzuki, Soju

    1996-01-01

    A flow rate distribution measurement test was carried out for the JOYO irradiation core (the MK-II core) after the 29th duty cycle operation. The main object of the test is to confirm the proper flow rate distribution at the final phase of the MK-II core. The each flow rate at the outlet of subassemblies was measured by the permanent magnetic flowmeter inserted avail of fuel exchange hole in the rotating plug. This is third test in the MK-II core, after 10 years absence from the final test (1985). Total of 550 subassemblies were exchanged and accumulated reactor operation time reached up to 38,000 hours from the previous test. As a conclusion, it confirmed that the flow rate distribution has been kept suitable in the final phase of the MK-II core. (author)

  11. Testing the anisotropy in the angular distribution of Fermi/GBM gamma-ray bursts

    Science.gov (United States)

    Tarnopolski, M.

    2017-12-01

    Gamma-ray bursts (GRBs) were confirmed to be of extragalactic origin due to their isotropic angular distribution, combined with the fact that they exhibited an intensity distribution that deviated strongly from the -3/2 power law. This finding was later confirmed with the first redshift, equal to at least z = 0.835, measured for GRB970508. Despite this result, the data from CGRO/BATSE and Swift/BAT indicate that long GRBs are indeed distributed isotropically, but the distribution of short GRBs is anisotropic. Fermi/GBM has detected 1669 GRBs up to date, and their sky distribution is examined in this paper. A number of statistical tests are applied: nearest neighbour analysis, fractal dimension, dipole and quadrupole moments of the distribution function decomposed into spherical harmonics, binomial test and the two-point angular correlation function. Monte Carlo benchmark testing of each test is performed in order to evaluate its reliability. It is found that short GRBs are distributed anisotropically in the sky, and long ones have an isotropic distribution. The probability that these results are not a chance occurrence is equal to at least 99.98 per cent and 30.68 per cent for short and long GRBs, respectively. The cosmological context of this finding and its relation to large-scale structures is discussed.

  12. Posterior cerebral artery Wada test: sodium amytal distribution and functional deficits

    Energy Technology Data Exchange (ETDEWEB)

    Urbach, H.; Schild, H.H. [Dept. of Radiology/Neuroradiology, Univ. of Bonn (Germany); Klemm, E.; Biersack, H.J. [Bonn Univ. (Germany). Klinik fuer Nuklearmedizin; Linke, D.B.; Behrends, K.; Schramm, J. [Dept. of Neurosurgery, Univ. of Bonn (Germany)

    2001-04-01

    Inadequate sodium amytal delivery to the posterior hippocampus during the intracarotid Wada test has led to development of selective tests. Our purpose was to show the sodium amytal distribution in the posterior cerebral artery (PCA) Wada test and to relate it to functional deficits during the test. We simultaneously injected 80 mg sodium amytal and 14.8 MBq {sup 99} {sup m}Tc-hexamethylpropyleneamine oxime (HMPAO) into the P2-segment of the PCA in 14 patients with temporal lobe epilepsy. To show the skull, we injected 116 MBq {sup 99} {sup m}Tc-HDP intravenously. Sodium amytal distribution was determined by high-resolution single-photon emission computed tomography (SPECT). In all patients, HMPAO was distributed throughout the parahippocampal gyrus and hippocampus; it was also seen in the occipital lobe in all cases and in the thalamus in 11. Eleven patients were awake and cooperative; one was slightly uncooperative due to speech comprehension difficulties and perseveration. All patients showed contralateral hemianopia during the test. Four patients had nominal dysphasia for 1-3 min. None developed motor deficits or had permanent neurological deficits. Neurological deficits due to inactivation of extrahippocampal areas thus do not grossly interfere with neuropsychological testing during the test. (orig.)

  13. THEORETICAL FRAMEWORK FOR INFORMATION AND EDUCATIONAL COMPLEX DEVELOPMENT OF AN ACADEMIC DISCIPLINE AT A HIGHER INSTITUTION

    Directory of Open Access Journals (Sweden)

    Evgeniia Nikolaevna Kikot

    2015-05-01

    Full Text Available The question of organization of contemporary education process is getting more important nowadays in the conditions of ICT (information and communication technologies and e-education usage.This defines one of the most important methodological and research directions in the university – creation of informational-educational course unit complex as the foundation of e-University resource.The foundation of informational-educational course unit complex creation are the concepts of openness, accessibility, clearness, personalisation and that allow to built the requirements system to the complex creation and its substantial content.The main functions of informational educational complex are detected: informational, educational, controlling and communicative.It’s defined that into the basis of scientific justification of new structure elements of informational-educational of course unit complex development and introduction is necessary to include creation of e-workbook, e-workshops in order to organize theoretical and practical e-conferences.Development of ICT in education that provides e-education application assume establishment of distance learning techno-logies for educational programme implementation.

  14. Radiometric compensation for cooperative distributed multi-projection system through 2-DOF distributed control.

    Science.gov (United States)

    Tsukamoto, Jun; Iwai, Daisuke; Kashima, Kenji

    2015-11-01

    This paper proposes a novel radiometric compensation technique for cooperative projection system based-on distributed optimization. To achieve high scalability and robustness, we assume cooperative projection environments such that 1. each projector does not have information about other projectors as well as target images, 2. the camera does not have information about the projectors either, while having the target images, and 3. only a broadcast communication from the camera to the projectors is allowed to suppress the data transfer bandwidth. To this end, we first investigate a distributed optimization based feedback mechanism that is suitable for the required decentralized information processing environment. Next, we show that this mechanism works well for still image projection, however not necessary for moving images due to the lack of dynamic responsiveness. To overcome this issue, we propose to implement an additional feedforward mechanism. Such a 2 Degree Of Freedom (2-DOF) control structure is well-known in control engineering community as a typical method to enhance not only disturbance rejection but also reference tracking capability, simultaneously. We theoretically guarantee and experimentally demonstrate that this 2-DOF structure yields the moving image projection accuracy that is overwhelming the best achievable performance only by the distributed optimization mechanisms.

  15. On the Bayes risk in information-hiding protocols

    NARCIS (Netherlands)

    Chatzikokolakis, K.; Palamidessi, C.; Panangaden, P.

    2008-01-01

    Randomized protocols for hiding private information can be regarded as noisy channels in the information-theoretic sense, and the inference of the concealed information can be regarded as a hypothesis-testing problem. We consider the Bayesian approach to the problem, and investigate the probability

  16. Current research and potential applications of the Concealed Information Test: An overview

    Directory of Open Access Journals (Sweden)

    Gershon eBen-Shakhar

    2012-09-01

    Full Text Available Research interest in psychophysiological detection of deception has significantly increased since the September 11 terror attack in the USA. In particular, the Concealed Information Test (CIT, designed to detect memory traces that can connect suspects to a certain crime, has been extensively studied. In this paper I will briefly review several psychophysiological detection paradigms that have been studied, with a focus on the CIT. The theoretical background of the CIT, its strength and weaknesses, its potential applications as well as research finings related to its validity, (based on a recent mata-analytic study, will be discussed. Several novel research directions, with a focus on factors that may affect CIT detection in realistic settings (e.g., memory for crime details; the effect of emotional stress during crime execution will be described. Additionally, research focusing on mal-intentions and attempts to detect terror networks using information gathered from groups of suspects using both the standard CIT and the searching CIT will be reviewed. Finally, implications of current research to the actual application of the CIT will be discussed and several recommendations that can enhance the use of the CIT will be made.

  17. Extending multivariate distance matrix regression with an effect size measure and the asymptotic null distribution of the test statistic.

    Science.gov (United States)

    McArtor, Daniel B; Lubke, Gitta H; Bergeman, C S

    2017-12-01

    Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains.

  18. Information spreading in Delay Tolerant Networks based on nodes' behaviors

    Science.gov (United States)

    Wu, Yahui; Deng, Su; Huang, Hongbin

    2014-07-01

    Information spreading in DTNs (Delay Tolerant Networks) adopts a store-carry-forward method, and nodes receive the message from others directly. However, it is hard to judge whether the information is safe in this communication mode. In this case, a node may observe other nodes' behaviors. At present, there is no theoretical model to describe the varying rule of the nodes' trusting level. In addition, due to the uncertainty of the connectivity in DTN, a node is hard to get the global state of the network. Therefore, a rational model about the node's trusting level should be a function of the node's own observing result. For example, if a node finds k nodes carrying a message, it may trust the information with probability p(k). This paper does not explore the real distribution of p(k), but instead presents a unifying theoretical framework to evaluate the performance of the information spreading in above case. This framework is an extension of the traditional SI (susceptible-infected) model, and is useful when p(k) conforms to any distribution. Simulations based on both synthetic and real motion traces show the accuracy of the framework. Finally, we explore the impact of the nodes' behaviors based on certain special distributions through numerical results.

  19. An experimental and theoretical investigation of the valence orbital momentum distributions and binding energy spectra of nitrogen

    International Nuclear Information System (INIS)

    Cook, J.P.D.; Pascual, R.; Weigold, E.

    1989-05-01

    A detailed electron momentum spectrosocpy (EMS) and a manybody theoretical study of the complete valence region of N 2 was carried out. The 1500eV EMS momentum distributions show that they provide a sensitive test for orbital wavefunctions of SCF calculations, and of correlation effects. The outermost 3σ g orbital is more sharply peaked at the origin than predicted by the orbital wavefunction. The inner valence 2σ g orbital is severely split, with spectroscopic strength ranging from 34eV to over 60eV in binding energy. The results of the present extended basis 1p Green's function calculations, as well as those of several previous manybody calculations, are only in semiquantitative agreement with this. There is a 2σ u pole at 25eV with a pole strength of approximately 0.067 in agreement with the results of manybody calculations. There is significant 2σ u and or 1π u strength and little 2σ g strength in the region 26-34eV. Poles observed at 29 and 32eV, previously attributed to the 2σ g orbital, are shown to be largely 2σ u in character. The manybody calculations predict too much 2σ g strength in the region 26-34eV. 29 refs., 1 tab., 16 figs

  20. An investigation on characterizing dense coal-water slurry with ultrasound: theoretical and experimental method

    Energy Technology Data Exchange (ETDEWEB)

    Xue, M.H.; Su, M.X.; Dong, L.L.; Shang, Z.T.; Cai, X.S. [Shanghai University of Science & Technology, Shanghai (China)

    2010-07-01

    Particle size distribution and concentration in particulate two-phase flow are important parameters in a wide variety of industrial areas. For the purpose of online characterization in dense coal-water slurries, ultrasonic methods have many advantages such as avoiding dilution, the capability for being used in real time, and noninvasive testing, while light-based techniques are not capable of providing information because optical methods often require the slurry to be diluted. In this article, the modified Urick equation including temperature modification, which can be used to determine the concentration by means of the measurement of ultrasonic velocity in a coal-water slurry, is evaluated on the basis of theoretical analysis and experimental study. A combination of the coupled-phase model and the Bouguer-Lambert-Beer law is employed in this work, and the attenuation spectrum is measured within the frequency region from 3 to 12 MHz. Particle size distributions of the coal-water slurry at different volume fractions are obtained with the optimum regularization technique. Therefore, the ultrasonic technique presented in this work brings the possibility of using ultrasound for online measurements of dense slurries.

  1. Distributed Rocket Engine Testing Health Monitoring System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The on-ground and Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) provides a system architecture and software tools for performing diagnostics...

  2. Distributed Rocket Engine Testing Health Monitoring System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Leveraging the Phase I achievements of the Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) including its software toolsets and system building...

  3. Preventive chemotherapy in human helminthiasis: theoretical and operational aspects

    Science.gov (United States)

    Chitsulo, L.; Engels, D.; Savioli, L.

    2017-01-01

    Preventive chemotherapy (PC), the large-scale distribution of anthelminthic drugs to population groups at risk, is the core intervention recommended by the WHO for reducing morbidity and transmission of the four main helminth infections, namely lymphatic filariasis, onchocerciasis, schistosomiasis and soil-transmitted helminthiasis. The strategy is widely implemented worldwide but its general theoretical foundations have not been described so far in a comprehensive and cohesive manner. Starting from the information available on the biological and epidemiological characteristics of helminth infections, as well as from the experience generated by disease control and elimination interventions across the world, we extrapolate the fundamentals and synthesise the principles that regulate PC and justify its implementation as a sound and essential public health intervention. The outline of the theoretical aspects of PC contributes to a thorough understanding of the different facets of this strategy and helps comprehend opportunities and limits of control and elimination interventions directed against helminth infections. PMID:22040463

  4. Cassini Information Management System in Distributed Operations Collaboration and Cassini Science Planning

    Science.gov (United States)

    Equils, Douglas J.

    2008-01-01

    Launched on October 15, 1997, the Cassini-Huygens spacecraft began its ambitious journey to the Saturnian system with a complex suite of 12 scientific instruments, and another 6 instruments aboard the European Space Agencies Huygens Probe. Over the next 6 1/2 years, Cassini would continue its relatively simplistic cruise phase operations, flying past Venus, Earth, and Jupiter. However, following Saturn Orbit Insertion (SOI), Cassini would become involved in a complex series of tasks that required detailed resource management, distributed operations collaboration, and a data base for capturing science objectives. Collectively, these needs were met through a web-based software tool designed to help with the Cassini uplink process and ultimately used to generate more robust sequences for spacecraft operations. In 2001, in conjunction with the Southwest Research Institute (SwRI) and later Venustar Software and Engineering Inc., the Cassini Information Management System (CIMS) was released which enabled the Cassini spacecraft and science planning teams to perform complex information management and team collaboration between scientists and engineers in 17 countries. Originally tailored to help manage the science planning uplink process, CIMS has been actively evolving since its inception to meet the changing and growing needs of the Cassini uplink team and effectively reduce mission risk through a series of resource management validation algorithms. These algorithms have been implemented in the web-based software tool to identify potential sequence conflicts early in the science planning process. CIMS mitigates these sequence conflicts through identification of timing incongruities, pointing inconsistencies, flight rule violations, data volume issues, and by assisting in Deep Space Network (DSN) coverage analysis. In preparation for extended mission operations, CIMS has also evolved further to assist in the planning and coordination of the dual playback redundancy of

  5. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  6. Real time testing of intelligent relays for synchronous distributed generation islanding detection

    Science.gov (United States)

    Zhuang, Davy

    As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.

  7. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data.

    Science.gov (United States)

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K

    2018-06-01

    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  8. Evaluation of the information servicing in a distributed learning ...

    African Journals Online (AJOL)

    The authors' main idea is to organize a distributed learning environment (DLE) based on information and communication resources of global network in combination with the technologies for virtual reality and 3D simulation. In this reason a conceptual model of the DLE architecture and learning processes is defined, and ...

  9. A study of fatigue life distribution of butt-welded joints

    International Nuclear Information System (INIS)

    Sakai, Tatsuo; Fujitani, Keizo; Kikuchi, Toshiro; Tanaka, Takao.

    1981-01-01

    Various kinds of welded joints are being used in many structures such as ships, bridges and constructions. It is important in reliability analysis of such structures to clarify the statistical fatigue property of the welded joints. In this study, fatigue tests were carried out on the butt-welded joints of SM50A steel and a theoretical interpretation on the fatigue life distribution was attempted, assuming that a butt-welded joint is composed of a number of sliced specimens with different fatigue strengths. Main results obtained are summarized as follows; (1) The median crack initiation life of the butt-welded joint specimens coincided with that of the sliced specimens, when the crack initiation was defined by a 0.2 mm crack in the sliced specimens or the equivalent state of stress intensity factor in the joint specimens. (2) The distribution of crack initiation lives of the butt-welded joints can be theoretically derived by combining the concept of extreme distribution and the distribution model on the number of fatigue cracks. The theoretical distribution of crack initiation lives of the joints is in good agreement with the general trend of the experimental results. (3) If the distribution of crack initiation lives and the crack growth law are given experimentally, one can obtain analytically the distribution of final fatigue lives. The fatigue life distribution of the sliced specimens can be explained by the theory established in this study. (author)

  10. Information Modeling for Direct Control of Distributed Energy Resources

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob

    2013-01-01

    We present an architecture for an unbundled liberalized electricity market system where a virtual power plant (VPP) is able to control a number of distributed energy resources (DERs) directly through a two-way communication link. The aggregator who operates the VPP utilizes the accumulated...... a desired accumulated response. In this paper, we design such an information model based on the markets that the aggregator participates in and based on the flexibility characteristics of the remote controlled DERs. The information model is constructed in a modular manner making the interface suitable...

  11. Electron energy distribution in a weakly ionized plasma

    International Nuclear Information System (INIS)

    Cesari, C.

    1967-03-01

    The aim of this work is to determine from both the theoretical and experimental points of view the type of distribution function for the electronic energies existing in a positive-column type cold laboratory plasma having an ionization rate of between 10 -6 and 10 -7 . The theoretical analysis, based on the imperfect Lorentz model and taking into account inelastic collisions is developed from the Boltzmann equation. The experimental method which we have employed for making an electrostatic analysis of the electronic energies makes use of a Langmuir probe used in conjunction with a transistorized electronic device. A comparison between the experimental and theoretical results yields information concerning the mechanisms governing electronic energy transfer on a microscopic scale. (author) [fr

  12. DAIDS: a Distributed, Agent-based Information Dissemination System

    Directory of Open Access Journals (Sweden)

    Pete Haglich

    2007-10-01

    Full Text Available The Distributed Agent-Based Information Dissemination System (DAIDS concept was motivated by the need to share information among the members of a military tactical team in an atmosphere of extremely limited or intermittent bandwidth. The DAIDS approach recognizes that in many cases communications limitations will preclude the complete sharing of all tactical information between the members of the tactical team. Communications may be limited by obstructions to the line of sight between platforms; electronic warfare; or environmental conditions, or just contention from other users of that bandwidth. Since it may not be possible to achieve a complete information exchange, it is important to prioritize transmissions so the most critical information from the standpoint of the recipient is disseminated first. The challenge is to be able to determine which elements of information are the most important to each teammate. The key innovation of the DAIDS concept is the use of software proxy agents to represent the information needs of the recipient of the information. The DAIDS approach uses these proxy agents to evaluate the content of a message in accordance with the context and information needs of the recipient platform (the agent's principal and prioritize the message for dissemination. In our research we implemented this approach and demonstrated that it provides nearly a reduction in transmission times for critical tactical reports by up to a factor of 30 under severe bandwidth limitations.

  13. statistical tests for frequency distribution of mean gravity anomalies

    African Journals Online (AJOL)

    ES Obe

    1980-03-01

    Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...

  14. Sharing Year 2000 Testing Information on DOD Information Technology Systems

    National Research Council Canada - National Science Library

    1998-01-01

    The audit objective was to determine whether planning for year 2000 testing is adequate to ensure that mission critical DoD information technology systems will continue to operate properly after the year 2000...

  15. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  16. Spatial distribution of soil contamination by 137Cs and 239,240Pu in the village of Dolon near the Semipalatinsk nuclear test site: new information on traces of the radioactive plume from the 29 August 1949 nuclear test.

    Science.gov (United States)

    Yamamoto, M; Tomita, J; Sakaguchi, A; Imanaka, T; Fukutani, S; Endo, S; Tanaka, K; Hoshi, M; Gusev, B I; Apsalikov, A N

    2008-04-01

    The village of Dolon located about 60 km northeast from the border of the Semipalatinsk Nuclear Test Site in Kazakhstan is one of the most affected inhabited settlements as a result of nuclear tests by the former USSR. Radioactive contamination in Dolon was mainly caused by the first USSR nuclear test on 29 August 1949. As part of the efforts to reconstruct the radiation dose in Dolon, Cs and Pu in soil samples collected from 26 locations in the vicinity of and within the village were measured to determine the width and position of the center-axis of the radioactive plume that passed over the village from the 29 August 1949 nuclear test. Measured soil inventories of Cs and Pu were plotted as a function of the distance from the supposed center-axis of the plume. A clear shape similar to a Gaussian function was observed in their spatial distributions with each maximum around a center-axis. It was suggested that the plume width that contaminated Dolon was at most 10 km and the real center-axis of the radioactive plume passed 0.7-0.9 km north of the supposed centerline. A peak-like shape with the maximum near the center-axis was also observed in the spatial distribution of the Pu/Cs activity ratio, which may reflect the fractionation effect between Pu and Cs during the deposition process. These results support the recently reported results. The data obtained here will provide useful information on the efforts to estimate radiation dose in Dolon as reliably as possible. Health Phys. 94(4):328-337; 2008.

  17. Benford's law and the FSD distribution of economic behavioral micro data

    Science.gov (United States)

    Villas-Boas, Sofia B.; Fu, Qiuzi; Judge, George

    2017-11-01

    In this paper, we focus on the first significant digit (FSD) distribution of European micro income data and use information theoretic-entropy based methods to investigate the degree to which Benford's FSD law is consistent with the nature of these economic behavioral systems. We demonstrate that Benford's law is not an empirical phenomenon that occurs only in important distributions in physical statistics, but that it also arises in self-organizing dynamic economic behavioral systems. The empirical likelihood member of the minimum divergence-entropy family, is used to recover country based income FSD probability density functions and to demonstrate the implications of using a Benford prior reference distribution in economic behavioral system information recovery.

  18. Consumers’ Acceptance and Use of Information and Communications Technology: A UTAUT and Flow Based Theoretical Model

    Directory of Open Access Journals (Sweden)

    Saleh Alwahaishi

    2013-03-01

    Full Text Available The world has changed a lot in the past years. The rapid advances in technology and the changing of the communication channels have changed the way people work and, for many, where do they work from. The Internet and mobile technology, the two most dynamic technological forces in modern information and communications technology (ICT are converging into one ubiquitous mobile Internet service, which will change our way of both doing business and dealing with our daily routine activities. As the use of ICT expands globally, there is need for further research into cultural aspects and implications of ICT. The acceptance of Information Technology (IT has become a fundamental part of the research plan for most organizations (Igbaria 1993. In IT research, numerous theories are used to understand users’ adoption of new technologies. Various models were developed including the Technology Acceptance Model, Theory of Reasoned Action, Theory of Planned Behavior, and recently, the Unified Theory of Acceptance and Use of Technology. Each of these models has sought to identify the factors which influence a citizen’s intention or actual use of information technology. Drawing on the UTAUT model and Flow Theory, this research composes a new hybrid theoretical framework to identify the factors affecting the acceptance and use of Mobile Internet -as an ICT application- in a consumer context. The proposed model incorporates eight constructs: Performance Expectancy, Effort Expectancy, Facilitating Conditions, Social Influences, Perceived Value, Perceived Playfulness, Attention Focus, and Behavioral intention. Data collected online from 238 respondents in Saudi Arabia were tested against the research model, using the structural equation modeling approach. The proposed model was mostly supported by the empirical data. The findings of this study provide several crucial implications for ICT and, in particular, mobile Internet service practitioners and researchers

  19. Theoretical predictions for vehicular headways and their clusters

    Science.gov (United States)

    Krbálek, Milan

    2013-11-01

    This paper presents a derivation of analytical predictions for steady-state distributions of netto time gaps among clusters of vehicles moving inside a traffic stream. Using the thermodynamic socio-physical traffic model with short-ranged repulsion between particles (originally introduced in Krbálek and Helbing 2004 Physica A 333 370) we first derive the time-clearance distribution in the model and confront it with relation to the theoretical criteria for the acceptability of analytical clearance distributions. Consecutively, the approximating statistical distributions for the so-called time multi-clearances are calculated by means of the theory of functional convolutions. Moreover, all the theoretical surmises used during the above-mentioned calculations are evaluated by the statistical analysis of traffic data. The mathematical predictions acquired in this paper are thoroughly compared with relevant empirical quantities and discussed in the context of traffic theory.

  20. Information Propagation on Permissionless Blockchains

    OpenAIRE

    Ersoy, Oguzhan; Ren, Zhijie; Erkin, Zekeriya; Lagendijk, Reginald L.

    2017-01-01

    Blockchain technology, as a decentralized and non-hierarchical platform, has the potential to replace centralized systems. Yet, there are several challenges inherent in the blockchain structure. One of the deficiencies of the existing blockchains is a convenient information propagation technique enhancing incentive-compatibility and bandwidth efficiency. The transition from a centralized system into distributed one brings along game theoretical concerns. Especially for the permissionless bloc...

  1. Jensen divergence based on Fisher’s information

    International Nuclear Information System (INIS)

    Sánchez-Moreno, P; Zarzo, A; Dehesa, J S

    2012-01-01

    The measure of Jensen–Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, grasps the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So it is appropriate and informative when studying the similarity of distributions, mainly for those having oscillatory character. The new Jensen–Fisher divergence shares with the Jensen–Shannon divergence the following properties: non-negativity, additivity when applied to an arbitrary number of probability densities, symmetry under exchange of these densities, vanishing under certain conditions and definiteness even when these densities present non-common zeros. Moreover, the Jensen–Fisher divergence is shown to be expressed in terms of the relative Fisher information as the Jensen–Shannon divergence does in terms of the Kullback–Leibler or relative Shannon entropy. Finally, the Jensen–Shannon and Jensen–Fisher divergences are compared for the following three large, non-trivial and qualitatively different families of probability distributions: the sinusoidal, generalized gamma-like and Rakhmanov–Hermite distributions, which are closely related to the quantum-mechanical probability densities of numerous physical systems. (paper)

  2. Theoretical Mathematics

    Science.gov (United States)

    Stöltzner, Michael

    Answering to the double-faced influence of string theory on mathematical practice and rigour, the mathematical physicists Arthur Jaffe and Frank Quinn have contemplated the idea that there exists a `theoretical' mathematics (alongside `theoretical' physics) whose basic structures and results still require independent corroboration by mathematical proof. In this paper, I shall take the Jaffe-Quinn debate mainly as a problem of mathematical ontology and analyse it against the backdrop of two philosophical views that are appreciative towards informal mathematical development and conjectural results: Lakatos's methodology of proofs and refutations and John von Neumann's opportunistic reading of Hilbert's axiomatic method. The comparison of both approaches shows that mitigating Lakatos's falsificationism makes his insights about mathematical quasi-ontology more relevant to 20th century mathematics in which new structures are introduced by axiomatisation and not necessarily motivated by informal ancestors. The final section discusses the consequences of string theorists' claim to finality for the theory's mathematical make-up. I argue that ontological reductionism as advocated by particle physicists and the quest for mathematically deeper axioms do not necessarily lead to identical results.

  3. Word categorization from distributional information: frames confer more than the sum of their (Bigram) parts.

    Science.gov (United States)

    Mintz, Toben H; Wang, Felix Hao; Li, Jia

    2014-12-01

    Grammatical categories, such as noun and verb, are the building blocks of syntactic structure and the components that govern the grammatical patterns of language. However, in many languages words are not explicitly marked with their category information, hence a critical part of acquiring a language is categorizing the words. Computational analyses of child-directed speech have shown that distributional information-information about how words pattern with one another in sentences-could be a useful source of initial category information. Yet questions remain as to whether learners use this kind of information, and if so, what kinds of distributional patterns facilitate categorization. In this paper we investigated how adults exposed to an artificial language use distributional information to categorize words. We compared training situations in which target words occurred in frames (i.e., surrounded by two words that frequently co-occur) against situations in which target words occurred in simpler bigram contexts (where an immediately adjacent word provides the context for categorization). We found that learners categorized words together when they occurred in similar frame contexts, but not when they occurred in similar bigram contexts. These findings are particularly relevant because they accord with computational investigations showing that frame contexts provide accurate category information cross-linguistically. We discuss these findings in the context of prior research on distribution-based categorization and the broader implications for the role of distributional categorization in language acquisition. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Preliminary investigation on determination of radionuclide distribution in field tracing test site

    International Nuclear Information System (INIS)

    Tanaka, Tadao; Mukai, Masayuki; Takebe, Shinichi; Guo Zede; Li Shushen; Kamiyama, Hideo.

    1993-12-01

    Field tracing tests for radionuclide migration have been conducted by using 3 H, 60 Co, 85 Sr and 134 Cs, in the natural unsaturated loess zone at field test site of China Institute for Radiation Protection. It is necessary to obtain confidable distribution data of the radionuclides in the test site, in order to evaluate exactly the migration behavior of the radionuclides in situ. An available method to determine the distribution was proposed on the basis of preliminary discussing results on sampling method of soils from the test site and analytical method of radioactivity in the soils. (author)

  5. Distributing Congestion Management System Information Using the World Wide Web

    Science.gov (United States)

    1997-01-01

    The Internet is a unique medium for the distribution of information, and it provides a tremendous opportunity to take advantage of peoples innate interest in transportation issues as they relate to their own lives. In particular, the World Wide Web (...

  6. Confronting Theoretical Predictions With Experimental Data; Fitting Strategy For Multi-Dimensional Distributions

    Directory of Open Access Journals (Sweden)

    Tomasz Przedziński

    2015-01-01

    Full Text Available After developing a Resonance Chiral Lagrangian (RχL model to describe hadronic τ lepton decays [18], the model was confronted with experimental data. This was accomplished using a fitting framework which was developed to take into account the complexity of the model and to ensure the numerical stability for the algorithms used in the fitting. Since the model used in the fit contained 15 parameters and there were only three 1-dimensional distributions available, we could expect multiple local minima or even whole regions of equal potential to appear. Our methods had to thoroughly explore the whole parameter space and ensure, as well as possible, that the result is a global minimum. This paper is focused on the technical aspects of the fitting strategy used. The first approach was based on re-weighting algorithm published in [17] and produced results in around two weeks. Later approach, with improved theoretical model and simple parallelization algorithm based on Inter-Process Communication (IPC methods of UNIX system, reduced computation time down to 2-3 days. Additional approximations were introduced to the model decreasing time to obtain the preliminary results down to 8 hours. This allowed to better validate the results leading to a more robust analysis published in [12].

  7. Vascular plants of the Nevada Test Site and Central-Southern Nevada: ecologic and geographic distributions

    Energy Technology Data Exchange (ETDEWEB)

    Beatley, J.C.

    1976-01-01

    The physical environment of the Nevada Test Site and surrounding area is described with regard to physiography, geology, soils, and climate. A discussion of plant associations is given for the Mojave Desert, Transition Desert, and Great Basin Desert. The vegetation of disturbed sites is discussed with regard to introduced species as well as endangered and threatened species. Collections of vascular plants were made during 1959 to 1975. The plants, belonging to 1093 taxa and 98 families are listed together with information concerning ecologic and geographic distributions. Indexes to families, genera, and species are included. (HLW)

  8. Talking Points: Women's Information Needs for Informed Decision-Making About Noninvasive Prenatal Testing for Down Syndrome.

    Science.gov (United States)

    Dane, Aimée C; Peterson, Madelyn; Miller, Yvette D

    2018-03-17

    Adequate knowledge is a vital component of informed decision-making; however, we do not know what information women value when making decisions about noninvasive prenatal testing (NIPT). The current study aimed to identify women's information needs for decision-making about NIPT as a first-tier, non-contingent test with out-of-pocket expense and, in turn, inform best practice by specifying the information that should be prioritized when providing pre-test counseling to women in a time-limited scenario or space-limited decision support tool. We asked women (N = 242) in Australia to indicate the importance of knowing 24 information items when making a decision about NIPT and to choose two information items they would most value. Our findings suggest that women value having complete information when making decisions about NIPT. Information about the accuracy of NIPT and the pros and cons of NIPT compared to other screening and invasive tests were perceived to be most important. The findings of this study can be used to maximize the usefulness of time-limited discussions or space-limited decision support tools, but should not be routinely relied upon as a replacement for provision of full and tailored information when feasible.

  9. Source distribution dependent scatter correction for PVI

    International Nuclear Information System (INIS)

    Barney, J.S.; Harrop, R.; Dykstra, C.J.

    1993-01-01

    Source distribution dependent scatter correction methods which incorporate different amounts of information about the source position and material distribution have been developed and tested. The techniques use image to projection integral transformation incorporating varying degrees of information on the distribution of scattering material, or convolution subtraction methods, with some information about the scattering material included in one of the convolution methods. To test the techniques, the authors apply them to data generated by Monte Carlo simulations which use geometric shapes or a voxelized density map to model the scattering material. Source position and material distribution have been found to have some effect on scatter correction. An image to projection method which incorporates a density map produces accurate scatter correction but is computationally expensive. Simpler methods, both image to projection and convolution, can also provide effective scatter correction

  10. 'Universal' Distribution of Interearthquake Times Explained

    International Nuclear Information System (INIS)

    Saichev, A.; Sornette, D.

    2006-01-01

    We propose a simple theory for the 'universal' scaling law previously reported for the distributions of waiting times between earthquakes. It is based on a largely used benchmark model of seismicity, which just assumes no difference in the physics of foreshocks, mainshocks, and aftershocks. Our theoretical calculations provide good fits to the data and show that universality is only approximate. We conclude that the distributions of interevent times do not reveal more information than what is already known from the Gutenberg-Richter and the Omori power laws. Our results reinforce the view that triggering earthquakes by other earthquakes is a key physical mechanism to understand seismicity

  11. Transport at basin scales: 1. Theoretical framework

    Directory of Open Access Journals (Sweden)

    A. Rinaldo

    2006-01-01

    Full Text Available The paper describes the theoretical framework for a class of general continuous models of the hydrologic response including both flow and transport of reactive solutes. The approach orders theoretical results appeared in disparate fields into a coherent theoretical framework for both hydrologic flow and transport. In this paper we focus on the Lagrangian description of the carrier hydrologic runoff and of the processes embedding catchment-scale generation and transport of matter carried by runoff. The former defines travel time distributions, while the latter defines lifetime distributions, here thought of as contact times between mobile and immobile phases. Contact times are assumed to control mass transfer in a well-mixed approximation, appropriate in cases, like in basin-scale transport phenomena, where the characteristic size of the injection areas is much larger than that of heterogeneous features. As a result, we define general mass-response functions of catchments which extend to transport of matter geomorphologic theories of the hydrologic response. A set of examples is provided to clarify the theoretical results towards a computational framework for generalized applications, described in a companion paper.

  12. Backward Dependencies and in-Situ wh-Questions as Test Cases on How to Approach Experimental Linguistics Research That Pursues Theoretical Linguistics Questions

    Science.gov (United States)

    Pablos, Leticia; Doetjes, Jenny; Cheng, Lisa L.-S.

    2018-01-01

    The empirical study of language is a young field in contemporary linguistics. This being the case, and following a natural development process, the field is currently at a stage where different research methods and experimental approaches are being put into question in terms of their validity. Without pretending to provide an answer with respect to the best way to conduct linguistics related experimental research, in this article we aim at examining the process that researchers follow in the design and implementation of experimental linguistics research with a goal to validate specific theoretical linguistic analyses. First, we discuss the general challenges that experimental work faces in finding a compromise between addressing theoretically relevant questions and being able to implement these questions in a specific controlled experimental paradigm. We discuss the Granularity Mismatch Problem (Poeppel and Embick, 2005) which addresses the challenges that research that is trying to bridge the representations and computations of language and their psycholinguistic/neurolinguistic evidence faces, and the basic assumptions that interdisciplinary research needs to consider due to the different conceptual granularity of the objects under study. To illustrate the practical implications of the points addressed, we compare two approaches to perform linguistic experimental research by reviewing a number of our own studies strongly grounded on theoretically informed questions. First, we show how linguistic phenomena similar at a conceptual level can be tested within the same language using measurement of event-related potentials (ERP) by discussing results from two ERP experiments on the processing of long-distance backward dependencies that involve coreference and negative polarity items respectively in Dutch. Second, we examine how the same linguistic phenomenon can be tested in different languages using reading time measures by discussing the outcome of four self

  13. Backward Dependencies and in-Situ wh-Questions as Test Cases on How to Approach Experimental Linguistics Research That Pursues Theoretical Linguistics Questions.

    Science.gov (United States)

    Pablos, Leticia; Doetjes, Jenny; Cheng, Lisa L-S

    2017-01-01

    The empirical study of language is a young field in contemporary linguistics. This being the case, and following a natural development process, the field is currently at a stage where different research methods and experimental approaches are being put into question in terms of their validity. Without pretending to provide an answer with respect to the best way to conduct linguistics related experimental research, in this article we aim at examining the process that researchers follow in the design and implementation of experimental linguistics research with a goal to validate specific theoretical linguistic analyses. First, we discuss the general challenges that experimental work faces in finding a compromise between addressing theoretically relevant questions and being able to implement these questions in a specific controlled experimental paradigm. We discuss the Granularity Mismatch Problem (Poeppel and Embick, 2005) which addresses the challenges that research that is trying to bridge the representations and computations of language and their psycholinguistic/neurolinguistic evidence faces, and the basic assumptions that interdisciplinary research needs to consider due to the different conceptual granularity of the objects under study. To illustrate the practical implications of the points addressed, we compare two approaches to perform linguistic experimental research by reviewing a number of our own studies strongly grounded on theoretically informed questions. First, we show how linguistic phenomena similar at a conceptual level can be tested within the same language using measurement of event-related potentials (ERP) by discussing results from two ERP experiments on the processing of long-distance backward dependencies that involve coreference and negative polarity items respectively in Dutch. Second, we examine how the same linguistic phenomenon can be tested in different languages using reading time measures by discussing the outcome of four self

  14. Backward Dependencies and in-Situ wh-Questions as Test Cases on How to Approach Experimental Linguistics Research That Pursues Theoretical Linguistics Questions

    Directory of Open Access Journals (Sweden)

    Leticia Pablos

    2018-01-01

    Full Text Available The empirical study of language is a young field in contemporary linguistics. This being the case, and following a natural development process, the field is currently at a stage where different research methods and experimental approaches are being put into question in terms of their validity. Without pretending to provide an answer with respect to the best way to conduct linguistics related experimental research, in this article we aim at examining the process that researchers follow in the design and implementation of experimental linguistics research with a goal to validate specific theoretical linguistic analyses. First, we discuss the general challenges that experimental work faces in finding a compromise between addressing theoretically relevant questions and being able to implement these questions in a specific controlled experimental paradigm. We discuss the Granularity Mismatch Problem (Poeppel and Embick, 2005 which addresses the challenges that research that is trying to bridge the representations and computations of language and their psycholinguistic/neurolinguistic evidence faces, and the basic assumptions that interdisciplinary research needs to consider due to the different conceptual granularity of the objects under study. To illustrate the practical implications of the points addressed, we compare two approaches to perform linguistic experimental research by reviewing a number of our own studies strongly grounded on theoretically informed questions. First, we show how linguistic phenomena similar at a conceptual level can be tested within the same language using measurement of event-related potentials (ERP by discussing results from two ERP experiments on the processing of long-distance backward dependencies that involve coreference and negative polarity items respectively in Dutch. Second, we examine how the same linguistic phenomenon can be tested in different languages using reading time measures by discussing the outcome of

  15. Neural complexity: A graph theoretic interpretation

    Science.gov (United States)

    Barnett, L.; Buckley, C. L.; Bullock, S.

    2011-04-01

    One of the central challenges facing modern neuroscience is to explain the ability of the nervous system to coherently integrate information across distinct functional modules in the absence of a central executive. To this end, Tononi [Proc. Natl. Acad. Sci. USA.PNASA60027-842410.1073/pnas.91.11.5033 91, 5033 (1994)] proposed a measure of neural complexity that purports to capture this property based on mutual information between complementary subsets of a system. Neural complexity, so defined, is one of a family of information theoretic metrics developed to measure the balance between the segregation and integration of a system’s dynamics. One key question arising for such measures involves understanding how they are influenced by network topology. Sporns [Cereb. Cortex53OPAV1047-321110.1093/cercor/10.2.127 10, 127 (2000)] employed numerical models in order to determine the dependence of neural complexity on the topological features of a network. However, a complete picture has yet to be established. While De Lucia [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.71.016114 71, 016114 (2005)] made the first attempts at an analytical account of this relationship, their work utilized a formulation of neural complexity that, we argue, did not reflect the intuitions of the original work. In this paper we start by describing weighted connection matrices formed by applying a random continuous weight distribution to binary adjacency matrices. This allows us to derive an approximation for neural complexity in terms of the moments of the weight distribution and elementary graph motifs. In particular, we explicitly establish a dependency of neural complexity on cyclic graph motifs.

  16. Strategy for a Rock Mechanics Site Descriptive Model. Development and testing of the theoretical approach

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Isabelle; Fredriksson, Anders; Outters, Nils [Golder Associates AB, Uppsala (Sweden)

    2002-05-01

    In the purpose of studying the possibilities of a Deep Repository for spent fuel, the Swedish Nuclear and Fuel Management Company (SKB) is currently planning for Site Investigations. Data collected from these Site Investigations are interpreted and analysed to achieve the full Site Description, which is built up of models from all the disciplines that are considered of importance for the Site Description. One of these models is the Rock Mechanical Descriptive Model,which would be developed for any site in hard crystalline rock, and is a combination and evaluation of the characterisation of rock mass by means of empirical relationships and a theoretical approach based on numerical modelling. The present report describes the theoretical approach. The characterisation of the mechanical properties of the rock mass, viewed as a unit consisting of intact rock and fractures, is achieved by numerical simulations with following input parameters: initial stresses, fracture geometry, distribution of rock mechanical properties, such as deformation and strength parameters, for the intact rock and for the fractures. The numerical modelling was performed with the two-dimensional code UDEC, and the rock block models were generated from 2D trace sections extracted from the 3D Discrete Fracture Network (DFN) model. Assumptions and uncertainties related to the set-up of the model are considered. The numerical model was set-up to simulate a plain strain-loading test. Different boundary conditions were applied on the model for simulating stress conditions (I) in the undisturbed rock mass, and (II) at the proximity of a tunnel. In order to assess the reliability of the model sensitivity analyses have been conducted on some rock block models for defining the dependency of mechanical properties to in situ stresses, the influence of boundary conditions, rock material and joint constitutive models used to simulate the behaviour of intact rock and fractures, domain size and anisotropy. To

  17. Strategy for a Rock Mechanics Site Descriptive Model. Development and testing of the theoretical approach

    International Nuclear Information System (INIS)

    Staub, Isabelle; Fredriksson, Anders; Outters, Nils

    2002-05-01

    In the purpose of studying the possibilities of a Deep Repository for spent fuel, the Swedish Nuclear and Fuel Management Company (SKB) is currently planning for Site Investigations. Data collected from these Site Investigations are interpreted and analysed to achieve the full Site Description, which is built up of models from all the disciplines that are considered of importance for the Site Description. One of these models is the Rock Mechanical Descriptive Model,which would be developed for any site in hard crystalline rock, and is a combination and evaluation of the characterisation of rock mass by means of empirical relationships and a theoretical approach based on numerical modelling. The present report describes the theoretical approach. The characterisation of the mechanical properties of the rock mass, viewed as a unit consisting of intact rock and fractures, is achieved by numerical simulations with following input parameters: initial stresses, fracture geometry, distribution of rock mechanical properties, such as deformation and strength parameters, for the intact rock and for the fractures. The numerical modelling was performed with the two-dimensional code UDEC, and the rock block models were generated from 2D trace sections extracted from the 3D Discrete Fracture Network (DFN) model. Assumptions and uncertainties related to the set-up of the model are considered. The numerical model was set-up to simulate a plain strain-loading test. Different boundary conditions were applied on the model for simulating stress conditions (I) in the undisturbed rock mass, and (II) at the proximity of a tunnel. In order to assess the reliability of the model sensitivity analyses have been conducted on some rock block models for defining the dependency of mechanical properties to in situ stresses, the influence of boundary conditions, rock material and joint constitutive models used to simulate the behaviour of intact rock and fractures, domain size and anisotropy. To

  18. 17 CFR 242.603 - Distribution, consolidation, and display of information with respect to quotations for and...

    Science.gov (United States)

    2010-04-01

    ..., and display of information with respect to quotations for and transactions in NMS stocks. 242.603... with respect to quotations for and transactions in NMS stocks. (a) Distribution of information. (1) Any... source, that distributes information with respect to quotations for or transactions in an NMS stock to a...

  19. Secure Retrieval of FFTF Testing, Design, and Operating Information

    International Nuclear Information System (INIS)

    Butner, R. Scott; Wootan, David W.; Omberg, Ronald P.; Makenas, Bruce J.; Nielsen, Deborah

    2009-01-01

    One of the goals of the Advanced Fuel Cycle Initiative (AFCI) is to preserve the knowledge that has been gained in the United States on Liquid Metal Reactors (LMR). In addition, preserving LMR information and knowledge is part of a larger international collaborative activity conducted under the auspices of the International Atomic Energy Agency (IAEA). A similar program is being conducted for EBR-II at the Idaho Nuclear Laboratory (INL) and international programs are also in progress. Knowledge preservation at the FFTF is focused on the areas of design, construction, startup, and operation of the reactor. As the primary function of the FFTF was testing, the focus is also on preserving information obtained from irradiation testing of fuels and materials. This information will be invaluable when, at a later date, international decisions are made to pursue new LMRs. In the interim, this information may be of potential use for international exchanges with other LMR programs around the world. At least as important in the United States, which is emphasizing large-scale computer simulation and modeling, this information provides the basis for creating benchmarks for validating and testing these large scale computer programs. Although the preservation activity with respect to FFTF information as discussed below is still underway, the team of authors above is currently retrieving and providing experimental and design information to the LMR modeling and simulation efforts for use in validating their computer models. On the Hanford Site, the FFTF reactor plant is one of the facilities intended for decontamination and decommissioning consistent with the cleanup mission on this site. The reactor facility has been deactivated and is being maintained in a cold and dark minimal surveillance and maintenance mode until final decommissioning is pursued. In order to ensure protection of information at risk, the program to date has focused on sequestering and secure retrieval

  20. The Space Station Module Power Management and Distribution automation test bed

    Science.gov (United States)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module Power Management And Distribution (SSM/PMAD) automation test bed project was begun at NASA/Marshall Space Flight Center (MSFC) in the mid-1980s to develop an autonomous, user-supportive power management and distribution test bed simulating the Space Station Freedom Hab/Lab modules. As the test bed has matured, many new technologies and projects have been added. The author focuses on three primary areas. The first area is the overall accomplishments of the test bed itself. These include a much-improved user interface, a more efficient expert system scheduler, improved communication among the three expert systems, and initial work on adding intermediate levels of autonomy. The second area is the addition of a more realistic power source to the SSM/PMAD test bed; this project is called the Large Autonomous Spacecraft Electrical Power System (LASEPS). The third area is the completion of a virtual link between the SSM/PMAD test bed at MSFC and the Autonomous Power Expert at Lewis Research Center.

  1. Distributed Information and Control system reliability enhancement by fog-computing concept application

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-03-01

    The paper focuses on the information and control system reliability issue. Authors of the current paper propose a new complex approach of information and control system reliability enhancement by application of the computing concept elements. The approach proposed consists of a complex of optimization problems to be solved. These problems are: estimation of computational complexity, which can be shifted to the edge of the network and fog-layer, distribution of computations among the data processing elements and distribution of computations among the sensors. The problems as well as some simulated results and discussion are formulated and presented within this paper.

  2. Goodness of Fit Test and Test of Independence by Entropy

    Directory of Open Access Journals (Sweden)

    M. Sharifdoost

    2009-06-01

    Full Text Available To test whether a set of data has a specific distribution or not, we can use the goodness of fit test. This test can be done by one of Pearson X 2 -statistic or the likelihood ratio statistic G 2 , which are asymptotically equal, and also by using the Kolmogorov-Smirnov statistic in continuous distributions. In this paper, we introduce a new test statistic for goodness of fit test which is based on entropy distance, and which can be applied for large sample sizes. We compare this new statistic with the classical test statistics X 2 , G 2 , and Tn by some simulation studies. We conclude that the new statistic is more sensitive than the usual statistics to the rejection of distributions which are almost closed to the desired distribution. Also for testing independence, a new test statistic based on mutual information is introduced

  3. Distributed analysis functional testing using GangaRobot in the ATLAS experiment

    Science.gov (United States)

    Legger, Federica; ATLAS Collaboration

    2011-12-01

    Automated distributed analysis tests are necessary to ensure smooth operations of the ATLAS grid resources. The HammerCloud framework allows for easy definition, submission and monitoring of grid test applications. Both functional and stress test applications can be defined in HammerCloud. Stress tests are large-scale tests meant to verify the behaviour of sites under heavy load. Functional tests are light user applications running at each site with high frequency, to ensure that the site functionalities are available at all times. Success or failure rates of these tests jobs are individually monitored. Test definitions and results are stored in a database and made available to users and site administrators through a web interface. In this work we present the recent developments of the GangaRobot framework. GangaRobot monitors the outcome of functional tests, creates a blacklist of sites failing the tests, and exports the results to the ATLAS Site Status Board (SSB) and to the Service Availability Monitor (SAM), providing on the one hand a fast way to identify systematic or temporary site failures, and on the other hand allowing for an effective distribution of the work load on the available resources.

  4. Towards a theory of tiered testing.

    Science.gov (United States)

    Hansson, Sven Ove; Rudén, Christina

    2007-06-01

    Tiered testing is an essential part of any resource-efficient strategy for the toxicity testing of a large number of chemicals, which is required for instance in the risk management of general (industrial) chemicals, In spite of this, no general theory seems to be available for the combination of single tests into efficient tiered testing systems. A first outline of such a theory is developed. It is argued that chemical, toxicological, and decision-theoretical knowledge should be combined in the construction of such a theory. A decision-theoretical approach for the optimization of test systems is introduced. It is based on expected utility maximization with simplified assumptions covering factual and value-related information that is usually missing in the development of test systems.

  5. Probabilistic Decision Making with Spikes: From ISI Distributions to Behaviour via Information Gain.

    Directory of Open Access Journals (Sweden)

    Javier A Caballero

    Full Text Available Computational theories of decision making in the brain usually assume that sensory 'evidence' is accumulated supporting a number of hypotheses, and that the first accumulator to reach threshold triggers a decision in favour of its associated hypothesis. However, the evidence is often assumed to occur as a continuous process whose origins are somewhat abstract, with no direct link to the neural signals - action potentials or 'spikes' - that must ultimately form the substrate for decision making in the brain. Here we introduce a new variant of the well-known multi-hypothesis sequential probability ratio test (MSPRT for decision making whose evidence observations consist of the basic unit of neural signalling - the inter-spike interval (ISI - and which is based on a new form of the likelihood function. We dub this mechanism s-MSPRT and show its precise form for a range of realistic ISI distributions with positive support. In this way we show that, at the level of spikes, the refractory period may actually facilitate shorter decision times, and that the mechanism is robust against poor choice of the hypothesized data distribution. We show that s-MSPRT performance is related to the Kullback-Leibler divergence (KLD or information gain between ISI distributions, through which we are able to link neural signalling to psychophysical observation at the behavioural level. Thus, we find the mean information needed for a decision is constant, thereby offering an account of Hick's law (relating decision time to the number of choices. Further, the mean decision time of s-MSPRT shows a power law dependence on the KLD offering an account of Piéron's law (relating reaction time to stimulus intensity. These results show the foundations for a research programme in which spike train analysis can be made the basis for predictions about behavior in multi-alternative choice tasks.

  6. Probabilistic Decision Making with Spikes: From ISI Distributions to Behaviour via Information Gain.

    Science.gov (United States)

    Caballero, Javier A; Lepora, Nathan F; Gurney, Kevin N

    2015-01-01

    Computational theories of decision making in the brain usually assume that sensory 'evidence' is accumulated supporting a number of hypotheses, and that the first accumulator to reach threshold triggers a decision in favour of its associated hypothesis. However, the evidence is often assumed to occur as a continuous process whose origins are somewhat abstract, with no direct link to the neural signals - action potentials or 'spikes' - that must ultimately form the substrate for decision making in the brain. Here we introduce a new variant of the well-known multi-hypothesis sequential probability ratio test (MSPRT) for decision making whose evidence observations consist of the basic unit of neural signalling - the inter-spike interval (ISI) - and which is based on a new form of the likelihood function. We dub this mechanism s-MSPRT and show its precise form for a range of realistic ISI distributions with positive support. In this way we show that, at the level of spikes, the refractory period may actually facilitate shorter decision times, and that the mechanism is robust against poor choice of the hypothesized data distribution. We show that s-MSPRT performance is related to the Kullback-Leibler divergence (KLD) or information gain between ISI distributions, through which we are able to link neural signalling to psychophysical observation at the behavioural level. Thus, we find the mean information needed for a decision is constant, thereby offering an account of Hick's law (relating decision time to the number of choices). Further, the mean decision time of s-MSPRT shows a power law dependence on the KLD offering an account of Piéron's law (relating reaction time to stimulus intensity). These results show the foundations for a research programme in which spike train analysis can be made the basis for predictions about behavior in multi-alternative choice tasks.

  7. Locating sensors for detecting source-to-target patterns of special nuclear material smuggling: a spatial information theoretic approach.

    Science.gov (United States)

    Przybyla, Jay; Taylor, Jeffrey; Zhou, Xuesong

    2010-01-01

    In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM) smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  8. Locating Sensors for Detecting Source-to-Target Patterns of Special Nuclear Material Smuggling: A Spatial Information Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Xuesong Zhou

    2010-08-01

    Full Text Available In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  9. Factors affecting daughters distribution among progeny testing Holstein bulls

    Directory of Open Access Journals (Sweden)

    Martino Cassandro

    2012-01-01

    Full Text Available The aim of this study was to investigate factors influencing the number of daughters of Holstein bulls during the progeny testing using data provided by the Italian Holstein Friesian Cattle Breeders Association. The hypothesis is that there are no differences among artificial insemination studs (AIS on the daughters distribution among progeny testing bulls. For each bull and beginning from 21 months of age, the distribution of daughters over the progeny testing period was calculated. Data were available on 1973 bulls born between 1986 and 2004, progeny tested in Italy and with at least 4 paternal half-sibs. On average, bulls exited the genetic centre at 11.3±1.1 months and reached their first official genetic proof at 58.0±3.1 months of age. An analysis of variance was performed on the cumulative frequency of daughters at 24, 36, 48, and 60 months. The generalized linear model included the fixed effects of year of birth of the bull (18 levels, artificial insemination stud (4 levels and sire of bull (137 levels. All effects significantly affected the variability of studied traits. Artificial insemination stud was the most important source of variation, followed by year of birth and sire of bull. Significant differences among AI studs exist, probably reflecting different strategies adopted during progeny testing.

  10. Distributed and dynamic intracellular organization of extracellular information.

    Science.gov (United States)

    Granados, Alejandro A; Pietsch, Julian M J; Cepeda-Humerez, Sarah A; Farquhar, Iseabail L; Tkačik, Gašper; Swain, Peter S

    2018-06-05

    Although cells respond specifically to environments, how environmental identity is encoded intracellularly is not understood. Here, we study this organization of information in budding yeast by estimating the mutual information between environmental transitions and the dynamics of nuclear translocation for 10 transcription factors. Our method of estimation is general, scalable, and based on decoding from single cells. The dynamics of the transcription factors are necessary to encode the highest amounts of extracellular information, and we show that information is transduced through two channels: Generalists (Msn2/4, Tod6 and Dot6, Maf1, and Sfp1) can encode the nature of multiple stresses, but only if stress is high; specialists (Hog1, Yap1, and Mig1/2) encode one particular stress, but do so more quickly and for a wider range of magnitudes. In particular, Dot6 encodes almost as much information as Msn2, the master regulator of the environmental stress response. Each transcription factor reports differently, and it is only their collective behavior that distinguishes between multiple environmental states. Changes in the dynamics of the localization of transcription factors thus constitute a precise, distributed internal representation of extracellular change. We predict that such multidimensional representations are common in cellular decision-making.

  11. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  12. Multi-UAV Doppler Information Fusion for Target Tracking Based on Distributed High Degrees Information Filters

    Directory of Open Access Journals (Sweden)

    Hamza Benzerrouk

    2018-03-01

    Full Text Available Multi-Unmanned Aerial Vehicle (UAV Doppler-based target tracking has not been widely investigated, specifically when using modern nonlinear information filters. A high-degree Gauss–Hermite information filter, as well as a seventh-degree cubature information filter (CIF, is developed to improve the fifth-degree and third-degree CIFs proposed in the most recent related literature. These algorithms are applied to maneuvering target tracking based on Radar Doppler range/range rate signals. To achieve this purpose, different measurement models such as range-only, range rate, and bearing-only tracking are used in the simulations. In this paper, the mobile sensor target tracking problem is addressed and solved by a higher-degree class of quadrature information filters (HQIFs. A centralized fusion architecture based on distributed information filtering is proposed, and yielded excellent results. Three high dynamic UAVs are simulated with synchronized Doppler measurement broadcasted in parallel channels to the control center for global information fusion. Interesting results are obtained, with the superiority of certain classes of higher-degree quadrature information filters.

  13. Theoretical method for determining particle distribution functions of classical systems

    International Nuclear Information System (INIS)

    Johnson, E.

    1980-01-01

    An equation which involves the triplet distribution function and the three-particle direct correlation function is obtained. This equation was derived using an analogue of the Ornstein--Zernike equation. The new equation is used to develop a variational method for obtaining the triplet distribution function of uniform one-component atomic fluids from the pair distribution function. The variational method may be used with the first and second equations in the YBG hierarchy to obtain pair and triplet distribution functions. It should be easy to generalize the results to the n-particle distribution function

  14. A Two-Level Cache for Distributed Information Retrieval in Search Engines

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users’ logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  15. A two-level cache for distributed information retrieval in search engines.

    Science.gov (United States)

    Zhang, Weizhe; He, Hui; Ye, Jianwei

    2013-01-01

    To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users' logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  16. A Theoretical Framework for Soft-Information-Based Synchronization in Iterative (Turbo Receivers

    Directory of Open Access Journals (Sweden)

    Lottici Vincenzo

    2005-01-01

    Full Text Available This contribution considers turbo synchronization, that is to say, the use of soft data information to estimate parameters like carrier phase, frequency, or timing offsets of a modulated signal within an iterative data demodulator. In turbo synchronization, the receiver exploits the soft decisions computed at each turbo decoding iteration to provide a reliable estimate of some signal parameters. The aim of our paper is to show that such "turbo-estimation" approach can be regarded as a special case of the expectation-maximization (EM algorithm. This leads to a general theoretical framework for turbo synchronization that allows to derive parameter estimation procedures for carrier phase and frequency offset, as well as for timing offset and signal amplitude. The proposed mathematical framework is illustrated by simulation results reported for the particular case of carrier phase and frequency offsets estimation of a turbo-coded 16-QAM signal.

  17. Distributed Cooperative Secondary Control of Microgrids Using Feedback Linearization

    DEFF Research Database (Denmark)

    Bidram, Ali; Davoudi, Ali; Lewis, Frank

    2013-01-01

    This paper proposes a secondary voltage control of microgrids based on the distributed cooperative control of multi-agent systems. The proposed secondary control is fully distributed; each distributed generator (DG) only requires its own information and the information of some neighbors. The dist......This paper proposes a secondary voltage control of microgrids based on the distributed cooperative control of multi-agent systems. The proposed secondary control is fully distributed; each distributed generator (DG) only requires its own information and the information of some neighbors...... parameters can be tuned to obtain a desired response speed. The effectiveness of the proposed control methodology is verified by the simulation of a microgrid test system....

  18. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  19. Wealth of information derivable from Evaporation Residue (ER) angular momentum distributions

    International Nuclear Information System (INIS)

    Madhavan, N.

    2016-01-01

    Understanding fusion-fission dynamics is possible by studying the fission process, or, alternatively, by studying the complementary fusion-evaporation process. Though the latter method is difficult to implement, requiring sophisticated recoil separators/spectrometers for selecting the ERs in the direction of the primary beam, it provides more clarity with better accuracy and is indispensible for probing the pre-saddle region in heavy nuclei. Super Heavy Element (SHE) search crucially depends on understanding the fusion-fission process, the choice of entrance channel and excitation energy of the Compound Nucleus (CN), ER cross-section and, more importantly, the angular momenta populated in ERs which survive fission. The measurement of ER angular momentum distributions, through coincidence technique involving large gamma multiplicity detector array and recoil separator, throws up a wealth of information such as, nuclear viscosity effects, limits of stability of ERs, shape changes at high spins, snapshot of frozen set of barriers using a single-shot experiment and indirect information about onset of quasi-fission processes. There is a paucity of experimental data with regard to angular momentum distributions in heavy nuclei due to experimental constraints. In this talk, the variety of information which could be derived through experimental ER angular momentum distributions will be elaborated with examples from work carried out at IUAC using advanced experimental facilities. (author)

  20. Impact of peak electricity demand in distribution grids: a stress test

    NARCIS (Netherlands)

    Hoogsteen, Gerwin; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria; Schuring, Friso; Kootstra, Ben

    2015-01-01

    The number of (hybrid) electric vehicles is growing, leading to a higher demand for electricity in distribution grids. To investigate the effects of the expected peak demand on distribution grids, a stress test with 15 electric vehicles in a single street is conducted and described in this paper.

  1. Value-Added Taxes, Chain Effects, and Informality

    OpenAIRE

    Áureo de Paula; Jose A. Scheinkman

    2010-01-01

    This paper investigates determinants of informal economic activity. We present an equilibrium model of informality and test its implications using a survey of 48,000+ small firms in Brazil. We define informality as tax avoidance; firms in the informal sector avoid tax payments but suffer other limitations. A novel theoretical contribution in this model is the role of value added taxes in transmitting informality. It predicts that the informality of a firm is correlated to the informality of f...

  2. Science Academies' Refresher Course on Theoretical Structural ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 22; Issue 8. Science Academies' Refresher Course on Theoretical Structural Geology, Crystallography, Mineralogy, Thermodynamics, Experimental Petrology and Theoretical Geophysics. Information and Announcements Volume 22 Issue 8 August 2017 ...

  3. Modification of Kolmogorov-Smirnov test for DNA content data analysis through distribution alignment.

    Science.gov (United States)

    Huang, Shuguang; Yeo, Adeline A; Li, Shuyu Dan

    2007-10-01

    The Kolmogorov-Smirnov (K-S) test is a statistical method often used for comparing two distributions. In high-throughput screening (HTS) studies, such distributions usually arise from the phenotype of independent cell populations. However, the K-S test has been criticized for being overly sensitive in applications, and it often detects a statistically significant difference that is not biologically meaningful. One major reason is that there is a common phenomenon in HTS studies that systematic drifting exists among the distributions due to reasons such as instrument variation, plate edge effect, accidental difference in sample handling, etc. In particular, in high-content cellular imaging experiments, the location shift could be dramatic since some compounds themselves are fluorescent. This oversensitivity of the K-S test is particularly overpowered in cellular assays where the sample sizes are very big (usually several thousands). In this paper, a modified K-S test is proposed to deal with the nonspecific location-shift problem in HTS studies. Specifically, we propose that the distributions are "normalized" by density curve alignment before the K-S test is conducted. In applications to simulation data and real experimental data, the results show that the proposed method has improved specificity.

  4. Information-Theoretic Approach May Shed a Light to a Better Understanding and Sustaining the Integrity of Ecological-Societal Systems under Changing Climate

    Science.gov (United States)

    Kim, J.

    2016-12-01

    Considering high levels of uncertainty, epistemological conflicts over facts and values, and a sense of urgency, normal paradigm-driven science will be insufficient to mobilize people and nation toward sustainability. The conceptual framework to bridge the societal system dynamics with that of natural ecosystems in which humanity operates remains deficient. The key to understanding their coevolution is to understand `self-organization.' Information-theoretic approach may shed a light to provide a potential framework which enables not only to bridge human and nature but also to generate useful knowledge for understanding and sustaining the integrity of ecological-societal systems. How can information theory help understand the interface between ecological systems and social systems? How to delineate self-organizing processes and ensure them to fulfil sustainability? How to evaluate the flow of information from data through models to decision-makers? These are the core questions posed by sustainability science in which visioneering (i.e., the engineering of vision) is an essential framework. Yet, visioneering has neither quantitative measure nor information theoretic framework to work with and teach. This presentation is an attempt to accommodate the framework of self-organizing hierarchical open systems with visioneering into a common information-theoretic framework. A case study is presented with the UN/FAO's communal vision of climate-smart agriculture (CSA) which pursues a trilemma of efficiency, mitigation, and resilience. Challenges of delineating and facilitating self-organizing systems are discussed using transdisciplinary toold such as complex systems thinking, dynamic process network analysis and multi-agent systems modeling. Acknowledgments: This study was supported by the Korea Meteorological Administration Research and Development Program under Grant KMA-2012-0001-A (WISE project).

  5. Toward an Integrative Theoretical Framework for Explaining Beliefs about Wife Beating: A Study among Students of Nursing from Turkey

    Science.gov (United States)

    Haj-Yahia, Muhammad M.; Uysal, Aynur

    2011-01-01

    An integrative theoretical framework was tested as the basis for explaining beliefs about wife beating among Turkish nursing students. Based on a survey design, 406 nursing students (404 females) in all 4 years of undergraduate studies completed a self-administered questionnaire. Questionnaires were distributed and collected from the participants…

  6. Coordinating Information and Decisions of Hierarchical Distributed Decision Units in Crises

    National Research Council Canada - National Science Library

    Rose, Gerald

    1997-01-01

    A program of research is described. The research addressed decision making by distributed decision makers using either consensus or leader structures and confronted by both routine tasks and different kinds of information system crisis...

  7. Ensemble distribution for immiscible two-phase flow in porous media.

    Science.gov (United States)

    Savani, Isha; Bedeaux, Dick; Kjelstrup, Signe; Vassvik, Morten; Sinha, Santanu; Hansen, Alex

    2017-02-01

    We construct an ensemble distribution to describe steady immiscible two-phase flow of two incompressible fluids in a porous medium. The system is found to be ergodic. The distribution is used to compute macroscopic flow parameters. In particular, we find an expression for the overall mobility of the system from the ensemble distribution. The entropy production at the scale of the porous medium is shown to give the expected product of the average flow and its driving force, obtained from a black-box description. We test numerically some of the central theoretical results.

  8. Experimental, computational and theoretical studies of δ′ phase coarsening in Al–Li alloys

    International Nuclear Information System (INIS)

    Pletcher, B.A.; Wang, K.G.; Glicksman, M.E.

    2012-01-01

    Experimental characterization of microstructure evolution in three binary Al–Li alloys provides critical tests of both diffusion screening theory and multiparticle diffusion simulations, which predict late-stage phase-coarsening kinetics. Particle size distributions, growth kinetics and maximum particle sizes obtained using quantitative, centered dark-field transmission electron microscopy are compared quantitatively with theoretical and computational predictions. We also demonstrate the dependence on δ′ precipitate volume fraction of the rate constant for coarsening and the microstructure’s maximum particle size, both of which remained undetermined for this alloy system for nearly a half century. Our experiments show quantitatively that the diffusion-screening theoretical description of phase coarsening yields reasonable kinetic predictions, and that useful simulations of microstructure evolution are obtained via multiparticle diffusion. The tested theory and simulation method will provide useful tools for future design of two-phase alloys for elevated temperature applications.

  9. LEDA RF distribution system design and component test results

    International Nuclear Information System (INIS)

    Roybal, W.T.; Rees, D.E.; Borchert, H.L.; McCarthy, M.; Toole, L.

    1998-01-01

    The 350 MHz and 700 MHz RF distribution systems for the Low Energy Demonstration Accelerator (LEDA) have been designed and are currently being installed at Los Alamos National Laboratory. Since 350 MHz is a familiar frequency used at other accelerator facilities, most of the major high-power components were available. The 700 MHz, 1.0 MW, CW RF delivery system designed for LEDA is a new development. Therefore, high-power circulators, waterloads, phase shifters, switches, and harmonic filters had to be designed and built for this applications. The final Accelerator Production of Tritium (APT) RF distribution systems design will be based on much of the same technology as the LEDA systems and will have many of the RF components tested for LEDA incorporated into the design. Low power and high-power tests performed on various components of these LEDA systems and their results are presented here

  10. Web-based Distributed Medical Information System for Chronic Viral Hepatitis

    Science.gov (United States)

    Yang, Ying; Qin, Tuan-fa; Jiang, Jian-ning; Lu, Hui; Ma, Zong-e.; Meng, Hong-chang

    2008-11-01

    To make a long-term dynamic monitoring to the chronically ill, especially patients of HBV A, we build a distributed Medical Information System for Chronic Viral Hepatitis (MISCHV). The Web-based system architecture and its function are described, and the extensive application and important role are also presented.

  11. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    Science.gov (United States)

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  12. Construction Of A Computerised Information-Processing Test Battery

    Directory of Open Access Journals (Sweden)

    Johann M. Schepers

    2002-09-01

    Full Text Available The primary goal of the study was to construct a computerised information-processing test battery to measure choice reaction time for up to and including six bits of information, to measure discrimination reaction time with regard to colour patterns and form patterns, to measure rate of information processing with regard to perceptual stimuli and conceptual reasoning, and to develop a suitable scoring system for the respective tests. The battery of tests was applied to 58 pilots. Opsomming Die hoofdoel van die studie was om ‘n gerekenariseerde inligtingverwerkingstoets-battery te konstrueer om keusereaksietyd tot en met ses bis inligting te meet, om diskriminasie-reaksietyd ten opsigte van kleurpatrone en vormpatrone te meet, om tempo van inligtingverwerking ten opsigte van perseptuele stimuli en konseptuele redenering te meet en om ‘n gepaste nasienstelsel vir die onderskeie toetse te ontwikkel. Die battery toetse is op 58 vlieëniers toegepas

  13. Integrating non-animal test information into an adaptive testing strategy - skin sensitization proof of concept case.

    Science.gov (United States)

    Jaworska, Joanna; Harol, Artsiom; Kern, Petra S; Gerberick, G Frank

    2011-01-01

    There is an urgent need to develop data integration and testing strategy frameworks allowing interpretation of results from animal alternative test batteries. To this end, we developed a Bayesian Network Integrated Testing Strategy (BN ITS) with the goal to estimate skin sensitization hazard as a test case of previously developed concepts (Jaworska et al., 2010). The BN ITS combines in silico, in chemico, and in vitro data related to skin penetration, peptide reactivity, and dendritic cell activation, and guides testing strategy by Value of Information (VoI). The approach offers novel insights into testing strategies: there is no one best testing strategy, but the optimal sequence of tests depends on information at hand, and is chemical-specific. Thus, a single generic set of tests as a replacement strategy is unlikely to be most effective. BN ITS offers the possibility of evaluating the impact of generating additional data on the target information uncertainty reduction before testing is commenced.

  14. Preventive chemotherapy in human helminthiasis: theoretical and operational aspects.

    Science.gov (United States)

    Gabrielli, A-F; Montresor, A; Chitsulo, L; Engels, D; Savioli, L

    2011-12-01

    Preventive chemotherapy (PC), the large-scale distribution of anthelminthic drugs to population groups at risk, is the core intervention recommended by the WHO for reducing morbidity and transmission of the four main helminth infections, namely lymphatic filariasis, onchocerciasis, schistosomiasis and soil-transmitted helminthiasis. The strategy is widely implemented worldwide but its general theoretical foundations have not been described so far in a comprehensive and cohesive manner. Starting from the information available on the biological and epidemiological characteristics of helminth infections, as well as from the experience generated by disease control and elimination interventions across the world, we extrapolate the fundamentals and synthesise the principles that regulate PC and justify its implementation as a sound and essential public health intervention. The outline of the theoretical aspects of PC contributes to a thorough understanding of the different facets of this strategy and helps comprehend opportunities and limits of control and elimination interventions directed against helminth infections. Copyright © 2011 Royal Society of Tropical Medicine and Hygiene. Published by Elsevier Ltd. All rights reserved.

  15. Theoretical and experimental analysis of daylight performance for various shading systems

    Energy Technology Data Exchange (ETDEWEB)

    Tsangrassoulis, A [Group Building Enviromental Studies, Lab. of Meteorology, Dept. of Applied Physics, Univ. of Athens (Greece); Santamouris, M [Group Building Enviromental Studies, Lab. of Meteorology, Dept. of Applied Physics, Univ. of Athens (Greece); Asimakopoulos, D [Group Building Enviromental Studies, Lab. of Meteorology, Dept. of Applied Physics, Univ. of Athens (Greece)

    1997-12-31

    The daylight coefficient approach is used for the theoretical analysis of various shading systems. Once a set of these coefficients has been calculated, it is very easy to calculate illuminance in the interior of a room under various sky luminance distributions. The present paper examines a method based on daylight coefficients to evaluate daylight in the interior of a room. The method is compared with existing radiosity and ray-tracing methods. The examined method is experimentaly validated using measurements obtained in a PASSYS test-cell equipped with shading devices. (orig.)

  16. Theoretical parameter histories of dynamic tests during power commissioning of Mochovce units power level up to 100 % Nnom

    International Nuclear Information System (INIS)

    Jagrik, J.; Mraz, M.; Rapant, M.; Stefanovic, P.; Kotasek, J.; Gieci, A.; Macko, J.; Mosny, J.

    1998-01-01

    Theoretical histories of selected parameters for dynamic tests carried out in the course of power commissioning of the Mochovce Unit 1 at the power level 100% N nom are shown in the report. The expected histories given were developed based on calculations performed by means of simulator in Nuclear Power Plants Research Institute Trnava, Inc., simulator in EGU Praha and simulator at the Mochovce plant, as well as based on similar tests at both Bohunice and Dukovany plants

  17. A group theoretic approach to quantum information

    CERN Document Server

    Hayashi, Masahito

    2017-01-01

    This textbook is the first one addressing quantum information from the viewpoint of group symmetry. Quantum systems have a group symmetrical structure. This structure enables to handle systematically quantum information processing. However, there is no other textbook focusing on group symmetry for quantum information although there exist many textbooks for group representation. After the mathematical preparation of quantum information, this book discusses quantum entanglement and its quantification by using group symmetry. Group symmetry drastically simplifies the calculation of several entanglement measures although their calculations are usually very difficult to handle. This book treats optimal information processes including quantum state estimation, quantum state cloning, estimation of group action and quantum channel etc. Usually it is very difficult to derive the optimal quantum information processes without asymptotic setting of these topics. However, group symmetry allows to derive these optimal solu...

  18. A Generic Danish Distribution Grid Model for Smart Grid Technology Testing

    DEFF Research Database (Denmark)

    Cha, Seung-Tae; Wu, Qiuwei; Østergaard, Jacob

    2012-01-01

    This paper describes the development of a generic Danish distribution grid model for smart grid technology testing based on the Bornholm power system. The frequency dependent network equivalent (FDNE) method has been used in order to accurately preserve the desired properties and characteristics...... as a generic Smart Grid benchmark model for testing purposes....... by comparing the transient response of the original Bornholm power system model and the developed generic model under significant fault conditions. The results clearly show that the equivalent generic distribution grid model retains the dynamic characteristics of the original system, and can be used...

  19. Population decay time and distribution of exciton states analyzed by rate equations based on theoretical phononic and electron-collisional rate coefficients

    Science.gov (United States)

    Oki, Kensuke; Ma, Bei; Ishitani, Yoshihiro

    2017-11-01

    Population distributions and transition fluxes of the A exciton in bulk GaN are theoretically analyzed using rate equations of states of the principal quantum number n up to 5 and the continuum. These rate equations consist of the terms of radiative, electron-collisional, and phononic processes. The dependence of the rate coefficients on temperature is revealed on the basis of the collisional-radiative model of hydrogen plasma for the electron-collisional processes and theoretical formulation using Fermi's "golden rule" for the phononic processes. The respective effects of the variations in electron, exciton, and lattice temperatures are exhibited. This analysis is a base of the discussion on nonthermal equilibrium states of carrier-exciton-phonon dynamics. It is found that the exciton dissociation is enhanced even below 150 K mainly by the increase in the lattice temperature. When the thermal-equilibrium temperature increases, the population fluxes between the states of n >1 and the continuum become more dominant. Below 20 K, the severe deviation from the Saha-Boltzmann distribution occurs owing to the interband excitation flux being higher than the excitation flux from the 1 S state. The population decay time of the 1 S state at 300 K is more than ten times longer than the recombination lifetime of excitons with kinetic energy but without the upper levels (n >1 and the continuum). This phenomenon is caused by a shift of population distribution to the upper levels. This phonon-exciton-radiation model gives insights into the limitations of conventional analyses such as the ABC model, the Arrhenius plot, the two-level model (n =1 and the continuum), and the neglect of the upper levels.

  20. Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints

    Directory of Open Access Journals (Sweden)

    Jim W. Kay

    2018-03-01

    Full Text Available The Partial Information Decomposition, introduced by Williams P. L. et al. (2010, provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method ( I dep has recently been proposed by James R. G. et al. (2017 for computing a two-predictor partial information decomposition over discrete spaces. A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a particular predictor has about the target is defined as the minimum increase in joint predictor-target mutual information when that particular predictor-target marginal dependency is constrained. Here, we apply the I dep approach to Gaussian systems, for which the marginally constrained maximum entropy models are Gaussian graphical models. Closed form solutions for the I dep PID are derived for both univariate and multivariate Gaussian systems. Numerical and graphical illustrations are provided, together with practical and theoretical comparisons of the I dep PID with the minimum mutual information partial information decomposition ( I mmi , which was discussed by Barrett A. B. (2015. The results obtained using I dep appear to be more intuitive than those given with other methods, such as I mmi , in which the redundant and unique information components are constrained to depend only on the predictor-target marginal distributions. In particular, it is proved that the I mmi method generally produces larger estimates of redundancy and synergy than does the I dep method. In discussion of the practical examples, the PIDs are complemented by the use of tests of deviance for the comparison of Gaussian graphical models.

  1. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  2. A Distributed Multi-Agent System for Collaborative Information Management and Learning

    Science.gov (United States)

    Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this paper, we present DIAMS, a system of distributed, collaborative agents to help users access, manage, share and exchange information. A DIAMS personal agent helps its owner find information most relevant to current needs. It provides tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Flexible hierarchical display is integrated with indexed query search-to support effective information access. Automatic indexing methods are employed to support user queries and communication between agents. Contents of a repository are kept in object-oriented storage to facilitate information sharing. Collaboration between users is aided by easy sharing utilities as well as automated information exchange. Matchmaker agents are designed to establish connections between users with similar interests and expertise. DIAMS agents provide needed services for users to share and learn information from one another on the World Wide Web.

  3. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    Directory of Open Access Journals (Sweden)

    Shan Yang

    2016-01-01

    Full Text Available Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverter based distributed generation is proposed. The proposed method let the inverter based distributed generation be equivalent to Iθ bus, which makes it suitable to calculate the power flow of distribution network with a current limited inverter based distributed generation. And the low voltage ride through capability of inverter based distributed generation can be considered as well in this paper. Finally, some tests of power flow and short circuit current calculation are performed on a 33-bus distribution network. The calculated results from the proposed method in this paper are contrasted with those by the traditional method and the simulation method, whose results have verified the effectiveness of the integrated method suggested in this paper.

  4. Weighted Lomax distribution.

    Science.gov (United States)

    Kilany, N M

    2016-01-01

    The Lomax distribution (Pareto Type-II) is widely applicable in reliability and life testing problems in engineering as well as in survival analysis as an alternative distribution. In this paper, Weighted Lomax distribution is proposed and studied. The density function and its behavior, moments, hazard and survival functions, mean residual life and reversed failure rate, extreme values distributions and order statistics are derived and studied. The parameters of this distribution are estimated by the method of moments and the maximum likelihood estimation method and the observed information matrix is derived. Moreover, simulation schemes are derived. Finally, an application of the model to a real data set is presented and compared with some other well-known distributions.

  5. Moisture distribution in sludges based on different testing methods

    Institute of Scientific and Technical Information of China (English)

    Wenyi Deng; Xiaodong Li; Jianhua Yan; Fei Wang; Yong Chi; Kefa Cen

    2011-01-01

    Moisture distributions in municipal sewage sludge, printing and dyeing sludge and paper mill sludge were experimentally studied based on four different methods, i.e., drying test, thermogravimetric-differential thermal analysis (TG-DTA) test, thermogravimetricdifferential scanning calorimetry (TG-DSC) test and water activity test. The results indicated that the moistures in the mechanically dewatered sludges were interstitial water, surface water and bound water. The interstitial water accounted for more than 50% wet basis (wb) of the total moisture content. The bond strength of sludge moisture increased with decreasing moisture content, especially when the moisture content was lower than 50% wb. Furthermore, the comparison among the four different testing methods was presented.The drying test was advantaged by its ability to quantify free water, interstitial water, surface water and bound water; while TG-DSC test, TG-DTA test and water activity test were capable of determining the bond strength of moisture in sludge. It was found that the results from TG-DSC and TG-DTA test are more persuasive than water activity test.

  6. A theoretical model for prediction of deposition efficiency in cold spraying

    International Nuclear Information System (INIS)

    Li Changjiu; Li Wenya; Wang Yuyue; Yang Guanjun; Fukanuma, H.

    2005-01-01

    The deposition behavior of a spray particle stream with a particle size distribution was theoretically examined for cold spraying in terms of deposition efficiency as a function of particle parameters and spray angle. The theoretical relation was established between the deposition efficiency and spray angle. The experiments were conducted by measuring deposition efficiency at different driving gas conditions and different spray angles using gas-atomized copper powder. It was found that the theoretically estimated results agreed reasonably well with the experimental ones. Based on the theoretical model and experimental results, it was revealed that the distribution of particle velocity resulting from particle size distribution influences significantly the deposition efficiency in cold spraying. It was necessary for the majority of particles to achieve a velocity higher than the critical velocity in order to improve the deposition efficiency. The normal component of particle velocity contributed to the deposition of the particle under the off-nomal spray condition. The deposition efficiency of sprayed particles decreased owing to the decrease of the normal velocity component as spray was performed at off-normal angle

  7. Adulthood Social Class and Union Interest: A First Test of a Theoretical Model.

    Science.gov (United States)

    Mellor, Steven

    2016-10-02

    A serial mediation model of union interest was tested. Based on theoretical notes provided by Mellor and Golay (in press), adulthood social class was positioned as a predictor of willingness to join a labor union, with success/failure attributions at work and willingness to share work goals positioned as intervening variables. Data from U.S. nonunion employees (N = 560) suggested full mediation after effects were adjusted for childhood social class. In sequence, adulthood social class predicted success/failure attributions at work, success/failure attributions at work predicted willingness to share work goals, and willingness to share work goals predicted willingness to join. Implications for socioeconomic status (SES) research and union expansion are discussed.

  8. Extracting information on the spatial variability in erosion rate stored in detrital cooling age distributions in river sands

    Science.gov (United States)

    Braun, Jean; Gemignani, Lorenzo; van der Beek, Peter

    2018-03-01

    One of the main purposes of detrital thermochronology is to provide constraints on the regional-scale exhumation rate and its spatial variability in actively eroding mountain ranges. Procedures that use cooling age distributions coupled with hypsometry and thermal models have been developed in order to extract quantitative estimates of erosion rate and its spatial distribution, assuming steady state between tectonic uplift and erosion. This hypothesis precludes the use of these procedures to assess the likely transient response of mountain belts to changes in tectonic or climatic forcing. Other methods are based on an a priori knowledge of the in situ distribution of ages to interpret the detrital age distributions. In this paper, we describe a simple method that, using the observed detrital mineral age distributions collected along a river, allows us to extract information about the relative distribution of erosion rates in an eroding catchment without relying on a steady-state assumption, the value of thermal parameters or an a priori knowledge of in situ age distributions. The model is based on a relatively low number of parameters describing lithological variability among the various sub-catchments and their sizes and only uses the raw ages. The method we propose is tested against synthetic age distributions to demonstrate its accuracy and the optimum conditions for it use. In order to illustrate the method, we invert age distributions collected along the main trunk of the Tsangpo-Siang-Brahmaputra river system in the eastern Himalaya. From the inversion of the cooling age distributions we predict present-day erosion rates of the catchments along the Tsangpo-Siang-Brahmaputra river system, as well as some of its tributaries. We show that detrital age distributions contain dual information about present-day erosion rate, i.e., from the predicted distribution of surface ages within each catchment and from the relative contribution of any given catchment to the

  9. Extracting information on the spatial variability in erosion rate stored in detrital cooling age distributions in river sands

    Directory of Open Access Journals (Sweden)

    J. Braun

    2018-03-01

    Full Text Available One of the main purposes of detrital thermochronology is to provide constraints on the regional-scale exhumation rate and its spatial variability in actively eroding mountain ranges. Procedures that use cooling age distributions coupled with hypsometry and thermal models have been developed in order to extract quantitative estimates of erosion rate and its spatial distribution, assuming steady state between tectonic uplift and erosion. This hypothesis precludes the use of these procedures to assess the likely transient response of mountain belts to changes in tectonic or climatic forcing. Other methods are based on an a priori knowledge of the in situ distribution of ages to interpret the detrital age distributions. In this paper, we describe a simple method that, using the observed detrital mineral age distributions collected along a river, allows us to extract information about the relative distribution of erosion rates in an eroding catchment without relying on a steady-state assumption, the value of thermal parameters or an a priori knowledge of in situ age distributions. The model is based on a relatively low number of parameters describing lithological variability among the various sub-catchments and their sizes and only uses the raw ages. The method we propose is tested against synthetic age distributions to demonstrate its accuracy and the optimum conditions for it use. In order to illustrate the method, we invert age distributions collected along the main trunk of the Tsangpo–Siang–Brahmaputra river system in the eastern Himalaya. From the inversion of the cooling age distributions we predict present-day erosion rates of the catchments along the Tsangpo–Siang–Brahmaputra river system, as well as some of its tributaries. We show that detrital age distributions contain dual information about present-day erosion rate, i.e., from the predicted distribution of surface ages within each catchment and from the relative contribution of

  10. Interevent time distribution in seismicity: A theoretical approach

    International Nuclear Information System (INIS)

    Molchan, G.

    2004-09-01

    This paper presents an analysis of the distribution of the time τ between two consecutive events in a stationary point process. The study is motivated by the discovery of unified scaling laws for τ for the case of seismic events. We demonstrate that these laws cannot exist simultaneously in a seismogenic area. Under very natural assumptions we show that if, after resealing to ensure Eτ = 1, the interevent time has a universal distribution F, then F must be exponential. In other words, Corral's unified scaling law cannot exist in the whole range of time. In the framework of a general cluster model we discuss the parameterization of an empirical unified law and the physical meaning of the parameters involved

  11. The Application of Hardware in the Loop Testing for Distributed Engine Control

    Science.gov (United States)

    Thomas, George L.; Culley, Dennis E.; Brand, Alex

    2016-01-01

    The essence of a distributed control system is the modular partitioning of control function across a hardware implementation. This type of control architecture requires embedding electronics in a multitude of control element nodes for the execution of those functions, and their integration as a unified system. As the field of distributed aeropropulsion control moves toward reality, questions about building and validating these systems remain. This paper focuses on the development of hardware-in-the-loop (HIL) test techniques for distributed aero engine control, and the application of HIL testing as it pertains to potential advanced engine control applications that may now be possible due to the intelligent capability embedded in the nodes.

  12. Pedestrian headform testing: inferring performance at impact speeds and for headform masses not tested, and estimating average performance in a range of real-world conditions.

    Science.gov (United States)

    Hutchinson, T Paul; Anderson, Robert W G; Searson, Daniel J

    2012-01-01

    Tests are routinely conducted where instrumented headforms are projected at the fronts of cars to assess pedestrian safety. Better information would be obtained by accounting for performance over the range of expected impact conditions in the field. Moreover, methods will be required to integrate the assessment of secondary safety performance with primary safety systems that reduce the speeds of impacts. Thus, we discuss how to estimate performance over a range of impact conditions from performance in one test and how this information can be combined with information on the probability of different impact speeds to provide a balanced assessment of pedestrian safety. Theoretical consideration is given to 2 distinct aspects to impact safety performance: the test impact severity (measured by the head injury criterion, HIC) at a speed at which a structure does not bottom out and the speed at which bottoming out occurs. Further considerations are given to an injury risk function, the distribution of impact speeds likely in the field, and the effect of primary safety systems on impact speeds. These are used to calculate curves that estimate injuriousness for combinations of test HIC, bottoming out speed, and alternative distributions of impact speeds. The injuriousness of a structure that may be struck by the head of a pedestrian depends not only on the result of the impact test but also the bottoming out speed and the distribution of impact speeds. Example calculations indicate that the relationship between the test HIC and injuriousness extends over a larger range than is presently used by the European New Car Assessment Programme (Euro NCAP), that bottoming out at speeds only slightly higher than the test speed can significantly increase the injuriousness of an impact location and that effective primary safety systems that reduce impact speeds significantly modify the relationship between the test HIC and injuriousness. Present testing regimes do not take fully into

  13. Quantum key distribution with hacking countermeasures and long term field trial.

    Science.gov (United States)

    Dixon, A R; Dynes, J F; Lucamarini, M; Fröhlich, B; Sharpe, A W; Plews, A; Tam, W; Yuan, Z L; Tanizawa, Y; Sato, H; Kawamura, S; Fujiwara, M; Sasaki, M; Shields, A J

    2017-05-16

    Quantum key distribution's (QKD's) central and unique claim is information theoretic security. However there is an increasing understanding that the security of a QKD system relies not only on theoretical security proofs, but also on how closely the physical system matches the theoretical models and prevents attacks due to discrepancies. These side channel or hacking attacks exploit physical devices which do not necessarily behave precisely as the theory expects. As such there is a need for QKD systems to be demonstrated to provide security both in the theoretical and physical implementation. We report here a QKD system designed with this goal in mind, providing a more resilient target against possible hacking attacks including Trojan horse, detector blinding, phase randomisation and photon number splitting attacks. The QKD system was installed into a 45 km link of a metropolitan telecom network for a 2.5 month period, during which time the system operated continuously and distributed 1.33 Tbits of secure key data with a stable secure key rate over 200 kbit/s. In addition security is demonstrated against coherent attacks that are more general than the collective class of attacks usually considered.

  14. Distributed learning process: principles of design and implementation

    Directory of Open Access Journals (Sweden)

    G. N. Boychenko

    2016-01-01

    Full Text Available At the present stage, broad information and communication technologies (ICT usage in educational practices is one of the leading trends of global education system development. This trend has led to the instructional interaction models transformation. Scientists have developed the theory of distributed cognition (Salomon, G., Hutchins, E., and distributed education and training (Fiore, S. M., Salas, E., Oblinger, D. G., Barone, C. A., Hawkins, B. L.. Educational process is based on two separated in time and space sub-processes of learning and teaching which are aimed at the organization of fl exible interactions between learners, teachers and educational content located in different non-centralized places.The purpose of this design research is to fi nd a solution for the problem of formalizing distributed learning process design and realization that is signifi cant in instructional design. The solution to this problem should take into account specifi cs of distributed interactions between team members, which becomes collective subject of distributed cognition in distributed learning process. This makes it necessary to design roles and functions of the individual team members performing distributed educational activities. Personal educational objectives should be determined by decomposition of team objectives into functional roles of its members with considering personal and learning needs and interests of students.Theoretical and empirical methods used in the study: theoretical analysis of philosophical, psychological, and pedagogical literature on the issue, analysis of international standards in the e-learning domain; exploration on practical usage of distributed learning in academic and corporate sectors; generalization, abstraction, cognitive modelling, ontology engineering methods.Result of the research is methodology for design and implementation of distributed learning process based on the competency approach. Methodology proposed by

  15. Comparison of minute distribution frequency for anesthesia start and end times from an anesthesia information management system and paper records.

    Science.gov (United States)

    Phelps, Michael; Latif, Asad; Thomsen, Robert; Slodzinski, Martin; Raghavan, Rahul; Paul, Sharon Leigh; Stonemetz, Jerry

    2017-08-01

    Use of an anesthesia information management system (AIMS) has been reported to improve accuracy of recorded information. We tested the hypothesis that analyzing the distribution of times charted on paper and computerized records could reveal possible rounding errors, and that this effect could be modulated by differences in the user interface for documenting certain event times with an AIMS. We compared the frequency distribution of start and end times for anesthesia cases completed with paper records and an AIMS. Paper anesthesia records had significantly more times ending with "0" and "5" compared to those from the AIMS (p < 0.001). For case start times, AIMS still exhibited end-digit preference, with times whose last digits had significantly higher frequencies of "0" and "5" than other integers. This effect, however, was attenuated compared to that for paper anesthesia records. For case end times, the distribution of minutes recorded with AIMS was almost evenly distributed, unlike those from paper records that still showed significant end-digit preference. The accuracy of anesthesia case start times and case end times, as inferred by statistical analysis of the distribution of the times, is enhanced with the use of an AIMS. Furthermore, the differences in AIMS user interface for documenting case start and case end times likely affects the degree of end-digit preference, and likely accuracy, of those times.

  16. INFORMATION-MEASURING TEST SYSTEM OF DIESEL LOCOMOTIVE HYDRAULIC TRANSMISSIONS

    Directory of Open Access Journals (Sweden)

    I. V. Zhukovytskyy

    2015-08-01

    Full Text Available Purpose. The article describes the process of developing the information-measuring test system of diesel locomotives hydraulic transmission, which gives the possibility to obtain baseline data to conduct further studies for the determination of the technical condition of diesel locomotives hydraulic transmission. The improvement of factory technology of post-repair tests of hydraulic transmissions by automating the existing hydraulic transmission test stands according to the specifications of the diesel locomotive repair enterprises was analyzed. It is achieved based on a detailed review of existing foreign information-measuring test systems for hydraulic transmission of diesel locomotives, BelAZ earthmover, aircraft tug, slag car, truck, BelAZ wheel dozer, some brands of tractors, etc. The problem for creation the information-measuring test systems for diesel locomotive hydraulic transmission is being solved, starting in the first place from the possibility of automation of the existing test stand of diesel locomotives hydraulic transmission at Dnipropetrovsk Diesel Locomotive Repair Plant "Promteplovoz". Methodology. In the work the researchers proposed the method to create a microprocessor automated system of diesel locomotives hydraulic transmission stand testing in the locomotive plant conditions. It acts by justifying the selection of the necessary sensors, as well as the application of the necessary hardware and software for information-measuring systems. Findings. Based on the conducted analysis there was grounded the necessity of improvement the plant hydraulic transmission stand testing by creating a microprocessor testing system, supported by the experience of developing such systems abroad. Further research should be aimed to improve the accuracy and frequency of data collection by adopting the more modern and reliable sensors in tandem with the use of filtering software for electromagnetic and other interference. Originality. The

  17. Patients’ Acceptance of Smartphone Health Technology for Chronic Disease Management: A Theoretical Model and Empirical Test

    Science.gov (United States)

    Dou, Kaili; Yu, Ping; Liu, Fang; Guan, YingPing; Li, Zhenye; Ji, Yumeng; Du, Ningkai; Lu, Xudong; Duan, Huilong

    2017-01-01

    Background Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. Objective The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients’ acceptance of smartphone health technology for chronic disease management. Methods Multiple theories and factors that may influence patients’ acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients’ acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients’ actual use of a smartphone health app. The partial least square method was used to test the theoretical model. Results The model accounted for .412 of the variance in patients’ intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients’ smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients’ intentions to use the technology. Age and gender had no significant influence on patients’ acceptance of smartphone technology. The study also

  18. Experimental apparatus to test air trap valves

    Science.gov (United States)

    Lemos De Lucca, Y. de F.; de Aquino, G. A.; Filho, J. G. D.

    2010-08-01

    It is known that the presence of trapped air within water distribution pipes can lead to irregular operation or even damage to the distribution systems and their components. The presence of trapped air may occur while the pipes are being filled with water, or while the pumping systems are in operation. The formation of large air pockets can produce the water hammer phenomenon, the instability and the loss of pressure in the water distribution networks. As a result, it can overload the pumps, increase the consumption of electricity, and damage the pumping system. In order to avoid its formation, all of the trapped air should be removed through "air trap valves". In Brazil, manufacturers frequently have unreliable sizing charts, which cause malfunctioning of the "air trap valves". The result of these malfunctions causes accidents of substantial damage. The construction of a test facility will provide a foundation of technical information that will be used to help make decisions when designing a system of pipelines where "air trap valves" are used. To achieve this, all of the valve characteristics (geometric, mechanic, hydraulic and dynamic) should be determined. This paper aims to describe and analyze the experimental apparatus and test procedure to be used to test "air trap valves". The experimental apparatus and test facility will be located at the University of Campinas, Brazil at the College of Civil Engineering, Architecture, and Urbanism in the Hydraulics and Fluid Mechanics laboratory. The experimental apparatus will be comprised of various components (pumps, steel pipes, butterfly valves to control the discharge, flow meter and reservoirs) and instrumentation (pressure transducers, anemometer and proximity sensor). It should be emphasized that all theoretical and experimental procedures should be defined while taking into consideration flow parameters and fluid properties that influence the tests.

  19. Experimental apparatus to test air trap valves

    Energy Technology Data Exchange (ETDEWEB)

    Lemos De Lucca, Y de F [CTH-DAEE-USP/FAAP/UNICAMP (Brazil); Aquino, G A de [SABESP/UNICAMP (Brazil); Filho, J G D, E-mail: yvone.lucca@gmail.co [Water Resources Department, University of Campinas-UNICAMP, Av. Albert Einstein, 951, Cidade Universitaria-Barao Geraldo-Campinas, S.P., 13083-852 (Brazil)

    2010-08-15

    It is known that the presence of trapped air within water distribution pipes can lead to irregular operation or even damage to the distribution systems and their components. The presence of trapped air may occur while the pipes are being filled with water, or while the pumping systems are in operation. The formation of large air pockets can produce the water hammer phenomenon, the instability and the loss of pressure in the water distribution networks. As a result, it can overload the pumps, increase the consumption of electricity, and damage the pumping system. In order to avoid its formation, all of the trapped air should be removed through 'air trap valves'. In Brazil, manufacturers frequently have unreliable sizing charts, which cause malfunctioning of the 'air trap valves'. The result of these malfunctions causes accidents of substantial damage. The construction of a test facility will provide a foundation of technical information that will be used to help make decisions when designing a system of pipelines where 'air trap valves' are used. To achieve this, all of the valve characteristics (geometric, mechanic, hydraulic and dynamic) should be determined. This paper aims to describe and analyze the experimental apparatus and test procedure to be used to test 'air trap valves'. The experimental apparatus and test facility will be located at the University of Campinas, Brazil at the College of Civil Engineering, Architecture, and Urbanism in the Hydraulics and Fluid Mechanics laboratory. The experimental apparatus will be comprised of various components (pumps, steel pipes, butterfly valves to control the discharge, flow meter and reservoirs) and instrumentation (pressure transducers, anemometer and proximity sensor). It should be emphasized that all theoretical and experimental procedures should be defined while taking into consideration flow parameters and fluid properties that influence the tests.

  20. Final comparison report on ISP-35: Nupec hydrogen mixing and distribution test (Test M-7-1)

    International Nuclear Information System (INIS)

    1994-12-01

    This final comparison report summarizes the results of the OECD/CSNI sponsored ISP-35 exercise which was based on NUPEC's Hydrogen Mixing and Distribution Test M-7-1. 12 organizations from 10 different countries took part in the exercise. For the ISP-35 test, a steam/light gas (helium) mixture was released into the lower region of a simplified model of a PWR containment. At the same time, the dome cooling spray was also activated. the transient time histories for gas temperature and concentrations were recorded for each of the 25 compartments of the model containment. The wall temperatures as well as the dome pressure were also recorded. The ISP-35 participants simulated the test conditions and attempted to predict the time histories using their accident analysis codes. Results of these analyses are presented, and comparisons are made between the experimental data and the calculated data. In general, predictions for pressure, helium concentration and gas distribution patterns were achieved with acceptable accuracy

  1. Information system architecture to support transparent access to distributed, heterogeneous data sources

    International Nuclear Information System (INIS)

    Brown, J.C.

    1994-08-01

    Quality situation assessment and decision making require access to multiple sources of data and information. Insufficient accessibility to data exists for many large corporations and Government agencies. By utilizing current advances in computer technology, today's situation analyst's have a wealth of information at their disposal. There are many potential solutions to the information accessibility problem using today's technology. The United States Department of Energy (US-DOE) faced this problem when dealing with one class of problem in the US. The result of their efforts has been the creation of the Tank Waste Information Network System -- TWINS. The TWINS solution combines many technologies to address problems in several areas such as User Interfaces, Transparent Access to Multiple Data Sources, and Integrated Data Access. Data related to the complex is currently distributed throughout several US-DOE installations. Over time, each installation has adopted their own set of standards as related to information management. Heterogeneous hardware and software platforms exist both across the complex and within a single installation. Standards for information management vary between US-DOE mission areas within installations. These factors contribute to the complexity of accessing information in a manner that enhances the performance and decision making process of the analysts. This paper presents one approach taken by the DOE to resolve the problem of distributed, heterogeneous, multi-media information management for the HLW Tank complex. The information system architecture developed for the DOE by the TWINS effort is one that is adaptable to other problem domains and uses

  2. Theoretical and testing performance of an innovative indirect evaporative chiller

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi; Xie, Xiaoyun [Department of Building Science and Technology, Tsinghua University, Beijing (China)

    2010-12-15

    An indirect evaporative chiller is a device used to produce chilled water at a temperature between the wet bulb temperature and dew point of the outdoor air, which can be used in building HVAC systems. This article presents a theoretical analysis and practical performance of an innovative indirect evaporative chiller. First, the process of the indirect evaporative chiller is introduced; then, the matching characteristics of the process are presented and analyzed. It can be shown that the process that produces cold water by using dry air is a nearly-reversible process, so the ideal produced chilled water temperature of the indirect evaporative chiller can be set close to the dew point temperature of the chiller's inlet air. After the indirect evaporative chiller was designed, simulations were done to analyze the output water temperature, the cooling efficiency relative to the inlet dew point temperature, and the COP that the chiller can performance. The first installation of the indirect evaporative chiller of this kind has been run for 5 years in a building in the city of Shihezi. The tested output water temperature of the chiller is around 14-20 C, which is just in between of the outdoor wet bulb temperature and dew point. The tested COP{sub r,s} of the developed indirect evaporative chiller reaches 9.1. Compared with ordinary air conditioning systems, the indirect evaporative chiller can save more than 40% in energy consumption due to the fact that the only energy consumed is from pumps and fans. An added bonus is that the indirect evaporative chiller uses no CFCs that pollute to the aerosphere. The tested internal parameters, such as the water-air flow rate ratio and heat transfer area for each heat transfer process inside the chiller, were analyzed and compared with designed values. The tested indoor air conditions, with a room temperature of 23-27 C and relative humidity of 50-70%, proved that the developed practical indirect evaporative chiller

  3. On-line test of power distribution prediction system for boiling water reactors

    International Nuclear Information System (INIS)

    Nishizawa, Y.; Kiguchi, T.; Kobayashi, S.; Takumi, K.; Tanaka, H.; Tsutsumi, R.; Yokomi, M.

    1982-01-01

    A power distribution prediction system for boiling water reactors has been developed and its on-line performance test has proceeded at an operating commercial reactor. This system predicts the power distribution or thermal margin in advance of control rod operations and core flow rate change. This system consists of an on-line computer system, an operator's console with a color cathode-ray tube, and plant data input devices. The main functions of this system are present power distribution monitoring, power distribution prediction, and power-up trajectory prediction. The calculation method is based on a simplified nuclear thermal-hydraulic calculation, which is combined with a method of model identification to the actual reactor core state. It has been ascertained by the on-line test that the predicted power distribution (readings of traversing in-core probe) agrees with the measured data within 6% root-mean-square. The computing time required for one prediction calculation step is less than or equal to 1.5 min by an HIDIC-80 on-line computer

  4. Synthesis, spectroscopic and structural characterization of 5-benzoyl-4-phenyl-2-methylthio-1H-pyrimidine with theoretical calculations using density functional theory

    Science.gov (United States)

    İnkaya, Ersin; Dinçer, Muharrem; Şahan, Emine; Yıldırım, İsmail

    2013-10-01

    In this paper, we will report a combined experimental and theoretical investigation of the molecular structure and spectroscopic parameters (FT-IR, 1H NMR, 13C NMR) of 5-benzoyl-4-phenyl-2-methylthio-1H-pyrimidine. The compound crystallizes in the triclinic space group P-1 with Z = 2. The molecular geometry was also optimized using density functional theory (DFT/B3LYP) method with the 6-311G(d,p) and 6-311++G(d,p) basis sets in ground state and compared with the experimental data. All the assignments of the theoretical frequencies were performed by potential energy distributions using VEDA 4 program. Information about the size, shape, charge density distribution and site of chemical reactivity of the molecules has been obtained by mapping electron density isosurface with electrostatic potential (ESP). Also, non-linear optical properties of the title compound were performed at B3LYP/6-311++G(d,p) level. The theoretical results showed an excellent agreement with the experimental values.

  5. Testing nuclear parton distributions with pA collisions at the LHC

    CERN Document Server

    Quiroga-Arias, Paloma; Wiedemann, Urs Achim

    2010-01-01

    Global perturbative QCD analyses, based on large data sets from electron-proton and hadron collider experiments, provide tight constraints on the parton distribution function (PDF) in the proton. The extension of these analyses to nuclear parton distributions (nPDF) has attracted much interest in recent years. nPDFs are needed as benchmarks for the characterization of hot QCD matter in nucleus-nucleus collisions, and attract further interest since they may show novel signatures of non-linear density-dependent QCD evolution. However, it is not known from first principles whether the factorization of long-range phenomena into process-independent parton distribution, which underlies global PDF extractions for the proton, extends to nuclear effects. As a consequence, assessing the reliability of nPDFs for benchmark calculations goes beyond testing the numerical accuracy of their extraction and requires phenomenological tests of the factorization assumption. Here we argue that a proton-nucleus collision program at...

  6. Pathways from Trauma to Psychotic Experiences: A Theoretically Informed Model of Posttraumatic Stress in Psychosis

    Directory of Open Access Journals (Sweden)

    Amy Hardy

    2017-05-01

    Full Text Available In recent years, empirical data and theoretical accounts relating to the relationship between childhood victimization and psychotic experiences have accumulated. Much of this work has focused on co-occurring Posttraumatic Stress Disorder or putative causal mechanisms in isolation from each other. The complexity of posttraumatic stress reactions experienced in psychosis remains poorly understood. This paper therefore attempts to synthesize the current evidence base into a theoretically informed, multifactorial model of posttraumatic stress in psychosis. Three trauma-related vulnerability factors are proposed to give rise to intrusions and to affect how people appraise and cope with them. First, understandable attempts to survive trauma become habitual ways of regulating emotion, manifesting in cognitive-affective, behavioral and interpersonal responses. Second, event memories, consisting of perceptual and episodic representations, are impacted by emotion experienced during trauma. Third, personal semantic memory, specifically appraisals of the self and others, are shaped by event memories. It is proposed these vulnerability factors have the potential to lead to two types of intrusions. The first type is anomalous experiences arising from emotion regulation and/or the generation of novel images derived from trauma memory. The second type is trauma memory intrusions reflecting, to varying degrees, the retrieval of perceptual, episodic and personal semantic representations. It is speculated trauma memory intrusions may be experienced on a continuum from contextualized to fragmented, depending on memory encoding and retrieval. Personal semantic memory will then impact on how intrusions are appraised, with habitual emotion regulation strategies influencing people’s coping responses to these. Three vignettes are outlined to illustrate how the model accounts for different pathways between victimization and psychosis, and implications for therapy are

  7. Theoretical and practical program in the non-destructive testing by eddy currents - the first level

    International Nuclear Information System (INIS)

    Shaaban, H.I.; Addarwish, J.M.A.

    2014-11-01

    The testing using eddy currents is one of the non-destructive tests that use electromagnetic property as a basis for testing procedures, and there are many other ways to use this principle, including Remote Field Testing and the Magnetic Flux Leakage test. Eddy currents are electrical currents moving in a circular path, and took the name eddy of eddies that form when a liquid or gas is moving in a circular path because of objection obstacles to its track. They are generated in the material using a variable magnetic field. Non-destructive testing by eddy currents is a technique used for the detection of defects and interruptions in a material and it is a process that relies on the generation of small eddy currents in the material of the part to be examined, provided that this part is of an electrically conducting material. This technique and its scientific basis are explained in this book. Also the devices used in this technique and how to use these devices in details are explained. The book contains Twelve chapters: Introduction to non destructive testing - Engineering materials and its mechanical characteristics - Electrical and magnetic characteristics of engineering materials - Introduction to testing by eddy currents - Factors affecting eddy currents - Basis of electrical circuits used in eddy currents testing devices - Probes of eddy currents testing - Eddy currents testing devices (Theoretical) - Analysis of the examination results of testing by eddy currents: techniques and applications - Applications of testing by eddy currents - Eddy currents testing devices (Application) - Practical lessons for the first level in testing by eddy currents.

  8. Theoretical analysis of the distribution of isolated particles in totally asymmetric exclusion processes: Application to mRNA translation rate estimation

    Science.gov (United States)

    Dao Duc, Khanh; Saleem, Zain H.; Song, Yun S.

    2018-01-01

    The Totally Asymmetric Exclusion Process (TASEP) is a classical stochastic model for describing the transport of interacting particles, such as ribosomes moving along the messenger ribonucleic acid (mRNA) during translation. Although this model has been widely studied in the past, the extent of collision between particles and the average distance between a particle to its nearest neighbor have not been quantified explicitly. We provide here a theoretical analysis of such quantities via the distribution of isolated particles. In the classical form of the model in which each particle occupies only a single site, we obtain an exact analytic solution using the matrix ansatz. We then employ a refined mean-field approach to extend the analysis to a generalized TASEP with particles of an arbitrary size. Our theoretical study has direct applications in mRNA translation and the interpretation of experimental ribosome profiling data. In particular, our analysis of data from Saccharomyces cerevisiae suggests a potential bias against the detection of nearby ribosomes with a gap distance of less than approximately three codons, which leads to some ambiguity in estimating the initiation rate and protein production flux for a substantial fraction of genes. Despite such ambiguity, however, we demonstrate theoretically that the interference rate associated with collisions can be robustly estimated and show that approximately 1% of the translating ribosomes get obstructed.

  9. Parton distributions for the LHC Run II

    CERN Document Server

    Ball, Richard D.; Carrazza, Stefano; Deans, Christopher S.; Del Debbio, Luigi; Forte, Stefano; Guffanti, Alberto; Hartland, Nathan P.; Latorre, José I.; Rojo, Juan; Ubiali, Maria

    2015-01-01

    We present NNPDF3.0, the first set of parton distribution functions (PDFs) determined with a methodology validated by a closure test. NNPDF3.0 uses a global dataset including HERA-II deep-inelastic inclusive cross-sections, the combined HERA charm data, jet production from ATLAS and CMS, vector boson rapidity and transverse momentum distributions from ATLAS, CMS and LHCb, W+c data from CMS and top quark pair production total cross sections from ATLAS and CMS. Results are based on LO, NLO and NNLO QCD theory and also include electroweak corrections. To validate our methodology, we show that PDFs determined from pseudo-data generated from a known underlying law correctly reproduce the statistical distributions expected on the basis of the assumed experimental uncertainties. This closure test ensures that our methodological uncertainties are negligible in comparison to the generic theoretical and experimental uncertainties of PDF determination. This enables us to determine with confidence PDFs at different pertu...

  10. Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis

    Science.gov (United States)

    Das, Samiran

    2018-04-01

    The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.

  11. Designing Better Graphs by Including Distributional Information and Integrating Words, Numbers, and Images

    Science.gov (United States)

    Lane, David M.; Sandor, Aniko

    2009-01-01

    Statistical graphs are commonly used in scientific publications. Unfortunately, graphs in psychology journals rarely portray distributional information beyond central tendency, and few graphs portray inferential statistics. Moreover, those that do portray inferential information generally do not portray it in a way that is useful for interpreting…

  12. Theoretical Studies of Small-System Thermodynamics in Energetic Materials

    Science.gov (United States)

    2016-01-06

    SECURITY CLASSIFICATION OF: This is a comprehensive theoretical research program to investigate the fundamental principles of small-system thermodynamics ...a.k.a. nanothermodynamics). The proposed work is motivated by our desire to better understand the fundamental dynamics and thermodynamics of...for Public Release; Distribution Unlimited Final Report: Theoretical Studies of Small-System Thermodynamics in Energetic Materials The views, opinions

  13. Photoelectron angular distribution from free SiO2 nanoparticles as a probe of elastic electron scattering.

    Science.gov (United States)

    Antonsson, E; Langer, B; Halfpap, I; Gottwald, J; Rühl, E

    2017-06-28

    In order to gain quantitative information on the surface composition of nanoparticles from X-ray photoelectron spectroscopy, a detailed understanding of photoelectron transport phenomena in these samples is needed. Theoretical results on the elastic and inelastic scattering have been reported, but a rigorous experimental verification is lacking. We report in this work on the photoelectron angular distribution from free SiO 2 nanoparticles (d = 122 ± 9 nm) after ionization by soft X-rays above the Si 2p and O 1s absorption edges, which gives insight into the relative importance of elastic and inelastic scattering channels in the sample particles. The photoelectron angular anisotropy is found to be lower for photoemission from SiO 2 nanoparticles than that expected from the theoretical values for the isolated Si and O atoms in the photoelectron kinetic energy range 20-380 eV. The reduced angular anisotropy is explained by elastic scattering of the outgoing photoelectrons from neighboring atoms, smearing out the atomic distribution. Photoelectron angular distributions yield detailed information on photoelectron elastic scattering processes allowing for a quantification of the number of elastic scattering events the photoelectrons have undergone prior to leaving the sample. The interpretation of the experimental photoelectron angular distributions is complemented by Monte Carlo simulations, which take inelastic and elastic photoelectron scattering into account using theoretical values for the scattering cross sections. The results of the simulations reproduce the experimental photoelectron angular distributions and provide further support for the assignment that elastic and inelastic electron scattering processes need to be considered.

  14. THE ROLE AND IMPORTANCE OF THEORETICAL PREPARATION ON “PHYSICAL EDUCATION FOR HIGHSCHOOL STUDENTS

    Directory of Open Access Journals (Sweden)

    DANIEL DOCU AXELERAD

    2009-12-01

    Full Text Available According to the pre-universitary curriculum, one of the criteria to asses the level of the subject’s acquisition is the quality of the theoretical knowledge. In the basic organizing documents of school physical education, there were and still are stipulated the exact requests regarding the necessary theoretical knowledge ofstudents on various education levels. According to these documents, the theoretical knowledge was general knowledge. To the general knowledge, there are added those pertaining to the basic information of the given subject, information about the means and methods of physical education, information from the domain of prophylactic physical education, etc. Special knowledge is that representing the students’ knowledge form various sports tests provided by the school curriculum, such as the sporting games (volleyball, basketball,football, handball, athletics (running, jumping, throwing and gymnastics (apparatus and floor exercises. It is here that the means and methods applied in acquiring the compartments listed above are attributed. In the special knowledge category there is also the knowledge related to the means, the forms and the methods to develop the basic motor qualities (force, speed, flexibility, resistance, skills, as well as the procedures for evaluating them.Nevertheless, regardless of the fact that in the physical education organizing normative documents, highschool included, it is provided that theoretical knowledge should be acquired, still there is no actual presentation of the specific requirements and the assessment criteria for the level of acquisition. No document specifies the ways to evaluate the volume and quality of acquiring the theoretical knowledge, which is why we are going to present here a detailed analysis of the level of acquisition of theoretical knowledge for the “Physical Education” subject by highschool students after applying the teaching –learning -evaluation technique on the

  15. Theoretical reflections on the paradigmatic construction of Information Science: considerations about the (s paradigm (s cognitive (s and social

    Directory of Open Access Journals (Sweden)

    Jonathas Luiz Carvalho Silva

    2013-07-01

    Full Text Available It presents a research about the theoretical and epistemological processes that influence the formation of the cognitive paradigm of Information Science (IS, noting the emergence of social paradigm within the domain analysis and hermeneutics of information. For this, we adopted the reflections of classical and contemporary authors, like Thomas Kuhn, Boaventura Santos, Capurro, Hjørland and Albrechtsen. We conclude that the perception paradigm in IS is a consolidated issue, however the social paradigm is still under construction, which will allow the creation of perceptions, interpretations and contributions in order to fill gaps left by other paradigms.

  16. Implementation of Web-based Information Systems in Distributed Organizations

    DEFF Research Database (Denmark)

    Bødker, Keld; Pors, Jens Kaaber; Simonsen, Jesper

    2004-01-01

    This article presents results elicited from studies conducted in relation to implementing a web-based information system throughout a large distributed organization. We demonstrate the kind of expectations and conditions for change that management face in relation to open-ended, configurable......, and context specific web-based information systems like Lotus QuickPlace. Our synthesis from the empirical findings is related to two recent models, the improvisational change management model suggested by Orlikowski and Hofman (1997), and Gallivan's (2001) model for organizational adoption and assimilation....... In line with comparable approaches from the knowledge management area (Dixon 2000; Markus 2001), we relate to, refine, and operationalize the models from an overall organizational view by identifying and characterizing four different and general implementation contexts...

  17. Preliminary Calculations of Bypass Flow Distribution in a Multi-Block Air Test

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Tak, Nam Il

    2011-01-01

    The development of a methodology for the bypass flow assessment in a prismatic VHTR (Very High Temperature Reactor) core has been conducted at KAERI. A preliminary estimation of variation of local bypass flow gap size between graphite blocks in the NHDD core were carried out. With the predicted gap sizes, their influence on the bypass flow distribution and the core hot spot was assessed. Due to the complexity of gap distributions, a system thermo-fluid analysis code is suggested as a tool for the core thermo-fluid analysis, the model and correlations of which should be validated. In order to generate data for validating the bypass flow analysis model, an experimental facility for a multi-block air test was constructed at Seoul National University (SNU). This study is focused on the preliminary evaluation of flow distribution in the test section to understand how the flow is distributed and to help the selection of experimental case. A commercial CFD code, ANSYS CFX is used for the analyses

  18. Proceedings of the Technology Roadmap Workshop on Communication and Control Systems for Distributed Energy Implementation and Testing

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2002-05-01

    More than 50 experts from energy and information technology industries, Federal and State government agencies, universities, and National Laboratories participated in the “Communication and Control Systems for Distributed Energy Implementation and Testing Workshop” in Reston, Virginia, on May 14-15, 2002. This was a unique workshop in that, for the first time, representatives from the information technology sector and those from energy-related industries, Federal and State government agencies, universities, and National Laboratories, gathered to discuss these issues and develop a set of action-oriented implementation strategies. A planning committee of industry, consultant, and government representatives laid the groundwork for the workshop by identifying key participants and developing an appropriate agenda. This document reflects the ideas and priorities discussed by workshop participants.

  19. 78 FR 2992 - Agency Information Collection Activities; Proposed Collection; Comment Request; Distribution of...

    Science.gov (United States)

    2013-01-15

    ... consequence analyses (OCA) as well as other elements of the risk management program. On August 5, 1999, the...). The Act required the President to promulgate regulations on the distribution of OCA information (CAA... responsibility to promulgate regulations to govern the dissemination of OCA information to the public. The final...

  20. 3D nonrigid medical image registration using a new information theoretic measure

    Science.gov (United States)

    Li, Bicao; Yang, Guanyu; Coatrieux, Jean Louis; Li, Baosheng; Shu, Huazhong

    2015-11-01

    This work presents a novel method for the nonrigid registration of medical images based on the Arimoto entropy, a generalization of the Shannon entropy. The proposed method employed the Jensen-Arimoto divergence measure as a similarity metric to measure the statistical dependence between medical images. Free-form deformations were adopted as the transformation model and the Parzen window estimation was applied to compute the probability distributions. A penalty term is incorporated into the objective function to smooth the nonrigid transformation. The goal of registration is to optimize an objective function consisting of a dissimilarity term and a penalty term, which would be minimal when two deformed images are perfectly aligned using the limited memory BFGS optimization method, and thus to get the optimal geometric transformation. To validate the performance of the proposed method, experiments on both simulated 3D brain MR images and real 3D thoracic CT data sets were designed and performed on the open source elastix package. For the simulated experiments, the registration errors of 3D brain MR images with various magnitudes of known deformations and different levels of noise were measured. For the real data tests, four data sets of 4D thoracic CT from four patients were selected to assess the registration performance of the method, including ten 3D CT images for each 4D CT data covering an entire respiration cycle. These results were compared with the normalized cross correlation and the mutual information methods and show a slight but true improvement in registration accuracy.