WorldWideScience

Sample records for information-theoretic-based thermo-statistical approach

  1. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  2. One-dimensional barcode reading: an information theoretic approach

    Science.gov (United States)

    Houni, Karim; Sawaya, Wadih; Delignon, Yves

    2008-03-01

    In the convergence context of identification technology and information-data transmission, the barcode found its place as the simplest and the most pervasive solution for new uses, especially within mobile commerce, bringing youth to this long-lived technology. From a communication theory point of view, a barcode is a singular coding based on a graphical representation of the information to be transmitted. We present an information theoretic approach for 1D image-based barcode reading analysis. With a barcode facing the camera, distortions and acquisition are modeled as a communication channel. The performance of the system is evaluated by means of the average mutual information quantity. On the basis of this theoretical criterion for a reliable transmission, we introduce two new measures: the theoretical depth of field and the theoretical resolution. Simulations illustrate the gain of this approach.

  3. Thermo-dynamical contours of electronic-vibrational spectra simulated using the statistical quantum-mechanical methods

    DEFF Research Database (Denmark)

    Pomogaev, Vladimir; Pomogaeva, Anna; Avramov, Pavel

    2011-01-01

    Three polycyclic organic molecules in various solvents focused on thermo-dynamical aspects were theoretically investigated using the recently developed statistical quantum mechanical/classical molecular dynamics method for simulating electronic-vibrational spectra. The absorption bands of estradiol...

  4. Statistical optimization of thermo-alkali stable xylanase production from Bacillus tequilensis strain ARMATI

    Directory of Open Access Journals (Sweden)

    Ameer Khusro

    2016-07-01

    Conclusions: The cellulase-free xylanase showed an alkali-tolerant and thermo-stable property with potentially applicable nature at industrial scale. This statistical approach established a major contribution in enzyme production from the isolate by optimizing independent factors and represents a first reference on the enhanced production of thermo-alkali stable cellulase-free xylanase from B. tequilensis.

  5. An Information-Theoretic Approach to PMU Placement in Electric Power Systems

    OpenAIRE

    Li, Qiao; Cui, Tao; Weng, Yang; Negi, Rohit; Franchetti, Franz; Ilic, Marija D.

    2012-01-01

    This paper presents an information-theoretic approach to address the phasor measurement unit (PMU) placement problem in electric power systems. Different from the conventional 'topological observability' based approaches, this paper advocates a much more refined, information-theoretic criterion, namely the mutual information (MI) between the PMU measurements and the power system states. The proposed MI criterion can not only include the full system observability as a special case, but also ca...

  6. Statistical approach for selection of biologically informative genes.

    Science.gov (United States)

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes

  7. Inform: Efficient Information-Theoretic Analysis of Collective Behaviors

    Directory of Open Access Journals (Sweden)

    Douglas G. Moore

    2018-06-01

    Full Text Available The study of collective behavior has traditionally relied on a variety of different methodological tools ranging from more theoretical methods such as population or game-theoretic models to empirical ones like Monte Carlo or multi-agent simulations. An approach that is increasingly being explored is the use of information theory as a methodological framework to study the flow of information and the statistical properties of collectives of interacting agents. While a few general purpose toolkits exist, most of the existing software for information theoretic analysis of collective systems is limited in scope. We introduce Inform, an open-source framework for efficient information theoretic analysis that exploits the computational power of a C library while simplifying its use through a variety of wrappers for common higher-level scripting languages. We focus on two such wrappers here: PyInform (Python and rinform (R. Inform and its wrappers are cross-platform and general-purpose. They include classical information-theoretic measures, measures of information dynamics and information-based methods to study the statistical behavior of collective systems, and expose a lower-level API that allow users to construct measures of their own. We describe the architecture of the Inform framework, study its computational efficiency and use it to analyze three different case studies of collective behavior: biochemical information storage in regenerating planaria, nest-site selection in the ant Temnothorax rugatulus, and collective decision making in multi-agent simulations.

  8. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    Science.gov (United States)

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of

  9. Nonequilibrium statistical averages and thermo field dynamics

    International Nuclear Information System (INIS)

    Marinaro, A.; Scarpetta, Q.

    1984-01-01

    An extension of thermo field dynamics is proposed, which permits the computation of nonequilibrium statistical averages. The Brownian motion of a quantum oscillator is treated as an example. In conclusion it is pointed out that the procedure proposed to computation of time-dependent statistical average gives the correct two-point Green function for the damped oscillator. A simple extension can be used to compute two-point Green functions of free particles

  10. A new formalism for non extensive physical systems: Tsallis Thermo statistics

    International Nuclear Information System (INIS)

    Tirnakli, U.; Bueyuekkilic, F.; Demirhan, D.

    1999-01-01

    Although Boltzmann-Gibbs (BG) statistics provides a suitable tool which enables us to handle a large number of physical systems satisfactorily, it has some basic restrictions. Recently a non extensive thermo statistics has been proposed by C.Tsallis to handle the non extensive physical systems and up to now, besides the generalization of some of the conventional concepts, the formalism has been prosperous in some of the physical applications. In this study, our effort is to introduce Tsallis thermo statistics in some details and to emphasize its achievements on physical systems by noting the recent developments on this line

  11. Experimental and theoretical studies of buoyant-thermo capillary flow

    International Nuclear Information System (INIS)

    Favre, E.; Blumenfeld, L.; Soubbaramayer

    1996-01-01

    In the AVLIS process, uranium metal is evaporated using a high power electron gun. We have prior discussed the power balance equation in the electron beam evaporation process and pointed out, among the loss terms, the importance of the power loss due to the convective flow in the molten pool driven by buoyancy and thermo capillarity. An empirical formula has been derived from model experiments with cerium, to estimate the latter power loss and that formula can be used practically in engineering calculations. In order to complete the empirical approach, a more fundamental research program of theoretical and experimental studies have been carried out in Cea-France, with the objective of understanding the basic phenomena (heat transport, flow instabilities, turbulence, etc.) occurring in a convective flow in a liquid layer locally heated on its free surface

  12. Information-Theoretic Approaches for Evaluating Complex Adaptive Social Simulation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Ganguly, Auroop R [ORNL; Jiao, Yu [ORNL

    2009-01-01

    In this paper, we propose information-theoretic approaches for comparing and evaluating complex agent-based models. In information theoretic terms, entropy and mutual information are two measures of system complexity. We used entropy as a measure of the regularity of the number of agents in a social class; and mutual information as a measure of information shared by two social classes. Using our approaches, we compared two analogous agent-based (AB) models developed for regional-scale social-simulation system. The first AB model, called ABM-1, is a complex AB built with 10,000 agents on a desktop environment and used aggregate data; the second AB model, ABM-2, was built with 31 million agents on a highperformance computing framework located at Oak Ridge National Laboratory, and fine-resolution data from the LandScan Global Population Database. The initializations were slightly different, with ABM-1 using samples from a probability distribution and ABM-2 using polling data from Gallop for a deterministic initialization. The geographical and temporal domain was present-day Afghanistan, and the end result was the number of agents with one of three behavioral modes (proinsurgent, neutral, and pro-government) corresponding to the population mindshare. The theories embedded in each model were identical, and the test simulations focused on a test of three leadership theories - legitimacy, coercion, and representative, and two social mobilization theories - social influence and repression. The theories are tied together using the Cobb-Douglas utility function. Based on our results, the hypothesis that performance measures can be developed to compare and contrast AB models appears to be supported. Furthermore, we observed significant bias in the two models. Even so, further tests and investigations are required not only with a wider class of theories and AB models, but also with additional observed or simulated data and more comprehensive performance measures.

  13. Thermo-driven microcrawlers fabricated via a microfluidic approach

    International Nuclear Information System (INIS)

    Wang Wei; Yao Chen; Zhang Maojie; Ju Xiaojie; Xie Rui; Chu Liangyin

    2013-01-01

    A novel thermo-driven microcrawler that can transform thermal stimuli into directional mechanical motion is developed by a simple microfluidic approach together with emulsion-template synthesis. The microcrawler is designed with a thermo-responsive poly(N-isopropylacrylamide) (PNIPAM) hydrogel body and a bell-like structure with an eccentric cavity. The asymmetric shrinking–swelling circulation of the microcrawlers enables a thermo-driven locomotion responding to repeated temperature changes, which provides a novel model with symmetry breaking principle for designing biomimetic soft microrobots. The microfluidic approach offers a novel and promising platform for design and fabrication of biomimetic soft microrobots. (paper)

  14. Information Ergonomics A theoretical approach and practical experience in transportation

    CERN Document Server

    Sandl, Peter

    2012-01-01

    The variety and increasing availability of hypermedia information systems, which are used in stationary applications like operators’ consoles as well as mobile systems, e.g. driver information and navigation systems in automobiles form a foundation for the mediatization of the society. From the human engineering point of view this development and the ensuing increased importance of information systems for economic and private needs require careful deliberation of the derivation and application of ergonomics methods particularly in the field of information systems. This book consists of two closely intertwined parts. The first, theoretical part defines the concept of an information system, followed by an explanation of action regulation as well as cognitive theories to describe man information system interaction. A comprehensive description of information ergonomics concludes the theoretical approach. In the second, practically oriented part of this book authors from industry as well as from academic institu...

  15. Information theoretic description of networks

    Science.gov (United States)

    Wilhelm, Thomas; Hollunder, Jens

    2007-11-01

    We present a new information theoretic approach for network characterizations. It is developed to describe the general type of networks with n nodes and L directed and weighted links, i.e., it also works for the simpler undirected and unweighted networks. The new information theoretic measures for network characterizations are based on a transmitter-receiver analogy of effluxes and influxes. Based on these measures, we classify networks as either complex or non-complex and as either democracy or dictatorship networks. Directed networks, in particular, are furthermore classified as either information spreading and information collecting networks. The complexity classification is based on the information theoretic network complexity measure medium articulation (MA). It is proven that special networks with a medium number of links ( L∼n1.5) show the theoretical maximum complexity MA=(log n)2/2. A network is complex if its MA is larger than the average MA of appropriately randomized networks: MA>MAr. A network is of the democracy type if its redundancy Rdictatorship network. In democracy networks all nodes are, on average, of similar importance, whereas in dictatorship networks some nodes play distinguished roles in network functioning. In other words, democracy networks are characterized by cycling of information (or mass, or energy), while in dictatorship networks there is a straight through-flow from sources to sinks. The classification of directed networks into information spreading and information collecting networks is based on the conditional entropies of the considered networks ( H(A/B)=uncertainty of sender node if receiver node is known, H(B/A)=uncertainty of receiver node if sender node is known): if H(A/B)>H(B/A), it is an information collecting network, otherwise an information spreading network. Finally, different real networks (directed and undirected, weighted and unweighted) are classified according to our general scheme.

  16. Dynamic statistical information theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum-the present static mutual information and static channel capacity under the limit case where the proportion of channel length to information transmission rate approaches to zero. All these unified and rigorous theoretical formulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories.

  17. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    Directory of Open Access Journals (Sweden)

    Ross S Williamson

    2015-04-01

    Full Text Available Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID, uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  18. Do pseudo-absence selection strategies influence species distribution models and their predictions? An information-theoretic approach based on simulated data

    Directory of Open Access Journals (Sweden)

    Guisan Antoine

    2009-04-01

    Full Text Available Abstract Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a real absences b pseudo-absences selected randomly from the background and c two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97, and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have

  19. Thermo-electro-chemical storage (TECS) of solar energy

    International Nuclear Information System (INIS)

    Wenger, Erez; Epstein, Michael; Kribus, Abraham

    2017-01-01

    Highlights: • A solar plant with thermally regenerative battery unifies energy conversion and storage. • Storage is a flow battery with thermo-chemical charging and electro-chemical discharging. • Sodium-sulfur and zinc-air systems are investigated as candidate storage materials. • Theoretical solar to electricity efficiencies of over 60% are predicted. • Charging temperature can be lowered with hybrid carbothermic reduction. - Abstract: A new approach for solar electricity generation and storage is proposed, based on the concept of thermally regenerative batteries. Concentrated sunlight is used for external thermo-chemical charging of a flow battery, and electricity is produced by conventional electro-chemical discharge of the battery. The battery replaces the steam turbine, currently used in commercial concentrated solar power (CSP) plants, potentially leading to much higher conversion efficiency. This approach offers potential performance, cost and operational advantages compared to existing solar technologies, and to existing storage solutions for management of an electrical grid with a significant contribution of intermittent solar electricity generation. Here we analyze the theoretical conversion efficiency for new thermo-electro-chemical storage (TECS) plant schemes based on the electro-chemical systems of sodium-sulfur (Na-S) and zinc-air. The thermodynamic upper limit of solar to electricity conversion efficiency for an ideal TECS cycle is about 60% for Na-S at reactor temperature of 1550 K, and 65% for the zinc-air system at 1750 K, both under sunlight concentration of 3000. A hybrid process with carbothermic reduction in the zinc-air system reaches 60% theoretical efficiency at the more practical conditions of reaction temperature <1200 K and concentration <1000. Practical TECS plant efficiency, estimated from these upper limits, may then be much higher compared to existing solar electricity technologies. The technical and economical

  20. Information systems development of analysis company financial state based on the expert-statistical approach

    Directory of Open Access Journals (Sweden)

    M. N. Ivliev

    2016-01-01

    Full Text Available The work is devoted to methods of analysis the company financial condition, including aggregated ratings. It is proposed to use the generalized solvency and liquidity indicator and the capital structure composite index. Mathematically, the generalized index is a sum of variables-characteristics and weighting factors characterizing the relative importance of individual characteristics composition. It is offered to select the significant features from a set of standard financial ratios, calculated according to enterprises balance sheets. To obtain the weighting factors values it is proposed to use one of the expert statistical approaches, the analytic hierarchy process. The method is as follows: we choose the most important characteristic and after the experts determine the degree of preference for the main feature based on the linguistic scale. Further, matrix of pairwise comparisons based on the assigned ranks is compiled, which characterizes the relative importance of attributes. The required coefficients are determined as elements of a vector of priorities, which is the first vector of the matrix of paired comparisons. The paper proposes a mechanism for finding the fields for rating numbers analysis. In addition, the paper proposes a method for the statistical evaluation of the balance sheets of various companies by calculating the mutual correlation matrices. Based on the considered mathematical methods to determine quantitative characteristics of technical objects financial and economic activities, was developed algorithms, information and software allowing to realize of different systems economic analysis.

  1. Information theoretic preattentive saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2011-01-01

    Employing an information theoretic operational definition of bottom-up attention from the field of computational visual perception a very general expression for saliency is provided. As opposed to many of the current approaches to determining a saliency map there is no need for an explicit data...... of which features, image information is described. We illustrate our result by determining a few specific saliency maps based on particular choices of features. One of them makes the link with the mapping underlying well-known Harris interest points, which is a result recently obtained in isolation...

  2. Hedonic approaches based on spatial econometrics and spatial statistics: application to evaluation of project benefits

    Science.gov (United States)

    Tsutsumi, Morito; Seya, Hajime

    2009-12-01

    This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.

  3. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  4. Information theoretic learning Renyi's entropy and Kernel perspectives

    CERN Document Server

    Principe, Jose C

    2010-01-01

    This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy. ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesi

  5. An Information-Theoretic Approach for Indirect Train Traffic Monitoring Using Building Vibration

    Directory of Open Access Journals (Sweden)

    Susu Xu

    2017-05-01

    Full Text Available This paper introduces an indirect train traffic monitoring method to detect and infer real-time train events based on the vibration response of a nearby building. Monitoring and characterizing traffic events are important for cities to improve the efficiency of transportation systems (e.g., train passing, heavy trucks, and traffic. Most prior work falls into two categories: (1 methods that require intensive labor to manually record events or (2 systems that require deployment of dedicated sensors. These approaches are difficult and costly to execute and maintain. In addition, most prior work uses dedicated sensors designed for a single purpose, resulting in deployment of multiple sensor systems. This further increases costs. Meanwhile, with the increasing demands of structural health monitoring, many vibration sensors are being deployed in commercial buildings. Traffic events create ground vibration that propagates to nearby building structures inducing noisy vibration responses. We present an information-theoretic method for train event monitoring using commonly existing vibration sensors deployed for building health monitoring. The key idea is to represent the wave propagation in a building induced by train traffic as information conveyed in noisy measurement signals. Our technique first uses wavelet analysis to detect train events. Then, by analyzing information exchange patterns of building vibration signals, we infer the category of the events (i.e., southbound or northbound train. Our algorithm is evaluated with an 11-story building where trains pass by frequently. The results show that the method can robustly achieve a train event detection accuracy of up to a 93% true positive rate and an 80% true negative rate. For direction categorization, compared with the traditional signal processing method, our information-theoretic approach reduces categorization error from 32.1 to 12.1%, which is a 2.5× improvement.

  6. A short course in quantum information theory an approach from theoretical physics

    CERN Document Server

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition...

  7. Anatomy of the Higgs fits: A first guide to statistical treatments of the theoretical uncertainties

    Directory of Open Access Journals (Sweden)

    Sylvain Fichet

    2016-04-01

    Full Text Available The studies of the Higgs boson couplings based on the recent and upcoming LHC data open up a new window on physics beyond the Standard Model. In this paper, we propose a statistical guide to the consistent treatment of the theoretical uncertainties entering the Higgs rate fits. Both the Bayesian and frequentist approaches are systematically analysed in a unified formalism. We present analytical expressions for the marginal likelihoods, useful to implement simultaneously the experimental and theoretical uncertainties. We review the various origins of the theoretical errors (QCD, EFT, PDF, production mode contamination…. All these individual uncertainties are thoroughly combined with the help of moment-based considerations. The theoretical correlations among Higgs detection channels appear to affect the location and size of the best-fit regions in the space of Higgs couplings. We discuss the recurrent question of the shape of the prior distributions for the individual theoretical errors and find that a nearly Gaussian prior arises from the error combinations. We also develop the bias approach, which is an alternative to marginalisation providing more conservative results. The statistical framework to apply the bias principle is introduced and two realisations of the bias are proposed. Finally, depending on the statistical treatment, the Standard Model prediction for the Higgs signal strengths is found to lie within either the 68% or 95% confidence level region obtained from the latest analyses of the 7 and 8 TeV LHC datasets.

  8. Anatomy of the Higgs fits: A first guide to statistical treatments of the theoretical uncertainties

    Science.gov (United States)

    Fichet, Sylvain; Moreau, Grégory

    2016-04-01

    The studies of the Higgs boson couplings based on the recent and upcoming LHC data open up a new window on physics beyond the Standard Model. In this paper, we propose a statistical guide to the consistent treatment of the theoretical uncertainties entering the Higgs rate fits. Both the Bayesian and frequentist approaches are systematically analysed in a unified formalism. We present analytical expressions for the marginal likelihoods, useful to implement simultaneously the experimental and theoretical uncertainties. We review the various origins of the theoretical errors (QCD, EFT, PDF, production mode contamination…). All these individual uncertainties are thoroughly combined with the help of moment-based considerations. The theoretical correlations among Higgs detection channels appear to affect the location and size of the best-fit regions in the space of Higgs couplings. We discuss the recurrent question of the shape of the prior distributions for the individual theoretical errors and find that a nearly Gaussian prior arises from the error combinations. We also develop the bias approach, which is an alternative to marginalisation providing more conservative results. The statistical framework to apply the bias principle is introduced and two realisations of the bias are proposed. Finally, depending on the statistical treatment, the Standard Model prediction for the Higgs signal strengths is found to lie within either the 68% or 95% confidence level region obtained from the latest analyses of the 7 and 8 TeV LHC datasets.

  9. Exploring super-gaussianity towards robust information-theoretical time delay estimation

    DEFF Research Database (Denmark)

    Petsatodis, Theodoros; Talantzis, Fotios; Boukis, Christos

    2013-01-01

    the effect upon TDE when modeling the source signal with different speech-based distributions. An information theoretical TDE method indirectly encapsulating higher order statistics (HOS) formed the basis of this work. The underlying assumption of Gaussian distributed source has been replaced...

  10. Comparison of information-theoretic to statistical methods for gene-gene interactions in the presence of genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Sucheston Lara

    2010-09-01

    Full Text Available Abstract Background Multifactorial diseases such as cancer and cardiovascular diseases are caused by the complex interplay between genes and environment. The detection of these interactions remains challenging due to computational limitations. Information theoretic approaches use computationally efficient directed search strategies and thus provide a feasible solution to this problem. However, the power of information theoretic methods for interaction analysis has not been systematically evaluated. In this work, we compare power and Type I error of an information-theoretic approach to existing interaction analysis methods. Methods The k-way interaction information (KWII metric for identifying variable combinations involved in gene-gene interactions (GGI was assessed using several simulated data sets under models of genetic heterogeneity driven by susceptibility increasing loci with varying allele frequency, penetrance values and heritability. The power and proportion of false positives of the KWII was compared to multifactor dimensionality reduction (MDR, restricted partitioning method (RPM and logistic regression. Results The power of the KWII was considerably greater than MDR on all six simulation models examined. For a given disease prevalence at high values of heritability, the power of both RPM and KWII was greater than 95%. For models with low heritability and/or genetic heterogeneity, the power of the KWII was consistently greater than RPM; the improvements in power for the KWII over RPM ranged from 4.7% to 14.2% at for α = 0.001 in the three models at the lowest heritability values examined. KWII performed similar to logistic regression. Conclusions Information theoretic models are flexible and have excellent power to detect GGI under a variety of conditions that characterize complex diseases.

  11. Quantum information theory and quantum statistics

    International Nuclear Information System (INIS)

    Petz, D.

    2008-01-01

    Based on lectures given by the author, this book focuses on providing reliable introductory explanations of key concepts of quantum information theory and quantum statistics - rather than on results. The mathematically rigorous presentation is supported by numerous examples and exercises and by an appendix summarizing the relevant aspects of linear analysis. Assuming that the reader is familiar with the content of standard undergraduate courses in quantum mechanics, probability theory, linear algebra and functional analysis, the book addresses graduate students of mathematics and physics as well as theoretical and mathematical physicists. Conceived as a primer to bridge the gap between statistical physics and quantum information, a field to which the author has contributed significantly himself, it emphasizes concepts and thorough discussions of the fundamental notions to prepare the reader for deeper studies, not least through the selection of well chosen exercises. (orig.)

  12. Near-field NanoThermoMechanical memory

    International Nuclear Information System (INIS)

    Elzouka, Mahmoud; Ndao, Sidy

    2014-01-01

    In this letter, we introduce the concept of NanoThermoMechanical Memory. Unlike electronic memory, a NanoThermoMechanical memory device uses heat instead of electricity to record, store, and recover data. Memory function is achieved through the coupling of near-field thermal radiation and thermal expansion resulting in negative differential thermal resistance and thermal latching. Here, we demonstrate theoretically via numerical modeling the concept of near-field thermal radiation enabled negative differential thermal resistance that achieves bistable states. Design and implementation of a practical silicon based NanoThermoMechanical memory device are proposed along with a study of its dynamic response under write/read cycles. With more than 50% of the world's energy losses being in the form of heat along with the ever increasing need to develop computer technologies which can operate in harsh environments (e.g., very high temperatures), NanoThermoMechanical memory and logic devices may hold the answer

  13. Thermo-mechanical analysis of RMP coil system for EAST tokamak

    International Nuclear Information System (INIS)

    Wang, Songke; Ji, Xiang; Song, Yuntao; Zhang, Shanwen; Wang, Zhongwei; Sun, Youwen; Qi, Minzhong; Liu, Xufeng; Wang, Shengming; Yao, Damao

    2014-01-01

    Highlights: • Thermal design requirements for EAST RMP coils are summarized. • Cooling parameters based on both theoretical and numerical solutions are determined. • Compromise between thermal design and structural design is made on number of turns. • Thermo-mechanical calculations are made to validate its structural performance. - Abstract: Resonant magnetic perturbation (RMP) has been proved to be an efficient approach on edge localized modes (ELMs) control, resistive wall mode (RWM) control, and error field correction (EFC), RMP coil system under design in EAST tokamak will realize the above-mentioned multi-functions. This paper focuses on the thermo-mechanical analysis of EAST RMP coil system on the basis of sensitivity analysis, both normal and off-normal working conditions are considered. The most characteristic set of coil system is chosen with a complete modelling by means of three-dimensional (3D) finite element method, thermo-hydraulic and thermal-structural performances are investigated adequately, both locally and globally. The compromise is made between thermal performance and structural design requirements, and the results indicate that the optimized design is feasible and reasonable

  14. An approach to build knowledge base for reactor accident diagnostic system using statistical method

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Yokobayashi, Masao; Matsumoto, Kiyoshi; Fujii, Minoru

    1988-01-01

    In the development of a rule based expert system, one of key issues is how to build a knowledge base (KB). A systematic approach has been attempted for building an objective KB efficiently. The approach is based on the concept that a prototype KB should first be generated in a systematic way and then it is to be modified and/or improved by expert for practical use. The statistical method, Factor Analysis, was applied to build a prototype KB for the JAERI expert system DISKET using source information obtained from a PWR simulator. The prototype KB was obtained and the inference with this KB was performed against several types of transients. In each diagnosis, the transient type was well identified. From this study, it is concluded that the statistical method used is useful for building a prototype knowledge base. (author)

  15. A short course in quantum information theory. An approach from theoretical physics. 2. ed.

    International Nuclear Information System (INIS)

    Diosi, Lajos

    2011-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. For the second edition, the authors has succeeded in adding many new topics while sticking to the conciseness of the overall approach. A new chapter on qubit thermodynamics has been added, while new sections and subsections have been incorporated in various chapter to deal with weak and time-continuous measurements, period-finding quantum algorithms and quantum error corrections. From the reviews of the first edition: ''The best things about this book are its brevity and clarity. In around 100 pages it provides a tutorial introduction to quantum information theory, including problems and solutions.. it's worth a look if you want to quickly get up to speed with the language and central concepts of quantum information theory, including the background classical information theory.'' (Craig Savage, Australian Physics, Vol. 44 (2), 2007). (orig.)

  16. Information-Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory.

    Science.gov (United States)

    Agres, Kat; Abdallah, Samer; Pearce, Marcus

    2018-01-01

    A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different types of statistical information affect listeners' memory for auditory stimuli. We used a combination of behavioral and computational methods to investigate memory for non-linguistic auditory sequences. Participants repeatedly heard tone sequences varying systematically in their information-theoretic properties. Expectedness ratings of tones were collected during three listening sessions, and a recognition memory test was given after each session. Information-theoretic measures of sequential predictability significantly influenced listeners' expectedness ratings, and variations in these properties had a significant impact on memory performance. Predictable sequences yielded increasingly better memory performance with increasing exposure. Computational simulations using a probabilistic model of auditory expectation suggest that listeners dynamically formed a new, and increasingly accurate, implicit cognitive model of the information-theoretic structure of the sequences throughout the experimental session. Copyright © 2017 Cognitive Science Society, Inc.

  17. Set-Theoretic Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan

    Despite being widely accepted and applied, maturity models in Information Systems (IS) have been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. This PhD thesis focuses on addressing...... these criticisms by incorporating recent developments in configuration theory, in particular application of set-theoretic approaches. The aim is to show the potential of employing a set-theoretic approach for maturity model research and empirically demonstrating equifinal paths to maturity. Specifically...... methodological guidelines consisting of detailed procedures to systematically apply set theoretic approaches for maturity model research and provides demonstrations of it application on three datasets. The thesis is a collection of six research papers that are written in a sequential manner. The first paper...

  18. Analyzing Statistical Mediation with Multiple Informants: A New Approach with an Application in Clinical Psychology.

    Science.gov (United States)

    Papa, Lesther A; Litson, Kaylee; Lockhart, Ginger; Chassin, Laurie; Geiser, Christian

    2015-01-01

    Testing mediation models is critical for identifying potential variables that need to be targeted to effectively change one or more outcome variables. In addition, it is now common practice for clinicians to use multiple informant (MI) data in studies of statistical mediation. By coupling the use of MI data with statistical mediation analysis, clinical researchers can combine the benefits of both techniques. Integrating the information from MIs into a statistical mediation model creates various methodological and practical challenges. The authors review prior methodological approaches to MI mediation analysis in clinical research and propose a new latent variable approach that overcomes some limitations of prior approaches. An application of the new approach to mother, father, and child reports of impulsivity, frustration tolerance, and externalizing problems (N = 454) is presented. The results showed that frustration tolerance mediated the relationship between impulsivity and externalizing problems. The new approach allows for a more comprehensive and effective use of MI data when testing mediation models.

  19. Information-theoretic discrepancy based iterative reconstructions (IDIR) for polychromatic x-ray tomography

    International Nuclear Information System (INIS)

    Jang, Kwang Eun; Lee, Jongha; Sung, Younghun; Lee, SeongDeok

    2013-01-01

    particular condition, the GID is equivalent to the Poisson log-likelihood function. The newly proposed GIDs of the other two categories consist of log-transformed measurements, which have the advantage of imposing linearized penalties over multiple discrepancies. For all proposed variants of the GID, the aforementioned strategy was used to obtain a closed-form update equation. Even though it is based on the exact polychromatic model, the derived algorithm bears a structural resemblance to conventional methods based on the monochromatic approximation. The authors named the proposed approach as information-theoretic discrepancy based iterative reconstructions (IDIR). In numerical experiments, IDIR with raw data converged faster than previously known statistical reconstruction methods. IDIR with log-transformed data exhibited superior reconstruction quality and faster convergence speed compared with conventional methods and their variants.Conclusions: The authors' new framework for tomographic reconstruction allows iterative inversion of the polychromatic data model. The primary departure from the traditional iterative reconstruction was the employment of the GID as a new metric for quantifying the inconsistency between the measured and synthetic data. The proposed methods outperformed not only conventional methods based on the monochromatic approximation but also those based on the polychromatic model. The authors have observed that the GID is a very flexible means to design an objective function for iterative reconstructions. Hence, the authors expect that the proposed IDIR framework will also be applicable to other challenging tasks

  20. Analyzing Statistical Mediation with Multiple Informants: A New Approach with an Application in Clinical Psychology

    Directory of Open Access Journals (Sweden)

    Lesther ePapa

    2015-11-01

    Full Text Available Testing mediation models is critical for identifying potential variables that need to be targeted to effectively change one or more outcome variables. In addition, it is now common practice for clinicians to use multiple informant (MI data in studies of statistical mediation. By coupling the use of MI data with statistical mediation analysis, clinical researchers can combine the benefits of both techniques. Integrating the information from MIs into a statistical mediation model creates various methodological and practical challenges. The authors review prior methodological approaches to MI mediation analysis in clinical research and propose a new latent variable approach that overcomes some limitations of prior approaches. An application of the new approach to mother, father, and child reports of impulsivity, frustration tolerance, and externalizing problems (N = 454 is presented. The results showed that frustration tolerance mediated the relationship between impulsivity and externalizing problems. Advantages and limitations of the new approach are discussed. The new approach can help clinical researchers overcome limitations of prior techniques. It allows for a more comprehensive and effective use of MI data when testing mediation models.

  1. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    Science.gov (United States)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land

  2. Advancing Empirical Approaches to the Concept of Resilience: A Critical Examination of Panarchy, Ecological Information, and Statistical Evidence

    Directory of Open Access Journals (Sweden)

    Ali Kharrazi

    2016-09-01

    Full Text Available Despite its ambiguities, the concept of resilience is of critical importance to researchers, practitioners, and policy-makers in dealing with dynamic socio-ecological systems. In this paper, we critically examine the three empirical approaches of (i panarchy; (ii ecological information-based network analysis; and (iii statistical evidence of resilience to three criteria determined for achieving a comprehensive understanding and application of this concept. These criteria are the ability: (1 to reflect a system’s adaptability to shocks; (2 to integrate social and environmental dimensions; and (3 to evaluate system-level trade-offs. Our findings show that none of the three currently applied approaches are strong in handling all three criteria. Panarchy is strong in the first two criteria but has difficulty with normative trade-offs. The ecological information-based approach is strongest in evaluating trade-offs but relies on common dimensions that lead to over-simplifications in integrating the social and environmental dimensions. Statistical evidence provides suggestions that are simplest and easiest to act upon but are generally weak in all three criteria. This analysis confirms the value of these approaches in specific instances but also the need for further research in advancing empirical approaches to the concept of resilience.

  3. An Information-Theoretic-Cluster Visualization for Self-Organizing Maps.

    Science.gov (United States)

    Brito da Silva, Leonardo Enzo; Wunsch, Donald C

    2018-06-01

    Improved data visualization will be a significant tool to enhance cluster analysis. In this paper, an information-theoretic-based method for cluster visualization using self-organizing maps (SOMs) is presented. The information-theoretic visualization (IT-vis) has the same structure as the unified distance matrix, but instead of depicting Euclidean distances between adjacent neurons, it displays the similarity between the distributions associated with adjacent neurons. Each SOM neuron has an associated subset of the data set whose cardinality controls the granularity of the IT-vis and with which the first- and second-order statistics are computed and used to estimate their probability density functions. These are used to calculate the similarity measure, based on Renyi's quadratic cross entropy and cross information potential (CIP). The introduced visualizations combine the low computational cost and kernel estimation properties of the representative CIP and the data structure representation of a single-linkage-based grouping algorithm to generate an enhanced SOM-based visualization. The visual quality of the IT-vis is assessed by comparing it with other visualization methods for several real-world and synthetic benchmark data sets. Thus, this paper also contains a significant literature survey. The experiments demonstrate the IT-vis cluster revealing capabilities, in which cluster boundaries are sharply captured. Additionally, the information-theoretic visualizations are used to perform clustering of the SOM. Compared with other methods, IT-vis of large SOMs yielded the best results in this paper, for which the quality of the final partitions was evaluated using external validity indices.

  4. Identifying Statistical Dependence in Genomic Sequences via Mutual Information Estimates

    Directory of Open Access Journals (Sweden)

    Wojciech Szpankowski

    2007-12-01

    Full Text Available Questions of understanding and quantifying the representation and amount of information in organisms have become a central part of biological research, as they potentially hold the key to fundamental advances. In this paper, we demonstrate the use of information-theoretic tools for the task of identifying segments of biomolecules (DNA or RNA that are statistically correlated. We develop a precise and reliable methodology, based on the notion of mutual information, for finding and extracting statistical as well as structural dependencies. A simple threshold function is defined, and its use in quantifying the level of significance of dependencies between biological segments is explored. These tools are used in two specific applications. First, they are used for the identification of correlations between different parts of the maize zmSRp32 gene. There, we find significant dependencies between the 5′ untranslated region in zmSRp32 and its alternatively spliced exons. This observation may indicate the presence of as-yet unknown alternative splicing mechanisms or structural scaffolds. Second, using data from the FBI's combined DNA index system (CODIS, we demonstrate that our approach is particularly well suited for the problem of discovering short tandem repeats—an application of importance in genetic profiling.

  5. A short course in quantum information theory. An approach from theoretical physics

    International Nuclear Information System (INIS)

    Diosi, L.

    2007-01-01

    This short and concise primer takes the vantage point of theoretical physics and the unity of physics. It sets out to strip the burgeoning field of quantum information science to its basics by linking it to universal concepts in physics. An extensive lecture rather than a comprehensive textbook, this volume is based on courses delivered over several years to advanced undergraduate and beginning graduate students, but essentially it addresses anyone with a working knowledge of basic quantum physics. Readers will find these lectures a most adequate entry point for theoretical studies in this field. (orig.)

  6. Laser thermal effect on silicon nitride ceramic based on thermo-chemical reaction with temperature-dependent thermo-physical parameters

    International Nuclear Information System (INIS)

    Pan, A.F.; Wang, W.J.; Mei, X.S.; Wang, K.D.; Zhao, W.Q.; Li, T.Q.

    2016-01-01

    Highlights: • A two-dimensional thermo-chemical reaction model is creatively built. • Thermal conductivity and heat capacity of β-Si_3N_4 are computed accurately. • The appropriate thermo-chemical reaction rate is fitted and reaction element length is set to assure the constringency. • The deepest ablated position was not the center of the ablated area due to plasma absorption. • The simulation results demonstrate the thermo-chemical process cant be simplified to be physical phase transition. - Abstract: In this study, a two-dimensional thermo-chemical reaction model with temperature-dependent thermo-physical parameters on Si_3N_4 with 10 ns laser was developed to investigate the ablated size, volume and surface morphology after single pulse. For model parameters, thermal conductivity and heat capacity of β-Si_3N_4 were obtained from first-principles calculations. Thermal-chemical reaction rate was fitted by collision theory, and then, reaction element length was deduced using the relationship between reaction rate and temperature distribution. Furthermore, plasma absorption related to energy loss was approximated as a function of electron concentration in Si_3N_4. It turned out that theoretical ablated volume and radius increased and then remained constant with increasing laser energy, and the maximum ablated depth was not in the center of the ablated zone. Moreover, the surface maximum temperature of Si_3N_4 was verified to be above 3000 K within pulse duration, and it was much higher than its thermal decomposition temperature of 1800 K, which indicated that Si_3N_4 was not ablated directly above the thermal decomposition temperature. Meanwhile, the single pulse ablation of Si_3N_4 was performed at different powers using a TEM_0_0 10 ns pulse Nd:YAG laser to validate the model. The model showed a satisfactory consistence between the experimental data and numerical predictions, presenting a new modeling technology that may significantly increase the

  7. Physics of thermo-acoustic sound generation

    Science.gov (United States)

    Daschewski, M.; Boehm, R.; Prager, J.; Kreutzbruck, M.; Harrer, A.

    2013-09-01

    We present a generalized analytical model of thermo-acoustic sound generation based on the analysis of thermally induced energy density fluctuations and their propagation into the adjacent matter. The model provides exact analytical prediction of the sound pressure generated in fluids and solids; consequently, it can be applied to arbitrary thermal power sources such as thermophones, plasma firings, laser beams, and chemical reactions. Unlike existing approaches, our description also includes acoustic near-field effects and sound-field attenuation. Analytical results are compared with measurements of sound pressures generated by thermo-acoustic transducers in air for frequencies up to 1 MHz. The tested transducers consist of titanium and indium tin oxide coatings on quartz glass and polycarbonate substrates. The model reveals that thermo-acoustic efficiency increases linearly with the supplied thermal power and quadratically with thermal excitation frequency. Comparison of the efficiency of our thermo-acoustic transducers with those of piezoelectric-based airborne ultrasound transducers using impulse excitation showed comparable sound pressure values. The present results show that thermo-acoustic transducers can be applied as broadband, non-resonant, high-performance ultrasound sources.

  8. Analytical Expressions for Thermo-Osmotic Permeability of Clays

    Science.gov (United States)

    Gonçalvès, J.; Ji Yu, C.; Matray, J.-M.; Tremosa, J.

    2018-01-01

    In this study, a new formulation for the thermo-osmotic permeability of natural pore solutions containing monovalent and divalent cations is proposed. The mathematical formulation proposed here is based on the theoretical framework supporting thermo-osmosis which relies on water structure alteration in the pore space of surface-charged materials caused by solid-fluid electrochemical interactions. The ionic content balancing the surface charge of clay minerals causes a disruption in the hydrogen bond network when more structured water is present at the clay surface. Analytical expressions based on our heuristic model are proposed and compared to the available data for NaCl solutions. It is shown that the introduction of divalent cations reduces the thermo-osmotic permeability by one third compared to the monovalent case. The analytical expressions provided here can be used to advantage for safety calculations in deep underground nuclear waste repositories.

  9. Theoretical approaches to the steady-state statistical physics of interacting dissipative units

    Science.gov (United States)

    Bertin, Eric

    2017-02-01

    The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.

  10. Theoretical information measurement in nonrelativistic time-dependent approach

    Science.gov (United States)

    Najafizade, S. A.; Hassanabadi, H.; Zarrinkamar, S.

    2018-02-01

    The information-theoretic measures of time-dependent Schrödinger equation are investigated via the Shannon information entropy, variance and local Fisher quantities. In our calculations, we consider the two first states n = 0,1 and obtain the position Sx (t) and momentum Sp (t) Shannon entropies as well as Fisher information Ix (t) in position and momentum Ip (t) spaces. Using the Fourier transformed wave function, we obtain the results in momentum space. Some interesting features of the information entropy densities ρs (x,t) and γs (p,t), as well as the probability densities ρ (x,t) and γ (p,t) for time-dependent states are demonstrated. We establish a general relation between variance and Fisher's information. The Bialynicki-Birula-Mycielski inequality is tested and verified for the states n = 0,1.

  11. The census categorization of ethnic identity: Between theoretical comprehensions and statistical practice

    Directory of Open Access Journals (Sweden)

    Knežević Aleksandar

    2016-01-01

    considered as a base of ethno-demographic studies of French and Serbian population, in which different theoretical concepts of ethnic identity created two different approaches to ethno-statistic census process. In every Serbian census from the middle of the 19th century till today central topics were the questions about static ethno-cultural categories of religion and about mother tongue, while the question about ethnicity was asked in each and every census taken after Second World War. On the other hand, although it has the longest census tradition in Europe, the official ethno-statistic evidence of population in France has been for a long time determined by constant rejection of ethnic categorization and by the absence of questions about primary ethnic marks of population. Instead, the supranational concept of legal nationality has become the central spot while, as a feature of political integration of immigrants in a census, the question about state of birth starts to appear. The main dilemma regarding relations between prevailing theoretical concepts of ethnic identity and the official statistic practice based on Serbian and French models, still remains. Therefore, this paper shows two-way influence of ethno-statistic categorization and the real ethnic structure, and opens a discussion whether the ethnic identities have been defined by statistics or the ethnic identity is the one which defines the official statistics. [Projekat Ministarstva nauke Republike Srbije, br. III 47006: Istraživanje demografskih fenomena u funkciji javnih politika u Srbiji

  12. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Statistical analysis of questionnaires a unified approach based on R and Stata

    CERN Document Server

    Bartolucci, Francesco; Gnaldi, Michela

    2015-01-01

    Statistical Analysis of Questionnaires: A Unified Approach Based on R and Stata presents special statistical methods for analyzing data collected by questionnaires. The book takes an applied approach to testing and measurement tasks, mirroring the growing use of statistical methods and software in education, psychology, sociology, and other fields. It is suitable for graduate students in applied statistics and psychometrics and practitioners in education, health, and marketing.The book covers the foundations of classical test theory (CTT), test reliability, va

  14. INNOVATIVE APPROACH TO EDUCATION AND TEACHING OF STATISTICS

    Directory of Open Access Journals (Sweden)

    Andrea Jindrová

    2010-06-01

    Full Text Available Educational and tutorial programs are being developed together, with the changing world of information technology it is a necessary course to adapt to and accept new possibilities and needs. Use of online learning tools can amplify our teaching resources and create new types of learning opportunities that did not exist in the pre-Internet age. The world is full of information, which needs to be constantly updated. Virtualisation of studying materials enables us to update and manage them quickly and easily. As an advantage, we see an asynchronous approach towards learning materials that can be tailored for the students´ needs and adjusted according to their time and availability. The specificness of statistical learning lies in various statistical programs. The high technical demands of these programs require tutorials (instructional presentations, which can help students to learn how to use them efficiently. Instructional presentation may be understood as a demonstration of how the statistical software program works. This is one of the options that students may use to simplify the utilization of control and navigation through the statistical system. Thanks to instructional presentations, students will be able to transfer their theoretical statistical knowledge into practical situation and real life and, therefore, improve their personal development process. The goal of this tutorial is to show an innovative approach for learning of statistics in the Czech University of Life Sciences. The use of presentations and their benefits for students was evaluated according to results obtained from a questionnaire survey completed by students of the 4th grade of the Faculty of Economics and Management. The aim of this pilot survey was to evaluate the benefits of these instructional presentations, and the students interest in using them. The information obtained was used as essential data for the evaluation of the efficiency of this new approach. Firstly

  15. Statistical sampling approaches for soil monitoring

    NARCIS (Netherlands)

    Brus, D.J.

    2014-01-01

    This paper describes three statistical sampling approaches for regional soil monitoring, a design-based, a model-based and a hybrid approach. In the model-based approach a space-time model is exploited to predict global statistical parameters of interest such as the space-time mean. In the hybrid

  16. Thermo-emf of cermet films based on rare earth borides

    International Nuclear Information System (INIS)

    Islamgaliev, R.K.; Zyrin, A.V.; Shulishova, O.I.; Shcherbak, I.A

    1987-01-01

    Thermo-emf and electric conductivity of granulated films which contain a solid solution of europium and praseodymium borides Eu 0.5 Pr 0.5 B 6 as a conducting phase, and glass-crystal binder on the base of alummomagnesial fluosilicates as a dielectric phase are studied within the temperature range of 100-1100 K. Thermo-emf of films has a negative sign within the temperature range of 100-500 K and does not exceed 5 μkV/K according to the absolute value which is close to the value of the conducting phase thermo-emf. A negative sign and a small value of thermo-emf are indicative of the charge transfer in granulated films by electrons. Contribution of each of the components into the general thermo-emf is different at high temperatures in different temperature ranges and depends on the individual physico-chemical properties of the used materials

  17. Toward a Theoretical Framework for Information Science

    Directory of Open Access Journals (Sweden)

    Amanda Spink

    2000-01-01

    Full Text Available Information Science is beginning to develop a theoretical framework for the modeling of users’ interactions with information retrieval (IR technologies within the more holistic context of human information behavior (Spink, 1998b. This paper addresses the following questions: (1 What is the nature of Information Science? and (2 What theoretical framework and model is most appropriate for Information Science? This paper proposes a theoretical framework for Information Science based on an explication of the processes of human information coordinating behavior and information feedback that facilitate the relationship between human information behavior and human interaction with information retrieval (IR technologies (Web, digital libraries, etc..

  18. Fast reactor safety and computational thermo-fluid dynamics approaches

    International Nuclear Information System (INIS)

    Ninokata, Hisashi; Shimizu, Takeshi

    1993-01-01

    This article provides a brief description of the safety principle on which liquid metal cooled fast breeder reactors (LMFBRs) is based and the roles of computations in the safety practices. A number of thermohydraulics models have been developed to date that successfully describe several of the important types of fluids and materials motion encountered in the analysis of postulated accidents in LMFBRs. Most of these models use a mixture of implicit and explicit numerical solution techniques in solving a set of conservation equations formulated in Eulerian coordinates, with special techniques included to specific situations. Typical computational thermo-fluid dynamics approaches are discussed in particular areas of analyses of the physical phenomena relevant to the fuel subassembly thermohydraulics design and that involve describing the motion of molten materials in the core over a large scale. (orig.)

  19. Quantum theoretical physics is statistical and relativistic

    International Nuclear Information System (INIS)

    Harding, C.

    1980-01-01

    A new theoretical framework for the quantum mechanism is presented. It is based on a strict deterministic behavior of single systems. The conventional QM equation, however, is found to describe statistical results of many classical systems. It will be seen, moreover, that a rigorous synthesis of our theory requires relativistic kinematics. So, QM is not only a classical statistical theory, it is, of necessity, a relativistic theory. The equation of the theory does not just duplicate QM, it indicates an inherent nonlinearity in QM which is subject to experimental verification. It is shown, therefore, that conventional QM is a corollary of classical deterministic principles. It is suggested that this concept of nature conflicts with that prevalent in modern physics. (author)

  20. Molecular acidity: An accurate description with information-theoretic approach in density functional reactivity theory.

    Science.gov (United States)

    Cao, Xiaofang; Rong, Chunying; Zhong, Aiguo; Lu, Tian; Liu, Shubin

    2018-01-15

    Molecular acidity is one of the important physiochemical properties of a molecular system, yet its accurate calculation and prediction are still an unresolved problem in the literature. In this work, we propose to make use of the quantities from the information-theoretic (IT) approach in density functional reactivity theory and provide an accurate description of molecular acidity from a completely new perspective. To illustrate our point, five different categories of acidic series, singly and doubly substituted benzoic acids, singly substituted benzenesulfinic acids, benzeneseleninic acids, phenols, and alkyl carboxylic acids, have been thoroughly examined. We show that using IT quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy, information gain, Onicescu information energy, and relative Rényi entropy, one is able to simultaneously predict experimental pKa values of these different categories of compounds. Because of the universality of the quantities employed in this work, which are all density dependent, our approach should be general and be applicable to other systems as well. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. The role of nonequilibrium thermo-mechanical statistics in modern technologies and industrial processes: an overview

    OpenAIRE

    Rodrigues, Clóves G.; Silva, Antônio A. P.; Silva, Carlos A. B.; Vasconcellos, Áurea R.; Ramos, J. Galvão; Luzzi, Roberto

    2010-01-01

    The nowadays notable development of all the modern technology, fundamental for the progress and well being of world society, imposes a great deal of stress in the realm of basic Physics, more precisely on Thermo-Statistics. We do face situations in electronics and optoelectronics involving physical-chemical systems far-removed-from equilibrium, where ultrafast (in pico- and femto-second scale) and non-linear processes are present. Further, we need to be aware of the rapid unfolding of nano-te...

  2. Information-theoretical approach to control of quantum-mechanical systems

    International Nuclear Information System (INIS)

    Kawabata, Shiro

    2003-01-01

    Fundamental limits on the controllability of quantum mechanical systems are discussed in the light of quantum information theory. It is shown that the amount of entropy-reduction that can be extracted from a quantum system by feedback controller is upper bounded by a sum of the decrease of entropy achievable in open-loop control and the mutual information between the quantum system and the controller. This upper bound sets a fundamental limit on the performance of any quantum controllers whose designs are based on the possibilities to attain low entropy states. An application of this approach pertaining to quantum error correction is also discussed

  3. Complexity characterization in a probabilistic approach to dynamical systems through information geometry and inductive inference

    International Nuclear Information System (INIS)

    Ali, S A; Kim, D-H; Cafaro, C; Giffin, A

    2012-01-01

    Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this paper, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by the use of statistical inductive inference and information geometry. We review the maximum relative entropy formalism and the theoretical structure of the information geometrodynamical approach to chaos on statistical manifolds M S . Special focus is devoted to a description of the roles played by the sectional curvature K M S , the Jacobi field intensity J M S and the information geometrodynamical entropy S M S . These quantities serve as powerful information-geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on M S . Finally, the application of such information-geometric techniques to several theoretical models is presented.

  4. Multivariate meta-analysis: a robust approach based on the theory of U-statistic.

    Science.gov (United States)

    Ma, Yan; Mazumdar, Madhu

    2011-10-30

    Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Finite temperature dynamics of a Holstein polaron: The thermo-field dynamics approach

    Science.gov (United States)

    Chen, Lipeng; Zhao, Yang

    2017-12-01

    Combining the multiple Davydov D2 Ansatz with the method of thermo-field dynamics, we study finite temperature dynamics of a Holstein polaron on a lattice. It has been demonstrated, using the hierarchy equations of motion method as a benchmark, that our approach provides an efficient, robust description of finite temperature dynamics of the Holstein polaron in the simultaneous presence of diagonal and off-diagonal exciton-phonon coupling. The method of thermo-field dynamics handles temperature effects in the Hilbert space with key numerical advantages over other treatments of finite-temperature dynamics based on quantum master equations in the Liouville space or wave function propagation with Monte Carlo importance sampling. While for weak to moderate diagonal coupling temperature increases inhibit polaron mobility, it is found that off-diagonal coupling induces phonon-assisted transport that dominates at high temperatures. Results on the mean square displacements show that band-like transport features dominate the diagonal coupling cases, and there exists a crossover from band-like to hopping transport with increasing temperature when including off-diagonal coupling. As a proof of concept, our theory provides a unified treatment of coherent and incoherent transport in molecular crystals and is applicable to any temperature.

  6. Theoretical physics 8 statistical physics

    CERN Document Server

    Nolting, Wolfgang

    2018-01-01

    This textbook offers a clear and comprehensive introduction to statistical physics, one of the core components of advanced undergraduate physics courses. It follows on naturally from the previous volumes in this series, using methods of probability theory and statistics to solve physical problems. The first part of the book gives a detailed overview on classical statistical physics and introduces all mathematical tools needed. The second part of the book covers topics related to quantized states, gives a thorough introduction to quantum statistics, followed by a concise treatment of quantum gases. Ideally suited to undergraduate students with some grounding in quantum mechanics, the book is enhanced throughout with learning features such as boxed inserts and chapter summaries, with key mathematical derivations highlighted to aid understanding. The text is supported by numerous worked examples and end of chapter problem sets. About the Theoretical Physics series Translated from the renowned and highly successf...

  7. On nonlinear thermo-electro-elasticity.

    Science.gov (United States)

    Mehnert, Markus; Hossain, Mokarram; Steinmann, Paul

    2016-06-01

    Electro-active polymers (EAPs) for large actuations are nowadays well-known and promising candidates for producing sensors, actuators and generators. In general, polymeric materials are sensitive to differential temperature histories. During experimental characterizations of EAPs under electro-mechanically coupled loads, it is difficult to maintain constant temperature not only because of an external differential temperature history but also because of the changes in internal temperature caused by the application of high electric loads. In this contribution, a thermo-electro-mechanically coupled constitutive framework is proposed based on the total energy approach. Departing from relevant laws of thermodynamics, thermodynamically consistent constitutive equations are formulated. To demonstrate the performance of the proposed thermo-electro-mechanically coupled framework, a frequently used non-homogeneous boundary-value problem, i.e. the extension and inflation of a cylindrical tube, is solved analytically. The results illustrate the influence of various thermo-electro-mechanical couplings.

  8. Studies in Theoretical and Applied Statistics

    CERN Document Server

    Pratesi, Monica; Ruiz-Gazen, Anne

    2018-01-01

    This book includes a wide selection of the papers presented at the 48th Scientific Meeting of the Italian Statistical Society (SIS2016), held in Salerno on 8-10 June 2016. Covering a wide variety of topics ranging from modern data sources and survey design issues to measuring sustainable development, it provides a comprehensive overview of the current Italian scientific research in the fields of open data and big data in public administration and official statistics, survey sampling, ordinal and symbolic data, statistical models and methods for network data, time series forecasting, spatial analysis, environmental statistics, economic and financial data analysis, statistics in the education system, and sustainable development. Intended for researchers interested in theoretical and empirical issues, this volume provides interesting starting points for further research.

  9. Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic

    NARCIS (Netherlands)

    Emons, W.H.M.; Meijer, R.R.; Sijtsma, K.

    2002-01-01

    The accuracy with which the theoretical sampling distribution of van der Flier's person-.t statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I

  10. A quantum information approach to statistical mechanics

    International Nuclear Information System (INIS)

    Cuevas, G.

    2011-01-01

    The field of quantum information and computation harnesses and exploits the properties of quantum mechanics to perform tasks more efficiently than their classical counterparts, or that may uniquely be possible in the quantum world. Its findings and techniques have been applied to a number of fields, such as the study of entanglement in strongly correlated systems, new simulation techniques for many-body physics or, generally, to quantum optics. This thesis aims at broadening the scope of quantum information theory by applying it to problems in statistical mechanics. We focus on classical spin models, which are toy models used in a variety of systems, ranging from magnetism, neural networks, to quantum gravity. We tackle these models using quantum information tools from three different angles. First, we show how the partition function of a class of widely different classical spin models (models in different dimensions, different types of many-body interactions, different symmetries, etc) can be mapped to the partition function of a single model. We prove this by first establishing a relation between partition functions and quantum states, and then transforming the corresponding quantum states to each other. Second, we give efficient quantum algorithms to estimate the partition function of various classical spin models, such as the Ising or the Potts model. The proof is based on a relation between partition functions and quantum circuits, which allows us to determine the quantum computational complexity of the partition function by studying the corresponding quantum circuit. Finally, we outline the possibility of applying quantum information concepts and tools to certain models of dis- crete quantum gravity. The latter provide a natural route to generalize our results, insofar as the central quantity has the form of a partition function, and as classical spin models are used as toy models of matter. (author)

  11. On the information-theoretic approach to G\\"odel's incompleteness theorem

    OpenAIRE

    D'Abramo, Germano

    2002-01-01

    In this paper we briefly review and analyze three published proofs of Chaitin's theorem, the celebrated information-theoretic version of G\\"odel's incompleteness theorem. Then, we discuss our main perplexity concerning a key step common to all these demonstrations.

  12. Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic

    NARCIS (Netherlands)

    Emons, Wilco H.M.; Meijer, R.R.; Sijtsma, Klaas

    2002-01-01

    The accuracy with which the theoretical sampling distribution of van der Flier’s person-fit statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I

  13. THEORETICAL APPROACHES TO THE DEFINITION OF THE "INFORMATION RESOURCE"

    OpenAIRE

    I. Netreba

    2014-01-01

    Existing approaches to determining the nature of the category "information resource" are detailed and systematized. The relationships between the categories "information resource", "information technology", "information management system" are revealed. Determined the importance of information resources for the production process at the enterprise.

  14. Distribution of apparent activation energy counterparts during thermo - And thermo-oxidative degradation of Aronia melanocarpa (black chokeberry).

    Science.gov (United States)

    Janković, Bojan; Marinović-Cincović, Milena; Janković, Marija

    2017-09-01

    Kinetics of degradation for Aronia melanocarpa fresh fruits in argon and air atmospheres were investigated. The investigation was based on probability distributions of apparent activation energy of counterparts (ε a ). Isoconversional analysis results indicated that the degradation process in an inert atmosphere was governed by decomposition reactions of esterified compounds. Also, based on same kinetics approach, it was assumed that in an air atmosphere, the primary compound in degradation pathways could be anthocyanins, which undergo rapid chemical reactions. A new model of reactivity demonstrated that, under inert atmospheres, expectation values for ε a occured at levels of statistical probability. These values corresponded to decomposition processes in which polyphenolic compounds might be involved. ε a values obeyed laws of binomial distribution. It was established that, for thermo-oxidative degradation, Poisson distribution represented a very successful approximation for ε a values where there was additional mechanistic complexity and the binomial distribution was no longer valid. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Diffusion-Based Density-Equalizing Maps: an Interdisciplinary Approach to Visualizing Homicide Rates and Other Georeferenced Statistical Data

    Science.gov (United States)

    Mazzitello, Karina I.; Candia, Julián

    2012-12-01

    In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.

  16. Applications of statistical physics and information theory to the analysis of DNA sequences

    Science.gov (United States)

    Grosse, Ivo

    2000-10-01

    DNA carries the genetic information of most living organisms, and the of genome projects is to uncover that genetic information. One basic task in the analysis of DNA sequences is the recognition of protein coding genes. Powerful computer programs for gene recognition have been developed, but most of them are based on statistical patterns that vary from species to species. In this thesis I address the question if there exist universal statistical patterns that are different in coding and noncoding DNA of all living species, regardless of their phylogenetic origin. In search for such species-independent patterns I study the mutual information function of genomic DNA sequences, and find that it shows persistent period-three oscillations. To understand the biological origin of the observed period-three oscillations, I compare the mutual information function of genomic DNA sequences to the mutual information function of stochastic model sequences. I find that the pseudo-exon model is able to reproduce the mutual information function of genomic DNA sequences. Moreover, I find that a generalization of the pseudo-exon model can connect the existence and the functional form of long-range correlations to the presence and the length distributions of coding and noncoding regions. Based on these theoretical studies I am able to find an information-theoretical quantity, the average mutual information (AMI), whose probability distributions are significantly different in coding and noncoding DNA, while they are almost identical in all studied species. These findings show that there exist universal statistical patterns that are different in coding and noncoding DNA of all studied species, and they suggest that the AMI may be used to identify genes in different living species, irrespective of their taxonomic origin.

  17. Coupled transient thermo-fluid/thermal-stress analysis approach in a VTBM setting

    International Nuclear Information System (INIS)

    Ying, A.; Narula, M.; Zhang, H.; Abdou, M.

    2008-01-01

    A virtual test blanket module (VTBM) has been envisioned as a utility to aid in streamlining and optimizing the US ITER TBM design effort by providing an integrated multi-code, multi-physics modeling environment. Within this effort, an integrated simulation approach is being developed for TBM design calculations and performance evaluation. Particularly, integrated thermo-fluid/thermal-stress analysis is important for enabling TBM design and performance calculations. In this paper, procedures involved in transient coupled thermo-fluid/thermal-stress analysis are investigated. The established procedure is applied to study the impact of pulsed operational phenomenon on the thermal-stress response of the TBM first wall. A two-way coupling between the thermal strain and temperature field is also studied, in the context of a change in thermal conductivity of the beryllium pebble bed in a solid breeder blanket TBM due to thermal strain. The temperature field determines the thermal strain in beryllium, which in turn changes the temperature field. Iterative thermo-fluid/thermal strain calculations have been applied to both steady-state and pulsed operation conditions. All calculations have been carried out in three dimensions with representative MCAD models, including all the TBM components in their entirety

  18. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  19. Informing Evidence Based Decisions: Usage Statistics for Online Journal Databases

    Directory of Open Access Journals (Sweden)

    Alexei Botchkarev

    2017-06-01

    Full Text Available Abstract Objective – The primary objective was to examine online journal database usage statistics for a provincial ministry of health in the context of evidence based decision-making. In addition, the study highlights implementation of the Journal Access Centre (JAC that is housed and powered by the Ontario Ministry of Health and Long-Term Care (MOHLTC to inform health systems policy-making. Methods – This was a prospective case study using descriptive analysis of the JAC usage statistics of journal articles from January 2009 to September 2013. Results – JAC enables ministry employees to access approximately 12,000 journals with full-text articles. JAC usage statistics for the 2011-2012 calendar years demonstrate a steady level of activity in terms of searches, with monthly averages of 5,129. In 2009-2013, a total of 4,759 journal titles were accessed including 1,675 journals with full-text. Usage statistics demonstrate that the actual consumption was over 12,790 full-text downloaded articles or approximately 2,700 articles annually. Conclusion – JAC’s steady level of activities, revealed by the study, reflects continuous demand for JAC services and products. It testifies that access to online journal databases has become part of routine government knowledge management processes. MOHLTC’s broad area of responsibilities with dynamically changing priorities translates into the diverse information needs of its employees and a large set of required journals. Usage statistics indicate that MOHLTC information needs cannot be mapped to a reasonably compact set of “core” journals with a subsequent subscription to those.

  20. Thermo-fluid behaviour of periodic cellular metals

    CERN Document Server

    Lu, Tian Jian; Wen, Ting

    2013-01-01

    Thermo-Fluid Behaviour of Periodic Cellular Metals introduces the study of coupled thermo-fluid behaviour of cellular metals with periodic structure in response to thermal loads, which is an interdisciplinary research area that requires a concurrent-engineering approach.  The book, for the first time, systematically adopts experimental, numerical, and analytical approaches, presents the fluid flow and heat transfer in periodic cellular metals under forced convection conditions, aiming to establish structure-property relationships for tailoring material structures to achieve properties and performance levels that are customized for defined multifunctional applications. The book, as a textbook and reference book, is intended for both academic and industrial people, including graduate students, researchers and engineers. Dr. Tian Jian Lu is a professor at the School of Aerospace, Xi’an Jiaotong University, Xi’an, China. Dr. Feng Xu is a professor at the Key Laboratory of Biomedical Information Engineering o...

  1. Evaluation of thermo-mechanical properties data of carbon-based plasma facing materials

    International Nuclear Information System (INIS)

    Ulrickson, M.; Barabash, V.R.; Matera, R.; Roedig, M.; Smith, J.J.; Janev, R.K.

    1991-03-01

    This Report contains the proceedings, results and conclusions of the work done and the analysis performed during the IAEA Consultants' Meeting on ''Evaluation of thermo-mechanical properties data of carbon-based plasma facing materials'', convened on December 17-21, 1990, at the IAEA Headquarters in Vienna. Although the prime objective of the meeting was to critically assess the available thermo-mechanical properties data for certain types of carbon-based fusion relevant materials, the work of the meeting went well beyond this task. The meeting participants discussed in depth the scope and structure of the IAEA material properties database, the format of data presentation, the most appropriate computerized system for data storage, retrieval, exchange and management. The existing IAEA ALADDIN system was adopted as a convenient tool for this purpose and specific ALADDIN labelling schemes and dictionaries were established for the material properties data. An ALADDIN formatted test-file for the thermo-physical and thermo-mechanical properties of pyrolytic graphite is appended to this Report for illustrative purposes. (author)

  2. Inert gas narcosis has no influence on thermo-tactile sensation.

    Science.gov (United States)

    Jakovljević, Miroljub; Vidmar, Gaj; Mekjavic, Igor B

    2012-05-01

    Contribution of skin thermal sensors under inert gas narcosis to the raising hypothermia is not known. Such information is vital for understanding the impact of narcosis on behavioural thermoregulation, diver safety and judgment of thermal (dis)comfort in the hyperbaric environment. So this study aimed at establishing the effects of normoxic concentration of 30% nitrous oxide (N(2)O) on thermo-tactile threshold sensation by studying 16 subjects [eight females and eight males; eight sensitive (S) and eight non-sensitive (NS) to N(2)O]. Their mean (SD) age was 22.1 (1.8) years, weight 72.8 (15.3) kg, height 1.75 (0.10) m and body mass index 23.8 (3.8) kg m(-2). Quantitative thermo-tactile sensory testing was performed on forearm, upper arm and thigh under two experimental conditions: breathing air (air trial) and breathing normoxic mixture of 30% N(2)O (N(2)O trial) in the mixed sequence. Difference in thermo-tactile sensitivity thresholds between two groups of subjects in two experimental conditions was analysed by 3-way mixed-model analysis of covariance. There were no statistically significant differences in thermo-tactile thresholds either between the Air and N(2)O trials, or between S and NS groups, or between females and males, or with respect to body mass index. Some clinically insignificant lowering of thermo-tactile thresholds occurred only for warm thermo-tactile thresholds on upper arm and thigh. The results indicated that normoxic mixture of 30% N(2)O had no influence on thermo-tactile sensation in normothermia.

  3. Student-Centered Instruction in a Theoretical Statistics Course

    Science.gov (United States)

    Bates Prins, Samantha C.

    2009-01-01

    This paper provides an example of how student-centered instruction can be used in a theoretical statistics class. The author taught a two-semester undergraduate probability and mathematical statistics sequence using primarily teacher-centered instruction in the first semester and primarily student-centered instruction in the second semester. A…

  4. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    Directory of Open Access Journals (Sweden)

    Abe D Hofman

    Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.

  5. FEATURES OF THE APPLICATION OF STATISTICAL INDICATORS OF SCHEDULED FLIGHTS OF AIRCRAFT

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Тhe possibilities of increasing the effectiveness of management of safety of regular aircraft operations on the basis of systematic approach, under normal operating conditions are considered. These new opportunities within the airline are based on Flight Safety Management System integration with quality management system. So far, however, these possibili- ties are practically not implemented due to the limited application of statistical methods. A necessary condition for the implementation of the proposed approach is the use of statistical flight data results of the quality control flight. The proper- ties and peculiarities of application of statistical indicators of flight parameters during the monitoring of flight data are analyzed. It is shown that the main statistical indicators of the controlled process are averages and variations. The features of the application of theoretical models of mathematical statistics in the analysis of flight information are indicated. It is noted that in practice the theoretical models often do not fit into the framework of its application because of the violation of the initial assumptions. Recommendations are given for the integrated use of statistical indicators of the current quality control of flights. Ultimately, the article concludes that the capabilities of the proposed approach allows on the basis of knowledge about the dynamics of statistical indicators of controlled flight process to identify hazards and develop safety indicators for the new information based on data flight operation aircraft.

  6. A Statistical Approach for Gain Bandwidth Prediction of Phoenix-Cell Based Reflect arrays

    Directory of Open Access Journals (Sweden)

    Hassan Salti

    2018-01-01

    Full Text Available A new statistical approach to predict the gain bandwidth of Phoenix-cell based reflectarrays is proposed. It combines the effects of both main factors that limit the bandwidth of reflectarrays: spatial phase delays and intrinsic bandwidth of radiating cells. As an illustration, the proposed approach is successfully applied to two reflectarrays based on new Phoenix cells.

  7. Thermo-elastic optical coherence tomography.

    Science.gov (United States)

    Wang, Tianshi; Pfeiffer, Tom; Wu, Min; Wieser, Wolfgang; Amenta, Gaetano; Draxinger, Wolfgang; van der Steen, Antonius F W; Huber, Robert; Soest, Gijs van

    2017-09-01

    The absorption of nanosecond laser pulses induces rapid thermo-elastic deformation in tissue. A sub-micrometer scale displacement occurs within a few microseconds after the pulse arrival. In this Letter, we investigate the laser-induced thermo-elastic deformation using a 1.5 MHz phase-sensitive optical coherence tomography (OCT) system. A displacement image can be reconstructed, which enables a new modality of phase-sensitive OCT, called thermo-elastic OCT. An analysis of the results shows that the optical absorption is a dominating factor for the displacement. Thermo-elastic OCT is capable of visualizing inclusions that do not appear on the structural OCT image, providing additional tissue type information.

  8. Information theoretic analysis of canny edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  9. Quantification of uncertainties in turbulence modeling: A comparison of physics-based and random matrix theoretic approaches

    International Nuclear Information System (INIS)

    Wang, Jian-Xun; Sun, Rui; Xiao, Heng

    2016-01-01

    Highlights: • Compared physics-based and random matrix methods to quantify RANS model uncertainty. • Demonstrated applications of both methods in channel ow over periodic hills. • Examined the amount of information introduced in the physics-based approach. • Discussed implications to modeling turbulence in both near-wall and separated regions. - Abstract: Numerical models based on Reynolds-Averaged Navier-Stokes (RANS) equations are widely used in engineering turbulence modeling. However, the RANS predictions have large model-form uncertainties for many complex flows, e.g., those with non-parallel shear layers or strong mean flow curvature. Quantification of these large uncertainties originating from the modeled Reynolds stresses has attracted attention in the turbulence modeling community. Recently, a physics-based Bayesian framework for quantifying model-form uncertainties has been proposed with successful applications to several flows. Nonetheless, how to specify proper priors without introducing unwarranted, artificial information remains challenging to the current form of the physics-based approach. Another recently proposed method based on random matrix theory provides the prior distributions with maximum entropy, which is an alternative for model-form uncertainty quantification in RANS simulations. This method has better mathematical rigorousness and provides the most non-committal prior distributions without introducing artificial constraints. On the other hand, the physics-based approach has the advantages of being more flexible to incorporate available physical insights. In this work, we compare and discuss the advantages and disadvantages of the two approaches on model-form uncertainty quantification. In addition, we utilize the random matrix theoretic approach to assess and possibly improve the specification of priors used in the physics-based approach. The comparison is conducted through a test case using a canonical flow, the flow past

  10. The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach

    Science.gov (United States)

    Sari, S. Y.; Afrizon, R.

    2018-04-01

    Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.

  11. Daniel Goodman’s empirical approach to Bayesian statistics

    Science.gov (United States)

    Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina

    2016-01-01

    Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.

  12. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    Science.gov (United States)

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  13. Effects of Microstructural Variability on Thermo-Mechanical Properties of a Woven Ceramic Matrix Composite

    Science.gov (United States)

    Goldsmith, Marlana B.; Sankar, Bhavani V.; Haftka, Raphael T.; Goldberg, Robert K.

    2013-01-01

    The objectives of this paper include identifying important architectural parameters that describe the SiC/SiC five-harness satin weave composite and characterizing the statistical distributions and correlations of those parameters from photomicrographs of various cross sections. In addition, realistic artificial cross sections of a 2D representative volume element (RVE) are generated reflecting the variability found in the photomicrographs, which are used to determine the effects of architectural variability on the thermo-mechanical properties. Lastly, preliminary information is obtained on the sensitivity of thermo-mechanical properties to architectural variations. Finite element analysis is used in combination with a response surface and it is shown that the present method is effective in determining the effects of architectural variability on thermo-mechanical properties.

  14. Thermo-msf-parser: an open source Java library to parse and visualize Thermo Proteome Discoverer msf files.

    Science.gov (United States)

    Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart

    2011-08-05

    The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.

  15. Theoretical analysis and experimental investigation on performance of the thermal shield of accelerator cryomodules by thermo-siphon cooling of liquid nitrogen

    Science.gov (United States)

    Datta, T. S.; Kar, S.; Kumar, M.; Choudhury, A.; Chacko, J.; Antony, J.; Babu, S.; Sahu, S. K.

    2015-12-01

    Five beam line cryomodules with total 27 superconducting Radio Frequency (RF) cavities are installed and commissioned at IUAC to enhance the energy of heavy ion from 15 UD Pelletron. To reduce the heat load at 4.2 K, liquid nitrogen (LN2) cooled intermediate thermal shield is used for all these cryomodules. For three linac cryomodules, concept of forced flow LN2 cooling is used and for superbuncher and rebuncher, thermo-siphon cooling is incorporated. It is noticed that the shield temperature of superbuncher varies from 90 K to 110 K with respect to liquid nitrogen level. The temperature difference can't be explained by using the basic concept of thermo-siphon with the heat load on up flow line. A simple thermo-siphon experimental set up is developed to simulate the thermal shield temperature profile. Mass flow rate of liquid nitrogen is measured with different heat load on up flow line for different liquid levels. It is noticed that small amount of heat load on down flow line have a significant effect on mass flow rate. The present paper will be investigating the data generated from the thermosiphon experimental set up and a theoretical analysis will be presented here to validate the measured temperature profile of the cryomodule shield.

  16. Improving statistical reasoning theoretical models and practical implications

    CERN Document Server

    Sedlmeier, Peter

    1999-01-01

    This book focuses on how statistical reasoning works and on training programs that can exploit people''s natural cognitive capabilities to improve their statistical reasoning. Training programs that take into account findings from evolutionary psychology and instructional theory are shown to have substantially larger effects that are more stable over time than previous training regimens. The theoretical implications are traced in a neural network model of human performance on statistical reasoning problems. This book apppeals to judgment and decision making researchers and other cognitive scientists, as well as to teachers of statistics and probabilistic reasoning.

  17. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  18. Towards a Set Theoretical Approach to Big Data Analytics

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    Formal methods, models and tools for social big data analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by relational sociology. There are no other unified modeling approaches to social big data that integrate the conceptual, formal...... this technique to the data analysis of big social data collected from Facebook page of the fast fashion company, H&M....... and software realms. In this paper, we first present and discuss a theory and conceptual model of social data. Second, we outline a formal model based on set theory and discuss the semantics of the formal model with a real-world social data example from Facebook. Third, we briefly present and discuss...

  19. Transcatheter hepatic arterial thermo-chemotherapy and thermo-lipiodol embolization for the treatment of hepatic metastases from colorectal carcinoma

    International Nuclear Information System (INIS)

    Wang Xuan; Chen Xiaofei

    2009-01-01

    Objective: To evaluate the clinical efficacy of transcatheter hepatic arterial thermo-chemotherapy and thermo-lipiodol embolization in the treatment of hepatic metastases from colorectal carcinoma. Methods: Sixty-eight cases with hepatic metastases from colorectal carcinoma were equally and randomly divided into two groups. The patients in study group were treated with transcatheter hepatic arterial thermo-chemotherapy and thermo-lipiodol embolization, while the patients in control group were treated with conventional (normal temperature) transcatheter hepatic arterial chemotherapy lipiodol embolization. Results: The effective rate of study group and control group was 65%(22/34) and 32%(11/34) respectively, the difference between two groups was statistically significant (P<0.05). No significant difference in the postoperative changes of hepatic function tests was found between the two groups. The survival rate at 6,12,18 and 24 months after the treatment was 100%, 82%, 44% and 18% respectively in study group, while it was 91%, 47%, 15% and 6% respectively in control group. Conclusion: Transcatheter hepatic arterial thermo-chemotherapy and thermo-lipiodol embolization is an effective and safe treatment for the hepatic metastases from colorectal carcinoma and has no obvious damage to the hepatic function. (authors)

  20. Hash functions and information theoretic security

    DEFF Research Database (Denmark)

    Bagheri, Nasoor; Knudsen, Lars Ramkilde; Naderi, Majid

    2009-01-01

    Information theoretic security is an important security notion in cryptography as it provides a true lower bound for attack complexities. However, in practice attacks often have a higher cost than the information theoretic bound. In this paper we study the relationship between information theoretic...

  1. Thermo-Fluid Dynamics of Two-Phase Flow

    CERN Document Server

    Ishii, Mamrou

    2011-01-01

    "Thermo-fluid Dynamics of Two-Phase Flow, Second Edition" is focused on the fundamental physics of two-phase flow. The authors present the detailed theoretical foundation of multi-phase flow thermo-fluid dynamics as they apply to: Nuclear reactor transient and accident analysis; Energy systems; Power generation systems; Chemical reactors and process systems; Space propulsion; Transport processes. This edition features updates on two-phase flow formulation and constitutive equations and CFD simulation codes such as FLUENT and CFX, new coverage of the lift force model, which is of part

  2. Fundamental topics for thermo-elastic stress analyses

    International Nuclear Information System (INIS)

    Biermann, M.

    1989-01-01

    This paper delivers a consistent collection of theoretical fundamentals needed to perform rather sound experimental stress analyses on thermo-elastic materials. An exposition of important concepts of symmetry and so-called peer groups, yielding the very base for a rational description of materials, goes ahead and is followed by an introduction to the constitutive theory of simple materials. Neat distinction is made between stress contributions determined by deformational and thermal impressions, on the one part, and stress constraints not accessible to strain gauging, on the other part. The mathematical formalism required for establishing constitutive equations is coherently developed from scratch and aided, albeit not subrogated, by intuition. The main intention goes to turning some of the recent advances in the nonlinear field theories of thermomechanics to practical account. A full success therein, obviously, results under the restriction to thermo-elasticity. In adverting to more particular subjects, the elementary static effects of nonlinear isotropic elasticity are pointed out. Due allowance is made for thermal effects likely to occur in heat conducting materials also beyond the isothermal or isentropic limit cases. Linearization of the constitutive equations for anisotropic thermo-elastic materials is then shown to entail the formulas of the classical theory. (orig./MM) [de

  3. Theory and modeling of cylindrical thermo-acoustic transduction

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Lihong, E-mail: lhtong@ecjtu.edu.cn [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China); Lim, C.W. [Department of Architecture and Civil Engineering, City University of Hong Kong, Kowloon, Hong Kong SAR (China); Zhao, Xiushao; Geng, Daxing [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China)

    2016-06-03

    Models both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed and the corresponding acoustic pressure solutions are obtained. The acoustic pressure for an individual carbon nanotube (CNT) as a function of input power is investigated analytically and it is verified by comparing with the published experimental data. Further numerical analysis on the acoustic pressure response and characteristics for varying input frequency and distance are also examined both for solid and thinfilm-solid cylindrical thermo-acoustic transductions. Through detailed theoretical and numerical studies on the acoustic pressure solution for thinfilm-solid cylindrical transduction, it is concluded that a solid with smaller thermal conductivity favors to improve the acoustic performance. In general, the proposed models are applicable to a variety of cylindrical thermo-acoustic devices performing in different gaseous media. - Highlights: • Theory and modeling both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed. • The modeling is verified by comparing with the published experimental data. • Acoustic response characteristics of cylindrical thermo-acoustic transductions are predicted by the proposed model.

  4. Perspectives on Cybersecurity Information Sharing among Multiple Stakeholders Using a Decision-Theoretic Approach.

    Science.gov (United States)

    He, Meilin; Devine, Laura; Zhuang, Jun

    2018-02-01

    The government, private sectors, and others users of the Internet are increasingly faced with the risk of cyber incidents. Damage to computer systems and theft of sensitive data caused by cyber attacks have the potential to result in lasting harm to entities under attack, or to society as a whole. The effects of cyber attacks are not always obvious, and detecting them is not a simple proposition. As the U.S. federal government believes that information sharing on cybersecurity issues among organizations is essential to safety, security, and resilience, the importance of trusted information exchange has been emphasized to support public and private decision making by encouraging the creation of the Information Sharing and Analysis Center (ISAC). Through a decision-theoretic approach, this article provides new perspectives on ISAC, and the advent of the new Information Sharing and Analysis Organizations (ISAOs), which are intended to provide similar benefits to organizations that cannot fit easily into the ISAC structure. To help understand the processes of information sharing against cyber threats, this article illustrates 15 representative information sharing structures between ISAC, government, and other participating entities, and provide discussions on the strategic interactions between different stakeholders. This article also identifies the costs of information sharing and information security borne by different parties in this public-private partnership both before and after cyber attacks, as well as the two main benefits. This article provides perspectives on the mechanism of information sharing and some detailed cost-benefit analysis. © 2017 Society for Risk Analysis.

  5. A comparison of SAR ATR performance with information theoretic predictions

    Science.gov (United States)

    Blacknell, David

    2003-09-01

    Performance assessment of automatic target detection and recognition algorithms for SAR systems (or indeed any other sensors) is essential if the military utility of the system / algorithm mix is to be quantified. This is a relatively straightforward task if extensive trials data from an existing system is used. However, a crucial requirement is to assess the potential performance of novel systems as a guide to procurement decisions. This task is no longer straightforward since a hypothetical system cannot provide experimental trials data. QinetiQ has previously developed a theoretical technique for classification algorithm performance assessment based on information theory. The purpose of the study presented here has been to validate this approach. To this end, experimental SAR imagery of targets has been collected using the QinetiQ Enhanced Surveillance Radar to allow algorithm performance assessments as a number of parameters are varied. In particular, performance comparisons can be made for (i) resolutions up to 0.1m, (ii) single channel versus polarimetric (iii) targets in the open versus targets in scrubland and (iv) use versus non-use of camouflage. The change in performance as these parameters are varied has been quantified from the experimental imagery whilst the information theoretic approach has been used to predict the expected variation of performance with parameter value. A comparison of these measured and predicted assessments has revealed the strengths and weaknesses of the theoretical technique as will be discussed in the paper.

  6. Theoretical frameworks informing family-based child and adolescent obesity interventions

    DEFF Research Database (Denmark)

    Alulis, Sarah; Grabowski, Dan

    2017-01-01

    into focus. However, the use of theoretical frameworks to strengthen these interventions is rare and very uneven. OBJECTIVE AND METHOD: To conduct a qualitative meta-synthesis of family-based interventions for child and adolescent obesity to identify the theoretical frameworks applied, thus understanding how...... inconsistencies and a significant void between research results and health care practice. Based on the analysis, this article proposes three themes to be used as focus points when designing future interventions and when selecting theories for the development of solid, theory-based frameworks for application...... cognitive, self-efficacy and Family Systems Theory appeared most frequently. The remaining 24 were classified as theory-related as theoretical elements of self-monitoring; stimulus control, reinforcement and modelling were used. CONCLUSION: The designs of family-based interventions reveal numerous...

  7. Status of the LMFBR thermo- and fluid-dynamic activities at KFK

    International Nuclear Information System (INIS)

    Hoffmann, H.; Hofmann, F.; Rehme, K.

    1979-01-01

    The aim of the thermo- and fluiddynamic analysis is to determine the spatial velocity and temperature distributions in LMFBR-core elements with high accuracy. Knowledge of these data is a necessary prerequisite for determining the mechanical behavior of fuel rods and of structural material. Three cases are distinguished: Nominal geometry and steady state conditions; non-nominal geometry and quasi-steady state conditions; nominal geometry and non-steady state conditions. The present situation for the design calculations of fuel elements is based mainly on undisturbed normal operation. Most of the thermo- and fluiddynamic activities performed under the Fast Breeder Programme at KFK are related to this case. The present status of theoretical and experimental research work briefly presented in this paper, can be subdivided into the following main topics: 1. Physical and mathematical modelling of single phase rod bundle thermo- and fluiddynamics, 2. Experimental investigations on heat transfer and fluid flow in rod bundles

  8. MASKED AREAS IN SHEAR PEAK STATISTICS: A FORWARD MODELING APPROACH

    International Nuclear Information System (INIS)

    Bard, D.; Kratochvil, J. M.; Dawson, W.

    2016-01-01

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impact of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance

  9. Advances in statistical multisource-multitarget information fusion

    CERN Document Server

    Mahler, Ronald PS

    2014-01-01

    This is the sequel to the 2007 Artech House bestselling title, Statistical Multisource-Multitarget Information Fusion. That earlier book was a comprehensive resource for an in-depth understanding of finite-set statistics (FISST), a unified, systematic, and Bayesian approach to information fusion. The cardinalized probability hypothesis density (CPHD) filter, which was first systematically described in the earlier book, has since become a standard multitarget detection and tracking technique, especially in research and development.Since 2007, FISST has inspired a considerable amount of research

  10. Approaches to Learning Information Literacy: A Phenomenographic Study

    Science.gov (United States)

    Diehm, Rae-Anne; Lupton, Mandy

    2012-01-01

    This paper reports on an empirical study that explores the ways students approach learning to find and use information. Based on interviews with 15 education students in an Australian university, this study uses phenomenography as its methodological and theoretical basis. The study reveals that students use three main strategies for learning…

  11. Understanding Fast and Robust Thermo-osmotic Flows through Carbon Nanotube Membranes: Thermodynamics Meets Hydrodynamics.

    Science.gov (United States)

    Fu, Li; Merabia, Samy; Joly, Laurent

    2018-04-19

    Following our recent theoretical prediction of the giant thermo-osmotic response of the water-graphene interface, we explore the practical implementation of waste heat harvesting with carbon-based membranes, focusing on model membranes of carbon nanotubes (CNT). To that aim, we combine molecular dynamics simulations and an analytical model considering the details of hydrodynamics in the membrane and at the tube entrances. The analytical model and the simulation results match quantitatively, highlighting the need to take into account both thermodynamics and hydrodynamics to predict thermo-osmotic flows through membranes. We show that, despite viscous entrance effects and a thermal short-circuit mechanism, CNT membranes can generate very fast thermo-osmotic flows, which can overcome the osmotic pressure of seawater. We then show that in small tubes confinement has a complex effect on the flow and can even reverse the flow direction. Beyond CNT membranes, our analytical model can guide the search for other membranes to generate fast and robust thermo-osmotic flows.

  12. Passing Decisions in Football: Introducing an Empirical Approach to Estimating the Effects of Perceptual Information and Associative Knowledge.

    Science.gov (United States)

    Steiner, Silvan

    2018-01-01

    The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented.

  13. Passing Decisions in Football: Introducing an Empirical Approach to Estimating the Effects of Perceptual Information and Associative Knowledge

    Directory of Open Access Journals (Sweden)

    Silvan Steiner

    2018-03-01

    Full Text Available The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented.

  14. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  15. Adaptive information-theoretic bounded rational decision-making with parametric priors

    OpenAIRE

    Grau-Moya, Jordi; Braun, Daniel A.

    2015-01-01

    Deviations from rational decision-making due to limited computational resources have been studied in the field of bounded rationality, originally proposed by Herbert Simon. There have been a number of different approaches to model bounded rationality ranging from optimality principles to heuristics. Here we take an information-theoretic approach to bounded rationality, where information-processing costs are measured by the relative entropy between a posterior decision strategy and a given fix...

  16. The circular thermo-phoretic spectrometer (CTSM), a new device for the study of the thermophoresis, Application on the fractals soot particles

    International Nuclear Information System (INIS)

    Brugiere, E.

    2012-01-01

    This work aims to improve the understanding of soot particle deposition by thermophoresis. In order to show the influence of the morphology of a fractal aggregate on its thermo-phoretic behavior, a new experimental device has been developed; the SpectroMetre Thermophoretique Circulaire (SMTC). This instrument is used to measure the mean thermo-phoretic velocity of particles selected between a hot plate and a cold plate thanks to a transfer function based on the geometry of the radial flow differential mobility analyser RF-DMA or SMEC (Spectrometre de Mobilite Electrique Circulaire). For the experimental validation, effective thermo-phoretic velocities of monodispersed spherical latex particles for diameters ranging from 64 nm to 500 nm and a temperature gradient equal to 50 750 K/m are measured and compared with theoretical values. The good agreement between the experimental results and theoretical values of Beresnev and Chernyak (1995) helps us to validate the operation of the instrument. Then we compare experimental thermo-phoretic velocity obtained with the SMTC for spherical particles and aggregates produced by a combustion aerosol generator. Contrary to the results obtained with the PSL particles, we observe that the thermo-phoretic velocity of aggregates increases with the electrical mobility diameter. Thanks to a morphological study of the aggregates, we showed that the thermo-phoretic velocity depends on the number of primary particles of the aggregate. These experimental results confirm, for the first time, the theoretical data of Mackowski (2006) obtained by a Monte Carlo simulation. Moreover, a comparison with the experimental results of Messerer et al. (2003) shows that the thermo-phoretic velocity of aggregates seems independent of the primary particle size. (author)

  17. Thermal effects in the hadronic and photonic multiplicity distributions and correlations: a thermo-field dynamic approach

    International Nuclear Information System (INIS)

    Bambah, Bindu A.; Mogurampally, Naveen Kumar

    2016-01-01

    The existence of the Quark Gluon Plasma (QGP) requires that in the collision of heavy ions an initial fireball is formed which has a lifetime larger than typical hadronic time scale of 10"−"2"3 sec and that the temperature and volume of the fireball is sufficient to ensure that the Quark Hadron phase transition predicted by statistical QCD is achieved. Then the pions and photons emitted from this hot fire ball may carry information of the temperature and life time of the emitting region, and this may manifest itself in the correlation functions and multiplicities which can be modified by finite temperature. Thus it is important to find ways of incorporating finite temperature effects in multiplicity distributions and correlations. The Thermo field formalism is particularly useful in the description of parametric dynamical systems in which squeezing of quantum fluctuations is important

  18. Dynamic thermo-hydraulic model of district cooling networks

    International Nuclear Information System (INIS)

    Oppelt, Thomas; Urbaneck, Thorsten; Gross, Ulrich; Platzer, Bernd

    2016-01-01

    Highlights: • A dynamic thermo-hydraulic model for district cooling networks is presented. • The thermal modelling is based on water segment tracking (Lagrangian approach). • Thus, numerical errors and balance inaccuracies are avoided. • Verification and validation studies proved the reliability of the model. - Abstract: In the present paper, the dynamic thermo-hydraulic model ISENA is presented which can be applied for answering different questions occurring in design and operation of district cooling networks—e.g. related to economic and energy efficiency. The network model consists of a quasistatic hydraulic model and a transient thermal model based on tracking water segments through the whole network (Lagrangian method). Applying this approach, numerical errors and balance inaccuracies can be avoided which leads to a higher quality of results compared to other network models. Verification and validation calculations are presented in order to show that ISENA provides reliable results and is suitable for practical application.

  19. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  20. Model-free information-theoretic approach to infer leadership in pairs of zebrafish.

    Science.gov (United States)

    Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio

    2016-04-01

    Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.

  1. A Formal Approach for RT-DVS Algorithms Evaluation Based on Statistical Model Checking

    Directory of Open Access Journals (Sweden)

    Shengxin Dai

    2015-01-01

    Full Text Available Energy saving is a crucial concern in embedded real time systems. Many RT-DVS algorithms have been proposed to save energy while preserving deadline guarantees. This paper presents a novel approach to evaluate RT-DVS algorithms using statistical model checking. A scalable framework is proposed for RT-DVS algorithms evaluation, in which the relevant components are modeled as stochastic timed automata, and the evaluation metrics including utilization bound, energy efficiency, battery awareness, and temperature awareness are expressed as statistical queries. Evaluation of these metrics is performed by verifying the corresponding queries using UPPAAL-SMC and analyzing the statistical information provided by the tool. We demonstrate the applicability of our framework via a case study of five classical RT-DVS algorithms.

  2. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  3. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    Science.gov (United States)

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  4. A statistic to estimate the variance of the histogram-based mutual information estimator based on dependent pairs of observations

    NARCIS (Netherlands)

    Moddemeijer, R

    In the case of two signals with independent pairs of observations (x(n),y(n)) a statistic to estimate the variance of the histogram based mutual information estimator has been derived earlier. We present such a statistic for dependent pairs. To derive this statistic it is necessary to avail of a

  5. Information-theoretic temporal Bell inequality and quantum computation

    International Nuclear Information System (INIS)

    Morikoshi, Fumiaki

    2006-01-01

    An information-theoretic temporal Bell inequality is formulated to contrast classical and quantum computations. Any classical algorithm satisfies the inequality, while quantum ones can violate it. Therefore, the violation of the inequality is an immediate consequence of the quantumness in the computation. Furthermore, this approach suggests a notion of temporal nonlocality in quantum computation

  6. Assessing socioeconomic vulnerability to dengue fever in Cali, Colombia: statistical vs expert-based modeling.

    Science.gov (United States)

    Hagenlocher, Michael; Delmelle, Eric; Casas, Irene; Kienberger, Stefan

    2013-08-14

    As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., local expertise, statistical approaches could be used, with caution. By decomposing identified vulnerability "hotspots" into their underlying factors, our approach provides valuable information on both (1) the location of neighborhoods, and (2) vulnerability factors that should be given priority in the context of targeted intervention

  7. Robust recognition via information theoretic learning

    CERN Document Server

    He, Ran; Yuan, Xiaotong; Wang, Liang

    2014-01-01

    This Springer Brief represents a comprehensive review of information theoretic methods for robust recognition. A variety of information theoretic methods have been proffered in the past decade, in a large variety of computer vision applications; this work brings them together, attempts to impart the theory, optimization and usage of information entropy.The?authors?resort to a new information theoretic concept, correntropy, as a robust measure and apply it to solve robust face recognition and object recognition problems. For computational efficiency,?the brief?introduces the additive and multip

  8. Margins of freedom: a field-theoretic approach to class-based health dispositions and practices.

    Science.gov (United States)

    Burnett, Patrick John; Veenstra, Gerry

    2017-09-01

    Pierre Bourdieu's theory of practice situates social practices in the relational interplay between experiential mental phenomena (habitus), resources (capitals) and objective social structures (fields). When applied to class-based practices in particular, the overarching field of power within which social classes are potentially made manifest is the primary field of interest. Applying relational statistical techniques to original survey data from Toronto and Vancouver, Canada, we investigated whether smoking, engaging in physical activity and consuming fruit and vegetables are dispersed in a three-dimensional field of power shaped by economic and cultural capitals and cultural dispositions and practices. We find that aesthetic dispositions and flexibility of developing and established dispositions are associated with positioning in the Canadian field of power and embedded in the logics of the health practices dispersed in the field. From this field-theoretic perspective, behavioural change requires the disruption of existing relations of harmony between the habitus of agents, the fields within which the practices are enacted and the capitals that inform and enforce the mores and regularities of the fields. The three-dimensional model can be explored at: http://relational-health.ca/margins-freedom. © 2017 Foundation for the Sociology of Health & Illness.

  9. A Theoretical Approach

    African Journals Online (AJOL)

    NICO

    L-rhamnose and L-fucose: A Theoretical Approach ... L-ramnose and L-fucose, by means of the Monte Carlo conformational search method. The energy of the conformers ..... which indicates an increased probability for the occurrence of.

  10. Statistical approach to thermal evolution of neutron stars

    International Nuclear Information System (INIS)

    Beznogov, M V; Yakovlev, D G

    2015-01-01

    Studying thermal evolution of neutron stars (NSs) is one of a few ways to investigate the properties of superdense matter in their cores. We study the cooling of isolated NSs (INSs) and deep crustal heating of transiently accreting NSs in X-ray transients (XRTs, binary systems with low-mass companions). Currently, nearly 50 of such NSs are observed, and one can apply statistical methods to analyze the whole dataset. We propose a method for such analysis based on thermal evolution theory for individual stars and on averaging the results over NS mass distributions. We calculate the distributions of INSs and accreting NSs (ANSs) in XRTs over cooling and heating diagrams respectively. Comparing theoretical and observational distributions one can infer information on physical properties of superdense matter and on mass distributions of INSs and ANSs. (paper)

  11. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (pstatistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    Science.gov (United States)

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  13. SciLab Based Remote Control of Thermo-Optical Plant

    Directory of Open Access Journals (Sweden)

    Miroslav Jano

    2011-11-01

    Full Text Available The paper deals with the web-based implementation of the control system of a thermo-optical plant. The control of the plant is based on the SciLab software which originally is not designed for web-based applications. The paper shows a possible way to circumvent this limitation. The ultimate goal is to enable remote controlled experiment using SciLab. The paper also describes possible tools for communication and control of the real plant and visualization of results.

  14. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test.

    Science.gov (United States)

    Swanson, David M; Blacker, Deborah; Alchawa, Taofik; Ludwig, Kerstin U; Mangold, Elisabeth; Lange, Christoph

    2013-11-07

    The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to "filter" redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the

  15. Information-theoretic metamodel of organizational evolution

    Science.gov (United States)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  16. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  17. Distinguishing prognostic and predictive biomarkers: An information theoretic approach.

    Science.gov (United States)

    Sechidis, Konstantinos; Papangelou, Konstantinos; Metcalfe, Paul D; Svensson, David; Weatherall, James; Brown, Gavin

    2018-05-02

    The identification of biomarkers to support decision-making is central to personalised medicine, in both clinical and research scenarios. The challenge can be seen in two halves: identifying predictive markers, which guide the development/use of tailored therapies; and identifying prognostic markers, which guide other aspects of care and clinical trial planning, i.e. prognostic markers can be considered as covariates for stratification. Mistakenly assuming a biomarker to be predictive, when it is in fact largely prognostic (and vice-versa) is highly undesirable, and can result in financial, ethical and personal consequences. We present a framework for data-driven ranking of biomarkers on their prognostic/predictive strength, using a novel information theoretic method. This approach provides a natural algebra to discuss and quantify the individual predictive and prognostic strength, in a self-consistent mathematical framework. Our contribution is a novel procedure, INFO+, which naturally distinguishes the prognostic vs predictive role of each biomarker and handles higher order interactions. In a comprehensive empirical evaluation INFO+ outperforms more complex methods, most notably when noise factors dominate, and biomarkers are likely to be falsely identified as predictive, when in fact they are just strongly prognostic. Furthermore, we show that our methods can be 1-3 orders of magnitude faster than competitors, making it useful for biomarker discovery in 'big data' scenarios. Finally, we apply our methods to identify predictive biomarkers on two real clinical trials, and introduce a new graphical representation that provides greater insight into the prognostic and predictive strength of each biomarker. R implementations of the suggested methods are available at https://github.com/sechidis. konstantinos.sechidis@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  18. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  19. Information theoretic bounds for compressed sensing in SAR imaging

    International Nuclear Information System (INIS)

    Jingxiong, Zhang; Ke, Yang; Jianzhong, Guo

    2014-01-01

    Compressed sensing (CS) is a new framework for sampling and reconstructing sparse signals from measurements significantly fewer than those prescribed by Nyquist rate in the Shannon sampling theorem. This new strategy, applied in various application areas including synthetic aperture radar (SAR), relies on two principles: sparsity, which is related to the signals of interest, and incoherence, which refers to the sensing modality. An important question in CS-based SAR system design concerns sampling rate necessary and sufficient for exact or approximate recovery of sparse signals. In the literature, bounds of measurements (or sampling rate) in CS have been proposed from the perspective of information theory. However, these information-theoretic bounds need to be reviewed and, if necessary, validated for CS-based SAR imaging, as there are various assumptions made in the derivations of lower and upper bounds on sub-Nyquist sampling rates, which may not hold true in CS-based SAR imaging. In this paper, information-theoretic bounds of sampling rate will be analyzed. For this, the SAR measurement system is modeled as an information channel, with channel capacity and rate-distortion characteristics evaluated to enable the determination of sampling rates required for recovery of sparse scenes. Experiments based on simulated data will be undertaken to test the theoretic bounds against empirical results about sampling rates required to achieve certain detection error probabilities

  20. Formal approach to modeling of modern Information Systems

    Directory of Open Access Journals (Sweden)

    Bálint Molnár

    2016-01-01

    Full Text Available Most recently, the concept of business documents has started to play double role. On one hand, a business document (word processing text or calculation sheet can be used as specification tool, on the other hand the business document is an immanent constituent of business processes, thereby essential component of business Information Systems. The recent tendency is that the majority of documents and their contents within business Information Systems remain in semi-structured format and a lesser part of documents is transformed into schemas of structured databases. In order to keep the emerging situation in hand, we suggest the creation (1 a theoretical framework for modeling business Information Systems; (2 and a design method for practical application based on the theoretical model that provides the structuring principles. The modeling approach that focuses on documents and their interrelationships with business processes assists in perceiving the activities of modern Information Systems.

  1. A queer-theoretical approach to community health psychology.

    Science.gov (United States)

    Easpaig, Bróna R Nic Giolla; Fryer, David M; Linn, Seònaid E; Humphrey, Rhianna H

    2014-01-01

    Queer-theoretical resources offer ways of productively rethinking how central concepts such as 'person-context', 'identity' and 'difference' may be understood for community health psychologists. This would require going beyond consideration of the problems with which queer theory is popularly associated to cautiously engage with the aspects of this work relevant to the promotion of collective practice and engaging with processes of marginalisation. In this article, we will draw upon and illustrate the queer-theoretical concepts of 'performativity' and 'cultural intelligibility' before moving towards a preliminary mapping of what a queer-informed approach to community health psychology might involve.

  2. CONTEMPORARY APPROACHES OF COMPANY PERFORMANCE ANALYSIS BASED ON RELEVANT FINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Sziki Klara

    2012-12-01

    Full Text Available In this paper we chose to present two components of the financial statements: the profit and loss account and the cash flow statement. These summary documents and different indicators calculated based on them allow us to formulate assessments on the performance and profitability on various functions and levels of the company’s activity. This paper aims to support the hypothesis that the accounting information presented in the profit and loss account and in the cash flow statement is an appropriate source for assessing company performance. The purpose of this research is to answer the question linked to the main hypothesis: Is it the profit and loss statement or the cash flow account that reflects better the performance of a business? Based on the literature of specialty studied we tried a conceptual, analytical and practical approach of the term performance, overviewing some terminological acceptations of the term performance as well as the main indicators of performance analysis on the basis of the profit and loss account and of the cash flow statement: aggregated indicators, also known as intermediary balances of administration, economic rate of return, rate of financial profitability, rate of return through cash flows, operating cash flow rate, rate of generating operating cash out of gross operating result. At the same time we had a comparative approach of the profit and loss account and cash flow statement, outlining the main advantages and disadvantages of these documents. In order to demonstrate the above theoretical assessments, we chose to analyze these indicators based on information from the financial statements of SC Sinteza SA, a company in Bihor county, listed on the Bucharest Stock Exchange.

  3. Circular codes revisited: a statistical approach.

    Science.gov (United States)

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Conference: Statistical Physics and Biological Information

    International Nuclear Information System (INIS)

    Gross, David J.; Hwa, Terence

    2001-01-01

    In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/

  5. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  6. SOCIOLOGICAL UNDERSTANDING OF INTERNET: THEORETICAL APPROACHES TO THE NETWORK ANALYSIS

    Directory of Open Access Journals (Sweden)

    D. E. Dobrinskaya

    2016-01-01

    Full Text Available Internet studies are carried out by various scientific disciplines and in different research perspectives. Sociological studies of the Internet deal with a new technology, a revolutionary means of mass communication and a social space. There is a set of research difficulties associated with the Internet. Firstly, the high speed and wide spread of Internet technologies’ development. Secondly, the collection and filtration of materials concerning with Internet studies. Lastly, the development of new conceptual categories, which are able to reflect the impact of the Internet development in contemporary world. In that regard the question of the “network” category use is essential. Network is the base of Internet functioning, on the one hand. On the other hand, network is the ground for almost all social interactions in modern society. So such society is called network society. Three theoretical network approaches in the Internet research case are the most relevant: network society theory, social network analysis and actor-network theory. Each of these theoretical approaches contributes to the study of the Internet. They shape various images of interactions between human beings in their entity and dynamics. All these approaches also provide information about the nature of these interactions. 

  7. An information-theoretical approach to image resolution applied to neutron imaging detectors based upon individual discriminator signals

    International Nuclear Information System (INIS)

    Clergeau, Jean-Francois; Ferraton, Matthieu; Guerard, Bruno; Khaplanov, Anton; Piscitelli, Francesco; Platz, Martin; Rigal, Jean-Marie; Van Esch, Patrick; Daulle, Thibault

    2013-06-01

    1D or 2D neutron imaging detectors with individual wire or strip readout using discriminators have the advantage of being able to treat several neutron impacts partially overlapping in time, hence reducing global dead time. A single neutron impact usually gives rise to several discriminator signals. In this paper, we introduce an information-theoretical definition of image resolution. Two point-like spots of neutron impacts with a given distance between them act as a source of information (each neutron hit belongs to one spot or the other), and the detector plus signal treatment is regarded as an imperfect communication channel that transmits this information. The maximal mutual information obtained from this channel as a function of the distance between the spots allows to define a calibration-independent measure of resolution. We then apply this measure to quantify the power of resolution of different algorithms treating these individual discriminator signals which can be implemented in firmware. The method is then applied to different detectors existing at the ILL. Center-of-gravity methods usually improve the resolution over best-wire algorithms which are the standard way of treating these signals. (authors)

  8. An information-theoretical approach to image resolution applied to neutron imaging detectors based upon individual discriminator signals

    Energy Technology Data Exchange (ETDEWEB)

    Clergeau, Jean-Francois; Ferraton, Matthieu; Guerard, Bruno; Khaplanov, Anton; Piscitelli, Francesco; Platz, Martin; Rigal, Jean-Marie; Van Esch, Patrick [Institut Laue Langevin, Neutron Detector Service, Grenoble (France); Daulle, Thibault [PHELMA Grenoble - INP Grenoble (France)

    2013-06-15

    1D or 2D neutron imaging detectors with individual wire or strip readout using discriminators have the advantage of being able to treat several neutron impacts partially overlapping in time, hence reducing global dead time. A single neutron impact usually gives rise to several discriminator signals. In this paper, we introduce an information-theoretical definition of image resolution. Two point-like spots of neutron impacts with a given distance between them act as a source of information (each neutron hit belongs to one spot or the other), and the detector plus signal treatment is regarded as an imperfect communication channel that transmits this information. The maximal mutual information obtained from this channel as a function of the distance between the spots allows to define a calibration-independent measure of resolution. We then apply this measure to quantify the power of resolution of different algorithms treating these individual discriminator signals which can be implemented in firmware. The method is then applied to different detectors existing at the ILL. Center-of-gravity methods usually improve the resolution over best-wire algorithms which are the standard way of treating these signals. (authors)

  9. Information-Theoretic Bounded Rationality and ε-Optimality

    Directory of Open Access Journals (Sweden)

    Daniel A. Braun

    2014-08-01

    Full Text Available Bounded rationality concerns the study of decision makers with limited information processing resources. Previously, the free energy difference functional has been suggested to model bounded rational decision making, as it provides a natural trade-off between an energy or utility function that is to be optimized and information processing costs that are measured by entropic search costs. The main question of this article is how the information-theoretic free energy model relates to simple ε-optimality models of bounded rational decision making, where the decision maker is satisfied with any action in an ε-neighborhood of the optimal utility. We find that the stochastic policies that optimize the free energy trade-off comply with the notion of ε-optimality. Moreover, this optimality criterion even holds when the environment is adversarial. We conclude that the study of bounded rationality based on ε-optimality criteria that abstract away from the particulars of the information processing constraints is compatible with the information-theoretic free energy model of bounded rationality.

  10. Statistical Data Processing with R – Metadata Driven Approach

    Directory of Open Access Journals (Sweden)

    Rudi SELJAK

    2016-06-01

    Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.

  11. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Meyer Patrick

    2007-01-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  12. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Patrick E. Meyer

    2007-06-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  13. Study of the thermo-mechanical performances of the IFMIF-EVEDA Lithium Test Loop target assembly

    Energy Technology Data Exchange (ETDEWEB)

    Di Maio, P.A., E-mail: dimaio@din.unipa.it [Dipartimento dell' Energia, Universita di Palermo, Viale delle Scienze, 90128 Palermo (Italy); Arena, P.; Bongiovi, G. [Dipartimento dell' Energia, Universita di Palermo, Viale delle Scienze, 90128 Palermo (Italy); Giammusso, R.; Micciche, G.; Tincani, A. [ENEA C. R. Brasimone, 40032 Camugnano, Bologna (Italy)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer IFMIF-EVEDA target assembly thermo-mechanical behavior has been investigated. Black-Right-Pointing-Pointer Finite element method has been followed and a commercial code has been used. Black-Right-Pointing-Pointer Nominal, design and pressure test steady state scenarios and start-up transient conditions have been investigated. Black-Right-Pointing-Pointer Steady state results have shown that back-plate yielding may occur only under the design scenario. Black-Right-Pointing-Pointer Transient analysis has indicated that TA start-up lasts for {approx}60 h. - Abstract: Within the framework of the IFMIF R and D program and in close cooperation with ENEA-Brasimone, at the Department of Energy of the University of Palermo a research campaign has been launched to investigate the thermo-mechanical behavior of the target assembly under both steady state and start-up transient conditions. A theoretical approach based on the finite element method (FEM) has been followed and a well-known commercial code has been adopted. A realistic 3D FEM model of the target assembly has been set-up and optimized by running a mesh independency analysis. A proper set of loads and boundary conditions, mainly concerned with radiation heat transfer between the target assembly external walls and the inner walls of its containment vessel, have been considered and the target assembly thermo-mechanical behavior under nominal, design and pressure test steady state scenarios and start-up transient conditions has been investigated. Results are herewith reported and discussed.

  14. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  15. Information categorization approach to literary authorship disputes

    Science.gov (United States)

    Yang, Albert C.-C.; Peng, C.-K.; Yien, H.-W.; Goldberger, Ary L.

    2003-11-01

    Scientific analysis of the linguistic styles of different authors has generated considerable interest. We present a generic approach to measuring the similarity of two symbolic sequences that requires minimal background knowledge about a given human language. Our analysis is based on word rank order-frequency statistics and phylogenetic tree construction. We demonstrate the applicability of this method to historic authorship questions related to the classic Chinese novel “The Dream of the Red Chamber,” to the plays of William Shakespeare, and to the Federalist papers. This method may also provide a simple approach to other large databases based on their information content.

  16. Information theoretic analysis of edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  17. Modeling and forecasting energy consumption for heterogeneous buildings using a physical–statistical approach

    International Nuclear Information System (INIS)

    Lü, Xiaoshu; Lu, Tao; Kibert, Charles J.; Viljanen, Martti

    2015-01-01

    Highlights: • This paper presents a new modeling method to forecast energy demands. • The model is based on physical–statistical approach to improving forecast accuracy. • A new method is proposed to address the heterogeneity challenge. • Comparison with measurements shows accurate forecasts of the model. • The first physical–statistical/heterogeneous building energy modeling approach is proposed and validated. - Abstract: Energy consumption forecasting is a critical and necessary input to planning and controlling energy usage in the building sector which accounts for 40% of the world’s energy use and the world’s greatest fraction of greenhouse gas emissions. However, due to the diversity and complexity of buildings as well as the random nature of weather conditions, energy consumption and loads are stochastic and difficult to predict. This paper presents a new methodology for energy demand forecasting that addresses the heterogeneity challenges in energy modeling of buildings. The new method is based on a physical–statistical approach designed to account for building heterogeneity to improve forecast accuracy. The physical model provides a theoretical input to characterize the underlying physical mechanism of energy flows. Then stochastic parameters are introduced into the physical model and the statistical time series model is formulated to reflect model uncertainties and individual heterogeneity in buildings. A new method of model generalization based on a convex hull technique is further derived to parameterize the individual-level model parameters for consistent model coefficients while maintaining satisfactory modeling accuracy for heterogeneous buildings. The proposed method and its validation are presented in detail for four different sports buildings with field measurements. The results show that the proposed methodology and model can provide a considerable improvement in forecasting accuracy

  18. Thermo Techno Modern Analytical Equipment for Research and Industrial Laboratories

    Directory of Open Access Journals (Sweden)

    Khokhlov, S.V.

    2014-03-01

    Full Text Available A brief overview of some models of Thermo Techno analytical equipment and possible areas of their application is given. Thermo Techno Company was created in 2000 as a part of representative office of international corporation Thermo Fisher Scientific — world leader in manufacturing analytical equipments. Thermo Techno is a unique company in its integrated approach in solving the problems of the user, which includes a series of steps: setting the analytical task, selection of effective analysis methods, sample delivery and preparation as well as data transmitting and archiving.

  19. Thermo Wigner operator in thermo field dynamics: its introduction and application

    International Nuclear Information System (INIS)

    Fan Hongyi; Jiang Nianquan

    2008-01-01

    Because in thermo-field dynamics (TFD) the thermo-operator has a neat expression in the thermo-entangled state representation, we need to introduce the thermo-Wigner operator (THWO) in the same representation. We derive the THWO in a direct way, which brings much conveniece to calculating the Wigner functions of thermo states in TFD. We also discuss the condition for existence of a wavefunction corresponding to a given Wigner function in the context of TFD by using the explicit form of the THWO.

  20. a Statistical Dynamic Approach to Structural Evolution of Complex Capital Market Systems

    Science.gov (United States)

    Shao, Xiao; Chai, Li H.

    As an important part of modern financial systems, capital market has played a crucial role on diverse social resource allocations and economical exchanges. Beyond traditional models and/or theories based on neoclassical economics, considering capital markets as typical complex open systems, this paper attempts to develop a new approach to overcome some shortcomings of the available researches. By defining the generalized entropy of capital market systems, a theoretical model and nonlinear dynamic equation on the operations of capital market are proposed from statistical dynamic perspectives. The US security market from 1995 to 2001 is then simulated and analyzed as a typical case. Some instructive results are discussed and summarized.

  1. A BIM-based approach to reusing construction firm’s management information

    Directory of Open Access Journals (Sweden)

    Zhiliang Ma

    2012-12-01

    Full Text Available Nowadays most construction firms have begun to use information management systems in their business to work more efficiently. At the same time, a lot of management information is being accumulated and some of the information can be reused to support the decision-making. Up to now, the information has not been reused so effectively in construction firms as expected. This paper introduces a new approach to reusing construction firm’s management information, which is based on BIM (Building Information Modeling technology. In the paper, the current approaches are reviewed at first, and then the framework of the new approach is described. Next, the key issues of the new approach are clarified. Finally, a use case of the new approach is demonstrated. It is concluded that the new approach can be used in construction firms to better reuse the accumulated management information.

  2. The CASE Project: Evaluation of Case-Based Approaches to Learning and Teaching in Statistics Service Courses

    Science.gov (United States)

    Fawcett, Lee

    2017-01-01

    The CASE project (Case-based Approaches to Statistics Education; see www.mas.ncl.ac.uk/~nlf8/innovation) was established to investigate how the use of real-life, discipline-specific case study material in Statistics service courses could improve student engagement, motivation, and confidence. Ultimately, the project aims to promote deep learning…

  3. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    International Nuclear Information System (INIS)

    Tadaki, Kohtaro

    2010-01-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  4. On the analysis of genome-wide association studies in family-based designs: a universal, robust analysis approach and an application to four genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Sungho Won

    2009-11-01

    Full Text Available For genome-wide association studies in family-based designs, we propose a new, universally applicable approach. The new test statistic exploits all available information about the association, while, by virtue of its design, it maintains the same robustness against population admixture as traditional family-based approaches that are based exclusively on the within-family information. The approach is suitable for the analysis of almost any trait type, e.g. binary, continuous, time-to-onset, multivariate, etc., and combinations of those. We use simulation studies to verify all theoretically derived properties of the approach, estimate its power, and compare it with other standard approaches. We illustrate the practical implications of the new analysis method by an application to a lung-function phenotype, forced expiratory volume in one second (FEV1 in 4 genome-wide association studies.

  5. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    Science.gov (United States)

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  6. Information-theoretic approach to lead-lag effect on financial markets

    Science.gov (United States)

    Fiedor, Paweł

    2014-08-01

    Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both types of analysis are concentrated mostly on Pearson's correlation coefficient and consequently intraday lead-lag relationships (where one of the variables in a pair is time-lagged) are also associated with them. Under the Efficient-Market Hypothesis such relationships are not possible as all information is embedded in the prices, but in real markets we find such dependencies. In this paper we analyse lead-lag relationships of financial instruments and extend known methodology by using mutual information instead of Pearson's correlation coefficient. Mutual information is not only a more general measure, sensitive to non-linear dependencies, but also can lead to a simpler procedure of statistical validation of links between financial instruments. We analyse lagged relationships using New York Stock Exchange 100 data not only on an intraday level, but also for daily stock returns, which have usually been ignored.

  7. A Theoretical Framework for Soft-Information-Based Synchronization in Iterative (Turbo Receivers

    Directory of Open Access Journals (Sweden)

    Lottici Vincenzo

    2005-01-01

    Full Text Available This contribution considers turbo synchronization, that is to say, the use of soft data information to estimate parameters like carrier phase, frequency, or timing offsets of a modulated signal within an iterative data demodulator. In turbo synchronization, the receiver exploits the soft decisions computed at each turbo decoding iteration to provide a reliable estimate of some signal parameters. The aim of our paper is to show that such "turbo-estimation" approach can be regarded as a special case of the expectation-maximization (EM algorithm. This leads to a general theoretical framework for turbo synchronization that allows to derive parameter estimation procedures for carrier phase and frequency offset, as well as for timing offset and signal amplitude. The proposed mathematical framework is illustrated by simulation results reported for the particular case of carrier phase and frequency offsets estimation of a turbo-coded 16-QAM signal.

  8. A gauge-theoretic approach to gravity.

    Science.gov (United States)

    Krasnov, Kirill

    2012-08-08

    Einstein's general relativity (GR) is a dynamical theory of the space-time metric. We describe an approach in which GR becomes an SU(2) gauge theory. We start at the linearized level and show how a gauge-theoretic Lagrangian for non-interacting massless spin two particles (gravitons) takes a much more simple and compact form than in the standard metric description. Moreover, in contrast to the GR situation, the gauge theory Lagrangian is convex. We then proceed with a formulation of the full nonlinear theory. The equivalence to the metric-based GR holds only at the level of solutions of the field equations, that is, on-shell. The gauge-theoretic approach also makes it clear that GR is not the only interacting theory of massless spin two particles, in spite of the GR uniqueness theorems available in the metric description. Thus, there is an infinite-parameter class of gravity theories all describing just two propagating polarizations of the graviton. We describe how matter can be coupled to gravity in this formulation and, in particular, how both the gravity and Yang-Mills arise as sectors of a general diffeomorphism-invariant gauge theory. We finish by outlining a possible scenario of the ultraviolet completion of quantum gravity within this approach.

  9. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  10. Thermo-responsive cell culture carrier: Effects on macrophage functionality and detachment efficiency.

    Science.gov (United States)

    Rennert, Knut; Nitschke, Mirko; Wallert, Maria; Keune, Natalie; Raasch, Martin; Lorkowski, Stefan; Mosig, Alexander S

    2017-01-01

    Harvesting cultivated macrophages for tissue engineering purposes by enzymatic digestion of cell adhesion molecules can potentially result in unintended activation, altered function, or behavior of these cells. Thermo-responsive polymer is a promising tool that allows for gentle macrophage detachment without artificial activation prior to subculture within engineered tissue constructs. We therefore characterized different species of thermo-responsive polymers for their suitability as cell substrate and to mediate gentle macrophage detachment by temperature shift. Primary human monocyte- and THP-1-derived macrophages were cultured on thermo-responsive polymers and characterized for phagocytosis and cytokine secretion in response to lipopolysaccharide stimulation. We found that both cell types differentially respond in dependence of culture and stimulation on thermo-responsive polymers. In contrast to THP-1 macrophages, primary monocyte-derived macrophages showed no signs of impaired viability, artificial activation, or altered functionality due to culture on thermo-responsive polymers compared to conventional cell culture. Our study demonstrates that along with commercially available UpCell carriers, two other thermo-responsive polymers based on poly(vinyl methyl ether) blends are attractive candidates for differentiation and gentle detachment of primary monocyte-derived macrophages. In summary, we observed similar functionality and viability of primary monocyte-derived macrophages cultured on thermo-responsive polymers compared to standard cell culture surfaces. While this first generation of custom-made thermo-responsive polymers does not yet outperform standard culture approaches, our results are very promising and provide the basis for exploiting the unique advantages offered by custom-made thermo-responsive polymers to further improve macrophage culture and recovery in the future, including the covalent binding of signaling molecules and the reduction of

  11. The role of quantum information in thermodynamics—a topical review

    International Nuclear Information System (INIS)

    Goold, John; Huber, Marcus; Riera, Arnau; Skrzypczyk, Paul; Rio, Lídia del

    2016-01-01

    This topical review article gives an overview of the interplay between quantum information theory and thermodynamics of quantum systems. We focus on several trending topics including the foundations of statistical mechanics, resource theories, entanglement in thermodynamic settings, fluctuation theorems and thermal machines. This is not a comprehensive review of the diverse field of quantum thermodynamics; rather, it is a convenient entry point for the thermo-curious information theorist. Furthermore this review should facilitate the unification and understanding of different interdisciplinary approaches emerging in research groups around the world. (topical review)

  12. Conference: Statistical Physics and Biological Information; F

    International Nuclear Information System (INIS)

    Gross, David J.; Hwa, Terence

    2001-01-01

    In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/

  13. Blogging in Higher Education: Theoretical and Practical Approach

    OpenAIRE

    Gulfidan CAN; Devrim OZDEMIR

    2006-01-01

    In this paper the blogging method, which includes new forms of writing, is supported as an alternative approach to address the frequently asserted problems in higher education such as product-oriented assessment and lack of value given to students' writing as contribution to the discourse of the academic disciplines. Both theoretical and research background information is provided to clarify the rationale of using this method in higher education. Furthermore, recommended way of using this met...

  14. Enhanced pathway efficiency of Saccharomyces cerevisiae by introducing thermo-tolerant devices.

    Science.gov (United States)

    Liu, Yueqin; Zhang, Genli; Sun, Huan; Sun, Xiangying; Jiang, Nisi; Rasool, Aamir; Lin, Zhanglin; Li, Chun

    2014-10-01

    In this study, thermo-tolerant devices consisting of heat shock genes from thermophiles were designed and introduced into Saccharomyces cerevisiae for improving its thermo-tolerance. Among ten engineered thermo-tolerant yeasts, T.te-TTE2469, T.te-GroS2 and T.te-IbpA displayed over 25% increased cell density and 1.5-4-fold cell viability compared with the control. Physiological characteristics of thermo-tolerant strains revealed that better cell wall integrity, higher trehalose content and enhanced metabolic energy were preserved by thermo-tolerant devices. Engineered thermo-tolerant strain was used to investigate the impact of thermo-tolerant device on pathway efficiency by introducing β-amyrin synthesis pathway, showed 28.1% increased β-amyrin titer, 28-35°C broadened growth temperature range and 72h shortened fermentation period. The results indicated that implanting heat shock proteins from thermophiles to S. cerevisiae would be an efficient approach to improve its thermo-tolerance. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Reconstructing Macroeconomics Based on Statistical Physics

    Science.gov (United States)

    Aoki, Masanao; Yoshikawa, Hiroshi

    We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.

  16. Statistical methods of combining information: Applications to sensor data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  17. Role of information theoretic uncertainty relations in quantum theory

    International Nuclear Information System (INIS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-01-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed

  18. Role of information theoretic uncertainty relations in quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  19. Investigation on thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing in scramjet cooling channel based on wavelet entropy method

    Science.gov (United States)

    Zan, Hao; Li, Haowei; Jiang, Yuguang; Wu, Meng; Zhou, Weixing; Bao, Wen

    2018-06-01

    As part of our efforts to find ways and means to further improve the regenerative cooling technology in scramjet, the experiments of thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing have been conducted in horizontal circular tubes at different conditions. The experimental results indicate that there is a developing process from thermo-acoustic stability to instability. In order to have a deep understanding on the developing process of thermo-acoustic instability, the method of Multi-scale Shannon Wavelet Entropy (MSWE) based on Wavelet Transform Correlation Filter (WTCF) and Multi-Scale Shannon Entropy (MSE) is adopted in this paper. The results demonstrate that the developing process of thermo-acoustic instability from noise and weak signals is well detected by MSWE method and the differences among the stability, the developing process and the instability can be identified. These properties render the method particularly powerful for warning thermo-acoustic instability of hydrocarbon fuel flowing in scramjet cooling channels. The mass flow rate and the inlet pressure will make an influence on the developing process of the thermo-acoustic instability. The investigation on thermo-acoustic instability dynamic characteristics at supercritical pressure based on wavelet entropy method offers guidance on the control of scramjet fuel supply, which can secure stable fuel flowing in regenerative cooling system.

  20. Statistical Angular Resolution Limit for Ultrawideband MIMO Noise Radar

    Directory of Open Access Journals (Sweden)

    Xiaoli Zhou

    2015-01-01

    Full Text Available The two-dimensional angular resolution limit (ARL of elevation and azimuth for MIMO radar with ultrawideband (UWB noise waveforms is investigated using statistical resolution theory. First, the signal model of monostatic UWB MIMO noise radar is established in a 3D reference frame. Then, the statistical angular resolution limits (SARLs of two closely spaced targets are derived using the detection-theoretic and estimation-theoretic approaches, respectively. The detection-theoretic approach is based on the generalized likelihood ratio test (GLRT with given probabilities of false alarm and detection, while the estimation-theoretic approach is based on Smith’s criterion which involves the Cramér-Rao lower bound (CRLB. Furthermore, the relationship between the two approaches is presented, and the factors affecting the SARL, that is, detection parameters, transmit waveforms, array geometry, signal-to-noise ratio (SNR, and parameters of target (i.e., radar cross section (RCS and direction, are analyzed. Compared with the conventional radar resolution theory defined by the ambiguity function, the SARL reflects the practical resolution ability of radar and can provide an optimization criterion for radar system design.

  1. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  2. Informing Physics: Jacob Bekenstein and the Informational Turn in Theoretical Physics

    Science.gov (United States)

    Belfer, Israel

    2014-03-01

    In his PhD dissertation in the early 1970s, the Mexican-Israeli theoretical physicist Jacob Bekenstein developed the thermodynamics of black holes using a generalized version of the second law of thermodynamics. This work made it possible for physicists to describe and analyze black holes using information-theoretical concepts. It also helped to transform information theory into a fundamental and foundational concept in theoretical physics. The story of Bekenstein's work—which was initially opposed by many scientists, including Stephen Hawking—highlights the transformation within physics towards an information-oriented scientific mode of theorizing. This "informational turn" amounted to a mild-mannered revolution within physics, revolutionary without being rebellious.

  3. Multi-scale modeling of the thermo-hydro- mechanical behaviour of heterogeneous materials. Application to cement-based materials under severe loads

    International Nuclear Information System (INIS)

    Grondin, Frederic Alain

    2005-01-01

    The work of modeling presented here relates to the study of the thermo-hydro- mechanical behaviour of porous materials based on hydraulic binder such as concrete, High Performance Concrete or more generally cement-based materials. This work is based on the exploitation of the Digital Concrete model, of the finite element code Symphonie developed in the Scientific and Technical Centre for Building (CSTB), in coupling with the homogenization methods to obtain macroscopic behaviour laws drawn from the Micro-Macro relations. Scales of investigation, macroscopic and microscopic, has been exploited by simulation in order to allow the comprehension fine of the behaviour of cement-based materials according to thermal, hydrous and mechanical loads. It appears necessary to take into account various scales of modeling. In order to study the behaviour of the structure, we are brought to reduce the scale of investigation to study the material more particularly. The research tasks presented suggest a new approach for the identification of the multi-physic behaviour of materials by simulation. In complement of the purely experimental approach, based on observations on the sample with measurements of the apparent parameters on the macroscopic scale, this new approach allows to obtain the fine analysis of elementary mechanisms in acting within the material. These elementary mechanisms are at the origin of the evolution of the macroscopic parameters measured in experimental tests. In this work, coefficients of the thermo-hydro-mechanical behaviour law of porous materials and the equivalent hydraulic conductivity were obtained by a multi-scales approach. Applications has been carried out on the study of the damaged behaviour of cement-based materials, in the objective to determine the elasticity tensor and the permeability tensor of a High Performance Concrete at high temperatures under a mechanical load. Also, the study of the strain evolution of cement-based materials at low

  4. New advances in the statistical parton distributions approach*

    Directory of Open Access Journals (Sweden)

    Soffer Jacques

    2016-01-01

    Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.

  5. Systems information management: graph theoretical approach

    NARCIS (Netherlands)

    Temel, T.

    2006-01-01

    This study proposes a new method for characterising the underlying information structure of a multi-sector system. A complete characterisation is accomplished by identifying information gaps and cause-effect information pathways in the system, and formulating critical testable hypotheses.

  6. How cells engulf: a review of theoretical approaches to phagocytosis

    Science.gov (United States)

    Richards, David M.; Endres, Robert G.

    2017-12-01

    Phagocytosis is a fascinating process whereby a cell surrounds and engulfs particles such as bacteria and dead cells. This is crucial both for single-cell organisms (as a way of acquiring nutrients) and as part of the immune system (to destroy foreign invaders). This whole process is hugely complex and involves multiple coordinated events such as membrane remodelling, receptor motion, cytoskeleton reorganisation and intracellular signalling. Because of this, phagocytosis is an excellent system for theoretical study, benefiting from biophysical approaches combined with mathematical modelling. Here, we review these theoretical approaches and discuss the recent mathematical and computational models, including models based on receptors, models focusing on the forces involved, and models employing energetic considerations. Along the way, we highlight a beautiful connection to the physics of phase transitions, consider the role of stochasticity, and examine links between phagocytosis and other types of endocytosis. We cover the recently discovered multistage nature of phagocytosis, showing that the size of the phagocytic cup grows in distinct stages, with an initial slow stage followed by a much quicker second stage starting around half engulfment. We also address the issue of target shape dependence, which is relevant to both pathogen infection and drug delivery, covering both one-dimensional and two-dimensional results. Throughout, we pay particular attention to recent experimental techniques that continue to inform the theoretical studies and provide a means to test model predictions. Finally, we discuss population models, connections to other biological processes, and how physics and modelling will continue to play a key role in future work in this area.

  7. Thermal energy storage using thermo-chemical heat pump

    International Nuclear Information System (INIS)

    Hamdan, M.A.; Rossides, S.D.; Haj Khalil, R.

    2013-01-01

    Highlights: ► Understanding of the performance of thermo chemical heat pump. ► Tool for storing thermal energy. ► Parameters that affect the amount of thermal stored energy. ► Lithium chloride has better effect on storing thermal energy. - Abstract: A theoretical study was performed to investigate the potential of storing thermal energy using a heat pump which is a thermo-chemical storage system consisting of water as sorbet, and sodium chloride as the sorbent. The effect of different parameters namely; the amount of vaporized water from the evaporator, the system initial temperature and the type of salt on the increase in temperature of the salt was investigated and hence on the performance of the thermo chemical heat pump. It was found that the performance of the heat pump improves with the initial system temperature, with the amount of water vaporized and with the water remaining in the system. Finally it was also found that lithium chloride salt has higher effect on the performance of the heat pump that of sodium chloride.

  8. Strategists and Non-Strategists in Austrian Enterprises—Statistical Approaches

    Science.gov (United States)

    Duller, Christine

    2011-09-01

    The purpose of this work is to determine with a modern statistical approach which variables can indicate whether an arbitrary enterprise uses strategic management as basic business concept. "Strategic management is an ongoing process that evaluates and controls the business and the industries in which the company is involved; assesses its competitors and sets goals and strategies to meet all existing and potential competitors; and then reassesses each strategy annually or quarterly (i.e. regularly) to determine how it has been implemented and whether it has succeeded or needs replacement by a new strategy to meet changed circumstances, new technology, new competitors, a new economic environment or a new social, financial or political environment." [12] In Austria 70% to 80% of all enterprises can be classified as family firms. In literature the empirically untested hypothesis can be found that family firms tend to have less formalised management accounting systems than non-family enterprises. But it is unknown whether the use of strategic management accounting systems is influenced more by the fact of structure (family or non-family enterprise) or by the effect of size (number of employees). Therefore, the goal is to split up enterprises into two subgroups, namely strategists and non-strategists and to get information on the variables of influence (size, structure, branches, etc.). Two statistical approaches are used: On the one hand a classical cluster analysis is implemented to design two subgroups and on the other hand a latent class model is built up for this problem. After a description of the theoretical background first results of both strategies are compared.

  9. Thermo-electrical systems for the generation of electricity

    International Nuclear Information System (INIS)

    Bitschi, A.; Froehlich, K.

    2010-01-01

    This article takes a look at theoretical models concerning thermo-electrical systems for the generation of electricity and demonstrations of technology actually realised. The potentials available and developments are discussed. The efficient use of energy along the whole generation and supply chain, as well as the use of renewable energy sources are considered as being two decisive factors in the attainment of a sustainable energy supply system. The large amount of unused waste heat available today in energy generation, industrial processes, transport systems and public buildings is commented on. Thermo-electric conversion systems are discussed and work being done on the subject at the Swiss Federal Institute of Technology in Zurich is discussed. The findings are discussed and results are presented in graphical form

  10. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; Aalst, W.M.P. van der

    2009-01-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  11. Steady shear characteristic and behavior of magneto-thermo-elasticity of isotropic MR elastomers

    International Nuclear Information System (INIS)

    Gao, Wei; Wang, Xingzhe

    2016-01-01

    The magneto-thermo-elastic steady shear behaviors of isotropic smart composites of silicon rubber matrix randomly filled with ferromagnetic particles, commonly referred to as magnetorheological (MR) elastomers, are investigated experimentally and theoretically in the present study. The strip specimens of the MR elastomer composite with different ferromagnetic particle concentrations are fabricated and implemented for lap-shear tests under both magnetic and thermal fields. It is illustrated that the magneto-thermo-elastic shear modulus of the MR elastomer is markedly enhanced with the volume fraction of ferromagnetic particles and the applied external magnetic field, while the shear modulus is decreased with the environment temperature. To qualitatively elucidate the magneto-thermo-elastic shear performance of this kind of magnetic smart composites, a modified constitutive of hyperelasticity is suggested taking into account the influence of magnetic field and temperature on the magnetic potential energy and strain energy. The theoretical modeling predictions on the stress–strain behaviors for different applied magnetic fields and environment temperatures are compared to experimental observations to demonstrate a good agreement. (paper)

  12. An Explicit Approach Toward Modeling Thermo-Coupled Deformation Behaviors of SMPs

    Directory of Open Access Journals (Sweden)

    Hao Li

    2017-03-01

    Full Text Available A new elastoplastic J 2 -flow models with thermal effects is proposed toward simulating thermo-coupled finite deformation behaviors of shape memory polymers. In this new model, an elastic potential evolving with development of plastic flow is incorporated to characterize the stress-softening effect at unloading and, moreover, thermo-induced plastic flow is introduced to represent the strain recovery effect at heating. It is shown that any given test data for both effects may be accurately simulated by means of direct and explicit procedures. Numerical examples for model predictions compare well with test data in literature.

  13. Child education and management: theoretical approaches on legislation

    Directory of Open Access Journals (Sweden)

    Rúbia Borges

    2017-11-01

    Full Text Available The aim of this work was to investigate theoretical approaches regarding to daycare centers and management, considering childhood education for different audiences, such children and babies on the childhood perspective. On qualitative approach, this research is bibliographical and reflects on official documents about the theme. The development of this research occurred through analysis on educational Brazilian laws, starting by the Federal Constitution (FC, Law of Guidelines and Bases for National Education (LGB, National Curriculum Guidelines and the Education National Plan (ENP. The results point to a generalist legislation that allow certain autonomy on the education. However, there is the need to deepen theoretical and practical studies on the reality of institutions which have the education as the paramount purpose, in order to offer education with quality and attending to the needs from the audience in these institutions.

  14. Locating sensors for detecting source-to-target patterns of special nuclear material smuggling: a spatial information theoretic approach.

    Science.gov (United States)

    Przybyla, Jay; Taylor, Jeffrey; Zhou, Xuesong

    2010-01-01

    In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM) smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  15. Locating Sensors for Detecting Source-to-Target Patterns of Special Nuclear Material Smuggling: A Spatial Information Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Xuesong Zhou

    2010-08-01

    Full Text Available In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  16. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  17. Principle-theoretic approach of kondo and construction-theoretic formalism of gauge theories

    International Nuclear Information System (INIS)

    Jain, L.C.

    1986-01-01

    Einstein classified various theories in physics as principle-theories and constructive-theories. In this lecture Kondo's approach to microscopic and macroscopic phenomena is analysed for its principle theoretic pursuit as followed by construction. The fundamentals of his theory may be recalled as Tristimulus principle, Observation principle, Kawaguchi spaces, empirical information, epistemological point of view, unitarity, intrinsicality, and dimensional analysis subject to logical and geometrical achievement. On the other hand, various physicists have evolved constructive gauge theories through the phenomenological point of view, often a collective one. Their synthetic method involves fibre bundles and connections, path integrals as well as other hypothetical structures. They lead towards clarity, completeness and adaptability

  18. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  19. Theoretical statistics of zero-age cataclysmic variables

    International Nuclear Information System (INIS)

    Politano, M.J.

    1988-01-01

    The distribution of the white dwarf masses, the distribution of the mass ratios and the distribution of the orbital periods in cataclysmic variables which are forming at the present time are calculated. These systems are referred to as zero-age cataclysmic variables. The results show that 60% of the systems being formed contain helium white dwarfs and 40% contain carbon-oxygen white dwarfs. The mean dwarf mass in those systems containing helium white dwarfs is 0.34. The mean white dwarf mass in those systems containing carbon-oxygen white dwarfs is 0.75. The orbital period distribution identifies four main classes of zero-age cataclysmic variables: (1) short-period systems containing helium white dwarfs, (2) systems containing carbon-oxygen white dwarfs whose secondaries are convectively stable against rapid mass transfer to the white dwarf, (3) systems containing carbon-oxygen white dwarfs whose secondaries are radiatively stable against rapid mass transfer to the white dwarf and (4) long period systems with evolved secondaries. The white dwarf mass distribution in zero-age cataclysmic variables has direct application to the calculation of the frequency of outburst in classical novae as a function of the mass of the white dwarf. The method developed in this thesis to calculate the distributions of the orbital parameters in zero-age cataclysmic variables can be used to calculate theoretical statistics of any class of binary systems. This method provides a theoretical framework from which to investigate the statistical properties and the evolution of the orbital parameters of binary systems

  20. Applying Bayesian Statistics to Educational Evaluation. Theoretical Paper No. 62.

    Science.gov (United States)

    Brumet, Michael E.

    Bayesian statistical inference is unfamiliar to many educational evaluators. While the classical model is useful in educational research, it is not as useful in evaluation because of the need to identify solutions to practical problems based on a wide spectrum of information. The reason Bayesian analysis is effective for decision making is that it…

  1. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  2. Solar-driven thermo- and electrochemical degradation of nitrobenzene in wastewater: Adaptation and adoption of solar STEP concept

    International Nuclear Information System (INIS)

    Gu, Di; Shao, Nan; Zhu, Yanji; Wu, Hongjun; Wang, Baohui

    2017-01-01

    Highlights: • STEP for NB treatment was established without input of energy and chemicals. • Treatment of NB was theoretically and experimentally studied by STEP. • The results demonstrated that STEP is more efficient than classical AOPs. • The mechanism of STEP was illustratively presented for NB wastewater. - Abstract: The STEP concept has successfully been demonstrated for driving chemical reaction by utilization of solar heat and electricity to minimize the fossil energy, meanwhile, maximize the rate of thermo- and electrochemical reactions in thermodynamics and kinetics. This pioneering investigation experimentally exhibit that the STEP concept is adapted and adopted efficiently for degradation of nitrobenzene. By employing the theoretical calculation and thermo-dependent cyclic voltammetry, the degradation potential of nitrobenzene was found to be decreased obviously, at the same time, with greatly lifting the current, while the temperature was increased. Compared with the conventional electrochemical methods, high efficiency and fast degradation rate were markedly displayed due to the co-action of thermo- and electrochemical effects and the switch of the indirect electrochemical oxidation to the direct one for oxidation of nitrobenzene. A clear conclusion on the mechanism of nitrobenzene degradation by the STEP can be schematically proposed and discussed by the combination of thermo- and electrochemistry based the analysis of the HPLC, UV–vis and degradation data. This theory and experiment provide a pilot for the treatment of nitrobenzene wastewater with high efficiency, clean operation and low carbon footprint, without any other input of energy and chemicals from solar energy.

  3. Solar-driven thermo- and electrochemical degradation of nitrobenzene in wastewater: Adaptation and adoption of solar STEP concept

    Energy Technology Data Exchange (ETDEWEB)

    Gu, Di; Shao, Nan; Zhu, Yanji; Wu, Hongjun; Wang, Baohui, E-mail: wangbh@nepu.edu.cn

    2017-01-05

    Highlights: • STEP for NB treatment was established without input of energy and chemicals. • Treatment of NB was theoretically and experimentally studied by STEP. • The results demonstrated that STEP is more efficient than classical AOPs. • The mechanism of STEP was illustratively presented for NB wastewater. - Abstract: The STEP concept has successfully been demonstrated for driving chemical reaction by utilization of solar heat and electricity to minimize the fossil energy, meanwhile, maximize the rate of thermo- and electrochemical reactions in thermodynamics and kinetics. This pioneering investigation experimentally exhibit that the STEP concept is adapted and adopted efficiently for degradation of nitrobenzene. By employing the theoretical calculation and thermo-dependent cyclic voltammetry, the degradation potential of nitrobenzene was found to be decreased obviously, at the same time, with greatly lifting the current, while the temperature was increased. Compared with the conventional electrochemical methods, high efficiency and fast degradation rate were markedly displayed due to the co-action of thermo- and electrochemical effects and the switch of the indirect electrochemical oxidation to the direct one for oxidation of nitrobenzene. A clear conclusion on the mechanism of nitrobenzene degradation by the STEP can be schematically proposed and discussed by the combination of thermo- and electrochemistry based the analysis of the HPLC, UV–vis and degradation data. This theory and experiment provide a pilot for the treatment of nitrobenzene wastewater with high efficiency, clean operation and low carbon footprint, without any other input of energy and chemicals from solar energy.

  4. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  5. A statistical approach for water movement in the unsaturated zone

    International Nuclear Information System (INIS)

    Tielin Zang.

    1991-01-01

    This thesis presents a statistical approach for estimating and analyzing the downward transport pattern and distribution of soil water by the use of pattern analysis of space-time correlation structures. This approach, called the Space-time-Correlation Field, is mainly based on the analyses of correlation functions simultaneously in the space and time domain. The overall purpose of this work is to derive an alternative statistical procedure in soil moisture analysis without involving detailed information on hydraulic parameters and to visualize the dynamics of soil water variability in the space and time domains. A numerical model using method of characteristics is employed to provide hypothetical time series to use in the statistical method, which is, after the verification and calibration, applied to the field measured time series. The results of the application show that the space-time correlation fields reveal effects of soil layers with different hydraulic properties and boundaries between them. It is concluded that the approach poses special advantages when visualizing time and space dependent properties simultaneously. It can be used to investigate the hydrological response of soil water dynamics and characteristics in different dimensions (space and time) and scales. This approach can be used to identify the dominant component in unsaturated flow systems. It is possible to estimate the pattern and the propagation rate downwards of moisture movement in the soil profile. Small-scale soil heterogeneities can be identified by the correlation field. Since the correlation field technique give a statistical measure of the dependent property that varies within the space-time field, it is possible to interpolate the fields to points where observations are not available, estimating spatial or temporal averages from discrete observations. (au)

  6. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  7. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    OpenAIRE

    Chahinez Benkoussas; Patrice Bellot

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval ...

  8. Enhancement of the thermo-optical response of silver nanoparticles due to surface plasmon resonance

    Science.gov (United States)

    Hashemi Zadeh, Sakineh; Rashidi-Huyeh, Majid; Palpant, Bruno

    2017-10-01

    Owing to their remarkable optical properties, noble metals' nanoparticles are proposed for many applications. Controlling the temperature dependence of these properties may then appear to be of great relevance. In this paper, we investigate the thermo-optical properties of silver nanoparticles. Different silver nanocolloids were prepared with different surface plasmon resonance modes. The thermo-extinction spectra of the colloidal solutions were then evaluated by measuring the extinction spectra at different temperatures. This reveals a typical peak-valley profile around each surface plasmon resonance mode. Mie theory was used to study theoretically the impact of nanoparticle size on the thermo-optical properties. The results allow us to interpret properly the experimental findings.

  9. A new theoretical approach to adsorption desorption behavior of Ga on GaAs surfaces

    Science.gov (United States)

    Kangawa, Y.; Ito, T.; Taguchi, A.; Shiraishi, K.; Ohachi, T.

    2001-11-01

    We propose a new theoretical approach for studying adsorption-desorption behavior of atoms on semiconductor surfaces. The new theoretical approach based on the ab initio calculations incorporates the free energy of gas phase; therefore we can calculate how adsorption and desorption depends on growth temperature and beam equivalent pressure (BEP). The versatility of the new theoretical approach was confirmed by the calculation of Ga adsorption-desorption transition temperatures and transition BEPs on the GaAs(0 0 1)-(4×2)β2 Ga-rich surface. This new approach is feasible to predict how adsorption and desorption depend on the growth conditions.

  10. Tetraphenylpyrimidine-Based AIEgens: Facile Preparation, Theoretical Investigation and Practical Application

    Directory of Open Access Journals (Sweden)

    Junkai Liu

    2017-10-01

    Full Text Available Aggregation-induced emission (AIE has become a hot research area and tremendous amounts of AIE-active luminogens (AIEgens have been generated. To further promote the development of AIE, new AIEgens are highly desirable. Herein, new AIEgens based on tetraphenylpyrimidine (TPPM are rationally designed according to the AIE mechanism of restriction of intramolecular motion, and facilely prepared under mild reaction conditions. The photophysical property of the generated TPPM, TPPM-4M and TPPM-4P are systematically investigated and the results show that they feature the aggregation-enhanced emission (AEE characteristics. Theoretical study shows the high-frequency bending vibrations in the central pyrimidine ring of TPPM derivatives dominate the nonradiative decay channels. Thanks to the AEE feature, their aggregates can be used to detect explosives with super-amplification quenching effects, and the sensing ability is higher than typical AIE-active tetraphenylethene. It is anticipated that TPPM derivatives could serve as a new type of widely used AIEgen based on their facile preparation and good thermo-, photo- and chemostabilities.

  11. An integrated approach to develop, validate and operate thermo-physiological human simulator for the development of protective clothing.

    Science.gov (United States)

    Psikuta, Agnes; Koelblen, Barbara; Mert, Emel; Fontana, Piero; Annaheim, Simon

    2017-12-07

    Following the growing interest in the further development of manikins to simulate human thermal behaviour more adequately, thermo-physiological human simulators have been developed by coupling a thermal sweating manikin with a thermo-physiology model. Despite their availability and obvious advantages, the number of studies involving these devices is only marginal, which plausibly results from the high complexity of the development and evaluation process and need of multi-disciplinary expertise. The aim of this paper is to present an integrated approach to develop, validate and operate such devices including technical challenges and limitations of thermo-physiological human simulators, their application and measurement protocol, strategy for setting test scenarios, and the comparison to standard methods and human studies including details which have not been published so far. A physical manikin controlled by a human thermoregulation model overcame the limitations of mathematical clothing models and provided a complementary method to investigate thermal interactions between the human body, protective clothing, and its environment. The opportunities of these devices include not only realistic assessment of protective clothing assemblies and equipment but also potential application in many research fields ranging from biometeorology, automotive industry, environmental engineering, and urban climate to clinical and safety applications.

  12. Thermo-mechanical constitutive modeling of unsaturated clays based on the critical state concepts

    Directory of Open Access Journals (Sweden)

    Saeed Tourchi

    2015-04-01

    Full Text Available A thermo-mechanical constitutive model for unsaturated clays is constructed based on the existing model for saturated clays originally proposed by the authors. The saturated clays model was formulated in the framework of critical state soil mechanics and modified Cam-clay model. The existing model has been generalized to simulate the experimentally observed behavior of unsaturated clays by introducing Bishop's stress and suction as independent stress parameters and modifying the hardening rule and yield criterion to take into account the role of suction. Also, according to previous studies, an increase in temperature causes a reduction in specific volume. A reduction in suction (wetting for a given confining stress may induce an irreversible volumetric compression (collapse. Thus an increase in suction (drying raises a specific volume i.e. the movement of normal consolidation line (NCL to higher values of void ratio. However, some experimental data confirm the assumption that this reduction is dependent on the stress level of soil element. A generalized approach considering the effect of stress level on the magnitude of clays thermal dependency in compression plane is proposed in this study. The number of modeling parameters is kept to a minimum, and they all have clear physical interpretations, to facilitate the usefulness of model for practical applications. A step-by-step procedure used for parameter calibration is also described. The model is finally evaluated using a comprehensive set of experimental data for the thermo-mechanical behavior of unsaturated soils.

  13. Multi-scale modeling of the thermo-mechanical behavior of particle-based composites

    International Nuclear Information System (INIS)

    Di Paola, F.

    2010-01-01

    The aim of this work was to perform numerical simulations of the thermal and mechanical behavior of a particle-based nuclear fuel. This is a refractory composite material made of UO 2 spherical particles which are coated with two layers of pyrocarbon and embedded in a graphite matrix at a high volume fraction (45%). The objective was to develop a multi-scale modeling of this composite material which can estimate its mean behavior as well as the heterogeneity of the local mechanical variables. The first part of this work was dedicated to the modeling of the microstructure in 3D. To do this, we developed tools to generate random distributions of spheres, meshes and to characterize the morphology of the microstructure towards the finite element code Cast3M. A hundred of numerical samples of the composite were created. The second part was devoted to the characterization of the thermo-elastic behavior by the finite element modeling of the samples. We studied the influence of different modeling parameters, one of them is the boundary conditions. We proposed a method to vanish the boundary conditions effects from the computed solution by analyzing it on an internal sub-volume of the sample obtained by erosion. Then, we determined the effective properties (elastic moduli, thermal conductivity and thermal expansion) and the stress distribution within the matrix. Finally, in the third part we proposed a multi-scale modeling to determine the mean values and the variance and covariance of the local mechanical variables for any macroscopic load. This statistical approach have been used to estimate the intra-phase distribution of these variables in the composite material. (author) [fr

  14. Multi-scale modeling of the thermo-mechanical behavior of particle-based composites

    International Nuclear Information System (INIS)

    Di Paola, F.

    2010-11-01

    The aim of this work was to perform numerical simulations of the thermal and mechanical behavior of a particle-based nuclear fuel. This is a refractory composite material made of UO 2 spherical particles which are coated with two layers of pyrocarbon and embedded in a graphite matrix at a high volume fraction (45 %). The objective was to develop a multi-scale modeling of this composite material which can estimate its mean behavior as well as the heterogeneity of the local mechanical variables. The first part of this work was dedicated to the modeling of the microstructure in 3D. To do this, we developed tools to generate random distributions of spheres, meshes and to characterize the morphology of the microstructure towards the finite element code Cast3M. A hundred of numerical samples of the composite were created. The second part was devoted to the characterization of the thermo-elastic behavior by the finite element modeling of the samples. We studied the influence of different modeling parameters, one of them is the boundary conditions. We proposed a method to vanish the boundary conditions effects from the computed solution by analyzing it on an internal sub-volume of the sample obtained by erosion. Then, we determined the effective properties (elastic moduli, thermal conductivity and thermal expansion) and the stress distribution within the matrix. Finally, in the third part we proposed a multi-scale modeling to determine the mean values and the variance and covariance of the local mechanical variables for any macroscopic load. This statistical approach have been used to estimate the intra-phase distribution of these variables in the composite material. (author)

  15. Theoretical approaches to elections defining

    OpenAIRE

    Natalya V. Lebedeva

    2011-01-01

    Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  16. Data for effects of lanthanum complex on the thermo-oxidative aging of natural rubber

    Directory of Open Access Journals (Sweden)

    Wei Zheng

    2015-12-01

    Full Text Available Novel mixed antioxidants composed of antioxidant IPPD and lanthanum (La complex were added as a filler to form natural rubber (NR composites. By mechanical testing, Fourier transform infrared spectroscopy with attenuated total reflectance (FTIR-ATR and thermogravimetric analysis (TGA, a string of data, including the mechanical properties, the variation of internal groups and the thermal and thermo-oxidative decompositions of NR, was presented in this data article. The data accompanying its research article [1] studied the thermo-oxidative aging properties of NR in detail. The density function theoretical (DFT calculations were also used as an assistant to study the thermo-oxidative aging mechanism of NR. The data revealed that this new rare-earth antioxidant could indeed enhance the thermo-oxidative aging resistance of NR, which is associated with its different function mechanism from that of the pure antioxidant IPPD.

  17. Data for effects of lanthanum complex on the thermo-oxidative aging of natural rubber.

    Science.gov (United States)

    Zheng, Wei; Liu, Li; Zhao, Xiuying; He, Jingwei; Wang, Ao; Chan, Tung W; Wu, Sizhu

    2015-12-01

    Novel mixed antioxidants composed of antioxidant IPPD and lanthanum (La) complex were added as a filler to form natural rubber (NR) composites. By mechanical testing, Fourier transform infrared spectroscopy with attenuated total reflectance (FTIR-ATR) and thermogravimetric analysis (TGA), a string of data, including the mechanical properties, the variation of internal groups and the thermal and thermo-oxidative decompositions of NR, was presented in this data article. The data accompanying its research article [1] studied the thermo-oxidative aging properties of NR in detail. The density function theoretical (DFT) calculations were also used as an assistant to study the thermo-oxidative aging mechanism of NR. The data revealed that this new rare-earth antioxidant could indeed enhance the thermo-oxidative aging resistance of NR, which is associated with its different function mechanism from that of the pure antioxidant IPPD.

  18. Data for effects of lanthanum complex on the thermo-oxidative aging of natural rubber

    Science.gov (United States)

    Zheng, Wei; Liu, Li; Zhao, Xiuying; He, Jingwei; Wang, Ao; Chan, Tung W.; Wu, Sizhu

    2015-01-01

    Novel mixed antioxidants composed of antioxidant IPPD and lanthanum (La) complex were added as a filler to form natural rubber (NR) composites. By mechanical testing, Fourier transform infrared spectroscopy with attenuated total reflectance (FTIR-ATR) and thermogravimetric analysis (TGA), a string of data, including the mechanical properties, the variation of internal groups and the thermal and thermo-oxidative decompositions of NR, was presented in this data article. The data accompanying its research article [1] studied the thermo-oxidative aging properties of NR in detail. The density function theoretical (DFT) calculations were also used as an assistant to study the thermo-oxidative aging mechanism of NR. The data revealed that this new rare-earth antioxidant could indeed enhance the thermo-oxidative aging resistance of NR, which is associated with its different function mechanism from that of the pure antioxidant IPPD. PMID:26693513

  19. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  20. Reducing the memory size in the study of statistical properties of the pseudo-random number generators, focused on solving problems of cryptographic information protection

    International Nuclear Information System (INIS)

    Chugunkov, I.V.

    2014-01-01

    The report contains the description of an approach based on calculation of missing sets quantity, which allows to reduce memory usage needed for implementation of statistical tests. Information about estimation procedure of test statistics derived as a result of using this approach is also provided [ru

  1. Comparison of subset-based local and FE-based global digital image correlation: Theoretical error analysis and validation

    KAUST Repository

    Pan, B.

    2016-03-22

    Subset-based local and finite-element-based (FE-based) global digital image correlation (DIC) approaches are the two primary image matching algorithms widely used for full-field displacement mapping. Very recently, the performances of these different DIC approaches have been experimentally investigated using numerical and real-world experimental tests. The results have shown that in typical cases, where the subset (element) size is no less than a few pixels and the local deformation within a subset (element) can be well approximated by the adopted shape functions, the subset-based local DIC outperforms FE-based global DIC approaches because the former provides slightly smaller root-mean-square errors and offers much higher computation efficiency. Here we investigate the theoretical origin and lay a solid theoretical basis for the previous comparison. We assume that systematic errors due to imperfect intensity interpolation and undermatched shape functions are negligibly small, and perform a theoretical analysis of the random errors or standard deviation (SD) errors in the displacements measured by two local DIC approaches (i.e., a subset-based local DIC and an element-based local DIC) and two FE-based global DIC approaches (i.e., Q4-DIC and Q8-DIC). The equations that govern the random errors in the displacements measured by these local and global DIC approaches are theoretically derived. The correctness of the theoretically predicted SD errors is validated through numerical translation tests under various noise levels. We demonstrate that the SD errors induced by the Q4-element-based local DIC, the global Q4-DIC and the global Q8-DIC are 4, 1.8-2.2 and 1.2-1.6 times greater, respectively, than that associated with the subset-based local DIC, which is consistent with our conclusions from previous work. © 2016 Elsevier Ltd. All rights reserved.

  2. Thermo effect of chemical reaction in irreversible electrochemical systems

    International Nuclear Information System (INIS)

    Tran Vinh Quy; Nguyen Tang

    1989-01-01

    From first law of thermodynamics the expressions of statistical calculation of 'Fundamental' and 'Thermo-chemical' thermal effects are obtained. Besides, method of calculation of thermal effect of chemical reactions in non-equilibrium electro-chemical systems is accurately discussed. (author). 7 refs

  3. Targeting Fear of Spiders with Control-, Acceptance-, and Information-Based Approaches

    Science.gov (United States)

    Wagener, Alexandra L.; Zettle, Robert D.

    2011-01-01

    The relative impact of control-, acceptance-, and information-based approaches in targeting a midlevel fear of spiders among college students was evaluated. Participants listened to a brief protocol presenting one of the three approaches before completing the Perceived-Threat Behavioral Approach Test (PT-BAT; Cochrane, Barnes-Holmes, &…

  4. Segmentation of human skull in MRI using statistical shape information from CT data.

    Science.gov (United States)

    Wang, Defeng; Shi, Lin; Chu, Winnie C W; Cheng, Jack C Y; Heng, Pheng Ann

    2009-09-01

    To automatically segment the skull from the MRI data using a model-based three-dimensional segmentation scheme. This study exploited the statistical anatomy extracted from the CT data of a group of subjects by means of constructing an active shape model of the skull surfaces. To construct a reliable shape model, a novel approach was proposed to optimize the automatic landmarking on the coupled surfaces (i.e., the skull vault) by minimizing the description length that incorporated local thickness information. This model was then used to locate the skull shape in MRI of a different group of patients. Compared with performing landmarking separately on the coupled surfaces, the proposed landmarking method constructed models that had better generalization ability and specificity. The segmentation accuracies were measured by the Dice coefficient and the set difference, and compared with the method based on mathematical morphology operations. The proposed approach using the active shape model based on the statistical skull anatomy presented in the head CT data contributes to more reliable segmentation of the skull from MRI data.

  5. Statistical lamb wave localization based on extreme value theory

    Science.gov (United States)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  6. Partial Least Square Approach to Second Order Factor in Behavioural Study of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Ibrahim Mohd Tarmizi

    2017-01-01

    Full Text Available Theories are developed to explain an observed phenomenon in an effort to understand why and how things happen. Theories thus, use latent variables to estimate conceptual parameters. The level of abstraction depends, partly on the complexity of the theoretical model explaining the phenomenon. The conjugation of directly-measured variables leads to a formation of a first-order factor. A combination of theoretical underpinnings supporting an existence of a higher-order components, and statistical evidence pointing to such presence adds advantage for the researchers to investigate a phenomenon both at an aggregated and disjointed dimensions. As partial least square (PLS gains its tractions in theory development, behavioural accounting discipline in general should exploit the flexibility of PLS to work with the higher-order factors. However, technical guides are scarcely available. Therefore, this article presents a PLS approach to validate a higher-order factor on a statistical ground using accounting information system dataset.

  7. On precipitation monitoring with theoretical statistical distributions

    Science.gov (United States)

    Cindrić, Ksenija; Juras, Josip; Pasarić, Zoran

    2018-04-01

    A common practice in meteorological drought monitoring is to transform the observed precipitation amounts to the standardised precipitation index (SPI). Though the gamma distribution is usually employed for this purpose, some other distribution may be used, particularly in regions where zero precipitation amounts are recorded frequently. In this study, two distributions are considered alongside with the gamma distribution: the compound Poisson exponential distribution (CPE) and the square root normal distribution (SRN). They are fitted to monthly precipitation amounts measured at 24 stations in Croatia in the 55-year-long period (1961-2015). At five stations, long-term series (1901-2015) are available and they have been used for a more detailed investigation. The accommodation of the theoretical distributions to empirical ones is tested by comparison of the corresponding empirical and theoretical ratios of the skewness and the coefficient of variation. Furthermore, following the common approach to precipitation monitoring (CLIMAT reports), the comparison of the empirical and theoretical quintiles in the two periods (1961-1990 and 1991-2015) is examined. The results from the present study reveal that it would be more appropriate to implement theoretical distributions in such climate reports, since they provide better evaluation for monitoring purposes than the current empirical distribution. Nevertheless, deciding on an optimal theoretical distribution for different climate regimes and for different time periods is not easy to accomplish. With regard to Croatian stations (covering different climate regimes), the CPE or SRN distribution could also be the right choice in the climatological practice, in addition to the gamma distribution.

  8. An information-theoretic machine learning approach to expression QTL analysis.

    Directory of Open Access Journals (Sweden)

    Tao Huang

    Full Text Available Expression Quantitative Trait Locus (eQTL analysis is a powerful tool to study the biological mechanisms linking the genotype with gene expression. Such analyses can identify genomic locations where genotypic variants influence the expression of genes, both in close proximity to the variant (cis-eQTL, and on other chromosomes (trans-eQTL. Many traditional eQTL methods are based on a linear regression model. In this study, we propose a novel method by which to identify eQTL associations with information theory and machine learning approaches. Mutual Information (MI is used to describe the association between genetic marker and gene expression. MI can detect both linear and non-linear associations. What's more, it can capture the heterogeneity of the population. Advanced feature selection methods, Maximum Relevance Minimum Redundancy (mRMR and Incremental Feature Selection (IFS, were applied to optimize the selection of the affected genes by the genetic marker. When we applied our method to a study of apoE-deficient mice, it was found that the cis-acting eQTLs are stronger than trans-acting eQTLs but there are more trans-acting eQTLs than cis-acting eQTLs. We compared our results (mRMR.eQTL with R/qtl, and MatrixEQTL (modelLINEAR and modelANOVA. In female mice, 67.9% of mRMR.eQTL results can be confirmed by at least two other methods while only 14.4% of R/qtl result can be confirmed by at least two other methods. In male mice, 74.1% of mRMR.eQTL results can be confirmed by at least two other methods while only 18.2% of R/qtl result can be confirmed by at least two other methods. Our methods provide a new way to identify the association between genetic markers and gene expression. Our software is available from supporting information.

  9. Three-dimensional Reconstruction and Homogenization of Heterogeneous Materials Using Statistical Correlation Functions and FEM

    Energy Technology Data Exchange (ETDEWEB)

    Baniassadi, Majid; Mortazavi, Behzad; Hamedani, Amani; Garmestani, Hamid; Ahzi, Said; Fathi-Torbaghan, Madjid; Ruch, David; Khaleel, Mohammad A.

    2012-01-31

    In this study, a previously developed reconstruction methodology is extended to three-dimensional reconstruction of a three-phase microstructure, based on two-point correlation functions and two-point cluster functions. The reconstruction process has been implemented based on hybrid stochastic methodology for simulating the virtual microstructure. While different phases of the heterogeneous medium are represented by different cells, growth of these cells is controlled by optimizing parameters such as rotation, shrinkage, translation, distribution and growth rates of the cells. Based on the reconstructed microstructure, finite element method (FEM) was used to compute the effective elastic modulus and effective thermal conductivity. A statistical approach, based on two-point correlation functions, was also used to directly estimate the effective properties of the developed microstructures. Good agreement between the predicted results from FEM analysis and statistical methods was found confirming the efficiency of the statistical methods for prediction of thermo-mechanical properties of three-phase composites.

  10. Theoretical approaches to elections defining

    Directory of Open Access Journals (Sweden)

    Natalya V. Lebedeva

    2011-01-01

    Full Text Available Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  11. Thermo-hydraulic and structural analysis for finger-based concept of ITER blanket first wall

    International Nuclear Information System (INIS)

    Kim, Byoung-Yoon; Ahn, Hee-Jae

    2011-01-01

    The blanket first wall is one of the main plasma facing components in ITER tokamak. The finger-typed first wall was proposed through the current design progress by ITER organization. In this concept, each first wall module is composed of a beam and twenty fingers. The main function of the first wall is to remove efficiently the high heat flux loading from the fusion plasma during its operation. Therefore, the thermal and structural performance should be investigated for the proposed finger-based design concept of first wall. The various case studies were performed for a unit finger model considering different loading conditions. The finite element model was made for a half of a module using symmetric boundary conditions to reduce the computational effort. The thermo-hydraulic analysis was performed to obtain the pressure drop and temperature profiles. Then the structural analysis was carried out using the maximum temperature distribution obtained in thermo-hydraulic analysis. Finally, the transient thermo-hydraulic analysis was performed for the generic first wall module to obtain the temperature evolution history considering cyclic heat flux loading with nuclear heating. After that, the thermo-mechanical analysis was performed at the time step when the maximum temperature gradient was occurred. Also, the stress analysis was performed for the component with a finger and a beam to check the residual stress of the component after thermal shrinkage assembly.

  12. Detection System of HTTP DDoS Attacks in a Cloud Environment Based on Information Theoretic Entropy and Random Forest

    Directory of Open Access Journals (Sweden)

    Mohamed Idhammad

    2018-01-01

    Full Text Available Cloud Computing services are often delivered through HTTP protocol. This facilitates access to services and reduces costs for both providers and end-users. However, this increases the vulnerabilities of the Cloud services face to HTTP DDoS attacks. HTTP request methods are often used to address web servers’ vulnerabilities and create multiple scenarios of HTTP DDoS attack such as Low and Slow or Flooding attacks. Existing HTTP DDoS detection systems are challenged by the big amounts of network traffic generated by these attacks, low detection accuracy, and high false positive rates. In this paper we present a detection system of HTTP DDoS attacks in a Cloud environment based on Information Theoretic Entropy and Random Forest ensemble learning algorithm. A time-based sliding window algorithm is used to estimate the entropy of the network header features of the incoming network traffic. When the estimated entropy exceeds its normal range the preprocessing and the classification tasks are triggered. To assess the proposed approach various experiments were performed on the CIDDS-001 public dataset. The proposed approach achieves satisfactory results with an accuracy of 99.54%, a FPR of 0.4%, and a running time of 18.5s.

  13. Coordination of the National Statistical System in the Information Security Context

    Directory of Open Access Journals (Sweden)

    O. H.

    2017-12-01

    Full Text Available The need for building the national statistical system (NSS as the framework for coordination of statistical works is substantiated. NSS is defined on the basis of system approach. It is emphasized that the essential conditions underlying NSS are strategic planning, reliance on internationally adopted methods and due consideration to country-specific environment. The role of the state coordination policy in organizing statistical activities in the NSS framework is highlighted, key objectives of the integrated national policy on coordination of statistical activities are given. Threats arising from non-existence of NSS in a country are shown: “irregular” pattern of statistical activities, resulting from absence of common legal, methodological and organizational grounds; high costs involved in the finished information product in parallel with its low quality; impossibility of administering the statistical information security in a coherent manner, i. e. keeping with the rules on confidentiality of data, preventing intentional distortion of information and keeping with the rules of treatment with data making the state secret. An extensive review of NSS functional objectives is made: to ensure the system development of the official statistics; to ensure confidentiality and protection of individual data; to establish interdepartmental mechanisms for control and protection of secret statistical information; to broaden and regulate the access to statistical data and their effective use. The need for creating the National Statistical Commission is grounded.

  14. A Theoretical Analysis of the Mission Statement Based on the Axiological Approach

    Directory of Open Access Journals (Sweden)

    Marius-Costel EŞI

    2016-12-01

    Full Text Available The aim of this work is focused on a theoretical analysis of formulating the mission statement of business organizations in relation to the idea of the organizational axiological core. On one hand, we consider the CSR-Corporate Social Responsibility which, in our view, must be brought into direct connection both with the moral entrepreneurship (which should support the philosophical perspective of the statement of business organizations mission and the purely economic entrepreneurship based on profit maximization (which should support the pragmatic perspective. On the other hand, an analysis of the moral concepts which should underpin business is becoming fundamental, in our view, as far as the idea of the social specific value of the social entrepreneurship is evidenced. Therefore, our approach highlights a number of epistemic explanations in relation to the actual practice dimension.

  15. Visual wetness perception based on image color statistics.

    Science.gov (United States)

    Sawayama, Masataka; Adelson, Edward H; Nishida, Shin'ya

    2017-05-01

    Color vision provides humans and animals with the abilities to discriminate colors based on the wavelength composition of light and to determine the location and identity of objects of interest in cluttered scenes (e.g., ripe fruit among foliage). However, we argue that color vision can inform us about much more than color alone. Since a trichromatic image carries more information about the optical properties of a scene than a monochromatic image does, color can help us recognize complex material qualities. Here we show that human vision uses color statistics of an image for the perception of an ecologically important surface condition (i.e., wetness). Psychophysical experiments showed that overall enhancement of chromatic saturation, combined with a luminance tone change that increases the darkness and glossiness of the image, tended to make dry scenes look wetter. Theoretical analysis along with image analysis of real objects indicated that our image transformation, which we call the wetness enhancing transformation, is consistent with actual optical changes produced by surface wetting. Furthermore, we found that the wetness enhancing transformation operator was more effective for the images with many colors (large hue entropy) than for those with few colors (small hue entropy). The hue entropy may be used to separate surface wetness from other surface states having similar optical properties. While surface wetness and surface color might seem to be independent, there are higher order color statistics that can influence wetness judgments, in accord with the ecological statistics. The present findings indicate that the visual system uses color image statistics in an elegant way to help estimate the complex physical status of a scene.

  16. Approaches on information presented in different brazilian periodicals from the area of information science

    Directory of Open Access Journals (Sweden)

    Nadia Aurora Vanti

    2013-04-01

    Full Text Available This article aims at mapping approaches on information presented in different Brazilian periodicals from the area of Information Science, regarding three conceptual guidelines: Business information, citizenship information and information for emancipation. The methodological approach encompassed a review of literature and qualitative and quantitative analysis. We conclude that the concept of information adopted in the articles analyzed varies according to the theoretical framework addressed by the authors, and for each of them is used a set of terms that identifies it as such. It was also possible to observe that the more recurring focus in the analyzed journals was Business information.

  17. Statistical properties of quantum entanglement and information entropy

    International Nuclear Information System (INIS)

    Abdel-Aty, M.M.A.

    2007-03-01

    Key words: entropy, entanglement, atom-field interaction, trapped ions, cold atoms, information entropy. Objects of research: Pure state entanglement, entropy squeezing mazer. The aim of the work: Study of the new entanglement features and new measures for both pure-state and mixed state of particle-field interaction. Also, the impact of the information entropy on the quantum information theory. Method of investigation: Methods of theoretical physics and applied mathematics (statistical physics, quantum optics) are used. Results obtained and their novelty are: All the results of the dissertation are new and many new features have been discovered. Particularly: the most general case of the pure state entanglement has been introduced. Although various special aspects of the quantum entropy have been investigated previously, the general features of the dynamics, when a multi-level system and a common environment are considered, have not been treated before and our work therefore, field a gap in the literature. Specifically: 1) A new entanglement measure due to quantum mutual entropy (mixed-state entanglement) we called it DEM, has been introduced, 2) A new treatment of the atomic information entropy in higher level systems has been presented. The problem has been completely solved in the case of three-level system, 3) A new solution of the interaction between the ultra cold atoms and cavity field has been discovered, 4) Some new models of the atom-field interaction have been adopted. Practical value: The subject carries out theoretic character. Application region: Results can be used in quantum computer developments. Also, the presented results can be used for further developments of the quantum information and quantum communications. (author)

  18. ε-Polylysine-based thermo-responsive adsorbents for immunoglobulin adsorption-desorption under mild conditions.

    Science.gov (United States)

    Maruyama, Masashi; Shibuya, Keisuke

    2017-08-22

    Thermo-responsive adsorbents for immunoglobulin G (IgG) employing ε-polylysine (EPL) as a polymer backbone were developed. The introduction of mercaptoethylpyridine (MEP) as an IgG-binding ligand and hydrophobization of side chains afforded thermo-responsive IgG adsorbents, whose thermo-responsive IgG desorption ratio was up to 88% (EPL/MEP derivative 3m). The changes in surface densities of active MEP groups, which are caused by thermal conformational changes of the adsorbents, play key roles for IgG desorption. Although a trade-off of IgG adsorption capacity and IgG desorption ratio was observed, the present study offers a novel molecular design for thermo-responsive adsorbents with high synthetic accessibility and potentially low toxicity.

  19. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  20. Information-Theoretical Analysis of EEG Microstate Sequences in Python

    Directory of Open Access Journals (Sweden)

    Frederic von Wegner

    2018-06-01

    Full Text Available We present an open-source Python package to compute information-theoretical quantities for electroencephalographic data. Electroencephalography (EEG measures the electrical potential generated by the cerebral cortex and the set of spatial patterns projected by the brain's electrical potential on the scalp surface can be clustered into a set of representative maps called EEG microstates. Microstate time series are obtained by competitively fitting the microstate maps back into the EEG data set, i.e., by substituting the EEG data at a given time with the label of the microstate that has the highest similarity with the actual EEG topography. As microstate sequences consist of non-metric random variables, e.g., the letters A–D, we recently introduced information-theoretical measures to quantify these time series. In wakeful resting state EEG recordings, we found new characteristics of microstate sequences such as periodicities related to EEG frequency bands. The algorithms used are here provided as an open-source package and their use is explained in a tutorial style. The package is self-contained and the programming style is procedural, focusing on code intelligibility and easy portability. Using a sample EEG file, we demonstrate how to perform EEG microstate segmentation using the modified K-means approach, and how to compute and visualize the recently introduced information-theoretical tests and quantities. The time-lagged mutual information function is derived as a discrete symbolic alternative to the autocorrelation function for metric time series and confidence intervals are computed from Markov chain surrogate data. The software package provides an open-source extension to the existing implementations of the microstate transform and is specifically designed to analyze resting state EEG recordings.

  1. Game theoretic approaches for spectrum redistribution

    CERN Document Server

    Wu, Fan

    2014-01-01

    This brief examines issues of spectrum allocation for the limited resources of radio spectrum. It uses a game-theoretic perspective, in which the nodes in the wireless network are rational and always pursue their own objectives. It provides a systematic study of the approaches that can guarantee the system's convergence at an equilibrium state, in which the system performance is optimal or sub-optimal. The author provides a short tutorial on game theory, explains game-theoretic channel allocation in clique and in multi-hop wireless networks and explores challenges in designing game-theoretic m

  2. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  3. Quantum statistics of stimulated Raman and hyper-Raman scattering by master equation approach

    International Nuclear Information System (INIS)

    Gupta, P.S.; Dash, J.

    1991-01-01

    A quantum theoretical density matrix formalism of stimulated Raman and hyper-Raman scattering using master equation approach is presented. The atomic system is described by two energy levels. The effects of upper level population and the cavity loss are incorporated. The photon statistics, coherence characteristics and the building up of the Stokes field are investigated. (author). 8 figs., 5 refs

  4. A Theoretical Approach to Information Needs Across Different Healthcare Stakeholders

    Science.gov (United States)

    Raitoharju, Reetta; Aarnio, Eeva

    Increased access to medical information can lead to information overload among both the employees in the healthcare sector as well as among healthcare consumers. Moreover, medical information can be hard to understand for consumers who have no prerequisites for interpreting and understanding it. Information systems (e.g. electronic patient records) are normally designed to meet the demands of one professional group, for instance those of physicians. Therefore, the same information in the same form is presented to all the users of the systems regardless of the actual need or prerequisites. The purpose of this article is to illustrate the differences in information needs across different stakeholders in healthcare. A literature review was conducted to collect examples of these different information needs. Based on the findings the role of more user specific information systems is discussed.

  5. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  6. Relational autonomy in informed consent (RAIC) as an ethics of care approach to the concept of informed consent.

    Science.gov (United States)

    Osuji, Peter I

    2018-03-01

    The perspectives of the dominant Western ethical theories, have dominated the concepts of autonomy and informed consent for many years. Recently this dominant understanding has been challenged by ethics of care which, although, also emanates from the West presents a more nuanced concept: relational autonomy, which is more faithful to our human experience. By paying particular attention to relational autonomy, particularity and Process approach to ethical deliberations in ethics of care, this paper seeks to construct a concept of informed consent from the perspective of ethics of care which is here called relational autonomy-in-informed consent (RAIC). Thus, providing a broader theoretical basis for informed consent beyond the usual theoretical perspectives that are particularly Western. Care ethics provides such a broader basis because it appeals to a global perspective that encompasses lessons from other cultures, and this will help to enrich the current ideas of bioethics principles of autonomy and informed consent. This objective will be achieved by exploring the ethics of care emphasis on relationships based on a universal experience of caring; and by contrasting its concept of autonomy as relational with the understanding of autonomy in the approaches of the dominant moral theories that reflect rational, individualistic, and rights-oriented autonomy of the American liberalism.

  7. Expectancy-Violation and Information-Theoretic Models of Melodic Complexity

    Directory of Open Access Journals (Sweden)

    Tuomas Eerola

    2016-07-01

    Full Text Available The present study assesses two types of models for melodic complexity: one based on expectancy violations and the other one related to an information-theoretic account of redundancy in music. Seven different datasets spanning artificial sequences, folk and pop songs were used to refine and assess the models. The refinement eliminated unnecessary components from both types of models. The final analysis pitted three variants of the two model types against each other and could explain from 46-74% of the variance in the ratings across the datasets. The most parsimonious models were identified with an information-theoretic criterion. This suggested that the simplified expectancy-violation models were the most efficient for these sets of data. However, the differences between all optimized models were subtle in terms both of performance and simplicity.

  8. Statistical inference an integrated Bayesianlikelihood approach

    CERN Document Server

    Aitkin, Murray

    2010-01-01

    Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to difficult model comparison problems and offers direct Bayesian counterparts of frequentist t-tests and other standard statistical methods for hypothesis testing.After an overview of the competing theories of statistical inference, the book introduces the Bayes/likelihood approach used throughout. It pre

  9. Information-Theoretic Approach May Shed a Light to a Better Understanding and Sustaining the Integrity of Ecological-Societal Systems under Changing Climate

    Science.gov (United States)

    Kim, J.

    2016-12-01

    Considering high levels of uncertainty, epistemological conflicts over facts and values, and a sense of urgency, normal paradigm-driven science will be insufficient to mobilize people and nation toward sustainability. The conceptual framework to bridge the societal system dynamics with that of natural ecosystems in which humanity operates remains deficient. The key to understanding their coevolution is to understand `self-organization.' Information-theoretic approach may shed a light to provide a potential framework which enables not only to bridge human and nature but also to generate useful knowledge for understanding and sustaining the integrity of ecological-societal systems. How can information theory help understand the interface between ecological systems and social systems? How to delineate self-organizing processes and ensure them to fulfil sustainability? How to evaluate the flow of information from data through models to decision-makers? These are the core questions posed by sustainability science in which visioneering (i.e., the engineering of vision) is an essential framework. Yet, visioneering has neither quantitative measure nor information theoretic framework to work with and teach. This presentation is an attempt to accommodate the framework of self-organizing hierarchical open systems with visioneering into a common information-theoretic framework. A case study is presented with the UN/FAO's communal vision of climate-smart agriculture (CSA) which pursues a trilemma of efficiency, mitigation, and resilience. Challenges of delineating and facilitating self-organizing systems are discussed using transdisciplinary toold such as complex systems thinking, dynamic process network analysis and multi-agent systems modeling. Acknowledgments: This study was supported by the Korea Meteorological Administration Research and Development Program under Grant KMA-2012-0001-A (WISE project).

  10. Game-theoretic interference coordination approaches for dynamic spectrum access

    CERN Document Server

    Xu, Yuhua

    2016-01-01

    Written by experts in the field, this book is based on recent research findings in dynamic spectrum access for cognitive radio networks. It establishes a game-theoretic framework and presents cutting-edge technologies for distributed interference coordination. With game-theoretic formulation and the designed distributed learning algorithms, it provides insights into the interactions between multiple decision-makers and the converging stable states. Researchers, scientists and engineers in the field of cognitive radio networks will benefit from the book, which provides valuable information, useful methods and practical algorithms for use in emerging 5G wireless communication.

  11. Reflections on the conceptualization and operationalization of a set-theoretic approach to employee motivation and performance research

    Directory of Open Access Journals (Sweden)

    James Christopher Ryan

    2017-01-01

    Full Text Available The current commentary offers a reflection on the conceptualizations of Lee and Raschke's (2016 proposal for a set-theoretic approach to employee motivation and organizational performance. The commentary is informed by the current author's operationalization of set-theoretic research on employee motivation which occurred contemporaneously to the work of Lee and Raschke. Observations on the state of current research on employee motivation, development of motivation theory and future directions of set-theoretic approaches to employee motivation and performance are offered.

  12. Natural disaster risk analysis for critical infrastructure systems: An approach based on statistical learning theory

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2009-01-01

    Probabilistic risk analysis has historically been developed for situations in which measured data about the overall reliability of a system are limited and expert knowledge is the best source of information available. There continue to be a number of important problem areas characterized by a lack of hard data. However, in other important problem areas the emergence of information technology has transformed the situation from one characterized by little data to one characterized by data overabundance. Natural disaster risk assessments for events impacting large-scale, critical infrastructure systems such as electric power distribution systems, transportation systems, water supply systems, and natural gas supply systems are important examples of problems characterized by data overabundance. There are often substantial amounts of information collected and archived about the behavior of these systems over time. Yet it can be difficult to effectively utilize these large data sets for risk assessment. Using this information for estimating the probability or consequences of system failure requires a different approach and analysis paradigm than risk analysis for data-poor systems does. Statistical learning theory, a diverse set of methods designed to draw inferences from large, complex data sets, can provide a basis for risk analysis for data-rich systems. This paper provides an overview of statistical learning theory methods and discusses their potential for greater use in risk analysis

  13. Use of a smart phone based thermo camera for skin prick allergy testing: a feasibility study (Conference Presentation)

    Science.gov (United States)

    Barla, Lindi; Verdaasdonk, Rudolf M.; Rustemeyer, Thomas; Klaessens, John; van der Veen, Albert

    2016-02-01

    Allergy testing is usually performed by exposing the skin to small quantities of potential allergens on the inner forearm and scratching the protective epidermis to increase exposure. After 15 minutes the dermatologist performs a visual check for swelling and erythema which is subjective and difficult for e.g. dark skin types. A small smart phone based thermo camera (FLIR One) was used to obtain quantitative images in a feasibility study of 17 patients Directly after allergen exposure on the forearm, thermal images were captured at 30 seconds interval and processed to a time lapse movie over 15 minutes. Considering the 'subjective' reading of the dermatologist as golden standard, in 11/17 pts (65%) the evaluation of dermatologist was confirmed by the thermo camera including 5 of 6 patients without allergic response. In 7 patients thermo showed additional spots. Of the 342 sites tested, the dermatologist detected 47 allergies of which 28 (60%) were confirmed by thermo imaging while thermo imaging showed 12 additional spots. The method can be improved with user dedicated acquisition software and better registration between normal and thermal images. The lymphatic reaction seems to shift from the original puncture site. The interpretation of the thermal images is still subjective since collecting quantitative data is difficult due to motion patient during 15 minutes. Although not yet conclusive, thermal imaging shows to be promising to improve the sensitivity and selectivity of allergy testing using a smart phone based camera.

  14. Thermo-economic optimization of an endoreversible four-heat-reservoir absorption-refrigerator

    International Nuclear Information System (INIS)

    Qin Xiaoyong; Chen Lingen; Sun Fengrui; Wu Chih

    2005-01-01

    Based on an endoreversible four-heat-reservoir absorption-refrigeration-cycle model, the optimal thermo-economic performance of an absorption-refrigerator is analyzed and optimized assuming a linear (Newtonian) heat-transfer law applies. The optimal relation between the thermo-economic criterion and the coefficient of performance (COP), the maximum thermo-economic criterion, and the COP and specific cooling load for the maximum thermo-economic criterion of the cycle are derived using finite-time thermodynamics. Moreover, the effects of the cycle parameters on the thermo-economic performance of the cycle are studied by numerical examples

  15. Wigner Function of Thermo-Invariant Coherent State

    International Nuclear Information System (INIS)

    Xue-Fen, Xu; Shi-Qun, Zhu

    2008-01-01

    By using the thermal Winger operator of thermo-field dynamics in the coherent thermal state |ξ) representation and the technique of integration within an ordered product of operators, the Wigner function of the thermo-invariant coherent state |z,ℵ> is derived. The nonclassical properties of state |z,ℵ> is discussed based on the negativity of the Wigner function. (general)

  16. Encryption of covert information into multiple statistical distributions

    International Nuclear Information System (INIS)

    Venkatesan, R.C.

    2007-01-01

    A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model

  17. Solar-driven thermo- and electrochemical degradation of nitrobenzene in wastewater: Adaptation and adoption of solar STEP concept.

    Science.gov (United States)

    Gu, Di; Shao, Nan; Zhu, Yanji; Wu, Hongjun; Wang, Baohui

    2017-01-05

    The STEP concept has successfully been demonstrated for driving chemical reaction by utilization of solar heat and electricity to minimize the fossil energy, meanwhile, maximize the rate of thermo- and electrochemical reactions in thermodynamics and kinetics. This pioneering investigation experimentally exhibit that the STEP concept is adapted and adopted efficiently for degradation of nitrobenzene. By employing the theoretical calculation and thermo-dependent cyclic voltammetry, the degradation potential of nitrobenzene was found to be decreased obviously, at the same time, with greatly lifting the current, while the temperature was increased. Compared with the conventional electrochemical methods, high efficiency and fast degradation rate were markedly displayed due to the co-action of thermo- and electrochemical effects and the switch of the indirect electrochemical oxidation to the direct one for oxidation of nitrobenzene. A clear conclusion on the mechanism of nitrobenzene degradation by the STEP can be schematically proposed and discussed by the combination of thermo- and electrochemistry based the analysis of the HPLC, UV-vis and degradation data. This theory and experiment provide a pilot for the treatment of nitrobenzene wastewater with high efficiency, clean operation and low carbon footprint, without any other input of energy and chemicals from solar energy. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. System identification with information theoretic criteria

    NARCIS (Netherlands)

    A.A. Stoorvogel; J.H. van Schuppen (Jan)

    1995-01-01

    textabstractAttention is focused in this paper on the approximation problem of system identification with information theoretic criteria. For a class of problems it is shown that the criterion of mutual information rate is identical to the criterion of exponential-of-quadratic cost and to

  19. Use of sensitivity-information for the adaptive simulation of thermo-hydraulic system codes

    International Nuclear Information System (INIS)

    Kerner, Alexander M.

    2011-01-01

    Within the scope of this thesis the development of methods for online-adaptation of dynamical plant simulations of a thermal-hydraulic system code to measurement data is depicted. The described approaches are mainly based on the use of sensitivity-information in different areas: statistical sensitivity measures are used for the identification of the parameters to be adapted and online-sensitivities for the parameter adjustment itself. For the parameter adjustment the method of a ''system-adapted heuristic adaptation with partial separation'' (SAHAT) was developed, which combines certain variants of parameter estimation and control with supporting procedures to solve the basic problems. The applicability of the methods is shown by adaptive simulations of a PKL-III experiment and by selected transients in a nuclear power plant. Finally the main perspectives for the application of a tracking simulator on a system code are identified.

  20. A quantitative approach to measure road network information based on edge diversity

    Science.gov (United States)

    Wu, Xun; Zhang, Hong; Lan, Tian; Cao, Weiwei; He, Jing

    2015-12-01

    The measure of map information has been one of the key issues in assessing cartographic quality and map generalization algorithms. It is also important for developing efficient approaches to transfer geospatial information. Road network is the most common linear object in real world. Approximately describe road network information will benefit road map generalization, navigation map production and urban planning. Most of current approaches focused on node diversities and supposed that all the edges are the same, which is inconsistent to real-life condition, and thus show limitations in measuring network information. As real-life traffic flow are directed and of different quantities, the original undirected vector road map was first converted to a directed topographic connectivity map. Then in consideration of preferential attachment in complex network study and rich-club phenomenon in social network, the from and to weights of each edge are assigned. The from weight of a given edge is defined as the connectivity of its end node to the sum of the connectivities of all the neighbors of the from nodes of the edge. After getting the from and to weights of each edge, edge information, node information and the whole network structure information entropies could be obtained based on information theory. The approach has been applied to several 1 square mile road network samples. Results show that information entropies based on edge diversities could successfully describe the structural differences of road networks. This approach is a complementarity to current map information measurements, and can be extended to measure other kinds of geographical objects.

  1. Quantum teleportation and entanglement. A hybrid approach to optical quantum information procesing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Tokyo Univ. (Japan). Dept. of Applied Physics; Loock, Peter van [Erlangen-Nuernberg Univ. (Germany). Lehrstuhl fuer Optik

    2011-07-01

    Unique in that it is jointly written by an experimentalist and a theorist, this monograph presents universal quantum computation based on quantum teleportation as an elementary subroutine and multi-party entanglement as a universal resource. Optical approaches to measurement-based quantum computation are also described, including schemes for quantum error correction, with most of the experiments carried out by the authors themselves. Ranging from the theoretical background to the details of the experimental realization, the book describes results and advances in the field, backed by numerous illustrations of the authors' experimental setups. Aimed at researchers, physicists, and graduate and PhD students in physics, theoretical quantum optics, quantum mechanics, and quantum information. (orig.)

  2. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  3. Theoretical Approaches to Political Communication.

    Science.gov (United States)

    Chesebro, James W.

    Political communication appears to be emerging as a theoretical and methodological academic area of research within both speech-communication and political science. Five complimentary approaches to political science (Machiavellian, iconic, ritualistic, confirmational, and dramatistic) may be viewed as a series of variations which emphasize the…

  4. UNCERTAINTY IN NEOCLASSICAL AND KEYNESIAN THEORETICAL APPROACHES: A BEHAVIOURAL PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    Sinziana BALTATESCU

    2015-11-01

    Full Text Available The ”mainstream” neoclassical assumptions about human economic behavior are currently challenged by both behavioural researches on human behaviour and other theoretical approaches which, in the context of the recent economic and financial crisis find arguments to reinforce their theoretical statements. The neoclassical “perfect rationality” assumption is most criticized and provokes the mainstream theoretical approach to efforts of revisiting the theoretical framework in order to re-state the economic models validity. Uncertainty seems, in this context, to be the concept that allows other theoretical approaches to take into consideration a more realistic individual from the psychological perspective. This paper is trying to present a comparison between the neoclassical and Keynesian approach of the uncertainty, considering the behavioural arguments and challenges addressed to the mainstream theory.

  5. WOMEN, FOOTBALL AND EUROPEAN INTEGRATION. AIMS AND QUESTIONS, METHODOLOGICAL AND THEORETICAL APPROACHES

    Directory of Open Access Journals (Sweden)

    Gertrud Pfister

    2013-12-01

    Full Text Available The aim of this article is to introduce a new research topic and provide information about a European research project focusing on football as a means of European integration. Using the results of available studies of the author and other scholars, it is to be discussed whether and how women can participate in football cultures and contribute to a European identity. Based on theoretical approaches to national identity, gender and socialization, as well as and on the analysis of various intersections between gender, football and fandom, it can be concluded that women are still outsiders in the world of football and that it is doubtful whether female players and fans will contribute decisively to Europeanization processes.

  6. Chemical entity recognition in patents by combining dictionary-based and statistical approaches

    Science.gov (United States)

    Akhondi, Saber A.; Pons, Ewoud; Afzal, Zubair; van Haagen, Herman; Becker, Benedikt F.H.; Hettne, Kristina M.; van Mulligen, Erik M.; Kors, Jan A.

    2016-01-01

    We describe the development of a chemical entity recognition system and its application in the CHEMDNER-patent track of BioCreative 2015. This community challenge includes a Chemical Entity Mention in Patents (CEMP) recognition task and a Chemical Passage Detection (CPD) classification task. We addressed both tasks by an ensemble system that combines a dictionary-based approach with a statistical one. For this purpose the performance of several lexical resources was assessed using Peregrine, our open-source indexing engine. We combined our dictionary-based results on the patent corpus with the results of tmChem, a chemical recognizer using a conditional random field classifier. To improve the performance of tmChem, we utilized three additional features, viz. part-of-speech tags, lemmas and word-vector clusters. When evaluated on the training data, our final system obtained an F-score of 85.21% for the CEMP task, and an accuracy of 91.53% for the CPD task. On the test set, the best system ranked sixth among 21 teams for CEMP with an F-score of 86.82%, and second among nine teams for CPD with an accuracy of 94.23%. The differences in performance between the best ensemble system and the statistical system separately were small. Database URL: http://biosemantics.org/chemdner-patents PMID:27141091

  7. A System Theoretical Inspired Approach to Knowledge Construction

    DEFF Research Database (Denmark)

    Mathiasen, Helle

    2008-01-01

    student's knowledge construction, in the light of operative constructivism, inspired by the German sociologist N. Luhmann's system theoretical approach to epistemology. Taking observations as operations based on distinction and indication (selection) contingency becomes a fundamental condition in learning......  Abstract The aim of this paper is to discuss the relation between teaching and learning. The point of departure is that teaching environments (communication forums) is a potential facilitator for learning processes and knowledge construction. The paper present a theoretical frame work, to discuss...... processes, and a condition which teaching must address as far as teaching strives to stimulate non-random learning outcomes. Thus learning outcomes understood as the individual learner's knowledge construction cannot be directly predicted from events and characteristics in the environment. This has...

  8. Comparing Simulated and Theoretical Sampling Distributions of the U3 Person-Fit Statistic.

    Science.gov (United States)

    Emons, Wilco H. M.; Meijer, Rob R.; Sijtsma, Klaas

    2002-01-01

    Studied whether the theoretical sampling distribution of the U3 person-fit statistic is in agreement with the simulated sampling distribution under different item response theory models and varying item and test characteristics. Simulation results suggest that the use of standard normal deviates for the standardized version of the U3 statistic may…

  9. Inkjet-Printed Biofunctional Thermo-Plasmonic Interfaces for Patterned Neuromodulation.

    Science.gov (United States)

    Kang, Hongki; Lee, Gu-Haeng; Jung, Hyunjun; Lee, Jee Woong; Nam, Yoonkey

    2018-02-27

    Localized heat generation by the thermo-plasmonic effect of metal nanoparticles has great potential in biomedical engineering research. Precise patterning of the nanoparticles using inkjet printing can enable the application of the thermo-plasmonic effect in a well-controlled way (shape and intensity). However, a universally applicable inkjet printing process that allows good control in patterning and assembly of nanoparticles with good biocompatibility is missing. Here we developed inkjet-printing-based biofunctional thermo-plasmonic interfaces that can modulate biological activities. We found that inkjet printing of plasmonic nanoparticles on a polyelectrolyte layer-by-layer substrate coating enables high-quality, biocompatible thermo-plasmonic interfaces across various substrates (rigid/flexible, hydrophobic/hydrophilic) by induced contact line pinning and electrostatically assisted nanoparticle assembly. We experimentally confirmed that the generated heat from the inkjet-printed thermo-plasmonic patterns can be applied in micrometer resolution over a large area. Lastly, we demonstrated that the patterned thermo-plasmonic effect from the inkjet-printed gold nanorods can selectively modulate neuronal network activities. This inkjet printing process therefore can be a universal method for biofunctional thermo-plasmonic interfaces in various bioengineering applications.

  10. Information-theoretic signatures of biodiversity in the barcoding gene.

    Science.gov (United States)

    Barbosa, Valmir C

    2018-08-14

    Analyzing the information content of DNA, though holding the promise to help quantify how the processes of evolution have led to information gain throughout the ages, has remained an elusive goal. Paradoxically, one of the main reasons for this has been precisely the great diversity of life on the planet: if on the one hand this diversity is a rich source of data for information-content analysis, on the other hand there is so much variation as to make the task unmanageable. During the past decade or so, however, succinct fragments of the COI mitochondrial gene, which is present in all animal phyla and in a few others, have been shown to be useful for species identification through DNA barcoding. A few million such fragments are now publicly available through the BOLD systems initiative, thus providing an unprecedented opportunity for relatively comprehensive information-theoretic analyses of DNA to be attempted. Here we show how a generalized form of total correlation can yield distinctive information-theoretic descriptors of the phyla represented in those fragments. In order to illustrate the potential of this analysis to provide new insight into the evolution of species, we performed principal component analysis on standardized versions of the said descriptors for 23 phyla. Surprisingly, we found that, though based solely on the species represented in the data, the first principal component correlates strongly with the natural logarithm of the number of all known living species for those phyla. The new descriptors thus constitute clear information-theoretic signatures of the processes whereby evolution has given rise to current biodiversity, which suggests their potential usefulness in further related studies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. A BIM-based approach to reusing construction firm’s management information

    OpenAIRE

    Zhiliang Ma

    2012-01-01

    Nowadays most construction firms have begun to use information management systems in their business to work more efficiently. At the same time, a lot of management information is being accumulated and some of the information can be reused to support the decision-making. Up to now, the information has not been reused so effectively in construction firms as expected. This paper introduces a new approach to reusing construction firm’s management information, which is based on BIM (Building Inf...

  12. VO{sub 2}-like thermo-optical switching effect in one-dimensional nonlinear defective photonic crystals

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Juan, E-mail: juanzhang@staff.shu.edu.cn, E-mail: ywang@siom.ac.cn; Zhang, Rongjun [Key Laboratory of Specialty Fiber Optics and Optical Access Networks, School of Communication and Information Engineering, Shanghai University, Shanghai 200072 (China); Wang, Yang, E-mail: juanzhang@staff.shu.edu.cn, E-mail: ywang@siom.ac.cn [Key Laboratory of High Power Laser Materials, Shanghai Institute of Optics and Fine Mechanics, Chinese Academy of Sciences, Shanghai 201800 (China)

    2015-06-07

    A new approach to achieve VO{sub 2}-like thermo-optical switching in a one-dimensional photonic crystal by the combination of thermo-optical and optical Kerr effects was proposed and numerically demonstrated in this study. The switching temperature and the hysteresis width can be tuned in a wide temperature range. Steep transition, high optical contrast, and low pumping power can be achieved at the same time. This kind of one-dimensional photonic crystal-based bistable switch will be low-cost, easy-to-fabricate, and versatile in practical applications compared with traditional VO{sub 2}-type one.

  13. A group theoretic approach to quantum information

    CERN Document Server

    Hayashi, Masahito

    2017-01-01

    This textbook is the first one addressing quantum information from the viewpoint of group symmetry. Quantum systems have a group symmetrical structure. This structure enables to handle systematically quantum information processing. However, there is no other textbook focusing on group symmetry for quantum information although there exist many textbooks for group representation. After the mathematical preparation of quantum information, this book discusses quantum entanglement and its quantification by using group symmetry. Group symmetry drastically simplifies the calculation of several entanglement measures although their calculations are usually very difficult to handle. This book treats optimal information processes including quantum state estimation, quantum state cloning, estimation of group action and quantum channel etc. Usually it is very difficult to derive the optimal quantum information processes without asymptotic setting of these topics. However, group symmetry allows to derive these optimal solu...

  14. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series

    Directory of Open Access Journals (Sweden)

    Charmaine eDemanuele

    2015-10-01

    Full Text Available Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from fMRI blood oxygenation level dependent (BOLD time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC, but not in the primary visual cortex (V1. Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in multivariate patterns of voxel

  15. Unity-Based Diversity: System Approach to Defining Information

    Directory of Open Access Journals (Sweden)

    Yixin Zhong

    2011-07-01

    Full Text Available What is information? This is the first question that information science should answer clearly. However, the definitions of information have been so diversified that people are questioning if there is any unity among the diversity, leading to a suspicion on whether it is possible to establish a unified theory of information or not. To answer this question, a system approach to defining information is introduced in this paper. It is proved that the unity of information definitions can be maintained with this approach. As a by-product, an important concept, the information eco-system, was also achieved.

  16. The spread of scientific information: insights from the web usage statistics in PLoS article-level metrics.

    Directory of Open Access Journals (Sweden)

    Koon-Kiu Yan

    Full Text Available The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML. These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact.

  17. The spread of scientific information: insights from the web usage statistics in PLoS article-level metrics.

    Science.gov (United States)

    Yan, Koon-Kiu; Gerstein, Mark

    2011-01-01

    The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML). These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact.

  18. A global approach to estimate irrigated areas - a comparison between different data and statistics

    Science.gov (United States)

    Meier, Jonas; Zabel, Florian; Mauser, Wolfram

    2018-02-01

    Agriculture is the largest global consumer of water. Irrigated areas constitute 40 % of the total area used for agricultural production (FAO, 2014a) Information on their spatial distribution is highly relevant for regional water management and food security. Spatial information on irrigation is highly important for policy and decision makers, who are facing the transition towards more efficient sustainable agriculture. However, the mapping of irrigated areas still represents a challenge for land use classifications, and existing global data sets differ strongly in their results. The following study tests an existing irrigation map based on statistics and extends the irrigated area using ancillary data. The approach processes and analyzes multi-temporal normalized difference vegetation index (NDVI) SPOT-VGT data and agricultural suitability data - both at a spatial resolution of 30 arcsec - incrementally in a multiple decision tree. It covers the period from 1999 to 2012. The results globally show a 18 % larger irrigated area than existing approaches based on statistical data. The largest differences compared to the official national statistics are found in Asia and particularly in China and India. The additional areas are mainly identified within already known irrigated regions where irrigation is more dense than previously estimated. The validation with global and regional products shows the large divergence of existing data sets with respect to size and distribution of irrigated areas caused by spatial resolution, the considered time period and the input data and assumption made.

  19. Genetic programming based models in plant tissue culture: An addendum to traditional statistical approach.

    Science.gov (United States)

    Mridula, Meenu R; Nair, Ashalatha S; Kumar, K Satheesh

    2018-02-01

    In this paper, we compared the efficacy of observation based modeling approach using a genetic algorithm with the regular statistical analysis as an alternative methodology in plant research. Preliminary experimental data on in vitro rooting was taken for this study with an aim to understand the effect of charcoal and naphthalene acetic acid (NAA) on successful rooting and also to optimize the two variables for maximum result. Observation-based modelling, as well as traditional approach, could identify NAA as a critical factor in rooting of the plantlets under the experimental conditions employed. Symbolic regression analysis using the software deployed here optimised the treatments studied and was successful in identifying the complex non-linear interaction among the variables, with minimalistic preliminary data. The presence of charcoal in the culture medium has a significant impact on root generation by reducing basal callus mass formation. Such an approach is advantageous for establishing in vitro culture protocols as these models will have significant potential for saving time and expenditure in plant tissue culture laboratories, and it further reduces the need for specialised background.

  20. Statistical inference an integrated approach

    CERN Document Server

    Migon, Helio S; Louzada, Francisco

    2014-01-01

    Introduction Information The concept of probability Assessing subjective probabilities An example Linear algebra and probability Notation Outline of the bookElements of Inference Common statistical modelsLikelihood-based functions Bayes theorem Exchangeability Sufficiency and exponential family Parameter elimination Prior Distribution Entirely subjective specification Specification through functional forms Conjugacy with the exponential family Non-informative priors Hierarchical priors Estimation Introduction to decision theoryBayesian point estimation Classical point estimation Empirical Bayes estimation Comparison of estimators Interval estimation Estimation in the Normal model Approximating Methods The general problem of inference Optimization techniquesAsymptotic theory Other analytical approximations Numerical integration methods Simulation methods Hypothesis Testing Introduction Classical hypothesis testingBayesian hypothesis testing Hypothesis testing and confidence intervalsAsymptotic tests Prediction...

  1. Fiber Optic Thermo-Hygrometers for Soil Moisture Monitoring.

    Science.gov (United States)

    Leone, Marco; Principe, Sofia; Consales, Marco; Parente, Roberto; Laudati, Armando; Caliro, Stefano; Cutolo, Antonello; Cusano, Andrea

    2017-06-20

    This work deals with the fabrication, prototyping, and experimental validation of a fiber optic thermo-hygrometer-based soil moisture sensor, useful for rainfall-induced landslide prevention applications. In particular, we recently proposed a new generation of fiber Bragg grating (FBGs)-based soil moisture sensors for irrigation purposes. This device was realized by integrating, inside a customized aluminum protection package, a FBG thermo-hygrometer with a polymer micro-porous membrane. Here, we first verify the limitations, in terms of the volumetric water content (VWC) measuring range, of this first version of the soil moisture sensor for its exploitation in landslide prevention applications. Successively, we present the development, prototyping, and experimental validation of a novel, optimized version of a soil VWC sensor, still based on a FBG thermo-hygrometer, but able to reliably monitor, continuously and in real-time, VWC values up to 37% when buried in the soil.

  2. Understanding latent structures of clinical information logistics: A bottom-up approach for model building and validating the workflow composite score.

    Science.gov (United States)

    Esdar, Moritz; Hübner, Ursula; Liebe, Jan-David; Hüsers, Jens; Thye, Johannes

    2017-01-01

    Clinical information logistics is a construct that aims to describe and explain various phenomena of information provision to drive clinical processes. It can be measured by the workflow composite score, an aggregated indicator of the degree of IT support in clinical processes. This study primarily aimed to investigate the yet unknown empirical patterns constituting this construct. The second goal was to derive a data-driven weighting scheme for the constituents of the workflow composite score and to contrast this scheme with a literature based, top-down procedure. This approach should finally test the validity and robustness of the workflow composite score. Based on secondary data from 183 German hospitals, a tiered factor analytic approach (confirmatory and subsequent exploratory factor analysis) was pursued. A weighting scheme, which was based on factor loadings obtained in the analyses, was put into practice. We were able to identify five statistically significant factors of clinical information logistics that accounted for 63% of the overall variance. These factors were "flow of data and information", "mobility", "clinical decision support and patient safety", "electronic patient record" and "integration and distribution". The system of weights derived from the factor loadings resulted in values for the workflow composite score that differed only slightly from the score values that had been previously published based on a top-down approach. Our findings give insight into the internal composition of clinical information logistics both in terms of factors and weights. They also allowed us to propose a coherent model of clinical information logistics from a technical perspective that joins empirical findings with theoretical knowledge. Despite the new scheme of weights applied to the calculation of the workflow composite score, the score behaved robustly, which is yet another hint of its validity and therefore its usefulness. Copyright © 2016 Elsevier Ireland

  3. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    characterized by equifinality, multiple conjunctural causation, and case diversity. We prescribe methodological guidelines consisting of a six-step procedure to systematically apply set theoretic methods to conceptualize, develop, and empirically derive maturity models and provide a demonstration......Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models...

  4. The duration of uncertain times: audiovisual information about intervals is integrated in a statistically optimal fashion.

    Directory of Open Access Journals (Sweden)

    Jess Hartcher-O'Brien

    Full Text Available Often multisensory information is integrated in a statistically optimal fashion where each sensory source is weighted according to its precision. This integration scheme isstatistically optimal because it theoretically results in unbiased perceptual estimates with the highest precisionpossible.There is a current lack of consensus about how the nervous system processes multiple sensory cues to elapsed time.In order to shed light upon this, we adopt a computational approach to pinpoint the integration strategy underlying duration estimationof audio/visual stimuli. One of the assumptions of our computational approach is that the multisensory signals redundantly specify the same stimulus property. Our results clearly show that despite claims to the contrary, perceived duration is the result of an optimal weighting process, similar to that adopted for estimates of space. That is, participants weight the audio and visual information to arrive at the most precise, single duration estimate possible. The work also disentangles how different integration strategies - i.e. consideringthe time of onset/offset ofsignals - might alter the final estimate. As such we provide the first concrete evidence of an optimal integration strategy in human duration estimates.

  5. Research methodology in dentistry: Part II — The relevance of statistics in research

    Science.gov (United States)

    Krithikadatta, Jogikalmat; Valarmathi, Srinivasan

    2012-01-01

    The lifeline of original research depends on adept statistical analysis. However, there have been reports of statistical misconduct in studies that could arise from the inadequate understanding of the fundamental of statistics. There have been several reports on this across medical and dental literature. This article aims at encouraging the reader to approach statistics from its logic rather than its theoretical perspective. The article also provides information on statistical misuse in the Journal of Conservative Dentistry between the years 2008 and 2011 PMID:22876003

  6. Security of statistical data bases: invasion of privacy through attribute correlational modeling

    Energy Technology Data Exchange (ETDEWEB)

    Palley, M.A.

    1985-01-01

    This study develops, defines, and applies a statistical technique for the compromise of confidential information in a statistical data base. Attribute Correlational Modeling (ACM) recognizes that the information contained in a statistical data base represents real world statistical phenomena. As such, ACM assumes correlational behavior among the database attributes. ACM proceeds to compromise confidential information through creation of a regression model, where the confidential attribute is treated as the dependent variable. The typical statistical data base may preclude the direct application of regression. In this scenario, the research introduces the notion of a synthetic data base, created through legitimate queries of the actual data base, and through proportional random variation of responses to these queries. The synthetic data base is constructed to resemble the actual data base as closely as possible in a statistical sense. ACM then applies regression analysis to the synthetic data base, and utilizes the derived model to estimate confidential information in the actual database.

  7. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  8. Thermo-hydraulic characteristics of ship propulsion reactor in the conditions of ship motions and safety assessment

    International Nuclear Information System (INIS)

    Kobayashi, Michiyuki; Murata, Hiroyuki; Sawada, Kenichi; Inasaka, Fujio; Aya, Izuo; Shiozaki, Koki

    1999-01-01

    By inputting the experimental data, information and others on thermo-hydraulic characteristics of integrated ship propulsion reactor accumulated hitherto by the Ship Research Institute and some recent cooperation results into the nuclear ship engineering simulation system, it was conducted not only to contribute an improvement study on next ship reactor by executing general analysis and evaluation on motion characteristics under ship body motion conditions, safety at accidents, and others of the integrated ship reactor but also to investigate and prepare some measures to apply fundamental experiment results based on obtained here information to safety countermeasure of the nuclear ships. In 1997 fiscal year, on safety of the integrated ship propulsion reactor loading nuclear ship, by adding experimental data on unstable flow analysis and information on all around of the analysis to general data base fundamental program, development to intellectual data base program was intended; on effect of pulsation flow on thermo-hydraulic characteristics of ship propulsion reactor; after pulsation flow visualization experiment, experimental equipment was reconstructed into heat transfer type to conduct numerical analysis of pulsation flow by confirming validity of numerical analysis code under comparison with the visualization experiment results; and on thermo-hydraulic behavior in storage container at accident of active safety type ship propulsion reactor; a flashing vibration test using new apparatus finished on its higher pressurization at last fiscal year to examine effects of each parameter such as radius and length of exhausting nozzle and pool water temperature. (G.K.)

  9. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  10. Theoretical Approaches to Coping

    Directory of Open Access Journals (Sweden)

    Sofia Zyga

    2013-01-01

    Full Text Available Introduction: Dealing with stress requires conscious effort, it cannot be perceived as equal to individual's spontaneous reactions. The intentional management of stress must not be confused withdefense mechanisms. Coping differs from adjustment in that the latter is more general, has a broader meaning and includes diverse ways of facing a difficulty.Aim: An exploration of the definition of the term "coping", the function of the coping process as well as its differentiation from other similar meanings through a literature review.Methodology: Three theoretical approaches of coping are introduced; the psychoanalytic approach; approaching by characteristics; and the Lazarus and Folkman interactive model.Results: The strategic methods of the coping approaches are described and the article ends with a review of the approaches including the functioning of the stress-coping process , the classificationtypes of coping strategies in stress-inducing situations and with a criticism of coping approaches.Conclusions: The comparison of coping in different situations is difficult, if not impossible. The coping process is a slow process, so an individual may select one method of coping under one set ofcircumstances and a different strategy at some other time. Such selection of strategies takes place as the situation changes.

  11. A novel approach for choosing summary statistics in approximate Bayesian computation.

    Science.gov (United States)

    Aeschbacher, Simon; Beaumont, Mark A; Futschik, Andreas

    2012-11-01

    The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θ(anc) = 4N(e)u) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L(2)-loss performs best. Applying that method to the ibex data, we estimate θ(anc)≈ 1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10(-4) and 3.5 × 10(-3) per locus per generation. The proportion of males with access to matings is estimated as ω≈ 0.21, which is in good agreement with recent independent estimates.

  12. Statistical models for expert judgement and wear prediction

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1994-01-01

    This thesis studies the statistical analysis of expert judgements and prediction of wear. The point of view adopted is the one of information theory and Bayesian statistics. A general Bayesian framework for analyzing both the expert judgements and wear prediction is presented. Information theoretic interpretations are given for some averaging techniques used in the determination of consensus distributions. Further, information theoretic models are compared with a Bayesian model. The general Bayesian framework is then applied in analyzing expert judgements based on ordinal comparisons. In this context, the value of information lost in the ordinal comparison process is analyzed by applying decision theoretic concepts. As a generalization of the Bayesian framework, stochastic filtering models for wear prediction are formulated. These models utilize the information from condition monitoring measurements in updating the residual life distribution of mechanical components. Finally, the application of stochastic control models in optimizing operational strategies for inspected components are studied. Monte-Carlo simulation methods, such as the Gibbs sampler and the stochastic quasi-gradient method, are applied in the determination of posterior distributions and in the solution of stochastic optimization problems. (orig.) (57 refs., 7 figs., 1 tab.)

  13. Statistical Measures for Usage-Based Linguistics

    Science.gov (United States)

    Gries, Stefan Th.; Ellis, Nick C.

    2015-01-01

    The advent of usage-/exemplar-based approaches has resulted in a major change in the theoretical landscape of linguistics, but also in the range of methodologies that are brought to bear on the study of language acquisition/learning, structure, and use. In particular, methods from corpus linguistics are now frequently used to study distributional…

  14. A Process Mining Based Service Composition Approach for Mobile Information Systems

    Directory of Open Access Journals (Sweden)

    Chengxi Huang

    2017-01-01

    Full Text Available Due to the growing trend in applying big data and cloud computing technologies in information systems, it is becoming an important issue to handle the connection between large scale of data and the associated business processes in the Internet of Everything (IoE environment. Service composition as a widely used phase in system development has some limits when the complexity of relationship among data increases. Considering the expanding scale and the variety of devices in mobile information systems, a process mining based service composition approach is proposed in this paper in order to improve the adaptiveness and efficiency of compositions. Firstly, a preprocessing is conducted to extract existing service execution information from server-side logs. Then process mining algorithms are applied to discover the overall event sequence with preprocessed data. After that, a scene-based service composition is applied to aggregate scene information and relocate services of the system. Finally, a case study that applied the work in mobile medical application proves that the approach is practical and valuable in improving service composition adaptiveness and efficiency.

  15. Materiality in a Practice-Based Approach

    Science.gov (United States)

    Svabo, Connie

    2009-01-01

    Purpose: The paper aims to provide an overview of the vocabulary for materiality which is used by practice-based approaches to organizational knowing. Design/methodology/approach: The overview is theoretically generated and is based on the anthology Knowing in Organizations: A Practice-based Approach edited by Nicolini, Gherardi and Yanow. The…

  16. A coupled thermo-mechanical pseudo inverse approach for preform design in forging

    Science.gov (United States)

    Thomas, Anoop Ebey; Abbes, Boussad; Li, Yu Ming; Abbes, Fazilay; Guo, Ying-Qiao; Duval, Jean-Louis

    2017-10-01

    Hot forging is a process used to form difficult to form materials as well as to achieve complex geometries. This is possible due to the reduction of yield stress at high temperatures and a subsequent increase in formability. Numerical methods have been used to predict the material yield and the stress/strain states of the final product. Pseudo Inverse Approach (PIA) developed in the context of cold forming provides a quick estimate of the stress and strain fields in the final product for a given initial shape. In this paper, PIA is extended to include the thermal effects on the forging process. A Johnson-Cook thermo-viscoplastic material law is considered and a staggered scheme is employed for the coupling between the mechanical and thermal problems. The results are compared with available commercial codes to show the efficiency and the limitations of PIA.

  17. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  18. Chemical entity recognition in patents by combining dictionary-based and statistical approaches.

    Science.gov (United States)

    Akhondi, Saber A; Pons, Ewoud; Afzal, Zubair; van Haagen, Herman; Becker, Benedikt F H; Hettne, Kristina M; van Mulligen, Erik M; Kors, Jan A

    2016-01-01

    We describe the development of a chemical entity recognition system and its application in the CHEMDNER-patent track of BioCreative 2015. This community challenge includes a Chemical Entity Mention in Patents (CEMP) recognition task and a Chemical Passage Detection (CPD) classification task. We addressed both tasks by an ensemble system that combines a dictionary-based approach with a statistical one. For this purpose the performance of several lexical resources was assessed using Peregrine, our open-source indexing engine. We combined our dictionary-based results on the patent corpus with the results of tmChem, a chemical recognizer using a conditional random field classifier. To improve the performance of tmChem, we utilized three additional features, viz. part-of-speech tags, lemmas and word-vector clusters. When evaluated on the training data, our final system obtained an F-score of 85.21% for the CEMP task, and an accuracy of 91.53% for the CPD task. On the test set, the best system ranked sixth among 21 teams for CEMP with an F-score of 86.82%, and second among nine teams for CPD with an accuracy of 94.23%. The differences in performance between the best ensemble system and the statistical system separately were small.Database URL: http://biosemantics.org/chemdner-patents. © The Author(s) 2016. Published by Oxford University Press.

  19. A Game-Theoretic Approach to Information-Flow Control via Protocol Composition

    Directory of Open Access Journals (Sweden)

    Mário S. Alvim

    2018-05-01

    Full Text Available In the inference attacks studied in Quantitative Information Flow (QIF, the attacker typically tries to interfere with the system in the attempt to increase its leakage of secret information. The defender, on the other hand, typically tries to decrease leakage by introducing some controlled noise. This noise introduction can be modeled as a type of protocol composition, i.e., a probabilistic choice among different protocols, and its effect on the amount of leakage depends heavily on whether or not this choice is visible to the attacker. In this work, we consider operators for modeling visible and hidden choice in protocol composition, and we study their algebraic properties. We then formalize the interplay between defender and attacker in a game-theoretic framework adapted to the specific issues of QIF, where the payoff is information leakage. We consider various kinds of leakage games, depending on whether players act simultaneously or sequentially, and on whether or not the choices of the defender are visible to the attacker. In the case of sequential games, the choice of the second player is generally a function of the choice of the first player, and his/her probabilistic choice can be either over the possible functions (mixed strategy or it can be on the result of the function (behavioral strategy. We show that when the attacker moves first in a sequential game with a hidden choice, then behavioral strategies are more advantageous for the defender than mixed strategies. This contrasts with the standard game theory, where the two types of strategies are equivalent. Finally, we establish a hierarchy of these games in terms of their information leakage and provide methods for finding optimal strategies (at the points of equilibrium for both attacker and defender in the various cases.

  20. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  1. The role of information systems in management decision making-an theoretical approach

    Directory of Open Access Journals (Sweden)

    PhD. Associate Professor Department of Management & Informatics Mihane Berisha-Namani

    2010-12-01

    Full Text Available In modern conditions of globalisation and development of information technology, information processing activities have come to be seen as essential to successful of businesses and organizations. Information has become essential to make decisions and crucial asset in organisations, whereas information systems is technology required for information processing. The application of information systems technology in business and organisations has opened up new possibilities for running and managing organisations, as well as has improved management decision making. The purpose of this paper is to give an understanding of the role that information systems have in management decision making and to discuss the possibilities how managers of organisations can make best use of information systems. The paper starts with identifying the functions of management and managerial roles and continue with information systems usage in three levels of decision making. It specifically addresses the way how information systems can help managers reduce uncertainty in decision making and includes some important implications of information systems usage for managers. Thus, this study provide a framework of effective use of information systems generally and offers an alternative approach to investigate the impact that information systems technology have in management decision making specifically

  2. Theoretical Approaches to Nuclear Proliferation

    Directory of Open Access Journals (Sweden)

    Konstantin S. Tarasov

    2015-01-01

    Full Text Available This article analyses discussions between representatives of three schools in the theory of international relations - realism, liberalism and constructivism - on the driving factors of nuclear proliferation. The paper examines major theoretical approaches, outlined in the studies of Russian and foreign scientists, to the causes of nuclear weapons development, while unveiling their advantages and limitations. Much of the article has been devoted to alternative approaches, particularly, the role of mathematical modeling in assessing proliferation risks. The analysis also reveals a variety of different approaches to nuclear weapons acquisition, as well as the absence of a comprehensive proliferation theory. Based on the research results the study uncovers major factors both favoring and impeding nuclear proliferation. The author shows that the lack of consensus between realists, liberals and constructivists on the nature of proliferation led a number of scientists to an attempt to explain nuclear rationale by drawing from the insights of more than one school in the theory of IR. Detailed study of the proliferation puzzle contributes to a greater understating of contemporary international realities, helps to identify mechanisms that are most likely to deter states from obtaining nuclear weapons and is of the outmost importance in predicting short- and long-term security environment. Furthermore, analysis of the existing scientific literature on nuclear proliferation helps to determine future research agenda of the subject at hand.

  3. Stability of nanofluids: Molecular dynamic approach and experimental study

    International Nuclear Information System (INIS)

    Farzaneh, H.; Behzadmehr, A.; Yaghoubi, M.; Samimi, A.; Sarvari, S.M.H.

    2016-01-01

    Highlights: • Nanofluid stability is investigated and discussed. • A molecular dynamic approach, considering different forces on the nanoparticles, is adopted. • Stability diagrams are presented for different thermo-fluid conditions. • An experimental investigation is carried out to confirm the theoretical approach. - Abstract: Nanofluids as volumetric absorbent in solar energy conversion devices or as working fluid in different heat exchangers have been proposed by various researchers. However, dispersion stability of nanofluids is an important issue that must be well addressed before any industrial applications. Conditions such as severe temperature gradient, high temperature of heat transfer fluid, nanoparticle mean diameters and types of nanoparticles and base fluid are among the most effective parameters on the stability of nanofluid. A molecular dynamic approach, considering kinetic energy of nanoparticles and DLVO potential energy between nanoparticles, is adopted to study the nanofluid stability for different nanofluids at different working conditions. Different forces such as Brownian, thermophoresis, drag and DLVO are considered to introduce the stability diagrams. The latter presents the conditions for which a nanofluid can be stable. In addition an experimental investigation is carried out to find a stable nanofluid and to show the validity of the theoretical approach. There is a good agreement between the experimental and theoretical results that confirms the validity of our theoretical approach.

  4. A poly(glycerol sebacate) based photo/thermo dual curable biodegradable and biocompatible polymer for biomedical applications.

    Science.gov (United States)

    Wang, Min; Lei, Dong; Liu, Zenghe; Chen, Shuo; Sun, Lijie; Lv, Ziying; Huang, Peng; Jiang, Zhongxing; You, Zhengwei

    2017-10-01

    Due to its biomimetic mechanical properties to soft tissues, excellent biocompatibility and biodegradability, poly (glycerol sebacate) (PGS) has emerged as a representative bioelastomer and been widely used in biomedical engineering. However, the typical curing of PGS needs high temperature (>120 °C), high vacuum (>1 Torr), and long duration (>12 h), which limit its further applications. Accordingly, we designed, synthesized and characterized a photo/thermo dual curable polymer based on PGS. Treatment of PGS with 2-isocyanatoethyl methacrylate without additional reagents readily produced a methacrylated PGS (PGS-IM). Photo-curing of PGS-IM for 10 min at room temperature using salt leaching method efficiently produced porous scaffolds with a thickness up to 1 mm. PGS-IM was adapt to thermo-curing as well. The combination of photo and thermo curing provided a further way to modulate the properties of resultant porous scaffolds. Interestingly, photo-cured scaffolds exhibited hierarchical porous structures carrying extensive micropores with a diameter from several to hundreds micrometers. All the scaffolds showed good elasticity and biodegradability. In addition, PGS-IM exhibited good compatibility with L929 fibroblast cells. We expect this new PGS based biomaterial will have a wide range of biomedical applications.

  5. Influence of the Lubricant Thermo-Piezo-Viscous Property on Hydrostatic Bearings in Oil Hydraulics

    DEFF Research Database (Denmark)

    Johansen, Per; Roemer, Daniel Beck; Andersen, Torben O.

    2016-01-01

    adds to the discrepancy of such simple design approach. In this paper the hydrostatic pressure force calculation is reviewed in terms of thermohydrodynamic (THD) lubrication theory, and simple analytical approximations of the hydrostatic pressure force, incorporating the piezo-viscous and thermo...... of these analytical approximations are explored in order to clarify the limits of application. In conclusion, it is found that the spatial gradient of the thermal field on the bearing surface is the significant factor in the thermo-viscous effect on the hydrostatic pressure profile, which leads to the conclusion...... that design engineers need to understand the thermodynamics of hydrostatic bearings, when using the conventional simple analytical approach, neglecting thermo-piezo-viscosity, in hydrostatic pressure force calculations....

  6. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  7. Three-dimensionality of space and the quantum bit: an information-theoretic approach

    International Nuclear Information System (INIS)

    Müller, Markus P; Masanes, Lluís

    2013-01-01

    It is sometimes pointed out as a curiosity that the state space of quantum two-level systems, i.e. the qubit, and actual physical space are both three-dimensional and Euclidean. In this paper, we suggest an information-theoretic analysis of this relationship, by proving a particular mathematical result: suppose that physics takes place in d spatial dimensions, and that some events happen probabilistically (not assuming quantum theory in any way). Furthermore, suppose there are systems that carry ‘minimal amounts of direction information’, interacting via some continuous reversible time evolution. We prove that this uniquely determines spatial dimension d = 3 and quantum theory on two qubits (including entanglement and unitary time evolution), and that it allows observers to infer local spatial geometry from probability measurements. (paper)

  8. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  9. A thermo-responsive and photo-polymerizable chondroitin sulfate-based hydrogel for 3D printing applications

    NARCIS (Netherlands)

    Abbadessa, A.; Blokzijl, M. M.; Mouser, V. H. M.; Marica, P.; Malda, J.; Hennink, W. E.; Vermonden, T.

    2016-01-01

    The aim ofthis study was to design a hydrogel system based on methacrylated chondroitin sulfate (CSMA) and a thermo-sensitive poly(N-(2-hydroxypropyl) methacrylamide-mono/dilactate)-polyethylene glycol triblock copolymer (M15P10) as a suitable material for additive manufacturing of scaffolds. CSMA

  10. A Markov game theoretic data fusion approach for cyber situational awareness

    Science.gov (United States)

    Shen, Dan; Chen, Genshe; Cruz, Jose B., Jr.; Haynes, Leonard; Kruger, Martin; Blasch, Erik

    2007-04-01

    This paper proposes an innovative data-fusion/ data-mining game theoretic situation awareness and impact assessment approach for cyber network defense. Alerts generated by Intrusion Detection Sensors (IDSs) or Intrusion Prevention Sensors (IPSs) are fed into the data refinement (Level 0) and object assessment (L1) data fusion components. High-level situation/threat assessment (L2/L3) data fusion based on Markov game model and Hierarchical Entity Aggregation (HEA) are proposed to refine the primitive prediction generated by adaptive feature/pattern recognition and capture new unknown features. A Markov (Stochastic) game method is used to estimate the belief of each possible cyber attack pattern. Game theory captures the nature of cyber conflicts: determination of the attacking-force strategies is tightly coupled to determination of the defense-force strategies and vice versa. Also, Markov game theory deals with uncertainty and incompleteness of available information. A software tool is developed to demonstrate the performance of the high level information fusion for cyber network defense situation and a simulation example shows the enhanced understating of cyber-network defense.

  11. Thermo-mechanical properties of polystyrene-based shape memory nanocomposites

    NARCIS (Netherlands)

    Xu, B.; Fu, Y.Q.; Ahmad, M.; Luo, J.K.; Huang, W.M.; Kraft, A.; Reuben, R.; Pei, Y.T.; Chen, Zhenguo; Hosson, J.Th.M. De

    2010-01-01

    Shape memory nanocomposites were fabricated using chemically cross-linked polystyrene (PS) copolymer as a matrix and different nanofillers (including alumina, silica and clay) as the reinforcing agents. Their thermo-mechanical properties and shape memory effects were characterized. Experimental

  12. The dynamics of alliances. A game theoretical approach

    NARCIS (Netherlands)

    Ridder, A. de

    2007-01-01

    In this dissertation, Annelies de Ridder presents a game theoretical approach to strategic alliances. More specifically, the dynamics of and within alliances have been studied. To do so, four new models have been developed in the game theoretical tradition. Both coalition theory and strategic game

  13. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    Directory of Open Access Journals (Sweden)

    Chahinez Benkoussas

    2015-01-01

    Full Text Available A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  14. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  15. STRUCTURAL AND METHODICAL MODEL OF INCREASING THE LEVEL OF THEORETICAL TRAINING OF CADETS USING INFORMATION AND COMMUNICATION TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladislav V. Bulgakov

    2018-03-01

    Full Text Available Features of training in higher educational institutions of system of EMERCOM of Russia demand introduction of the new educational techniques and the technical means directed on intensification of educational process, providing an opportunity of preparation of cadets at any time in the independent mode and improving quality of their theoretical knowledge. The authors have developed a structural and methodological model of increasing the level of theoretical training of cadets using information and communication technologies. The proposed structural and methodological model that includes elements to stimulate and enhance cognitive activity, allows you to generate the trajectory of theoretical training of cadets for the entire period of study at the University, to organize a systematic independent work, objective, current and final control of theoretical knowledge. The structural and methodological model for improving the level of theoretical training consists of three main elements: the base of theoretical questions, functional modules "teacher" and "cadet". The basis of the structural and methodological model of increasing the level of theoretical training of cadets is the base of theoretical issues, developed in all disciplines specialty 20.05.01 – fire safety. The functional module "teacher" allows you to create theoretical questions of various kinds, edit questions and delete them from the database if necessary, as well as create tests and monitor their implementation. The functional module "cadet" provides ample opportunities for theoretical training through independent work, testing for current and final control, the implementation of the game form of training in the form of a duel, as well as for the formation of the results of the cadets in the form of statistics and rankings. Structural and methodical model of increasing the level of theoretical training of cadets is implemented in practice in the form of a multi-level automated system

  16. Clinical application of transcatheter arterial thermo-chemotherapy and thermo-lipiodol embolization in treatment of hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Wang Xuan; Chen Xiaofei; Dong Weihua

    2007-01-01

    Objective: To evaluate the clinical efficacy of thermo-chemotherapy and thermo-lipiodol embolization in treatment of primary hepatocellular carcinoma(PHC). Methods: One hundred and sixteen cases of PHC were divided into three groups. Group A (38 cases)was treated with normal temperature chemotherapy and normal temperature lipiodol, Group B(40 cases)with thermo-chemotherapy and normal temperature lipiodol and group C (38 cases)with thermo-chemotherapy and thermo-lipiodol. Group B and group C were called the thermotherapy group. Results: In the thermotherapy groups, the rates of tumor size reduction were significantly greater than those in the normal group. There were no significant different in the hepatic function tests among the three groups. The 6-, 12-, 18-, and 24- month survival rates of the normal group and thermotherapy groups were 97%, 58%, 39% and 18%, versus 99%, 79%, 57% and 36%, respectively. No significant differences were found in the rates of reduction of tumor size and survival rates between group B and group C. Conclusion: Thermo-chemotherapy and thermo-embolization possess significant effect on PHC but without conspicuous damage to liver function. (authors)

  17. Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach.

    Science.gov (United States)

    Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem

    2013-01-01

    This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff.

  18. Self-cleaned electrochemical protein imprinting biosensor basing on a thermo-responsive memory hydrogel.

    Science.gov (United States)

    Wei, Yubo; Zeng, Qiang; Hu, Qiong; Wang, Min; Tao, Jia; Wang, Lishi

    2018-01-15

    Herein, the self-cleaned electrochemical protein imprinting biosensor basing on a thermo-responsive memory hydrogel was constructed on a glassy carbon electrode (GCE) with a free radical polymerization method. Combining the advantages of thermo-responsive molecular imprinted polymers and electrochemistry, the resulted biosensor presents a novel self-cleaned ability for bovine serum albumin (BSA) in aqueous media. As a temperature controlled gate, the hydrogel film undergoes the adsorption and desorption of BSA basing on a reversible structure change with the external temperature stimuli. In particular, these processes have been revealed by the response of cyclic voltammetry (CV) and electrochemical impedance spectroscopy (EIS) of electroactive [Fe(CN) 6 ] 3-/4- . The results have been supported by the evidences of scanning electron microscopy (SEM) and contact angles measurements. Under the optimal conditions, a wide detection range from 0.02μmolL -1 to 10μmolL -1 with a detection limit of 0.012 μmolL -1 (S/N = 3) was obtained for BSA. This proposed BSA sensor also possesses high selectivity, excellent stability, acceptable recovery and good reproducibility in its practical applications. Copyright © 2017. Published by Elsevier B.V.

  19. Thermo-economic design optimization of parabolic trough solar plants for industrial process heat applications with memetic algorithms

    International Nuclear Information System (INIS)

    Silva, R.; Berenguel, M.; Pérez, M.; Fernández-Garcia, A.

    2014-01-01

    Highlights: • A thermo-economic optimization of a parabolic-trough solar plant for industrial process heat applications is developed. • An analysis of the influence of economic cost functions on optimal design point location is presented. • A multi-objective optimization approach to the design routine is proposed. • A sensitivity analysis of the optimal point location to economic, operational, and ambient conditions is developed. • Design optimization of a parabolic trough plant for a reference industrial application is developed. - Abstract: A thermo-economic design optimization of a parabolic trough solar plant for industrial processes with memetic algorithms is developed. The design domain variables considered in the optimization routine are the number of collectors in series, number of collector rows, row spacing, and storage volume. Life cycle savings, levelized cost of energy, and payback time objective functions are compared to study the influence on optimal design point location. Furthermore a multi-objective optimization approach is proposed to analyze the design problem from a multi-economic criteria point of view. An extensive set of optimization cases are performed to estimate the influence of fuel price trend, plant location, demand profile, operation conditions, solar field orientation, and radiation uncertainty on optimal design. The results allow quantifying as thermo-economic design optimization based on short term criteria as the payback time leads to smaller plants with higher solar field efficiencies and smaller solar fractions, while the consideration of optimization criteria based on long term performance of the plants, as life cycle savings based optimization, leads to the reverse conclusion. The role of plant location and future evolution of gas prices in the thermo-economic performance of the solar plant has been also analyzed. Thermo-economic optimization of a parabolic trough solar plant design for the reference industrial

  20. A Game Theoretic Approach to Nuclear Security Analysis against Insider Threat

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyonam; Kim, So Young; Yim, Mansung [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Schneider, Erich [Univ. of Texas at Austin, Texas (United States)

    2014-05-15

    As individuals with authorized access to a facility and system who use their trusted position for unauthorized purposes, insiders are able to take advantage of their access rights and knowledge of a facility to bypass dedicated security measures. They can also capitalize on their knowledge to exploit any vulnerabilities in safety-related systems, with cyber security of safety-critical information technology systems offering an important example of the 3S interface. While this Probabilistic Risk Assessment (PRA) approach is appropriate for describing fundamentally random events like component failure of a safety system, it does not capture the adversary's intentions, nor does it account for adversarial response and adaptation to defensive investments. To address these issues of intentionality and interactions, this study adopts a game theoretic approach. The interaction between defender and adversary is modeled as a two-person Stackelberg game. The optimal strategy of both players is found from the equilibrium of this game. A defender strategy consists of a set of design modifications and/or post-construction security upgrades. An attacker strategy involves selection of a target as well as a pathway to that target. In this study, application of the game theoretic approach is demonstrated using a simplified test case problem. Novel to our approach is the modeling of insider threat that affects the non-detection probability of an adversary. The game-theoretic approach has the advantage of modelling an intelligent adversary who has an intention and complete knowledge of the facility. In this study, we analyzed the expected adversarial path and security upgrades with a limited budget with insider threat modeled as increasing the non-detection probability. Our test case problem categorized three groups of adversary paths assisted by insiders and derived the largest insider threat in terms of the budget for security upgrades. Certainly more work needs to be done to

  1. A Game Theoretic Approach to Nuclear Security Analysis against Insider Threat

    International Nuclear Information System (INIS)

    Kim, Kyonam; Kim, So Young; Yim, Mansung; Schneider, Erich

    2014-01-01

    As individuals with authorized access to a facility and system who use their trusted position for unauthorized purposes, insiders are able to take advantage of their access rights and knowledge of a facility to bypass dedicated security measures. They can also capitalize on their knowledge to exploit any vulnerabilities in safety-related systems, with cyber security of safety-critical information technology systems offering an important example of the 3S interface. While this Probabilistic Risk Assessment (PRA) approach is appropriate for describing fundamentally random events like component failure of a safety system, it does not capture the adversary's intentions, nor does it account for adversarial response and adaptation to defensive investments. To address these issues of intentionality and interactions, this study adopts a game theoretic approach. The interaction between defender and adversary is modeled as a two-person Stackelberg game. The optimal strategy of both players is found from the equilibrium of this game. A defender strategy consists of a set of design modifications and/or post-construction security upgrades. An attacker strategy involves selection of a target as well as a pathway to that target. In this study, application of the game theoretic approach is demonstrated using a simplified test case problem. Novel to our approach is the modeling of insider threat that affects the non-detection probability of an adversary. The game-theoretic approach has the advantage of modelling an intelligent adversary who has an intention and complete knowledge of the facility. In this study, we analyzed the expected adversarial path and security upgrades with a limited budget with insider threat modeled as increasing the non-detection probability. Our test case problem categorized three groups of adversary paths assisted by insiders and derived the largest insider threat in terms of the budget for security upgrades. Certainly more work needs to be done to

  2. Information dynamics and open systems classical and quantum approach

    CERN Document Server

    Ingarden, R S; Ohya, M

    1997-01-01

    This book aims to present an information-theoretical approach to thermodynamics and its generalisations On the one hand, it generalises the concept of `information thermodynamics' to that of `information dynamics' in order to stress applications outside thermal phenomena On the other hand, it is a synthesis of the dynamics of state change and the theory of complexity, which provide a common framework to treat both physical and nonphysical systems together Both classical and quantum systems are discussed, and two appendices are included to explain principal definitions and some important aspects of the theory of Hilbert spaces and operator algebras The concept of higher-order temperatures is explained and applied to biological and linguistic systems The theory of open systems is presented in a new, much more general form Audience This volume is intended mainly for theoretical and mathematical physicists, but also for mathematicians, experimental physicists, physical chemists, theoretical biologists, communicat...

  3. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  4. Theoretical approaches to social innovation – A critical literature review

    NARCIS (Netherlands)

    Butzin, A.; Davis, A.; Domanski, D.; Dhondt, S.; Howaldt, J.; Kaletka, C.; Kesselring, A.; Kopp, R.; Millard, J.; Oeij, P.; Rehfeld, D.; Schaper-Rinkel, P.; Schwartz, M.; Scoppetta, A.; Wagner-Luptacik, P.; Weber, M.

    2014-01-01

    The SI-DRIVE report “Theoretical approaches to Social Innovation – A Critical Literature Review” delivers a comprehensive overview on the state of the art of theoretically relevant building blocks for advancing a theoretical understanding of social innovation. It collects different theoretical

  5. A theoretical approach to artificial intelligence systems in medicine.

    Science.gov (United States)

    Spyropoulos, B; Papagounos, G

    1995-10-01

    The various theoretical models of disease, the nosology which is accepted by the medical community and the prevalent logic of diagnosis determine both the medical approach as well as the development of the relevant technology including the structure and function of the A.I. systems involved. A.I. systems in medicine, in addition to the specific parameters which enable them to reach a diagnostic and/or therapeutic proposal, entail implicitly theoretical assumptions and socio-cultural attitudes which prejudice the orientation and the final outcome of the procedure. The various models -causal, probabilistic, case-based etc. -are critically examined and their ethical and methodological limitations are brought to light. The lack of a self-consistent theoretical framework in medicine, the multi-faceted character of the human organism as well as the non-explicit nature of the theoretical assumptions involved in A.I. systems restrict them to the role of decision supporting "instruments" rather than regarding them as decision making "devices". This supporting role and, especially, the important function which A.I. systems should have in the structure, the methods and the content of medical education underscore the need of further research in the theoretical aspects and the actual development of such systems.

  6. Enhanced Thermo-Optical Switching of Paraffin-Wax Composite Spots under Laser Heating.

    Science.gov (United States)

    Said, Asmaa; Salah, Abeer; Fattah, Gamal Abdel

    2017-05-12

    Thermo-optical switches are of particular significance in communications networks where increasingly high switching speeds are required. Phase change materials (PCMs), in particular those based on paraffin wax, provide wealth of exciting applications with unusual thermally-induced switching properties, only limited by paraffin's rather low thermal conductivity. In this paper, the use of different carbon fillers as thermal conductivity enhancers for paraffin has been investigated, and a novel structure based on spot of paraffin wax as a thermo-optic switch is presented. Thermo-optical switching parameters are enhanced with the addition of graphite and graphene, due to the extreme thermal conductivity of the carbon fillers. Differential Scanning Calorimetry (DSC) and Scanning electron microscope (SEM) are performed on paraffin wax composites, and specific heat capacities are calculated based on DSC measurements. Thermo-optical switching based on transmission is measured as a function of the host concentration under conventional electric heating and laser heating of paraffin-carbon fillers composites. Further enhancements in thermo-optical switching parameters are studied under Nd:YAG laser heating. This novel structure can be used in future networks with huge bandwidth requirements and electric noise free remote aerial laser switching applications.

  7. Information Theoretic-Learning Auto-Encoder

    OpenAIRE

    Santana, Eder; Emigh, Matthew; Principe, Jose C

    2016-01-01

    We propose Information Theoretic-Learning (ITL) divergence measures for variational regularization of neural networks. We also explore ITL-regularized autoencoders as an alternative to variational autoencoding bayes, adversarial autoencoders and generative adversarial networks for randomly generating sample data without explicitly defining a partition function. This paper also formalizes, generative moment matching networks under the ITL framework.

  8. Radiotherapy problem under fuzzy theoretic approach

    International Nuclear Information System (INIS)

    Ammar, E.E.; Hussein, M.L.

    2003-01-01

    A fuzzy set theoretic approach is used for radiotherapy problem. The problem is faced with two goals: the first is to maximize the fraction of surviving normal cells and the second is to minimize the fraction of surviving tumor cells. The theory of fuzzy sets has been employed to formulate and solve the problem. A linguistic variable approach is used for treating the first goal. The solutions obtained by the modified approach are always efficient and best compromise. A sensitivity analysis of the solutions to the differential weights is given

  9. A statistical approach to plasma profile analysis

    International Nuclear Information System (INIS)

    Kardaun, O.J.W.F.; McCarthy, P.J.; Lackner, K.; Riedel, K.S.

    1990-05-01

    A general statistical approach to the parameterisation and analysis of tokamak profiles is presented. The modelling of the profile dependence on both the radius and the plasma parameters is discussed, and pertinent, classical as well as robust, methods of estimation are reviewed. Special attention is given to statistical tests for discriminating between the various models, and to the construction of confidence intervals for the parameterised profiles and the associated global quantities. The statistical approach is shown to provide a rigorous approach to the empirical testing of plasma profile invariance. (orig.)

  10. A thermo-responsive and photo-polymerizable chondroitin sulfate-based hydrogel for 3D printing applications

    NARCIS (Netherlands)

    Abbadessa, A|info:eu-repo/dai/nl/369480376; Blokzijl, M M; Mouser, V H M; Marica, P; Malda, J|info:eu-repo/dai/nl/412461099; Hennink, W E|info:eu-repo/dai/nl/070880409; Vermonden, T|info:eu-repo/dai/nl/275124517

    2016-01-01

    The aim of this study was to design a hydrogel system based on methacrylated chondroitin sulfate (CSMA) and a thermo-sensitive poly(N-(2-hydroxypropyl) methacrylamide-mono/dilactate)-polyethylene glycol triblock copolymer (M15P10) as a suitable material for additive manufacturing of scaffolds. CSMA

  11. A Theoretical Explanation of Marital Conflicts by Paradigmatic Approach

    Directory of Open Access Journals (Sweden)

    اسماعیل جهانی دولت آباد

    2017-06-01

    Full Text Available Due to the economic, social and cultural changes in recent decades and consequently alterations in the form and duties of families and expectations of individuals from marriage, the institution of the family and marriage are enormously involved with different challenges and conflicts in comparison to past years. Fragile marital relationships, conflicts and divorce are results of such situations in Iran. Accordingly, the present study, which is designed through meta-analysis and deduction based on the concept analysis and reconceptualization of recent studies, has committed to manifest a proper different paradigm to explain marital conflicts. This paradigm is relying on various theoretical approaches, particularly the theory of symbolic interactionism as the main explanatory mean, and also applying the concept of “Marital Paradigm” as the missing information in previous studies of this field. It explains the marital conflicts between couples as paradigmatic conflicts; and its main idea is that marital conflict is not the result of one or more fixed and specified factors, but it is the production of encountering the opposing (or different paradigms.

  12. Sonication-Induced Modification of Carbon Nanotubes: Effect on the Rheological and Thermo-Oxidative Behaviour of Polymer-Based Nanocomposites.

    Science.gov (United States)

    Arrigo, Rossella; Teresi, Rosalia; Gambarotti, Cristian; Parisi, Filippo; Lazzara, Giuseppe; Dintcheva, Nadka Tzankova

    2018-03-05

    The aim of this work is the investigation of the effect of ultrasound treatment on the structural characteristics of carbon nanotubes (CNTs) and the consequent influence that the shortening induced by sonication exerts on the morphology, rheological behaviour and thermo-oxidative resistance of ultra-high molecular weight polyethylene (UHMWPE)-based nanocomposites. First, CNTs have been subjected to sonication for different time intervals and the performed spectroscopic and morphological analyses reveal that a dramatic decrease of the CNT's original length occurs with increased sonication time. The reduction of the initial length of CNTs strongly affects the nanocomposite rheological behaviour, which progressively changes from solid-like to liquid-like as the CNT sonication time increases. The study of the thermo-oxidative behaviour of the investigated nanocomposites reveals that the CNT sonication has a detrimental effect on the thermo-oxidative stability of nanocomposites, especially for long exposure times. The worsening of the thermo-oxidative resistance of sonicated CNT-containing nanocomposites could be attributed to the lower thermal conductivity of low-aspect-ratio CNTs, which causes the increase of the local temperature at the polymer/nanofillers interphase, with the consequent acceleration of the degradative phenomena.

  13. Sonication-Induced Modification of Carbon Nanotubes: Effect on the Rheological and Thermo-Oxidative Behaviour of Polymer-Based Nanocomposites

    Science.gov (United States)

    Teresi, Rosalia; Gambarotti, Cristian; Dintcheva, Nadka Tzankova

    2018-01-01

    The aim of this work is the investigation of the effect of ultrasound treatment on the structural characteristics of carbon nanotubes (CNTs) and the consequent influence that the shortening induced by sonication exerts on the morphology, rheological behaviour and thermo-oxidative resistance of ultra-high molecular weight polyethylene (UHMWPE)-based nanocomposites. First, CNTs have been subjected to sonication for different time intervals and the performed spectroscopic and morphological analyses reveal that a dramatic decrease of the CNT’s original length occurs with increased sonication time. The reduction of the initial length of CNTs strongly affects the nanocomposite rheological behaviour, which progressively changes from solid-like to liquid-like as the CNT sonication time increases. The study of the thermo-oxidative behaviour of the investigated nanocomposites reveals that the CNT sonication has a detrimental effect on the thermo-oxidative stability of nanocomposites, especially for long exposure times. The worsening of the thermo-oxidative resistance of sonicated CNT-containing nanocomposites could be attributed to the lower thermal conductivity of low-aspect-ratio CNTs, which causes the increase of the local temperature at the polymer/nanofillers interphase, with the consequent acceleration of the degradative phenomena. PMID:29510595

  14. Kinetic and theoretical studies of novel biodegradable thermo-sensitive xerogels based on PEG/PVP/silica for sustained release of enrofloxacin

    Science.gov (United States)

    Ebadi, Azra; Rafati, Amir Abbas; Bavafa, Sadeghali; Mohammadi, Masoumah

    2017-12-01

    This study involves the synthesis of a new silica-based colloidal hybrid system. In this new hybrid system, poly (ethylene glycol) (PEG) and thermo-sensitive amphiphilic biocompatible poly (vinyl pyrrolidone) (PVP) were used to create suitable storage for hydrophobic drugs. The possibility of using variable PVP/PEG molar ratios to modulate drug release rate from silica nanoparticles was a primary goal of the current research. In addition, an investigation of the drug release kinetic was conducted. To achieve this, silica nanoparticles were synthesized in poly (ethylene glycol) (PEG) and poly (vinyl pyrrolidone) (PVP) solution incorporated with enrofloxacin (EFX) (as a model hydrophobic drug), using a simple synthetic strategy of hybrid materials which avoided waste and multi-step processes. The impacts of PVP/PEG molar ratios, temperature, and pH of the release medium on release kinetic were investigated. The physicochemical properties of the drug-loaded composites were studied by Fourier transform infrared (FT-IR) spectra, scanning electron microscopy (SEM), and thermogravimetric analysis (TGA). In vitro drug release studies demonstrated that the drug release rate, which was evaluated by analyzing the experimental data with seven kinetic models in a primarily non-Fickian diffusion-controlled process, aligned well with both Ritger-Peppas and Sahlin-Peppas equations.

  15. Supramolecular structure, phase behavior and thermo-rheological properties of a poly (L-lactide-co-ε-caprolactone) statistical copolymer.

    Science.gov (United States)

    Ugartemendia, Jone M; Muñoz, M E; Santamaria, A; Sarasua, J R

    2015-08-01

    PLAcoCL samples, both unaged, termed PLAcoCLu, and aged over time, PLAcoCLa, were prepared and analyzed to study the phase structure, morphology, and their evolution under non-quiescent conditions. X- ray diffraction, Differential Scanning Calorimetry and Atomic Force Microscopy were complemented with thermo-rheological measurements to reveal that PLAcoCL evolves over time from a single amorphous metastable state to a 3 phase system, made up of two compositionally different amorphous phases and a crystalline phase. The supramolecular arrangements developed during aging lead to a rheological complex behavior in the PLAcoCLa copolymer: Around Tt=131 °C thermo-rheological complexity and a peculiar chain mobility reduction were observed, but at T>Tt the thermo-rheological response of a homogeneous system was recorded. In comparison with the latter, the PLLA/PCL 70:30 physical blend counterpart showed double amorphous phase behavior at all temperatures, supporting the hypothesis that phase separation in the PLAcoCLa copolymer is caused by the crystallization of polylactide segment blocks during aging. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. SYSTEMATIZATION OF SCIENTIFIC APPROACHES TO THE INTERPRETATION OF INFORMATION ECONOMY

    Directory of Open Access Journals (Sweden)

    Nataliya Kholiavko

    2017-09-01

    Full Text Available The purpose of the article is to analyse and systematize scientific approaches to the interpretation of the essence of the information economy. The research object: scientific approaches to the interpretation of the essence of the concept of “information economy”. The theoretical and methodological base of the research consists of works by D. Bell, Z. Brzezinski, J. Baudrillard, M. Castells, Yoneji Masuda, F. Machlup, M. Porat, A. Toffler, which are devoted to issues of development of information society. Analysis of recent research and publications allows noting an increasing scientific interest in issues of formation of the economy of information type. The essence and features of information economy are considered in scientific papers of scholars such as Hrynkevych S., Iliash A., Krystynevych S., Malyk I., Nikolaiev Ye., Bazhal Yu., Tolstiakov R., Shkarlet S., Fedulova L., Chukhno A., and others. Research methods: analysis, content analysis, synthesis, system method. Increasing scientific interest in the formation of the information economy leads to an increase in the number of publications on this topic; pluralism of scientific approaches to the consideration of the essence of the information economy actualizes the need for their systematization. The theory of information economy logically follows and is a component of the theory of the development of information society. Along with it, there is an approach in the scientific literature, according to which scholars identify the concept of information society and information economy. In our opinion, this approach is not well-balanced, since the term “information society” is inherently wider than “information economy”. The latter can be considered as an inherent component of the information society, which development is determined by a number of specific factors. In other words, these terms are closely interrelated and they should be investigated in the context of links between them

  17. Construction of database server system for fuel thermo-physical properties

    International Nuclear Information System (INIS)

    Park, Chang Je; Kang, Kwon Ho; Song, Kee Chan

    2003-12-01

    To perform the evaluation of various fuels in the nuclear reactors, not only the mechanical properties but also thermo-physical properties are required as one of most important inputs for fuel performance code system. The main objective of this study is to make a database system for fuel thermo-physical properties and a PC-based hardware system has been constructed for ease use for the public with visualization such as web-based server system. This report deals with the hardware and software which are used in the database server system for nuclear fuel thermo-physical properties. It is expected to be highly useful to obtain nuclear fuel data without such a difficulty through opening the database of fuel properties to the public and is also helpful to research of development of various fuel of nuclear industry. Furthermore, the proposed models of nuclear fuel thermo-physical properties will be enough utilized to the fuel performance code system

  18. A holistic approach to thermodynamic analysis of photo-thermo-electrical processes in a photovoltaic cell

    International Nuclear Information System (INIS)

    Bicer, Yusuf; Dincer, Ibrahim; Zamfirescu, Calin

    2016-01-01

    Highlights: • A novel approach for energy and exergy analyses of a photovoltaic cell is presented. • Photonic, thermal and electrical sub-processes are identified. • The irreversibilities caused by the photo-thermo-electrical processes are assessed. • Energy and exergy efficiencies are determined for comparison purposes. - Abstract: In this study, a novel approach for energy and exergy analyses of a photovoltaic (PV) cell is presented, and the exergy destructions within the relevant optical, thermal and electrical processes are quantified. The present study uses a holistic approach to cover all processes and their interactions inside a PV cell; such as photonic: photons transmission, reflection and spectral absorption, background (blackbody) radiation emission at cell temperature; electrical: electron excitation to create a photocurrent, electron-hole recombination, electrical power transmission to an external load; and thermal: internal heat generation by shunt and series resistances, and heat dissipation by conduction-convection. A physical model which considers the highly complex interaction and interdependence among these processes is introduced based on energy and exergy balances completed by writing various constitutive equations, including correlations for the convective heat transfer coefficient and the photocurrent dependence of the spectral distribution of the quantum efficiency. The irreversibilities caused by the processes are assessed in terms of their relative magnitudes of the exergy destructions. The largest exergy destruction occurs in PV generator-photo current generation process followed by wafer-light absorption process. The overall energy and exergy efficiencies are then determined based on the novel model for seven different atmospheric and ecological conditions. The lowest and highest exergy efficiencies of the PV cell are calculated as 9.3% and 14% for two sample locations as Oshawa in Canada and Emirdag in Turkey, respectively

  19. Enhanced statistical damage identification using frequency-shift information with tunable piezoelectric transducer circuitry

    International Nuclear Information System (INIS)

    Zhao, J; Tang, J; Wang, K W

    2008-01-01

    The frequency-shift-based damage detection method entertains advantages such as global detection capability and easy implementation, but also suffers from drawbacks that include low detection accuracy and sensitivity and the difficulty in identifying damage using a small number of measurable frequencies. Moreover, the damage detection/identification performance is inevitably affected by the uncertainty/variations in the baseline model. In this research, we investigate an enhanced statistical damage identification method using the tunable piezoelectric transducer circuitry. The tunable piezoelectric transducer circuitry can lead to much enriched information on frequency shift (before and after damage occurrence). The circuitry elements, meanwhile, can be directly and accurately measured and thus can be considered uncertainty-free. A statistical damage identification algorithm is formulated which can identify both the mean and variance of the elemental property change. Our analysis indicates that the integration of the tunable piezoelectric transducer circuitry can significantly enhance the robustness of the frequency-shift-based damage identification approach under uncertainty and noise

  20. Optimization of Investment Planning Based on Game-Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Butsenko

    2018-03-01

    Full Text Available The game-theoretic approach has a vast potential in solving economic problems. On the other hand, the theory of games itself can be enriched by the studies of real problems of decision-making. Hence, this study is aimed at developing and testing the game-theoretic technique to optimize the management of investment planning. This technique enables to forecast the results and manage the processes of investment planning. The proposed method of optimizing the management of investment planning allows to choose the best development strategy of an enterprise. This technique uses the “game with nature” model, and the Wald criterion, the maximum criterion and the Hurwitz criterion as criteria. The article presents a new algorithm for constructing the proposed econometric method to optimize investment project management. This algorithm combines the methods of matrix games. Furthermore, I show the implementation of this technique in a block diagram. The algorithm includes the formation of initial data, the elements of the payment matrix, as well as the definition of maximin, maximal, compromise and optimal management strategies. The methodology is tested on the example of the passenger transportation enterprise of the Sverdlovsk Railway in Ekaterinburg. The application of the proposed methodology and the corresponding algorithm allowed to obtain an optimal price strategy for transporting passengers for one direction of traffic. This price strategy contributes to an increase in the company’s income with minimal risk from the launch of this direction. The obtained results and conclusions show the effectiveness of using the developed methodology for optimizing the management of investment processes in the enterprise. The results of the research can be used as a basis for the development of an appropriate tool and applied by any economic entity in its investment activities.

  1. Thermo-magneto-elastoplastic coupling model of metal magnetic memory testing method for ferromagnetic materials

    Science.gov (United States)

    Shi, Pengpeng; Zhang, Pengcheng; Jin, Ke; Chen, Zhenmao; Zheng, Xiaojing

    2018-04-01

    Metal magnetic memory (MMM) testing (also known as micro-magnetic testing) is a new non-destructive electromagnetic testing method that can diagnose ferromagnetic materials at an early stage by measuring the MMM signal directly on the material surface. Previous experiments have shown that many factors affect MMM signals, in particular, the temperature, the elastoplastic state, and the complex environmental magnetic field. However, the fact that there have been only a few studies of either how these factors affect the signals or the physical coupling mechanisms among them seriously limits the industrial applications of MMM testing. In this paper, a nonlinear constitutive relation for a ferromagnetic material considering the influences of temperature and elastoplastic state is established under a weak magnetic field and is used to establish a nonlinear thermo-magneto-elastoplastic coupling model of MMM testing. Comparing with experimental data verifies that the proposed theoretical model can accurately describe the thermo-magneto-elastoplastic coupling influence on MMM signals. The proposed theoretical model can predict the MMM signals in a complex environment and so is expected to provide a theoretical basis for improving the degree of quantification in MMM testing.

  2. Thermo-optical Properties of Nanofluids

    International Nuclear Information System (INIS)

    Ortega, Maria Alejandra; Echevarria, Lorenzo; Rodriguez, Luis; Castillo, Jimmy; Fernandez, Alberto

    2008-01-01

    In this work, we report thermo-optical properties of nanofluids. Spherical gold nanoparticles obtained by laser ablation in condensed media were characterized using thermal lens spectroscopy in SDS-water solution pumping at 532 nm with a 10 ns pulsed laser-Nd-YAG system. Nanoparticles obtained by laser ablation were stabilized in the time by surfactants (Sodium Dodecyl-Sulfate or SDS) in different molar concentrations. The morphology and size of the gold nanoparticles were determined by transmission electron microscopy (TEM). The plasmonic resonance bands in gold nanoparticles are responsible of the light optical absorption of this wavelength. The position of the absorption maximum and width band in the UV-Visible spectra is given by the morphological characteristics of these systems. The thermo-optical constant such as thermal diffusion, thermal conductivity and dn/dT are functions of nanoparticles sizes and dielectric constant of the media. The theoretical model existents do not describe completely this relations because is not possible separate the contributions due to nanoparticles size, factor form and dielectric constant. The thermal lens signal obtained is also dependent of nanoparticles sizes. This methodology can be used in order to evaluate nanofluids and characterizing nanoparticles in different media. These results are expected to have an impact in bioimaging, biosensors and other technological applications such as cooler system

  3. Security of Heterogeneous Content in Cloud Based Library Information Systems Using an Ontology Based Approach

    Directory of Open Access Journals (Sweden)

    Mihai DOINEA

    2014-01-01

    Full Text Available As in any domain that involves the use of software, the library information systems take advantages of cloud computing. The paper highlights the main aspect of cloud based systems, describing some public solutions provided by the most important players on the market. Topics related to content security in cloud based services are tackled in order to emphasize the requirements that must be met by these types of systems. A cloud based implementation of an Information Library System is presented and some adjacent tools that are used together with it to provide digital content and metadata links are described. In a cloud based Information Library System security is approached by means of ontologies. Aspects such as content security in terms of digital rights are presented and a methodology for security optimization is proposed.

  4. Theoretical development of information science: A brief history

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    the strongest “paradigms” in the field is a tradition derived from the Cranfield experiments in the 1960s and the bibliometric research following the publication of Science Citation Index from 1963 and forward. Among the competing theoretical frameworks, ‘the cognitive view’ became influential from the 1970s......This paper presents a brief history of information science (IS) as viewed by the author. The term ‘information science’ goes back to 1955 and evolved in the aftermath of Claude Shannon’s ‘information theory’ (1948), which also inspired research into problems in fields of library science...... and documentation. These subjects were a main focus of what became established as ‘information science’, which from 1964 onwards was often termed ‘library and information science’ (LIS). However, the usefulness of Shannon’s information theory as the theoretical foundation of the field was been challenged. Among...

  5. A practical model-based statistical approach for generating functional test cases: application in the automotive industry

    OpenAIRE

    Awédikian , Roy; Yannou , Bernard

    2012-01-01

    International audience; With the growing complexity of industrial software applications, industrials are looking for efficient and practical methods to validate the software. This paper develops a model-based statistical testing approach that automatically generates online and offline test cases for embedded software. It discusses an integrated framework that combines solutions for three major software testing research questions: (i) how to select test inputs; (ii) how to predict the expected...

  6. Understanding employee motivation and organizational performance: Arguments for a set-theoretic approach

    Directory of Open Access Journals (Sweden)

    Michael T. Lee

    2016-09-01

    Full Text Available Empirical evidence demonstrates that motivated employees mean better organizational performance. The objective of this conceptual paper is to articulate the progress that has been made in understanding employee motivation and organizational performance, and to suggest how the theory concerning employee motivation and organizational performance may be advanced. We acknowledge the existing limitations of theory development and suggest an alternative research approach. Current motivation theory development is based on conventional quantitative analysis (e.g., multiple regression analysis, structural equation modeling. Since researchers are interested in context and understanding of this social phenomena holistically, they think in terms of combinations and configurations of a set of pertinent variables. We suggest that researchers take a set-theoretic approach to complement existing conventional quantitative analysis. To advance current thinking, we propose a set-theoretic approach to leverage employee motivation for organizational performance.

  7. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    International Nuclear Information System (INIS)

    Lan, Ganhui; Tu, Yuhai

    2016-01-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  8. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  9. Information processing in bacteria: memory, computation, and statistical physics: a key issues review.

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  10. New robust statistical procedures for the polytomous logistic regression models.

    Science.gov (United States)

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  11. Thermo-mechanical modeling of the obduction process based on the Oman ophiolite case

    OpenAIRE

    Duretz , Thibault; Agard , Philippe; Yamato , Philippe; Ducassou , Céline; Burov , Evgenii ,; Gerya , T. V.

    2016-01-01

    International audience; Obduction emplaces regional-scale fragments of oceanic lithosphere (ophiolites) over continental lithosphere margins of much lower density. For this reason, the mechanisms responsible for obduction remain enigmatic in the framework of plate tectonics. We present two-dimensional (2D) thermo-mechanical models of obduction and investigate possible dynamics and physical controls of this process. Model geometry and boundary conditions are based on available geological and g...

  12. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  13. [Lack of access to information on oral health problems among adults: an approach based on the theoretical model for literacy in health].

    Science.gov (United States)

    Roberto, Luana Leal; Noronha, Daniele Durães; Souza, Taiane Oliveira; Miranda, Ellen Janayne Primo; Martins, Andréa Maria Eleutério de Barros Lima; Paula, Alfredo Maurício Batista De; Ferreira, Efigênia Ferreira E; Haikal, Desirée Sant'ana

    2018-03-01

    This study sought to investigate factors associated with the lack of access to information on oral health among adults. It is a cross-sectional study, carried out among 831 adults (35-44 years of age). The dependent variable was access to information on how to avoid oral problems, and the independent variables were gathered into subgroups according to the theoretical model for literacy in health. Binary logistic regression was carried out, and results were corrected by the design effect. It was observed that 37.5% had no access to information about dental problems. The lack of access was higher among adults who had lower per capita income, were dissatisfied with the dental services provided, did not use dental floss, had unsatisfactory physical control of the quality of life, and self-perceived their oral health as fair/poor/very poor. The likelihood of not having access to information about dental problems among those dissatisfied with the dental services used was 3.28 times higher than for those satisfied with the dental services used. Thus, decreased access to information was related to unfavorable conditions among adults. Health services should ensure appropriate information to their users in order to increase health literacy levels and improve satisfaction and equity.

  14. System of National Accounts as an Information Base for Tax Statistics

    Directory of Open Access Journals (Sweden)

    A. E. Lyapin

    2017-01-01

    Full Text Available The article is devoted to those aspects of the system of national accounts, which together perform the role of information base of tax statistics. In our time, the tax system is one of the main subjects of the discussions about the methods and directions of its reform.Taxes are one of the main factors of regulation of the economy and act as an incentive for its development. Analysis of tax revenues to the budgets of different levels will enable to collect taxes and perform tax burden for various industries. From the amount of tax revenue it is possible to judge scales of reproductive processes in the country. It should be noted that taxes in the SNA are special. As mentioned earlier, in the SNA, taxes on products are treated in the form of income. At the same time, most economists prefer, their consideration in the form of consumption taxes, and taxes on various financial transactions (for example: taxes on the purchase/sale of securities are treated as taxes on production, including in cases when there are no services. It would be rational to revise and amend the SNA associated with the interpretation of all taxes and subsidies, to ensure better understanding and compliance with user needs.Taxes are an integral part of any state and an indispensable element of economic relations of any society. In turn, taxes and the budget are inextricably linked, as these relations have a clearly expressed, objective bilateral character. Taxes are the main groups of budget revenues, which makes it possible to finance all the government agencies and expenditure items, as well as the implementation of institutional subsidy units that make up the SNA sector “non-financial corporations”.The second side story is that taxes – a part of the money that is taken from producers and households. The total mass of taxes depends on the composition of taxes, tax rates, tax base and scope of benefits. The bulk of tax revenues also depends on possible changes in

  15. A new theoretical approach to analyze complex processes in cytoskeleton proteins.

    Science.gov (United States)

    Li, Xin; Kolomeisky, Anatoly B

    2014-03-20

    Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.

  16. Thermo-plasmonics of Irradiated Metallic Nanostructures

    DEFF Research Database (Denmark)

    Ma, Haiyan

    Thermo-plasmonics is an emerging field in photonics which aims at harnessing the kinetic energy of light to generate nanoscopic sources of heat. Localized surface plasmons (LSP) supported by metallic nanostructures greatly enhance the interactions of light with the structure. By engineering...... delivery, nano-surgeries and thermo-transportations. Apart from generating well-controlled temperature increase in functional thermo-plasmonic devices, thermo-plasmonics can also be used in understanding complex phenomena in thermodynamics by creating drastic temperature gradients which are not accessible...... using conventional techniques. In this thesis, we present novel experimental and numerical tools to characterize thermo-plasmonic devices in a biologically relevant environment, and explore the thermodiffusion properties and measure thermophoretic forces for particles in temperature gradients ranging...

  17. Effect of microencapsulated phase change materials on the thermo-mechanical properties of poly(methyl-methacrylate) based biomaterials.

    Science.gov (United States)

    De Santis, Roberto; Ambrogi, Veronica; Carfagna, Cosimo; Ambrosio, Luigi; Nicolais, Luigi

    2006-12-01

    Microencapsulated paraffin based phase change material (PCM) have been incorporated into Poly(methyl-methacrylate) (PMMA) matrix in order to enhance the thermo-mechanical properties. Calorimetric and mechanical analyses are carried out and the thermo regulating potential of PMMA/PCM composites is investigated. Results indicate that the PCM phase has a negligible effect on the glass transition temperature of the PMMA matrix, and the thermal regulating capability spans around body temperature absorbing or releasing a thermal energy up to 30 J/g. One of the effect of the PCM phase into the cement is the reduction of the peak temperature developed during the exothermal reaction.

  18. A population-based approach to background discrimination in particle physics

    International Nuclear Information System (INIS)

    Colecchia, Federico

    2012-01-01

    Background properties in experimental particle physics are typically estimated from control samples corresponding to large numbers of events. This can provide precise knowledge of average background distributions, but typically does not take into account statistical fluctuations in a data set of interest. A novel approach based on mixture model decomposition is presented, as a way to extract additional information about statistical fluctuations from a given data set with a view to improving on knowledge of background distributions obtained from control samples. Events are treated as heterogeneous populations comprising particles originating from different processes, and individual particles are mapped to a process of interest on a probabilistic basis. The proposed approach makes it possible to estimate features of the background distributions from the data, and to extract information about statistical fluctuations that would otherwise be lost using traditional supervised classifiers trained on high-statistics control samples. A feasibility study on Monte Carlo is presented, together with a comparison with existing techniques. Finally, the prospects for the development of tools for intensive offline analysis of individual interesting events at the Large Hadron Collider are discussed.

  19. Statistical mechanics of low-density parity-check codes

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 2268502 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)

    2004-02-13

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  20. Statistical mechanics of low-density parity-check codes

    International Nuclear Information System (INIS)

    Kabashima, Yoshiyuki; Saad, David

    2004-01-01

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  1. Predicting energy performance of a net-zero energy building: A statistical approach

    International Nuclear Information System (INIS)

    Kneifel, Joshua; Webb, David

    2016-01-01

    Highlights: • A regression model is applied to actual energy data from a net-zero energy building. • The model is validated through a rigorous statistical analysis. • Comparisons are made between model predictions and those of a physics-based model. • The model is a viable baseline for evaluating future models from the energy data. - Abstract: Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid Climate Zone, and compares these

  2. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  3. Information-theoretic lengths of Jacobi polynomials

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, A; Dehesa, J S [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, Granada (Spain); Sanchez-Moreno, P, E-mail: agmartinez@ugr.e, E-mail: pablos@ugr.e, E-mail: dehesa@ugr.e [Instituto ' Carlos I' de Fisica Teorica y Computacional, Universidad de Granada, Granada (Spain)

    2010-07-30

    The information-theoretic lengths of the Jacobi polynomials P{sup ({alpha}, {beta})}{sub n}(x), which are information-theoretic measures (Renyi, Shannon and Fisher) of their associated Rakhmanov probability density, are investigated. They quantify the spreading of the polynomials along the orthogonality interval [- 1, 1] in a complementary but different way as the root-mean-square or standard deviation because, contrary to this measure, they do not refer to any specific point of the interval. The explicit expressions of the Fisher length are given. The Renyi lengths are found by the use of the combinatorial multivariable Bell polynomials in terms of the polynomial degree n and the parameters ({alpha}, {beta}). The Shannon length, which cannot be exactly calculated because of its logarithmic functional form, is bounded from below by using sharp upper bounds to general densities on [- 1, +1] given in terms of various expectation values; moreover, its asymptotics is also pointed out. Finally, several computational issues relative to these three quantities are carefully analyzed.

  4. Thermo-mechanical constitutive modeling of unsaturated clays based on the critical state concepts

    OpenAIRE

    Tourchi, Saeed; Hamidi, Amir

    2015-01-01

    A thermo-mechanical constitutive model for unsaturated clays is constructed based on the existing model for saturated clays originally proposed by the authors. The saturated clays model was formulated in the framework of critical state soil mechanics and modified Cam-clay model. The existing model has been generalized to simulate the experimentally observed behavior of unsaturated clays by introducing Bishop's stress and suction as independent stress parameters and modifying the hardening rul...

  5. Integrated approach for stress based lifing of aero gas turbine blades

    Science.gov (United States)

    Abu, Abdullahi Obonyegba

    In order to analyse the turbine blade life, the damage due to the combined thermal and mechanical loads should be adequately accounted for. This is more challenging when detailed component geometry is limited. Therefore, a compromise between the level of geometric detail and the complexity of the lifing method to be implemented would be necessary. This research focuses on how the life assessment of aero engine turbine blades can be done, considering the balance between available design inputs and adequate level of fidelity. Accordingly, the thesis contributes to developing a generic turbine blade lifing method that is based on the engine thermodynamic cycle; as well as integrating critical design/technological factors and operational parameters that influence the aero engine blade life. To this end, thermo-mechanical fatigue was identified as the critical damage phenomenon driving the life of the turbine blade.. The developed approach integrates software tools and numerical models created using the minimum design information typically available at the early design stages. Using finite element analysis of an idealised blade geometry, the approach captures relevant impacts of thermal gradients and thermal stresses that contribute to the thermo-mechanical fatigue damage on the gas turbine blade. The blade life is evaluated using the Neu/Sehitoglu thermo-mechanical fatigue model that considers damage accumulation due to fatigue, oxidation, and creep. The leading edge is examined as a critical part of the blade to estimate the damage severity for different design factors and operational parameters. The outputs of the research can be used to better understand how the environment and the operating conditions of the aircraft affect the blade life consumption and therefore what is the impact on the maintenance cost and the availability of the propulsion system. This research also finds that the environmental (oxidation) effect drives the blade life and the blade coolant

  6. Diagonalization of propagators in thermo field dynamics for relativistic quantum fields

    International Nuclear Information System (INIS)

    Henning, P.A.; Umezawa, H.

    1992-09-01

    Two-point functions for interacting quantum fields in statistical systems can be diagnolized by matrix transformations. It is shown, that within the framework of time-dependent Thermo Field Dynamics this diagonalization can be understood as a thermal Bogoliubov transformation to non-interacting statistical quasi-particles. The condition for their unperturbed propagation relates these states to the thermodynamic properties of the system: It requires global equilibrium for stationary situations, or specifies the time evolution according to a kinetic equation. (orig.)

  7. Online adaptive approach for a game-theoretic strategy for complete vehicle energy management

    NARCIS (Netherlands)

    Chen, H.; Kessels, J.T.B.A.; Weiland, S.

    2015-01-01

    This paper introduces an adaptive approach for a game-theoretic strategy on Complete Vehicle Energy Management. The proposed method enhances the game-theoretic approach such that the strategy is able to adapt to real driving behavior. The classical game-theoretic approach relies on one probability

  8. Objective and subjective measures of exercise intensity during thermo-neutral and hot yoga.

    Science.gov (United States)

    Boyd, Corinne N; Lannan, Stephanie M; Zuhl, Micah N; Mora-Rodriguez, Ricardo; Nelson, Rachael K

    2018-04-01

    While hot yoga has gained enormous popularity in recent years, owing in part to increased environmental challenge associated with exercise in the heat, it is not clear whether hot yoga is more vigorous than thermo-neutral yoga. Therefore, the aim of this study was to determine objective and subjective measures of exercise intensity during constant intensity yoga in a hot and thermo-neutral environment. Using a randomized, crossover design, 14 participants completed 2 identical ∼20-min yoga sessions in a hot (35.3 ± 0.8 °C; humidity: 20.5% ± 1.4%) and thermo-neutral (22.1 ± 0.2 °C; humidity: 27.8% ± 1.6%) environment. Oxygen consumption and heart rate (HR) were recorded as objective measures (percentage of maximal oxygen consumption and percentage of maximal HR (%HRmax)) and rating of perceived exertion (RPE) was recorded as a subjective measure of exercise intensity. There was no difference in exercise intensity based on percentage of maximal oxygen consumption during hot versus thermo-neutral yoga (30.9% ± 2.3% vs. 30.5% ± 1.8%, p = 0.68). However, exercise intensity was significantly higher during hot versus thermo-neutral yoga based on %HRmax (67.0% ± 2.3% vs. 60.8% ± 1.9%, p = 0.01) and RPE (12 ± 1 vs. 11 ± 1, p = 0.04). According to established exercise intensities, hot yoga was classified as light-intensity exercise based on percentage of maximal oxygen consumption but moderate-intensity exercise based on %HRmax and RPE while thermo-neutral yoga was classified as light-intensity exercise based on percentage of maximal oxygen uptake, %HRmax, and RPE. Despite the added hemodynamic stress and perception that yoga is more strenuous in a hot environment, we observed similar oxygen consumption during hot versus thermo-neutral yoga, classifying both exercise modalities as light-intensity exercise.

  9. A multimodal wave spectrum-based approach for statistical downscaling of local wave climate

    Science.gov (United States)

    Hegermiller, Christie; Antolinez, Jose A A; Rueda, Ana C.; Camus, Paula; Perez, Jorge; Erikson, Li; Barnard, Patrick; Mendez, Fernando J.

    2017-01-01

    Characterization of wave climate by bulk wave parameters is insufficient for many coastal studies, including those focused on assessing coastal hazards and long-term wave climate influences on coastal evolution. This issue is particularly relevant for studies using statistical downscaling of atmospheric fields to local wave conditions, which are often multimodal in large ocean basins (e.g. the Pacific). Swell may be generated in vastly different wave generation regions, yielding complex wave spectra that are inadequately represented by a single set of bulk wave parameters. Furthermore, the relationship between atmospheric systems and local wave conditions is complicated by variations in arrival time of wave groups from different parts of the basin. Here, we address these two challenges by improving upon the spatiotemporal definition of the atmospheric predictor used in statistical downscaling of local wave climate. The improved methodology separates the local wave spectrum into “wave families,” defined by spectral peaks and discrete generation regions, and relates atmospheric conditions in distant regions of the ocean basin to local wave conditions by incorporating travel times computed from effective energy flux across the ocean basin. When applied to locations with multimodal wave spectra, including Southern California and Trujillo, Peru, the new methodology improves the ability of the statistical model to project significant wave height, peak period, and direction for each wave family, retaining more information from the full wave spectrum. This work is the base of statistical downscaling by weather types, which has recently been applied to coastal flooding and morphodynamic applications.

  10. The person-oriented approach: A short theoretical and practical guide

    Directory of Open Access Journals (Sweden)

    Lars R. Bergman

    2014-05-01

    Full Text Available A short overview of the person-oriented approach is given as a guide to the researcher interested in carrying out person-oriented research. Theoretical, methodological, and practical considerations of the approach are discussed. First, some historical roots are traced, followed by a description of the holisticinteractionistic research paradigm, which provided the general framework for the development of the modern person-oriented approach. The approach has both a theoretical and a methodological facet and after presenting its key theoretical tenets, an overview is given of some common person-oriented methods. Central to the person-oriented approach is a system view with its components together forming a pattern regarded as indivisible. This pattern should be understood and studied as a whole, not broken up into pieces (variables that are studied as separate entities. Hence, usually methodological tools are used by which whole patterns are analysed (e.g. cluster analysis. An empirical example is given where the pattern development of school grades is studied.

  11. Probabilistic properties of the date of maximum river flow, an approach based on circular statistics in lowland, highland and mountainous catchment

    Science.gov (United States)

    Rutkowska, Agnieszka; Kohnová, Silvia; Banasik, Kazimierz

    2018-04-01

    Probabilistic properties of dates of winter, summer and annual maximum flows were studied using circular statistics in three catchments differing in topographic conditions; a lowland, highland and mountainous catchment. The circular measures of location and dispersion were used in the long-term samples of dates of maxima. The mixture of von Mises distributions was assumed as the theoretical distribution function of the date of winter, summer and annual maximum flow. The number of components was selected on the basis of the corrected Akaike Information Criterion and the parameters were estimated by means of the Maximum Likelihood method. The goodness of fit was assessed using both the correlation between quantiles and a version of the Kuiper's and Watson's test. Results show that the number of components varied between catchments and it was different for seasonal and annual maxima. Differences between catchments in circular characteristics were explained using climatic factors such as precipitation and temperature. Further studies may include circular grouping catchments based on similarity between distribution functions and the linkage between dates of maximum precipitation and maximum flow.

  12. Statistical method application to knowledge base building for reactor accident diagnostic system

    International Nuclear Information System (INIS)

    Yoshida, Kazuo; Yokobayashi, Masao; Matsumoto, Kiyoshi; Kohsaka, Atsuo

    1989-01-01

    In the development of a knowledge based expert system, one of key issues is how to build the knowledge base (KB) in an efficient way with keeping the objectivity of KB. In order to solve this issue, an approach has been proposed to build a prototype KB systematically by a statistical method, factor analysis. For the verification of this approach, factor analysis was applied to build a prototype KB for the JAERI expert system DISKET. To this end, alarm and process information was generated by a PWR simulator and the factor analysis was applied to this information to define taxonomy of accident hypotheses and to extract rules for each hypothesis. The prototype KB thus built was tested through inferring against several types of transients including double-failures. In each diagnosis, the transient type was well identified. Furthermore, newly introduced standards for rule extraction showed good effects on the enhancement of the performance of prototype KB. (author)

  13. Physics of thermo-nuclear fusion and the ITER project; La physique de la fusion thermonucleaire et le projet ITER

    Energy Technology Data Exchange (ETDEWEB)

    Garin, P [CEA Cadarache, Dept. de Recherches sur la Fusion Controlee - DRFC, 13 - Saint-Paul-lez-Durance (France)

    2003-01-01

    This document gathers the slides of the 6 contributions to the workshop 'the physics of thermo-nuclear fusion and the ITER project': 1) the feasibility of magnetic confinement and the issue of heat recovery, 2) heating and current generation in tokamaks, 3) the physics of wall-plasma interaction, 4) recent results at JET, 5) inertial confinement and fast ignition, and 6) the technology of fusion machines based on magnetic confinement. This document presents the principles of thermo-nuclear fusion machines and gives a lot of technical information about JET, Tore-Supra and ITER.

  14. PROCESS-BASED LEARNING: TOWARDS THEORETICAL AND LECTURE-BASED COURSEWORK IN STUDIO STYLE

    Directory of Open Access Journals (Sweden)

    Hatem Ezzat Nabih

    2010-07-01

    Full Text Available This article presents a process-based learning approach to design education where theoretical coursework is taught in studio-style. Lecture-based coursework is sometimes regarded as lacking in challenge and broadening the gap between theory and practice. Furthermore, lecture-based curricula tend to be detached from the studio and deny students from applying their theoretically gained knowledge. Following the belief that student motivation is increased by establishing a higher level of autonomy in the learning process, I argue for a design education that links theory with applied design work within the studio setting. By synthesizing principles of Constructivist Learning and Problem-Based Learning, PBL students are given greater autonomy by being actively involved in their education. Accordingly, I argue for a studio setting that incorporates learning in studio style by presenting three design applications involving students in investigation and experimentation in order to self-experience the design process.

  15. Artificial intelligence approaches in statistics

    International Nuclear Information System (INIS)

    Phelps, R.I.; Musgrove, P.B.

    1986-01-01

    The role of pattern recognition and knowledge representation methods from Artificial Intelligence within statistics is considered. Two areas of potential use are identified and one, data exploration, is used to illustrate the possibilities. A method is presented to identify and separate overlapping groups within cluster analysis, using an AI approach. The potential of such ''intelligent'' approaches is stressed

  16. Theoretical Aspects of the Patterns Recognition Statistical Theory Used for Developing the Diagnosis Algorithms for Complicated Technical Systems

    Science.gov (United States)

    Obozov, A. A.; Serpik, I. N.; Mihalchenko, G. S.; Fedyaeva, G. A.

    2017-01-01

    In the article, the problem of application of the pattern recognition (a relatively young area of engineering cybernetics) for analysis of complicated technical systems is examined. It is shown that the application of a statistical approach for hard distinguishable situations could be the most effective. The different recognition algorithms are based on Bayes approach, which estimates posteriori probabilities of a certain event and an assumed error. Application of the statistical approach to pattern recognition is possible for solving the problem of technical diagnosis complicated systems and particularly big powered marine diesel engines.

  17. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  18. Bayesian Information Criterion as an Alternative way of Statistical Inference

    Directory of Open Access Journals (Sweden)

    Nadejda Yu. Gubanova

    2012-05-01

    Full Text Available The article treats Bayesian information criterion as an alternative to traditional methods of statistical inference, based on NHST. The comparison of ANOVA and BIC results for psychological experiment is discussed.

  19. THEORETICAL AND METHODOLOGICAL APPROACHES TO THE STUDY OF THE IMPACT OF INFORMATION TECHNOLOGY ON SOCIAL CONNECTIONS AMONG YOUTH

    Directory of Open Access Journals (Sweden)

    Sofia Alexandrovna Zverkova

    2015-11-01

    Full Text Available The urgency is due to the virtualization of communication in modern society, especially among young people, affecting social relations and social support services. Stressed the need for a more in-depth study of network virtualization of social relations of society, due to the ambiguous consequences of this phenomenon among the youth.Purpose. Analyze classic and contemporary theoretical and methodological approaches to the study of social ties and social support in terms of technological progress.Results. The article presents a sociological analysis of theoretical and methodological approaches to the study of problems of interaction and social support among youth through strong and weak social ties in cyberspace and in the real world.Practical implications. The analysis gives the opportunity for a wide range of examining social relations in various fields of sociology, such as sociology of youth, sociology of communications.

  20. Risk map for cutaneous leishmaniasis in Ethiopia based on environmental factors as revealed by geographical information systems and statistics

    Directory of Open Access Journals (Sweden)

    Ahmed Seid

    2014-05-01

    Full Text Available Cutaneous leishmaniasis (CL is a neglected tropical disease strongly associated with poverty. Treatment is problematic and no vaccine is available. Ethiopia has seen new outbreaks in areas previously not known to be endemic, often with co-infection by the human immunodeficiency virus (HIV with rates reaching 5.6% of the cases. The present study concerns the development of a risk model based on environmental factors using geographical information systems (GIS, statistical analysis and modelling. Odds ratio (OR of bivariate and multivariate logistic regression was used to evaluate the relative importance of environmental factors, accepting P ≤0.056 as the inclusion level for the model’s environmental variables. When estimating risk from the viewpoint of geographical surface, slope, elevation and annual rainfall were found to be good predictors of CL presence based on both probabilistic and weighted overlay approaches. However, when considering Ethiopia as whole, a minor difference was observed between the two methods with the probabilistic technique giving a 22.5% estimate, while that of weighted overlay approach was 19.5%. Calculating the population according to the land surface estimated by the latter method, the total Ethiopian population at risk for CL was estimated at 28,955,035, mainly including people in the highlands of the regional states of Amhara, Oromia, Tigray and the Southern Nations, Nationalities and Peoples’ Region, one of the nine ethnic divisions in Ethiopia. Our environmental risk model provided an overall prediction accuracy of 90.4%. The approach proposed here can be replicated for other diseases to facilitate implementation of evidence-based, integrated disease control activities.

  1. Thermo-mechanical behaviour modelling of particle fuels using a multi-scale approach

    International Nuclear Information System (INIS)

    Blanc, V.

    2009-12-01

    Particle fuels are made of a few thousand spheres, one millimeter diameter large, compound of uranium oxide coated by confinement layers which are embedded in a graphite matrix to form the fuel element. The aim of this study is to develop a new simulation tool for thermo-mechanical behaviour of those fuels under radiations which is able to predict finely local loadings on the particles. We choose to use the square finite element method, in which two different discretization scales are used: a macroscopic homogeneous structure whose properties in each integration point are computed on a second heterogeneous microstructure, the Representative Volume Element (RVE). First part of this works is concerned by the definition of this RVE. A morphological indicator based in the minimal distance between spheres centers permit to select random sets of microstructures. The elastic macroscopic response of RVE, computed by finite element has been compared to an analytical model. Thermal and mechanical representativeness indicators of local loadings has been built from the particle failure modes. A statistical study of those criteria on a hundred of RVE showed the significance of choose a representative microstructure. In this perspective, a empirical model binding morphological indicator to mechanical indicator has been developed. Second part of the work deals with the two transition scale method which are based on the periodic homogenization. Considering a linear thermal problem with heat source in permanent condition, one showed that the heterogeneity of the heat source involve to use a second order method to localized finely the thermal field. The mechanical non-linear problem has been treats by using the iterative Cast3M algorithm, substituting to integration of the behavior law a finite element computation on the RVE. This algorithm has been validated, and coupled with thermal resolution in order to compute a radiation loading. A computation on a complete fuel element

  2. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  3. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

    Science.gov (United States)

    Thiessen, Erik D

    2017-01-05

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik

  4. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    Science.gov (United States)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  5. A game theoretic approach to a finite-time disturbance attenuation problem

    Science.gov (United States)

    Rhee, Ihnseok; Speyer, Jason L.

    1991-01-01

    A disturbance attenuation problem over a finite-time interval is considered by a game theoretic approach where the control, restricted to a function of the measurement history, plays against adversaries composed of the process and measurement disturbances, and the initial state. A zero-sum game, formulated as a quadratic cost criterion subject to linear time-varying dynamics and measurements, is solved by a calculus of variation technique. By first maximizing the quadratic cost criterion with respect to the process disturbance and initial state, a full information game between the control and the measurement residual subject to the estimator dynamics results. The resulting solution produces an n-dimensional compensator which expresses the controller as a linear combination of the measurement history. A disturbance attenuation problem is solved based on the results of the game problem. For time-invariant systems it is shown that under certain conditions the time-varying controller becomes time-invariant on the infinite-time interval. The resulting controller satisfies an H(infinity) norm bound.

  6. Giant thermo-optical relaxation oscillations in millimeter-size whispering gallery mode disk resonators.

    Science.gov (United States)

    Diallo, Souleymane; Lin, Guoping; Chembo, Yanne K

    2015-08-15

    In this Letter, we show that giant thermo-optical oscillations can be triggered in millimeter (mm)-size whispering gallery mode (WGM) disk resonators when they are pumped by a resonant continuous-wave laser. Our resonator is an ultrahigh-Q barium fluoride cavity that features a positive thermo-optic coefficient and a negative thermo-elastic coefficient. We demonstrate for the first time, to our knowledge, that the complex interplay between these two thermic coefficients and the intrinsic Kerr nonlinearity yields very sharp slow-fast relaxation oscillations with a slow timescale that can be exceptionally large, typically of the order of 1 s. We use a time-domain model to gain understanding into this instability, and we find that both the experimental and theoretical results are in excellent agreement. The understanding of these thermal effects is an essential requirement for every WGM-related application and our study demonstrates that even in the case of mm-size resonators, such effects can still be accurately analyzed using nonlinear time-domain models.

  7. A four stage approach for ontology-based health information system design.

    Science.gov (United States)

    Kuziemsky, Craig E; Lau, Francis

    2010-11-01

    To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Toward a risk-based approach to the assessment of the surety of information systems

    Energy Technology Data Exchange (ETDEWEB)

    Wyss, G.D.; Fletcher, S.K.; Halbgewachs, R.D.; Jansma, R.M.; Lim, J.J.; Murphy, M.; Sands, P.D.

    1995-03-01

    Traditional approaches to the assessment of information systems have treated system security, system reliability, data integrity, and application functionality as separate disciplines. However, each areas requirements and solutions have a profound impact on the successful implementation of the other areas. A better approach is to assess the ``surety`` of an information system, which is defined as ensuring the ``correct`` operation of an information system by incorporating appropriate levels of safety, functionality, confidentiality, availability, and integrity. Information surety examines the combined impact of design alternatives on all of these areas. We propose a modelling approach that combines aspects of fault trees and influence diagrams for assessing information surety requirements under a risk assessment framework. This approach allows tradeoffs to be based on quantitative importance measures such as risk reduction while maintaining the modelling flexibility of the influence diagram paradigm. This paper presents an overview of the modelling method and a sample application problem.

  9. LPI Optimization Framework for Target Tracking in Radar Network Architectures Using Information-Theoretic Criteria

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2014-01-01

    Full Text Available Widely distributed radar network architectures can provide significant performance improvement for target detection and localization. For a fixed radar network, the achievable target detection performance may go beyond a predetermined threshold with full transmitted power allocation, which is extremely vulnerable in modern electronic warfare. In this paper, we study the problem of low probability of intercept (LPI design for radar network and propose two novel LPI optimization schemes based on information-theoretic criteria. For a predefined threshold of target detection, Schleher intercept factor is minimized by optimizing transmission power allocation among netted radars in the network. Due to the lack of analytical closed-form expression for receiver operation characteristics (ROC, we employ two information-theoretic criteria, namely, Bhattacharyya distance and J-divergence as the metrics for target detection performance. The resulting nonconvex and nonlinear LPI optimization problems associated with different information-theoretic criteria are cast under a unified framework, and the nonlinear programming based genetic algorithm (NPGA is used to tackle the optimization problems in the framework. Numerical simulations demonstrate that our proposed LPI strategies are effective in enhancing the LPI performance for radar network.

  10. SOCIOLOGICAL UNDERSTANDING OF INTERNET: THEORETICAL APPROACHES TO THE NETWORK ANALYSIS

    Directory of Open Access Journals (Sweden)

    D. E. Dobrinskaya

    2016-01-01

    Full Text Available The network is an efficient way of social structure analysis for contemporary sociologists. It gives broad opportunities for detailed and fruitful research of different patterns of ties and social relations by quantitative analytical methods and visualization of network models. The network metaphor is used as the most representative tool for description of a new type of society. This new type is characterized by flexibility, decentralization and individualization. Network organizational form became the dominant form in modern societies. The network is also used as a mode of inquiry. Actually three theoretical network approaches in the Internet research case are the most relevant: social network analysis, “network society” theory and actor-network theory. Every theoretical approach has got its own notion of network. Their special methodological and theoretical features contribute to the Internet studies in different ways. The article represents a brief overview of these network approaches. This overview demonstrates the absence of a unified semantic space of the notion of “network” category. This fact, in turn, points out the need for detailed analysis of these approaches to reveal their theoretical and empirical possibilities in application to the Internet studies. 

  11. Preservation of Newspapers: Theoretical Approaches and Practical Achievements

    Science.gov (United States)

    Hasenay, Damir; Krtalic, Maja

    2010-01-01

    The preservation of newspapers is the main topic of this paper. A theoretical overview of newspaper preservation is given, with an emphasis on the importance of a systematic and comprehensive approach. Efficient newspaper preservation implies understanding the meaning of preservation in general, as well as understanding specific approaches,…

  12. Magnetic resonance guided focalized ultrasound thermo-ablation: A promising oncologic local therapy

    International Nuclear Information System (INIS)

    Iannessi, A.; Doyen, J.; Leysalle, A.; Thyss, A.

    2014-01-01

    Pain management of bone metastases is usually made using systemic and local therapy. Even though radiations are nowadays the gold standard for painful metastases, innovations regarding minimally invasive treatment approaches have been developed because of the existing non-responder patients [1]. Indeed, cementoplasty and thermo-ablations like radiofrequency or cryotherapy have shown to be efficient on pain [2-4]. Among thermo-therapy, magnetic resonance guided focalized ultrasound is now a new non-invasive weapon for bone pain palliation. (authors)

  13. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    Science.gov (United States)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  14. Excimer laser micropatterning of freestanding thermo-responsive hydrogel layers for cells-on-chip applications

    International Nuclear Information System (INIS)

    Santaniello, Tommaso; Milani, Paolo; Lenardi, Cristina; Martello, Federico; Tocchio, Alessandro; Gassa, Federico; Webb, Patrick

    2012-01-01

    We report a novel reliable and repeatable technologic manufacturing protocol for the realization of micro-patterned freestanding hydrogel layers based on thermo-responsive poly-(N-isopropyl)acrylamide (PNIPAAm), which have potential to be employed as temperature-triggered smart surfaces for cells-on-chip applications. PNIPAAm-based films with controlled mechanical properties and different thicknesses (100–300 µm thickness) were prepared by injection compression moulding at room temperature. A 9 × 9 array of 20 µm diameter through-holes is machined by means of the KrF excimer laser on dry PNIPAAm films which are physically attached to flat polyvinyl chloride (PVC) substrates. Machining parameters, such as fluence and number of shots, are optimized in order to achieve highly resolved features. Micro-structured freestanding films are then easily obtained after hydrogels are detached from PVC by gradually promoting the film swelling in ethanol. In the PNIPAAm water-swollen state, the machined holes’ diameter approaches a slight larger value (30 µm) according to the measured hydrogel swelling ratio. Thermo-responsive behaviour and through-hole tapering characterization are carried out by metrology measurements using an optical inverted and confocal microscope setup, respectively. After the temperature of freestanding films is raised above 32 °C, we observe that the shrinkage of the whole through-hole array occurs, thus reducing the holes’ diameter to less than a half its original size (about 15 µm) as a consequence of the film dehydration. Different holes’ diameters (10 and 30 µm) are also obtained on dry hydrogel employing suitable projection masks, showing similar shrinking behaviour when hydrated and undergone thermo-response tests. Thermo-responsive PNIPAAm-based freestanding layers could then be integrated with other suitable micro-fabricated thermoplastic components in order to preliminary test their feasibility in operating as temperature

  15. Information-theoretic semi-supervised metric learning via entropy regularization.

    Science.gov (United States)

    Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi

    2014-08-01

    We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.

  16. Value of information-based inspection planning for offshore structures

    DEFF Research Database (Denmark)

    Irman, Arifian Agusta; Thöns, Sebastian; Leira, Bernt J.

    2017-01-01

    with each inspection strategy. A simplified and generic risk-based inspection planning utilizing pre- posterior Bayesian decision analysis had been proposed by Faber et al. [1] and Straub [2]. This paper provides considerations on the theoretical background and a Value of Information analysis......-based inspection planning. The paper will start out with a review of the state-of-art RBI planning procedure based on Bayesian decision theory and its application in offshore structure integrity management. An example of the Value of Information approach is illustrated and it is pointed to further research......Asset integrity and management is an important part of the oil and gas industry especially for existing offshore structures. With declining oil price, the production rate is an important factor to be maintained that makes integrity of the structures one of the main concerns. Reliability based...

  17. Informed Systems: Enabling Collaborative Evidence Based Organizational Learning

    Directory of Open Access Journals (Sweden)

    Mary M. Somerville

    2015-12-01

    theoretical approach. Results – Over time and with practice, as co-workers design and enact information-focused and evidence based learning experiences, they learn the way to decision-making and action-taking. Increasingly more complex experiences of information exchange, sense making, and knowledge creation, well supported by workplace communication systems and professional practices, further dialogue and reflection and thereby enrich analysis and interpretation of complexities and interdependencies. Conclusions - Research projects and evaluation studies conducted since 2003 demonstrate the transformative potential of the holistic Informed Systems approach to creating robust workplace learning environments. Leaders are responsible for design of workplace environments supportive of well contextualized, information-rich conversations. Co-workers revisit both the nature of organizational information and the purpose of organizational work. As colleagues better understand the complexities of the organization and its situation, they learn to diagnose problems and identify consequences, guided by Informed Systems models. Systemic activity and process models activate collaborative evidence based information processes within enabling conditions for thought leadership and workplace learning that recognize learning is social. Enabling communication systems and professional practices therefore intentionally catalyze and support collegial inquiry to co-create information experiences and organizational knowledge through evidence based practice to enliven capacity, inform decisions, produce improvements, and sustain relationships. The Informed Systems approach is thereby a contribution to professional practice and workplace renewal through evidence based decision-making and action-taking in contemporary organizations.

  18. Dramaturgical and Music-Theoretical Approaches to Improvisation Pedagogy

    Science.gov (United States)

    Huovinen, Erkki; Tenkanen, Atte; Kuusinen, Vesa-Pekka

    2011-01-01

    The aim of this article is to assess the relative merits of two approaches to teaching musical improvisation: a music-theoretical approach, focusing on chords and scales, and a "dramaturgical" one, emphasizing questions of balance, variation and tension. Adult students of music pedagogy, with limited previous experience in improvisation,…

  19. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Taghezouit, Bilal; Saidi, Ahmed; Hamlati, Mohamed-Elkarim

    2017-01-01

    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one

  20. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  1. Statistical decisions under nonparametric a priori information

    International Nuclear Information System (INIS)

    Chilingaryan, A.A.

    1985-01-01

    The basic module of applied program package for statistical analysis of the ANI experiment data is described. By means of this module tasks of choosing theoretical model most adequately fitting to experimental data, selection of events of definte type, identification of elementary particles are carried out. For mentioned problems solving, the Bayesian rules, one-leave out test and KNN (K Nearest Neighbour) adaptive density estimation are utilized

  2. A generalised porous medium approach to study thermo-fluid dynamics in human eyes.

    Science.gov (United States)

    Mauro, Alessandro; Massarotti, Nicola; Salahudeen, Mohamed; Romano, Mario R; Romano, Vito; Nithiarasu, Perumal

    2018-03-22

    The present work describes the application of the generalised porous medium model to study heat and fluid flow in healthy and glaucomatous eyes of different subject specimens, considering the presence of ocular cavities and porous tissues. The 2D computational model, implemented into the open-source software OpenFOAM, has been verified against benchmark data for mixed convection in domains partially filled with a porous medium. The verified model has been employed to simulate the thermo-fluid dynamic phenomena occurring in the anterior section of four patient-specific human eyes, considering the presence of anterior chamber (AC), trabecular meshwork (TM), Schlemm's canal (SC), and collector channels (CC). The computational domains of the eye are extracted from tomographic images. The dependence of TM porosity and permeability on intraocular pressure (IOP) has been analysed in detail, and the differences between healthy and glaucomatous eye conditions have been highlighted, proving that the different physiological conditions of patients have a significant influence on the thermo-fluid dynamic phenomena. The influence of different eye positions (supine and standing) on thermo-fluid dynamic variables has been also investigated: results are presented in terms of velocity, pressure, temperature, friction coefficient and local Nusselt number. The results clearly indicate that porosity and permeability of TM are two important parameters that affect eye pressure distribution. Graphical abstract Velocity contours and vectors for healthy eyes (top) and glaucomatous eyes (bottom) for standing position.

  3. Insights into the Mechanism and Kinetics of Thermo-Oxidative Degradation of HFPE High Performance Polymer.

    Science.gov (United States)

    Kunnikuruvan, Sooraj; Parandekar, Priya V; Prakash, Om; Tsotsis, Thomas K; Nair, Nisanth N

    2016-06-02

    The growing requisite for materials having high thermo-oxidative stability makes the design and development of high performance materials an active area of research. Fluorination of the polymer backbone is a widely applied strategy to improve various properties of the polymer, most importantly the thermo-oxidative stability. Many of these fluorinated polymers are known to have thermo-oxidative stability up to 700 K. However, for space and aerospace applications, it is important to improve its thermo-oxidative stability beyond 700 K. Molecular-level details of the thermo-oxidative degradation of such polymers can provide vital information to improve the polymer. In this spirit, we have applied quantum mechanical and microkinetic analysis to scrutinize the mechanism and kinetics of the thermo-oxidative degradation of a fluorinated polymer with phenylethenyl end-cap, HFPE. This study gives an insight into the thermo-oxidative degradation of HFPE and explains most of the experimental observations on the thermo-oxidative degradation of this polymer. Thermolysis of C-CF3 bond in the dianhydride component (6FDA) of HFPE is found to be the rate-determining step of the degradation. Reaction pathways that are responsible for the experimentally observed weight loss of the polymer is also scrutinized. On the basis of these results, we propose a modification of HFPE polymer to improve its thermo-oxidative stability.

  4. The Metabolic Basis of Pollen Thermo-Tolerance: Perspectives for Breeding

    Directory of Open Access Journals (Sweden)

    Marine J. Paupière

    2014-09-01

    Full Text Available Crop production is highly sensitive to elevated temperatures. A rise of a few degrees above the optimum growing temperature can lead to a dramatic yield loss. A predicted increase of 1–3 degrees in the twenty first century urges breeders to develop thermo-tolerant crops which are tolerant to high temperatures. Breeding for thermo-tolerance is a challenge due to the low heritability of this trait. A better understanding of heat stress tolerance and the development of reliable methods to phenotype thermo-tolerance are key factors for a successful breeding approach. Plant reproduction is the most temperature-sensitive process in the plant life cycle. More precisely, pollen quality is strongly affected by heat stress conditions. High temperature leads to a decrease of pollen viability which is directly correlated with a loss of fruit production. The reduction in pollen viability is associated with changes in the level and composition of several (groups of metabolites, which play an important role in pollen development, for example by contributing to pollen nutrition or by providing protection to environmental stresses. This review aims to underline the importance of maintaining metabolite homeostasis during pollen development, in order to produce mature and fertile pollen under high temperature. The review will give an overview of the current state of the art on the role of various pollen metabolites in pollen homeostasis and thermo-tolerance. Their possible use as metabolic markers to assist breeding programs for plant thermo-tolerance will be discussed.

  5. The metabolic basis of pollen thermo-tolerance: perspectives for breeding.

    Science.gov (United States)

    Paupière, Marine J; van Heusden, Adriaan W; Bovy, Arnaud G

    2014-09-30

    Crop production is highly sensitive to elevated temperatures. A rise of a few degrees above the optimum growing temperature can lead to a dramatic yield loss. A predicted increase of 1-3 degrees in the twenty first century urges breeders to develop thermo-tolerant crops which are tolerant to high temperatures. Breeding for thermo-tolerance is a challenge due to the low heritability of this trait. A better understanding of heat stress tolerance and the development of reliable methods to phenotype thermo-tolerance are key factors for a successful breeding approach. Plant reproduction is the most temperature-sensitive process in the plant life cycle. More precisely, pollen quality is strongly affected by heat stress conditions. High temperature leads to a decrease of pollen viability which is directly correlated with a loss of fruit production. The reduction in pollen viability is associated with changes in the level and composition of several (groups of) metabolites, which play an important role in pollen development, for example by contributing to pollen nutrition or by providing protection to environmental stresses. This review aims to underline the importance of maintaining metabolite homeostasis during pollen development, in order to produce mature and fertile pollen under high temperature. The review will give an overview of the current state of the art on the role of various pollen metabolites in pollen homeostasis and thermo-tolerance. Their possible use as metabolic markers to assist breeding programs for plant thermo-tolerance will be discussed.

  6. The success or failure of management information systems: A theoretical approach

    Energy Technology Data Exchange (ETDEWEB)

    Curlee, T.R.; Tonn, B.T.

    1987-03-01

    Work has been done by various disciplines to address the reasons why modern, computerized management information systems either succeed or fail. However, the studies are not based on a well-defined conceptual framework and the focus has been narrow. This report presents a comprehensive conceptual framework of how an information system is used within an organization. This framework not only suggests how the use of an information system may translate into productivity improvements for the implementing organization but also helps to identify why a system may succeed or fail. A major aspect of the model is its distinction between the objectives of the organization in its decision to implement an information system and the objectives of the individual employees who are to use the system. A divergence between these objectives can lead to system underutilization or misuse at the expense of the organization's overall productivity.

  7. A thermo-responsive and photo-polymerizable chondroitin sulfate-based hydrogel for 3D printing applications.

    Science.gov (United States)

    Abbadessa, A; Blokzijl, M M; Mouser, V H M; Marica, P; Malda, J; Hennink, W E; Vermonden, T

    2016-09-20

    The aim of this study was to design a hydrogel system based on methacrylated chondroitin sulfate (CSMA) and a thermo-sensitive poly(N-(2-hydroxypropyl) methacrylamide-mono/dilactate)-polyethylene glycol triblock copolymer (M15P10) as a suitable material for additive manufacturing of scaffolds. CSMA was synthesized by reaction of chondroitin sulfate with glycidyl methacrylate (GMA) in dimethylsulfoxide at 50°C and its degree of methacrylation was tunable up to 48.5%, by changing reaction time and GMA feed. Unlike polymer solutions composed of CSMA alone (20% w/w), mixtures based on 2% w/w of CSMA and 18% of M15P10 showed strain-softening, thermo-sensitive and shear-thinning properties more pronounced than those found for polymer solutions based on M15P10 alone. Additionally, they displayed a yield stress of 19.2±7.0Pa. The 3D printing of this hydrogel resulted in the generation of constructs with tailorable porosity and good handling properties. Finally, embedded chondrogenic cells remained viable and proliferating over a culture period of 6days. The hydrogel described herein represents a promising biomaterial for cartilage 3D printing applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  9. Modelling diversity in building occupant behaviour: a novel statistical approach

    DEFF Research Database (Denmark)

    Haldi, Frédéric; Calì, Davide; Andersen, Rune Korsholm

    2016-01-01

    We propose an advanced modelling framework to predict the scope and effects of behavioural diversity regarding building occupant actions on window openings, shading devices and lighting. We develop a statistical approach based on generalised linear mixed models to account for the longitudinal nat...

  10. Bearing Fault Diagnosis Based on Statistical Locally Linear Embedding.

    Science.gov (United States)

    Wang, Xiang; Zheng, Yuan; Zhao, Zhenzhou; Wang, Jinping

    2015-07-06

    Fault diagnosis is essentially a kind of pattern recognition. The measured signal samples usually distribute on nonlinear low-dimensional manifolds embedded in the high-dimensional signal space, so how to implement feature extraction, dimensionality reduction and improve recognition performance is a crucial task. In this paper a novel machinery fault diagnosis approach based on a statistical locally linear embedding (S-LLE) algorithm which is an extension of LLE by exploiting the fault class label information is proposed. The fault diagnosis approach first extracts the intrinsic manifold features from the high-dimensional feature vectors which are obtained from vibration signals that feature extraction by time-domain, frequency-domain and empirical mode decomposition (EMD), and then translates the complex mode space into a salient low-dimensional feature space by the manifold learning algorithm S-LLE, which outperforms other feature reduction methods such as PCA, LDA and LLE. Finally in the feature reduction space pattern classification and fault diagnosis by classifier are carried out easily and rapidly. Rolling bearing fault signals are used to validate the proposed fault diagnosis approach. The results indicate that the proposed approach obviously improves the classification performance of fault pattern recognition and outperforms the other traditional approaches.

  11. EVOLUTION OF THEORETICAL APPROACHES TO THE DEFINITION OF THE CATEGORY “PERSONNEL POTENTIAL”

    Directory of Open Access Journals (Sweden)

    Аlexandra Deshchenko

    2016-02-01

    Full Text Available The article describes the evolution of theoretical approaches to definition of the category «personnel potential» based on the analysis of approaches to definition of the conceptual apparatus of labor Economics, including such categories as: labor force, labor resources, labor potential, human resources, human capital, human capital different authors. The analysis of the evolution of the terms in accordance with the stages of development of a society.

  12. Theoretical bases analysis of scientific prediction on marketing principles

    OpenAIRE

    A.S. Rosohata

    2012-01-01

    The article presents an overview categorical apparatus of scientific predictions and theoretical foundations results of scientific forecasting. They are integral part of effective management of economic activities. The approaches to the prediction of scientists in different fields of Social science and the categories modification of scientific prediction, based on principles of marketing are proposed.

  13. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    practice of science (except by Gong et al., 2013, whose fundamental insight is the basis for this talk), and here I offer two examples of practical methods that scientists might use to approximately measure ontological information. I place this practical discussion in the context of several recent and high-profile experiments that have found that simple out-of-sample statistical models typically (vastly) outperform our most sophisticated terrestrial hydrology models. I offer some perspective on several open questions about how to use these findings to improve our models and understanding of these systems. Cartwright, N. (1983) How the Laws of Physics Lie. New York, NY: Cambridge Univ Press. Clark, M. P., Kavetski, D. and Fenicia, F. (2011) 'Pursuing the method of multiple working hypotheses for hydrological modeling', Water Resources Research, 47(9). Cover, T. M. and Thomas, J. A. (1991) Elements of Information Theory. New York, NY: Wiley-Interscience. Cox, R. T. (1946) 'Probability, frequency and reasonable expectation', American Journal of Physics, 14, pp. 1-13. Csiszár, I. (1972) 'A Class of Measures of Informativity of Observation Channels', Periodica Mathematica Hungarica, 2(1), pp. 191-213. Davies, P. C. W. (1990) 'Why is the physical world so comprehensible', Complexity, entropy and the physics of information, pp. 61-70. Gong, W., Gupta, H. V., Yang, D., Sricharan, K. and Hero, A. O. (2013) 'Estimating Epistemic & Aleatory Uncertainties During Hydrologic Modeling: An Information Theoretic Approach', Water Resources Research, 49(4), pp. 2253-2273. Jaynes, E. T. (2003) Probability Theory: The Logic of Science. New York, NY: Cambridge University Press. Nearing, G. S. and Gupta, H. V. (2015) 'The quantity and quality of information in hydrologic models', Water Resources Research, 51(1), pp. 524-538. Popper, K. R. (2002) The Logic of Scientific Discovery. New York: Routledge. Van Horn, K. S. (2003) 'Constructing a logic of plausible inference: a guide to cox's theorem

  14. Laser based thermo-conductometry as an approach to determine ribbon solid fraction off-line and in-line.

    Science.gov (United States)

    Wiedey, Raphael; Šibanc, Rok; Kleinebudde, Peter

    2018-06-06

    Ribbon solid fraction is one of the most important quality attributes during roll compaction/dry granulation. Accurate and precise determination is challenging and no in-line measurement tool has been generally accepted, yet. In this study, a new analytical tool with potential off-line as well as in-line applicability is described. It is based on the thermo-conductivity of the compacted material, which is known to depend on the solid fraction. A laser diode was used to punctually heat the ribbon and the heat propagation monitored by infrared thermography. After performing a Gaussian fit of the transverse ribbon profile, the scale parameter σ showed correlation to ribbon solid fraction in off-line as well as in-line studies. Accurate predictions of the solid fraction were possible for a relevant range of process settings. Drug stability was not affected, as could be demonstrated for the model drug nifedipine. The application of this technique was limited when using certain fillers and working at higher roll speeds. This study showed the potentials of this new technique and is a starting point for additional work that has to be done to overcome these challenges. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Monitor-Based Statistical Model Checking for Weighted Metric Temporal Logic

    DEFF Research Database (Denmark)

    Bulychev, Petr; David, Alexandre; Larsen, Kim Guldstrand

    2012-01-01

    We present a novel approach and implementation for ana- lysing weighted timed automata (WTA) with respect to the weighted metric temporal logic (WMTL≤ ). Based on a stochastic semantics of WTAs, we apply statistical model checking (SMC) to estimate and test probabilities of satisfaction with desi......We present a novel approach and implementation for ana- lysing weighted timed automata (WTA) with respect to the weighted metric temporal logic (WMTL≤ ). Based on a stochastic semantics of WTAs, we apply statistical model checking (SMC) to estimate and test probabilities of satisfaction...

  16. Did Life Emerge in Thermo-Acidic Conditions?

    Science.gov (United States)

    Holmes, D. S.

    2017-12-01

    There is widespread, but not unanimous, agreement that life emerged in hot conditions by exploiting redox and pH disequilibria found on early earth. Although there are several hypotheses to explain the postulated pH disequilibria, few of these consider that life evolved at very low pH (biological evolution. This presentation will evaluate the pros and cons of the hypothesis that the early evolution of life occurred in thermo-acidic conditions. Such environments are thought to have been abundant on early earth and were probably rich in hydrogen and soluble metals including iron and sulfur that could have served as sources and sinks of electrons. Extant thermo-acidophiles thrive in such conditions. Low pH environments are rich in protons that are the major drivers of energy conservation by coupling to phosphorylation in virtually all organisms on earth; this may be a "biochemical fossil" reflecting the use of protons (low pH) in primitive energy conservation. It has also been proposed that acidic conditions favored the evolution of an RNA world with expanded catalytic activities. On the other hand, the idea that life emerged in thermo-acidic conditions can be challenged because of the proposed difficulties of folding and stabilizing proteins simultaneously exposed to high temperature and low pH. In addition, although thermo-acidophiles root to the base of the phylogenetic tree of life, consistent with the proposition that they evolved early, yet there are problems of interpretation of their subsequent evolution that cloud this simplistic phylogenetic view. We propose solutions to these problems and hypothesize that life evolved in thermo-acidic conditions.

  17. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  18. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  19. A statistical-based approach for fault detection and diagnosis in a photovoltaic system

    KAUST Repository

    Garoudja, Elyes; Harrou, Fouzi; Sun, Ying; Kara, Kamel; Chouder, Aissa; Silvestre, Santiago

    2017-01-01

    This paper reports a development of a statistical approach for fault detection and diagnosis in a PV system. Specifically, the overarching goal of this work is to early detect and identify faults on the DC side of a PV system (e.g., short

  20. Statistical and theoretical research

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology

  1. Statistical modelling approach to derive quantitative nanowastes classification index; estimation of nanomaterials exposure

    CSIR Research Space (South Africa)

    Ntaka, L

    2013-08-01

    Full Text Available . In this work, statistical inference approach specifically the non-parametric bootstrapping and linear model were applied. Data used to develop the model were sourced from the literature. 104 data points with information on aggregation, natural organic matter...

  2. A theoretical approach to sputtering due to molecular ion bombardment, 1

    International Nuclear Information System (INIS)

    Karashima, Shosuke; Ootoshi, Tsukuru; Kamiyama, Masahide; Kim, Pil-Hyon; Namba, Susumu.

    1981-01-01

    A shock wave model is proposed to explain theoretically the non-linear effects in sputtering phenomena by molecular ion bombardments. In this theory the sputtering processes are separated into two parts; one is due to linear effects and another is due to non-linear effects. The treatment of the linear parts is based on the statistical model by Schwarz and Helms concerning a broad range of atomic collision cascades. The non-linear parts are treated by the model of shock wave due to overlapping cascades, and useful equations to calculate the sputtering yields and the dynamical quantities in the system are derived. (author)

  3. Mission-profile-based stress analysis of bond-wires in SiC power modules

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad; Iannuzzo, Francesco; Blaabjerg, Frede

    2016-01-01

    This paper proposes a novel mission-profile-based reliability analysis approach for stress on bond wires in Silicon Carbide (SiC) MOSFET power modules using statistics and thermo-mechanical FEM analysis. In the proposed approach, both the operational and environmental thermal stresses are taken...... into account. The approach uses a two-dimension statistical analysis of the operating conditions in a real one-year mission profile sampled at time frames 5 minutes long. For every statistical bin corresponding to a given operating condition, the junction temperature evolution is estimated by a thermal network...... and the mechanical stress on bond wires is consequently extracted by finite-element simulations. In the final step, the considered mission profile is translated in a stress sequence to be used for Rainflow counting calculation and lifetime estimation....

  4. Understanding Short-Term Nonmigrating Tidal Variability in the Ionospheric Dynamo Region from SABER Using Information Theory and Bayesian Statistics

    Science.gov (United States)

    Kumari, K.; Oberheide, J.

    2017-12-01

    Nonmigrating tidal diagnostics of SABER temperature observations in the ionospheric dynamo region reveal a large amount of variability on time-scales of a few days to weeks. In this paper, we discuss the physical reasons for the observed short-term tidal variability using a novel approach based on Information theory and Bayesian statistics. We diagnose short-term tidal variability as a function of season, QBO, ENSO, and solar cycle and other drivers using time dependent probability density functions, Shannon entropy and Kullback-Leibler divergence. The statistical significance of the approach and its predictive capability is exemplified using SABER tidal diagnostics with emphasis on the responses to the QBO and solar cycle. Implications for F-region plasma density will be discussed.

  5. Memory-dependent derivatives theory of thermo-viscoelasticity involving two-temperature

    Energy Technology Data Exchange (ETDEWEB)

    Ezzat, M. A. [Alexandria University, Alexandria (Egypt); El-Bary, A. A. [Arab Academy for Science and Technology, Alexandria (Egypt)

    2015-10-15

    A new model of two-temperature generalized thermo-viscoelasticity theory based on memory-dependent derivative is constructed. The equations of the new model are applied to one-dimensional problem of a half-space. The bounding surface is taken to be traction free and subjected to a time dependent thermal shock. Laplace transforms technique is used. A direct approach is applied to obtain the exact formulas of heat flux, temperature, stresses, displacement and strain in the Laplace transform domain. Application is employed to our problem to get the solution in the complete form. The considered variables are presented graphically and discussions are made.

  6. Statistical mechanics of learning: A variational approach for real data

    International Nuclear Information System (INIS)

    Malzahn, Doerthe; Opper, Manfred

    2002-01-01

    Using a variational technique, we generalize the statistical physics approach of learning from random examples to make it applicable to real data. We demonstrate the validity and relevance of our method by computing approximate estimators for generalization errors that are based on training data alone

  7. An Information-Based Approach to Precision Analysis of Indoor WLAN Localization Using Location Fingerprint

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2015-12-01

    Full Text Available In this paper, we proposed a novel information-based approach to precision analysis of indoor wireless local area network (WLAN localization using location fingerprint. First of all, by using the Fisher information matrix (FIM, we derive the fundamental limit of WLAN fingerprint-based localization precision considering different signal distributions in characterizing the variation of received signal strengths (RSSs in the target environment. After that, we explore the relationship between the localization precision and access point (AP placement, which can provide valuable suggestions for the design of the highly-precise localization system. Second, we adopt the heuristic simulated annealing (SA algorithm to optimize the AP locations for the sake of approaching the fundamental limit of localization precision. Finally, the extensive simulations and experiments are conducted in both regular line-of-sight (LOS and irregular non-line-of-sight (NLOS environments to demonstrate that the proposed approach can not only effectively improve the WLAN fingerprint-based localization precision, but also reduce the time overhead.

  8. Rank-based permutation approaches for non-parametric factorial designs.

    Science.gov (United States)

    Umlauft, Maria; Konietschke, Frank; Pauly, Markus

    2017-11-01

    Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.

  9. A statistical approach to instrument calibration

    Science.gov (United States)

    Robert R. Ziemer; David Strauss

    1978-01-01

    Summary - It has been found that two instruments will yield different numerical values when used to measure identical points. A statistical approach is presented that can be used to approximate the error associated with the calibration of instruments. Included are standard statistical tests that can be used to determine if a number of successive calibrations of the...

  10. Thermo-sensitive intelligent track membrane

    International Nuclear Information System (INIS)

    Pang Deling; Ren Lihua; Qian Zhilin; Huang Gang; Zhang Jinhua

    1999-01-01

    Using N-isopropylacryl-amide (NIP AAm) thermo-sensitive function material as monomer and nuclear track microporous membrane (NTMM) as baseline material, a thermo-sensitive intelligent track membrane (TsITM) has been prepared by the over-oxidization and pre-irradiation grafting techniques. The TsITM can be used to make a micro-switch controlled by temperature and to adjust particle screening and osmosis. To obtain sub-micron responsive grafted track pores only a very thin thermo-sensitive layer is needed. The TsITM pores are capable of swelling and shrinking rapidly and respond more sensitively to temperature

  11. Intelligent cognitive radio jamming - a game-theoretical approach

    Science.gov (United States)

    Dabcevic, Kresimir; Betancourt, Alejandro; Marcenaro, Lucio; Regazzoni, Carlo S.

    2014-12-01

    Cognitive radio (CR) promises to be a solution for the spectrum underutilization problems. However, security issues pertaining to cognitive radio technology are still an understudied topic. One of the prevailing such issues are intelligent radio frequency (RF) jamming attacks, where adversaries are able to exploit on-the-fly reconfigurability potentials and learning mechanisms of cognitive radios in order to devise and deploy advanced jamming tactics. In this paper, we use a game-theoretical approach to analyze jamming/anti-jamming behavior between cognitive radio systems. A non-zero-sum game with incomplete information on an opponent's strategy and payoff is modelled as an extension of Markov decision process (MDP). Learning algorithms based on adaptive payoff play and fictitious play are considered. A combination of frequency hopping and power alteration is deployed as an anti-jamming scheme. A real-life software-defined radio (SDR) platform is used in order to perform measurements useful for quantifying the jamming impacts, as well as to infer relevant hardware-related properties. Results of these measurements are then used as parameters for the modelled jamming/anti-jamming game and are compared to the Nash equilibrium of the game. Simulation results indicate, among other, the benefit provided to the jammer when it is employed with the spectrum sensing algorithm in proactive frequency hopping and power alteration schemes.

  12. METHODOLOGICAL APPROACHES TO ORGANIZATION OF SAFE INFORMATION AND EDUCATIONAL ENVIRONMENT OF THE UNIVERSITY

    Directory of Open Access Journals (Sweden)

    A. N. Privalov

    2017-01-01

    Full Text Available Introduction. One of the tendencies of modern higher education is the ubiquitous use of information and communication technologies. At the same time, the functioning of the electronic information and educational environment (IEE of the university should be based on the means of IEE and the condition of its information security.The aim of the research is conceptualization of a problem of the rational organization of the safe information and education environment of higher education institution wherein reliable protection of its infrastructure, the personal and unique information of a pupil and teacher and virtual space of their educational interaction is provided.Methodology and research methods. System-based approach is a key approach to organization of safe educational environment of the university. From the point of view of authors, personal-activity and functional approaches are expedient while designing and development of a safe IEE. Socio-historical and theoretical-methodological analysis, modeling, research and synthesis of experience of effective application of the systems approach in educational professional organizations are used.Results and scientific novelty. The concept «safe information educational environment of the university» is specified wherein the first word has to express a predominant quality of the system. Creating a safe information environment in educational professional organizations provides a convenient and safe educational environment in the process of professional training of university students. The components and directions for the organization of the safe IEE are highlighted. Practical recommendations for its design and successful functioning are given.Practical significance. The materials of the present research can be demanded by managers and administrative employees of educational organizations. 

  13. Biometric security from an information-theoretical perspective

    NARCIS (Netherlands)

    Ignatenko, T.; Willems, F.M.J.

    2012-01-01

    In this review, biometric systems are studied from an information theoretical point of view. In the first part biometric authentication systems are studied. The objective of these systems is, observing correlated enrollment and authentication biometric sequences, to generate or convey as large as

  14. New Theoretical Approach Integrated Education and Technology

    Science.gov (United States)

    Ding, Gang

    2010-01-01

    The paper focuses on exploring new theoretical approach in education with development of online learning technology, from e-learning to u-learning and virtual reality technology, and points out possibilities such as constructing a new teaching ecological system, ubiquitous educational awareness with ubiquitous technology, and changing the…

  15. THE INCREASE OF ENTERPRISES’ INNOVATIVE DEVELOPMENT BASED ON THE NETWORK APPROACH

    Directory of Open Access Journals (Sweden)

    Olena Gudz

    2018-01-01

    Full Text Available The purpose of the paper is studying the role and problems of the innovative development of domestic enterprises, discovering the factors that influence these processes. Methodology. The methodology for the study was based on logical and historical methods, methods of the system-functional approach, methods of scientific abstraction, systematization, grouping, generalization and formalization, analysis and synthesis, economic and statistical methods, and method of questioning and peer review. Results. It is studied the essence and substantiated the expediency of the network approach use, it is outlined its capabilities and limitations, determined the effectiveness of network innovation structures, and developed the proposals for activating the innovative development of enterprises in new dimensions of the economic space based on the network approach. Practical implications. The proposed measures will promote the activation of innovative development for domestic enterprises, improve the quality of business chains, competitiveness and management structures, and provide the development of new market segments. Value/originality. The information background for the paper was the official data of the State Statistics Service of Ukraine, statistical and financial statements of enterprises, rating estimates by the international agency Bloomberg Rankings, analytical report “Global Innovation Index” (World Intellectual Property Organization, WIPO, the report of the European Innovation Scoreboard, scientific publications of domestic and foreign researchers, normative reference literature, analytical and logical generalizations and observations of authors, Internet information resources.

  16. Life course approach in social epidemiology: an overview, application and future implications.

    Science.gov (United States)

    Cable, Noriko

    2014-01-01

    The application of the life course approach to social epidemiology has helped epidemiologists theoretically examine social gradients in population health. Longitudinal data with rich contextual information collected repeatedly and advanced statistical approaches have made this challenging task easier. This review paper provides an overview of the life course approach in epidemiology, its research application, and future challenges. In summary, a systematic approach to methods, including theoretically guided measurement of socioeconomic position, would assist researchers in gathering evidence for reducing social gradients in health, and collaboration across individual disciplines will make this task achievable.

  17. Statistical inference for remote sensing-based estimates of net deforestation

    Science.gov (United States)

    Ronald E. McRoberts; Brian F. Walters

    2012-01-01

    Statistical inference requires expression of an estimate in probabilistic terms, usually in the form of a confidence interval. An approach to constructing confidence intervals for remote sensing-based estimates of net deforestation is illustrated. The approach is based on post-classification methods using two independent forest/non-forest classifications because...

  18. Thermo-elastic optical coherence tomography

    NARCIS (Netherlands)

    Wang, Tianshi; Pfeiffer, Tom; Wu, Min; Wieser, Wolfgang; Amenta, Gaetano; Draxinger, Wolfgang; van der Steen, A.F.W.; Huber, Robert; Van Soest, Gijs

    2017-01-01

    The absorption of nanosecond laser pulses induces rapid thermo-elastic deformation in tissue. A sub-micrometer scale displacement occurs within a few microseconds after the pulse arrival. In this Letter, we investigate the laser-induced thermo-elastic deformation using a 1.5 MHz phase-sensitive

  19. Multiple point statistical simulation using uncertain (soft) conditional data

    Science.gov (United States)

    Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou

    2018-05-01

    Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.

  20. Randomised controlled trial of a theoretically grounded tailored intervention to diffuse evidence-based public health practice [ISRCTN23257060

    Directory of Open Access Journals (Sweden)

    Nordheim Lena

    2003-03-01

    Full Text Available Abstract Background Previous studies have shown that Norwegian public health physicians do not systematically and explicitly use scientific evidence in their practice. They work in an environment that does not encourage the integration of this information in decision-making. In this study we investigate whether a theoretically grounded tailored intervention to diffuse evidence-based public health practice increases the physicians' use of research information. Methods 148 self-selected public health physicians were randomised to an intervention group (n = 73 and a control group (n = 75. The intervention group received a multifaceted intervention while the control group received a letter declaring that they had access to library services. Baseline assessments before the intervention and post-testing immediately at the end of a 1.5-year intervention period were conducted. The intervention was theoretically based and consisted of a workshop in evidence-based public health, a newsletter, access to a specially designed information service, to relevant databases, and to an electronic discussion list. The main outcome measure was behaviour as measured by the use of research in different documents. Results The intervention did not demonstrate any evidence of effects on the objective behaviour outcomes. We found, however, a statistical significant difference between the two groups for both knowledge scores: Mean difference of 0.4 (95% CI: 0.2–0.6 in the score for knowledge about EBM-resources and mean difference of 0.2 (95% CI: 0.0–0.3 in the score for conceptual knowledge of importance for critical appraisal. There were no statistical significant differences in attitude-, self-efficacy-, decision-to-adopt- or job-satisfaction scales. There were no significant differences in Cochrane library searching after controlling for baseline values and characteristics. Conclusion Though demonstrating effect on knowledge the study failed to provide support for

  1. BEHAVIORAL INPUTS TO THE THEORETICAL APPROACH OF THE ECONOMIC CRISIS

    Directory of Open Access Journals (Sweden)

    Sinziana BALTATESCU

    2015-09-01

    Full Text Available The current economic and financial crisis gave room for the theoretical debates to reemerge. The economic reality challenged the mainstream neoclassical approach leaving the opportunity for the Austrian School, Post Keynesianism or Institutionalists to bring in front theories that seem to better explain the economic crisis and thus, leaving space for more efficient economic policies to result. In this context, the main assumptions of the mainstream theoretical approach are challenged and reevaluated, behavioral economics is one of the main challengers. Without developing in an integrated school of thought yet, behavioral economics brings new elements within the framework of economic thinking. How are the main theoretical approaches integrating these new elements and whether this process is going to narrow the theory or enrich it to be more comprehensive are questions to which this paper tries to answer, or, at least, to leave room for an answer.

  2. General thermo-elastic solution of radially heterogeneous, spherically isotropic rotating sphere

    Energy Technology Data Exchange (ETDEWEB)

    Bayat, Yahya; EkhteraeiToussi, THamid [Ferdowsi University of Mashhad, Mashhad (Iran, Islamic Republic of)

    2015-06-15

    A thick walled rotating spherical object made of transversely isotropic functionally graded materials (FGMs) with general types of thermo-mechanical boundary conditions is studied. The thermo-mechanical governing equations consisting of decoupled thermal and mechanical equations are represented. The centrifugal body forces of the rotation are considered in the modeling phase. The unsymmetrical thermo-mechanical boundary conditions and rotational body forces are expressed in terms of the Legendre series. The series method is also implemented in the solution of the resulting equations. The solutions are checked with the known literature and FEM based solutions of ABAQUS software. The effects of anisotropy and heterogeneity are studied through the case studies and the results are represented in different figures. The newly developed series form solution is applicable to the rotating FGM spherical transversely isotropic vessels having nonsymmetrical thermo-mechanical boundary condition.

  3. Sensitivity analyses of biodiesel thermo-physical properties under diesel engine conditions

    DEFF Research Database (Denmark)

    Cheng, Xinwei; Ng, Hoon Kiat; Gan, Suyin

    2016-01-01

    This reported work investigates the sensitivities of spray and soot developments to the change of thermo-physical properties for coconut and soybean methyl esters, using two-dimensional computational fluid dynamics fuel spray modelling. The choice of test fuels made was due to their contrasting...... saturation-unsaturation compositions. The sensitivity analyses for non-reacting and reacting sprays were carried out against a total of 12 thermo-physical properties, at an ambient temperature of 900 K and density of 22.8 kg/m3. For the sensitivity analyses, all the thermo-physical properties were set...... as the baseline case and each property was individually replaced by that of diesel. The significance of individual thermo-physical property was determined based on the deviations found in predictions such as liquid penetration, ignition delay period and peak soot concentration when compared to those of baseline...

  4. Identifying consumer's needs of health information technology through an innovative participatory design approach among English- and Spanish-speaking urban older adults.

    Science.gov (United States)

    Lucero, R; Sheehan, B; Yen, P; Velez, O; Nobile-Hernandez, D; Tiase, V

    2014-01-01

    We describe an innovative community-centered participatory design approach, Consumer-centered Participatory Design (C2PD), and the results of applying C2PD to design and develop a web-based fall prevention system. We conducted focus groups and design sessions with English- and Spanish-speaking community-dwelling older adults. Focus group data were summarized and used to inform the context of the design sessions. Descriptive content analysis methods were used to develop categorical descriptions of design session informant's needs related to information technology. The C2PD approach enabled the assessment and identification of informant's needs of health information technology (HIT) that informed the development of a falls prevention system. We learned that our informants needed a system that provides variation in functions/content; differentiates between actionable/non-actionable information/structures; and contains sensory cues that support wide-ranging and complex tasks in a varied, simple, and clear interface to facilitate self-management. The C2PD approach provides community-based organizations, academic researchers, and commercial entities with a systematic theoretically informed approach to develop HIT innovations. Our community-centered participatory design approach focuses on consumer's technology needs while taking into account core public health functions.

  5. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  6. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    Science.gov (United States)

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  7. Partial discharge transients: The field theoretical approach

    DEFF Research Database (Denmark)

    McAllister, Iain Wilson; Crichton, George C

    1998-01-01

    Up until the mid-1980s the theory of partial discharge transients was essentially static. This situation had arisen because of the fixation with the concept of void capacitance and the use of circuit theory to address what is in essence a field problem. Pedersen rejected this approach and instead...... began to apply field theory to the problem of partial discharge transients. In the present paper, the contributions of Pedersen using the field theoretical approach will be reviewed and discussed....

  8. Concepts and recent advances in generalized information measures and statistics

    CERN Document Server

    Kowalski, Andres M

    2013-01-01

    Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantif

  9. Intraplate seismicity in Canada: a graph theoretic approach to data analysis and interpretation

    Directory of Open Access Journals (Sweden)

    K. Vasudevan

    2010-10-01

    Full Text Available Intraplate seismicity occurs in central and northern Canada, but the underlying origin and dynamics remain poorly understood. Here, we apply a graph theoretic approach to characterize the statistical structure of spatiotemporal clustering exhibited by intraplate seismicity, a direct consequence of the underlying nonlinear dynamics. Using a recently proposed definition of "recurrences" based on record breaking processes (Davidsen et al., 2006, 2008, we have constructed directed graphs using catalogue data for three selected regions (Region 1: 45°−48° N/74°−80° W; Region 2: 51°−55° N/77°−83° W; and Region 3: 56°−70° N/65°−95° W, with attributes drawn from the location, origin time and the magnitude of the events. Based on comparisons with a null model derived from Poisson distribution or Monte Carlo shuffling of the catalogue data, our results provide strong evidence in support of spatiotemporal correlations of seismicity in all three regions considered. Similar evidence for spatiotemporal clustering has been documented using seismicity catalogues for southern California, suggesting possible similarities in underlying earthquake dynamics of both regions despite huge differences in the variability of seismic activity.

  10. Uniting statistical and individual-based approaches for animal movement modelling.

    Science.gov (United States)

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  11. Modelling in Accounting. Theoretical and Practical Dimensions

    OpenAIRE

    Teresa Szot -Gabryś

    2010-01-01

    Accounting in the theoretical approach is a scientific discipline based on specific paradigms. In the practical aspect, accounting manifests itself through the introduction of a system for measurement of economic quantities which operates in a particular business entity. A characteristic of accounting is its flexibility and ability of adaptation to information needs of information recipients. One of the main currents in the development of accounting theory and practice is to cover by economic...

  12. Information-theoretic decomposition of embodied and situated systems.

    Science.gov (United States)

    Da Rold, Federico

    2018-07-01

    The embodied and situated view of cognition stresses the importance of real-time and nonlinear bodily interaction with the environment for developing concepts and structuring knowledge. In this article, populations of robots controlled by an artificial neural network learn a wall-following task through artificial evolution. At the end of the evolutionary process, time series are recorded from perceptual and motor neurons of selected robots. Information-theoretic measures are estimated on pairings of variables to unveil nonlinear interactions that structure the agent-environment system. Specifically, the mutual information is utilized to quantify the degree of dependence and the transfer entropy to detect the direction of the information flow. Furthermore, the system is analyzed with the local form of such measures, thus capturing the underlying dynamics of information. Results show that different measures are interdependent and complementary in uncovering aspects of the robots' interaction with the environment, as well as characteristics of the functional neural structure. Therefore, the set of information-theoretic measures provides a decomposition of the system, capturing the intricacy of nonlinear relationships that characterize robots' behavior and neural dynamics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Net analyte signal based statistical quality control

    NARCIS (Netherlands)

    Skibsted, E.T.S.; Boelens, H.F.M.; Westerhuis, J.A.; Smilde, A.K.; Broad, N.W.; Rees, D.R.; Witte, D.T.

    2005-01-01

    Net analyte signal statistical quality control (NAS-SQC) is a new methodology to perform multivariate product quality monitoring based on the net analyte signal approach. The main advantage of NAS-SQC is that the systematic variation in the product due to the analyte (or property) of interest is

  14. Supply chain collaboration: A Game-theoretic approach to profit allocation

    Energy Technology Data Exchange (ETDEWEB)

    Ponte, B.; Fernández, I.; Rosillo, R.; Parreño, J.; García, N.

    2016-07-01

    Purpose: This paper aims to develop a theoretical framework for profit allocation, as a mechanism for aligning incentives, in collaborative supply chains. Design/methodology/approach: The issue of profit distribution is approached from a game-theoretic perspective. We use the nucleolus concept. The framework is illustrated through a numerical example based on the Beer Game scenario. Findings: The nucleolus offers a powerful perspective to tackle this problem, as it takes into consideration the bargaining power of the different echelons. We show that this framework outperforms classical alternatives. Research limitations/implications: The allocation of the overall supply chain profit is analyzed from a static perspective. Considering the dynamic nature of the problem would be an interesting next step. Practical implications: We provide evidence of drawbacks derived from classical solutions to the profit allocation problem. Real-world collaborative supply chains need of robust mechanisms like the one tackled in this work to align incentives from the various actors. Originality/value: Adopting an efficient collaborative solution is a major challenge for supply chains, since it is a wide and complex process that requires an appropriate scheme. Within this framework, profit allocation is essential.

  15. Supply chain collaboration: A Game-theoretic approach to profit allocation

    International Nuclear Information System (INIS)

    Ponte, B.; Fernández, I.; Rosillo, R.; Parreño, J.; García, N.

    2016-01-01

    Purpose: This paper aims to develop a theoretical framework for profit allocation, as a mechanism for aligning incentives, in collaborative supply chains. Design/methodology/approach: The issue of profit distribution is approached from a game-theoretic perspective. We use the nucleolus concept. The framework is illustrated through a numerical example based on the Beer Game scenario. Findings: The nucleolus offers a powerful perspective to tackle this problem, as it takes into consideration the bargaining power of the different echelons. We show that this framework outperforms classical alternatives. Research limitations/implications: The allocation of the overall supply chain profit is analyzed from a static perspective. Considering the dynamic nature of the problem would be an interesting next step. Practical implications: We provide evidence of drawbacks derived from classical solutions to the profit allocation problem. Real-world collaborative supply chains need of robust mechanisms like the one tackled in this work to align incentives from the various actors. Originality/value: Adopting an efficient collaborative solution is a major challenge for supply chains, since it is a wide and complex process that requires an appropriate scheme. Within this framework, profit allocation is essential.

  16. Vanadium supersaturated silicon system: a theoretical and experimental approach

    Science.gov (United States)

    Garcia-Hemme, Eric; García, Gregorio; Palacios, Pablo; Montero, Daniel; García-Hernansanz, Rodrigo; Gonzalez-Diaz, Germán; Wahnon, Perla

    2017-12-01

    The effect of high dose vanadium ion implantation and pulsed laser annealing on the crystal structure and sub-bandgap optical absorption features of V-supersaturated silicon samples has been studied through the combination of experimental and theoretical approaches. Interest in V-supersaturated Si focusses on its potential as a material having a new band within the Si bandgap. Rutherford backscattering spectrometry measurements and formation energies computed through quantum calculations provide evidence that V atoms are mainly located at interstitial positions. The response of sub-bandgap spectral photoconductance is extended far into the infrared region of the spectrum. Theoretical simulations (based on density functional theory and many-body perturbation in GW approximation) bring to light that, in addition to V atoms at interstitial positions, Si defects should also be taken into account in explaining the experimental profile of the spectral photoconductance. The combination of experimental and theoretical methods provides evidence that the improved spectral photoconductance up to 6.2 µm (0.2 eV) is due to new sub-bandgap transitions, for which the new band due to V atoms within the Si bandgap plays an essential role. This enables the use of V-supersaturated silicon in the third generation of photovoltaic devices.

  17. Bayesian analysis of systems with random chemical composition: renormalization-group approach to Dirichlet distributions and the statistical theory of dilution.

    Science.gov (United States)

    Vlad, Marcel Ovidiu; Tsuchiya, Masa; Oefner, Peter; Ross, John

    2002-01-01

    We investigate the statistical properties of systems with random chemical composition and try to obtain a theoretical derivation of the self-similar Dirichlet distribution, which is used empirically in molecular biology, environmental chemistry, and geochemistry. We consider a system made up of many chemical species and assume that the statistical distribution of the abundance of each chemical species in the system is the result of a succession of a variable number of random dilution events, which can be described by using the renormalization-group theory. A Bayesian approach is used for evaluating the probability density of the chemical composition of the system in terms of the probability densities of the abundances of the different chemical species. We show that for large cascades of dilution events, the probability density of the composition vector of the system is given by a self-similar probability density of the Dirichlet type. We also give an alternative formal derivation for the Dirichlet law based on the maximum entropy approach, by assuming that the average values of the chemical potentials of different species, expressed in terms of molar fractions, are constant. Although the maximum entropy approach leads formally to the Dirichlet distribution, it does not clarify the physical origin of the Dirichlet statistics and has serious limitations. The random theory of dilution provides a physical picture for the emergence of Dirichlet statistics and makes it possible to investigate its validity range. We discuss the implications of our theory in molecular biology, geochemistry, and environmental science.

  18. A Statistical Ontology-Based Approach to Ranking for Multiword Search

    Science.gov (United States)

    Kim, Jinwoo

    2013-01-01

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…

  19. Optimal information transfer in enzymatic networks: A field theoretic formulation

    Science.gov (United States)

    Samanta, Himadri S.; Hinczewski, Michael; Thirumalai, D.

    2017-07-01

    Signaling in enzymatic networks is typically triggered by environmental fluctuations, resulting in a series of stochastic chemical reactions, leading to corruption of the signal by noise. For example, information flow is initiated by binding of extracellular ligands to receptors, which is transmitted through a cascade involving kinase-phosphatase stochastic chemical reactions. For a class of such networks, we develop a general field-theoretic approach to calculate the error in signal transmission as a function of an appropriate control variable. Application of the theory to a simple push-pull network, a module in the kinase-phosphatase cascade, recovers the exact results for error in signal transmission previously obtained using umbral calculus [Hinczewski and Thirumalai, Phys. Rev. X 4, 041017 (2014), 10.1103/PhysRevX.4.041017]. We illustrate the generality of the theory by studying the minimal errors in noise reduction in a reaction cascade with two connected push-pull modules. Such a cascade behaves as an effective three-species network with a pseudointermediate. In this case, optimal information transfer, resulting in the smallest square of the error between the input and output, occurs with a time delay, which is given by the inverse of the decay rate of the pseudointermediate. Surprisingly, in these examples the minimum error computed using simulations that take nonlinearities and discrete nature of molecules into account coincides with the predictions of a linear theory. In contrast, there are substantial deviations between simulations and predictions of the linear theory in error in signal propagation in an enzymatic push-pull network for a certain range of parameters. Inclusion of second-order perturbative corrections shows that differences between simulations and theoretical predictions are minimized. Our study establishes that a field theoretic formulation of stochastic biological signaling offers a systematic way to understand error propagation in

  20. Numerical modelling in building thermo-aeraulics: from CFD modelling to an hybrid finite volume / zonal approach; Modelisation numerique de la thermoaeraulique du batiment: des modeles CFD a une approche hybride volumes finis / zonale

    Energy Technology Data Exchange (ETDEWEB)

    Bellivier, A.

    2004-05-15

    For 3D modelling of thermo-aeraulics in building using field codes, it is necessary to reduce the computing time in order to model increasingly larger volumes. The solution suggested in this study is to couple two modelling: a zonal approach and a CFD approach. The first part of the work that was carried out is the setting of a simplified CFD modelling. We propose rules for use of coarse grids, a constant effective viscosity law and adapted coefficients for heat exchange in the framework of building thermo-aeraulics. The second part of this work concerns the creation of fluid Macro-Elements and their coupling with a calculation of CFD finite volume type. Depending on the boundary conditions of the problem, a local description of the driving flow is proposed via the installation and use of semi-empirical evolution laws. The Macro-Elements is then inserted in CFD computation: the values of velocity calculated by the evolution laws are imposed on the CFD cells corresponding to the Macro-Element. We use these two approaches on five cases representative of thermo-aeraulics in buildings. The results are compared with experimental data and with traditional RANS simulations. We highlight the significant gain of time that our approach allows while preserving a good quality of numerical results. (author)

  1. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings.

    Science.gov (United States)

    Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon

    2014-01-01

    Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  2. ANALYSIS OF THEORETICAL AND METHODOLOGICAL APPROACHES TO DESIGN OF ELECTRONIC TEXTBOOKS FOR STUDENTS OF HIGHER AGRICULTURAL EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Olena Yu. Balalaieva

    2017-06-01

    Full Text Available The article deals with theoretical and methodological approaches to the design of electronic textbook, in particular systems, competence, activity, personality oriented, technological one, that in complex reflect the general trends in the formation of a new educational paradigm, distinctive features of which lie in constructing the heuristic searching model of the learning process, focusing on developmental teaching, knowledge integration, skills development for the independent information search and processing, technification of the learning process. The approach in this study is used in a broad sense as a synthesis of the basic ideas, views, principles that determine the overall research strategy. The main provisions of modern approaches to design are not antagonistic, they should be applied in a complex, taking into account the advantages of each of them and leveling shortcomings for the development of optimal concept of electronic textbook. The model of electronic textbook designing and components of methodology for its using based on these approaches are described.

  3. Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results

  4. A Rights-based Approach to Information in Humanitarian Assistance.

    Science.gov (United States)

    Scarnecchia, Daniel P; Raymond, Nathaniel A; Greenwood, Faine; Howarth, Caitlin; Poole, Danielle N

    2017-09-20

    Crisis-affected populations and humanitarian aid providers are both becoming increasingly reliant on information and communications technology (ICTs) for finding and provisioning aid. This is exposing critical, unaddressed gaps in the legal and ethical frameworks that traditionally defined and governed the professional conduct of humanitarian action. The most acute of these gaps is a lack of clarity about what human rights people have regarding information in disaster, and the corresponding obligations incumbent upon governments and aid providers.  This need is lent urgency by emerging evidence demonstrating that the use of these technologies in crisis response may be, in some cases, causing harm to the very populations they intend to serve.  Preventing and mitigating these harms, while also working to responsibly ensure access to the benefits of information during crises, requires a rights-based framework to guide humanitarian operations. In this brief report, we provide a commentary that accompanies our report, the Signal Code: A Human Rights Approach to Information During Crisis, where we have identified five rights pertaining to the use of information and data during crisis which are grounded in current international human rights and customary law. It is our belief that the continued relevance of the humanitarian project, as it grows increasingly dependent on the use of data and ICTs, urgently requires a discussion of these rights and corresponding obligations.

  5. Topics in theoretical and applied statistics

    CERN Document Server

    Giommi, Andrea

    2016-01-01

    This book highlights the latest research findings from the 46th International Meeting of the Italian Statistical Society (SIS) in Rome, during which both methodological and applied statistical research was discussed. This selection of fully peer-reviewed papers, originally presented at the meeting, addresses a broad range of topics, including the theory of statistical inference; data mining and multivariate statistical analysis; survey methodologies; analysis of social, demographic and health data; and economic statistics and econometrics.

  6. Use of thermo-coagulation as an alternative treatment modality in a 'screen-and-treat' programme of cervical screening in rural Malawi.

    Science.gov (United States)

    Campbell, Christine; Kafwafwa, Savel; Brown, Hilary; Walker, Graeme; Madetsa, Belito; Deeny, Miriam; Kabota, Beatrice; Morton, David; Ter Haar, Reynier; Grant, Liz; Cubie, Heather A

    2016-08-15

    The incidence of cervical cancer in Malawi is the highest in the world and projected to increase in the absence of interventions. Although government policy supports screening using visual inspection with acetic acid (VIA), screening provision is limited due to lack of infrastructure, trained personnel, and the cost and availability of gas for cryotherapy. Recently, thermo-coagulation has been acknowledged as a safe and acceptable procedure suitable for low-resource settings. We introduced thermo-coagulation for treatment of VIA-positive lesions as an alternative to cryotherapy within a cervical screening service based on VIA, coupled with appropriate, sustainable pathways of care for women with high-grade lesions and cancers. Detailed planning was undertaken for VIA clinics, and approvals were obtained from the Ministry of Health, Regional and Village Chiefs. Educational resources were developed. Thermo-coagulators were introduced into hospital and health centre settings, with theoretical and practical training in safe use and maintenance of equipment. A total of 7,088 previously unscreened women attended VIA clinics between October 2013 and March 2015. Screening clinics were held daily in the hospital and weekly in the health centres. Overall, VIA positivity was 6.1%. Almost 90% received same day treatment in the hospital setting, and 3- to 6-month cure rates of more than 90% are observed. Thermo-coagulation proved feasible and acceptable in this setting. Effective implementation requires comprehensive training and provider support, ongoing competency assessment, quality assurance and improvement audit. Thermo-coagulation offers an effective alternative to cryotherapy and encouraged VIA screening of many more women. © 2016 The Authors International Journal of Cancer published by John Wiley & Sons Ltd on behalf of UICC.

  7. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data.

    Science.gov (United States)

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K

    2018-06-01

    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  8. Thermo-electric transport in gauge/gravity models with momentum dissipation

    Science.gov (United States)

    Amoretti, Andrea; Braggio, Alessandro; Maggiore, Nicola; Magnoli, Nicodemo; Musso, Daniele

    2014-09-01

    We present a systematic definition and analysis of the thermo-electric linear response in gauge/gravity systems focusing especially on models with massive gravity in the bulk and therefore momentum dissipation in the dual field theory. A precise treatment of finite counter-terms proves to be essential to yield a consistent physical picture whose hydrodynamic and beyond-hydrodynamics behaviors noticeably match with field theoretical expectations. The model furnishes a possible gauge/gravity description of the crossover from the quantum-critical to the disorder-dominated Fermi-liquid behaviors, as expected in graphene.

  9. Information theoretic approach to tactile encoding and discrimination

    OpenAIRE

    Saal, Hannes

    2011-01-01

    The human sense of touch integrates feedback from a multitude of touch receptors, but how this information is represented in the neural responses such that it can be extracted quickly and reliably is still largely an open question. At the same time, dexterous robots equipped with touch sensors are becoming more common, necessitating better methods for representing sequentially updated information and new control strategies that aid in extracting relevant features for object man...

  10. Finite Element Modeling of Thermo Creep Processes Using Runge-Kutta Method

    Directory of Open Access Journals (Sweden)

    Yu. I. Dimitrienko

    2015-01-01

    Full Text Available Thermo creep deformations for most heat-resistant alloys, as a rule, nonlinearly depend on stresses and are practically non- reversible. Therefore, to calculate the properties of these materials the theory of plastic flow is most widely used. Finite-element computations of a stress-strain state of structures with account of thermo creep deformations up to now are performed using main commercial software, including ANSYS package. However, in most cases to solve nonlinear creep equations, one should apply explicit or implicit methods based on the Euler method of approximation of time-derivatives. The Euler method is sufficiently efficient in terms of random access memory in computations, however this method is cumbersome in computation time and does not always provide a required accuracy for creep deformation computations.The paper offers a finite-element algorithm to solve a three-dimensional problem of thermo creep based on the Runge-Kutta finite-difference schemes of different orders with respect to time. It shows a numerical test example to solve the problem on the thermo creep of a beam under tensile loading. The computed results demonstrate that using the Runge-Kutta method with increasing accuracy order allows us to obtain a more accurate solution (with increasing accuracy order by 1 a relative error decreases, approximately, by an order too. The developed algorithm proves to be efficient enough and can be recommended for solving the more complicated problems of thermo creep of structures.

  11. GENUS STATISTICS USING THE DELAUNAY TESSELLATION FIELD ESTIMATION METHOD. I. TESTS WITH THE MILLENNIUM SIMULATION AND THE SDSS DR7

    International Nuclear Information System (INIS)

    Zhang Youcai; Yang Xiaohu; Springel, Volker

    2010-01-01

    We study the topology of cosmic large-scale structure through the genus statistics, using galaxy catalogs generated from the Millennium Simulation and observational data from the latest Sloan Digital Sky Survey Data Release (SDSS DR7). We introduce a new method for constructing galaxy density fields and for measuring the genus statistics of its isodensity surfaces. It is based on a Delaunay tessellation field estimation (DTFE) technique that allows the definition of a piece-wise continuous density field and the exact computation of the topology of its polygonal isodensity contours, without introducing any free numerical parameter. Besides this new approach, we also employ the traditional approaches of smoothing the galaxy distribution with a Gaussian of fixed width, or by adaptively smoothing with a kernel that encloses a constant number of neighboring galaxies. Our results show that the Delaunay-based method extracts the largest amount of topological information. Unlike the traditional approach for genus statistics, it is able to discriminate between the different theoretical galaxy catalogs analyzed here, both in real space and in redshift space, even though they are based on the same underlying simulation model. In particular, the DTFE approach detects with high confidence a discrepancy of one of the semi-analytic models studied here compared with the SDSS data, while the other models are found to be consistent.

  12. Biomass thermo-conversion. Research trends

    International Nuclear Information System (INIS)

    Rodriguez Machin, Lizet; Perez Bermudez, Raul; Quintana Perez, Candido Enrique; Ocanna Guevara, Victor Samuel; Duffus Scott, Alejandro

    2011-01-01

    In this paper is studied the state of the art in order to identify the main trends of the processes of thermo conversion of biomass into fuels and other chemicals. In Cuba, from total supply of biomass, wood is the 19% and sugar cane bagasse and straw the 80%, is why research in the country, should be directed primarily toward these. The methods for energy production from biomass can be group into two classes: thermo-chemical and biological conversion routes. The technology of thermo-chemical conversion includes three subclasses: pyrolysis, gasification, and direct liquefaction. Although pyrolysis is still under development, in the current energy scenario, has received special attention, because can convert directly biomass into solid, liquid and gaseous by thermal decomposition in absence of oxygen. The gasification of biomass is a thermal treatment, where great quantities of gaseous products and small quantities of char and ash are produced. In Cuba, studies of biomass thermo-conversion studies are limited to slow pyrolysis and gasification; but gas fuels, by biomass, are mainly obtained by digestion (biogas). (author)

  13. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    Science.gov (United States)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were

  14. A THEORETICAL APPROACH TO THE TRANSITION FROM A RESOURCE BASED TO A KNOWLEDGE-ECONOMY

    Directory of Open Access Journals (Sweden)

    Diana GIOACASI

    2015-09-01

    Full Text Available Economic development and the emergence of new technologies have changed the optics on the factors that are generating added value. The transition from a resource-dependent economy to one focused on tangible non-financial factors has progressed in a gradual manner and took place under the influence of globalization and of the internet boom. The aim of this article is to provide a theoretical approach to this phenomenon from the perspective of the temporal evolution of enterprise resources.

  15. Twistor-theoretic approach to topological field theories

    International Nuclear Information System (INIS)

    Ito, Kei.

    1991-12-01

    The two-dimensional topological field theory which describes a four-dimensional self-dual space-time (gravitational instanton) as a target space, which we constructed before, is shown to be deeply connected with Penrose's 'twistor theory'. The relations are presented in detail. Thus our theory offers a 'twistor theoretic' approach to topological field theories. (author)

  16. Applications of chitosan-based thermo-sensitive copolymers for harvesting living cell sheet

    International Nuclear Information System (INIS)

    Chen, J.-P.; Yang, T.-F.

    2008-01-01

    A thermo-sensitive chitosan-based copolymer hydrogel was used for harvesting living cell sheets. The hydrogel was tested for harvesting 3T3 cells after carrying out cell culture at 37 deg. C and incubating the confluent cells at 20 deg. C for spontaneous detachment of cell sheets from hydrogel surface without enzyme treatment. Results from cell viability assay and microscopy observations demonstrated that cells could attach to the hydrogel surface and maintain high viability and proliferation ability. Cell detachment efficiency from the hydrogel was about 80%. The detached cell sheet retained high viability and could proliferate again after transferred to a new culture surface

  17. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    Science.gov (United States)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  18. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  19. Modern applied U-statistics

    CERN Document Server

    Kowalski, Jeanne

    2008-01-01

    A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...

  20. Thermo-mechanical design and testing of a microbalance for space applications

    Science.gov (United States)

    Scaccabarozzi, Diego; Saggin, Bortolino; Tarabini, Marco; Palomba, Ernesto; Longobardo, Andrea; Zampetti, Emiliano

    2014-12-01

    This work focuses on the thermo-mechanical design of the microbalance used for the VISTA (Volatile In Situ Thermogravimetry Analyzer) sensor. VISTA has been designed to operate in situ in different space environments (asteroids, Mars, icy satellites). In this paper we focus on its application on Mars, where the expected environmental conditions are the most challenging for the thermo-mechanical design. The microbalance holding system has been designed to ensure piezoelectric crystal integrity against the high vibration levels during launch and landing and to cope with the unavoidable thermo-elastic differential displacements due to CTE and temperature differences between the microbalance elements. The crystal holding system, based on three symmetrical titanium supports, provides also the electrical connections needed for crystal actuation, microbalance heating and temperature measurement on the electrode area. On the microbalance crystal surfaces the electrodes, a micro film heater (optimized to perform thermo-gravimetric analysis up to 400 °C) and a resistive thermometer are deposited through a vacuum sputtering process. A mockup of the system has been manufactured and tested at the expected vibration levels and the thermal control effectiveness has been verified in thermo-vacuum environment.

  1. Categorical data processing for real estate objects valuation using statistical analysis

    Science.gov (United States)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  2. Physical modeling and characterization of thermo-acoustic loudspeakers made of silver nano-wire films

    Science.gov (United States)

    La Torraca, P.; Larcher, L.; Bobinger, M.; Pavan, P.; Seeber, B.; Lugli, P.

    2017-06-01

    Recent developments of ultra-low heat capacity nanostructured materials revived the interest in the thermo-acoustic (TA) loudspeaker technology, which shows important advantages compared to the classical dynamic loudspeakers as they feature a lower cost and weight, flexibility, conformability to the surface of various shapes, and transparency. The development of the TA loudspeaker technology requires accurate physical models connecting the material properties to the thermal and acoustic speaker's performance. We present here a combined theoretical and experimental analysis of TA loudspeakers, where the electro-thermal and the thermo-acoustic transductions are handled separately, thus allowing an in-depth description of both the pressure and temperature dynamics. The electro-thermal transduction is analyzed by accounting for all the heat flow processes taking place between the TA loudspeaker and the surrounding environment, with focus on their frequency dependence. The thermo-acoustic conversion is studied by solving the coupled thermo-acoustic equations, derived from the Navier-Stokes equations, and by exploiting the Huygens-Fresnel principle to decompose the TA loudspeaker surface into a dense set of TA point sources. A general formulation of the 3D pressure field is derived summing up the TA point source contributions via a Rayleigh integral. The model is validated against temperature and sound pressure level measured on the TA loudspeaker sample made of a Silver Nanowire random network deposited on a polyimide substrate. A good agreement is found between measurements and simulations, demonstrating that the model is capable of connecting material properties to the thermo-acoustic performance of the device, thus providing a valuable tool for the design and optimization of TA loudspeakers.

  3. Gross greenhouse gas fluxes from hydro-power reservoir compared to thermo-power plants

    International Nuclear Information System (INIS)

    Santos, Marco Aurelio dos; Pinguelli Rosa, Luiz; Sikar, Bohdan; Sikar, Elizabeth; Santos, Ednaldo Oliveira dos

    2006-01-01

    This paper presents the findings of gross carbon dioxide and methane emissions measurements in several Brazilian hydro-reservoirs, compared to thermo power generation. The term 'gross emissions' means gas flux measurements from the reservoir surface without natural pre-impoundment emissions by natural bodies such as the river channel, seasonal flooding and terrestrial ecosystems. The net emissions result from deducting pre-existing emissions by the reservoir. A power dam emits biogenic gases such as CO 2 and CH 4 . However, studies comparing gas emissions (gross emissions) from the reservoir surface with emissions by thermo-power generation technologies show that the hydro-based option presents better results in most cases analyzed. In this study, measurements were carried in the Miranda, Barra Bonita, Segredo, Tres Marias, Xingo, and Samuel and Tucurui reservoirs, located in two different climatological regimes. Additional data were used here from measurements taken at the Itaipu and Serra da Mesa reservoirs. Comparisons were also made between emissions from hydro-power plants and their thermo-based equivalents. Bearing in mind that the estimated values for hydro-power plants include emissions that are not totally anthropogenic, the hydro-power plants studied generally posted lower emissions than their equivalent thermo-based counterparts. Hydro-power complexes with greater power densities (capacity/area flooded-W/m 2 ), such as Itaipu, Xingo, Segredo and Miranda, have the best performance, well above thermo-power plants using state-of-the-art technology: combined cycle fueled by natural gas, with 50% efficiency. On the other hand, some hydro-power complexes with low-power density perform only slightly better or even worse than their thermo-power counterparts

  4. Basics of modern mathematical statistics

    CERN Document Server

    Spokoiny, Vladimir

    2015-01-01

    This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.

  5. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    Science.gov (United States)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with

  6. Functional integral approach to classical statistical dynamics

    International Nuclear Information System (INIS)

    Jensen, R.V.

    1980-04-01

    A functional integral method is developed for the statistical solution of nonlinear stochastic differential equations which arise in classical dynamics. The functional integral approach provides a very natural and elegant derivation of the statistical dynamical equations that have been derived using the operator formalism of Martin, Siggia, and Rose

  7. Success Determination by Innovation: A Theoretical Approach in Marketing

    Directory of Open Access Journals (Sweden)

    Raj Kumar Gautam

    2012-10-01

    Full Text Available The paper aims at to identify the main issues in the marketing which needs immediate attention of the marketers. The importance of innovation in the marketing has also been highlighted and marketing mix have been related to innovative and creative ideas. The study is based on the secondary data, various research papers, articles has been studied to develop a innovative approach in the marketing. Marketing innovative ideas relating to business lead generation, product, price, distribution, promotion of product, and revenue generation have been highlighted in the paper. All the suggestions are theoretical and may have relevance and implication to the marketers.

  8. Success Determination by Innovation: A Theoretical Approach in Marketing

    Directory of Open Access Journals (Sweden)

    Raj Kumar Gautam

    2012-11-01

    Full Text Available The paper aims at to identify the main issues in the marketing which needs immediate attention of the marketers. The importance of innovation in the marketing has also been highlighted and marketing mix have been related to innovative and creative ideas. The study is based on the secondary data, various research papers, articles has been studied to develop a innovative approach in the marketing. Marketing innovative ideas relating to business lead generation, product, price, distribution, promotion of product, and revenue generation have been highlighted in the paper. All the suggestions are theoretical and may have relevance and implication to the marketers.

  9. Thermo-hydrodynamic lubrication in hydrodynamic bearings

    CERN Document Server

    Bonneau, Dominique; Souchet, Dominique

    2014-01-01

    This Series provides the necessary elements to the development and validation of numerical prediction models for hydrodynamic bearings. This book describes the thermo-hydrodynamic and the thermo-elasto-hydrodynamic lubrication. The algorithms are methodically detailed and each section is thoroughly illustrated.

  10. Statistical approach to quantum field theory. An introduction

    International Nuclear Information System (INIS)

    Wipf, Andreas

    2013-01-01

    Based on course-tested notes and pedagogical in style. Authored by a leading researcher in the field. Contains end-of-chapter problems and listings of short, useful computer programs. Authored by a leading researcher in the field. Contains end-of-chapter problems and listings of short, useful computer programs. Contains end-of-chapter problems and listings of short, useful computer programs. Over the past few decades the powerful methods of statistical physics and Euclidean quantum field theory have moved closer together, with common tools based on the use of path integrals. The interpretation of Euclidean field theories as particular systems of statistical physics has opened up new avenues for understanding strongly coupled quantum systems or quantum field theories at zero or finite temperatures. Accordingly, the first chapters of this book contain a self-contained introduction to path integrals in Euclidean quantum mechanics and statistical mechanics. The resulting high-dimensional integrals can be estimated with the help of Monte Carlo simulations based on Markov processes. The most commonly used algorithms are presented in detail so as to prepare the reader for the use of high-performance computers as an ''experimental'' tool for this burgeoning field of theoretical physics. Several chapters are then devoted to an introduction to simple lattice field theories and a variety of spin systems with discrete and continuous spins, where the ubiquitous Ising model serves as an ideal guide for introducing the fascinating area of phase transitions. As an alternative to the lattice formulation of quantum field theories, variants of the flexible renormalization group methods are discussed in detail. Since, according to our present-day knowledge, all fundamental interactions in nature are described by gauge theories, the remaining chapters of the book deal with gauge theories without and with matter. This text is based on course-tested notes for graduate students and, as

  11. Statistical physics of vaccination

    Science.gov (United States)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  12. A Survey of Game Theoretic Approaches to Modelling Decision-Making in Information Warfare Scenarios

    Directory of Open Access Journals (Sweden)

    Kathryn Merrick

    2016-07-01

    Full Text Available Our increasing dependence on information technologies and autonomous systems has escalated international concern for information- and cyber-security in the face of politically, socially and religiously motivated cyber-attacks. Information warfare tactics that interfere with the flow of information can challenge the survival of individuals and groups. It is increasingly important that both humans and machines can make decisions that ensure the trustworthiness of information, communication and autonomous systems. Subsequently, an important research direction is concerned with modelling decision-making processes. One approach to this involves modelling decision-making scenarios as games using game theory. This paper presents a survey of information warfare literature, with the purpose of identifying games that model different types of information warfare operations. Our contribution is a systematic identification and classification of information warfare games, as a basis for modelling decision-making by humans and machines in such scenarios. We also present a taxonomy of games that map to information warfare and cyber crime problems as a precursor to future research on decision-making in such scenarios. We identify and discuss open research questions including the role of behavioural game theory in modelling human decision making and the role of machine decision-making in information warfare scenarios.

  13. Association of Trans-theoretical Model (TTM based Exercise Behavior Change with Body Image Evaluation among Female Iranian Students

    Directory of Open Access Journals (Sweden)

    Sahar Rostami

    2017-03-01

    Full Text Available BackgroundBody image is a determinant of individual attractiveness and physical activity among the young people. This study was aimed to assess the association of Trans-theoretical model based exercise behavior change with body image evaluation among the female Iranian students.Materials and MethodsThis cross-sectional study was conducted in Sanandaj city, Iran in 2016. Using multistage sampling method, a total of 816 high school female students were included in the study. They completed a three-section questionnaire, including demographic information, Trans-theoretical model constructs and body image evaluation. The obtained data were fed into SPSS version 21.0.  ResultsThe results showed more than 60% of participants were in the pre-contemplation and contemplation stages of exercise behavior. The means of perceived self-efficacy, barriers and benefits were found to have a statistically significant difference during the stages of exercise behavior change (P

  14. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    Science.gov (United States)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  15. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings

    Directory of Open Access Journals (Sweden)

    Siaw-Teng Liaw

    2014-10-01

    Full Text Available Introduction Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework.Methods We searched PubMed, Medline, Web of Science, ABI Inform (Proquest and Business Source Premier (EBSCO using the terms curation, information ecosystem, data quality management (DQM, data governance, information governance (IG and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise.Findings There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly ‘big-data’ environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle.Conclusions The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  16. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    Science.gov (United States)

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  17. THEORETICAL ASPECTS OF INFORMATIONAL SERVICES REGIONAL MARKET EFFECTIVE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    I.N. Korabejnikov

    2008-12-01

    Full Text Available The peculiarities and priorities of the informational services regional market formation as a part of network model of the economic development are described in this article. The authors present the classification of the factors which have an influence on the effectiveness of the informational services regional market development. Theoretical aspects of the informational services regional market effective development are shown.

  18. The informal recycling in the international and local context: theoretical Elements

    International Nuclear Information System (INIS)

    Yepes P, Dora Luz

    2002-01-01

    This article is a synthesis of the theoretical aspects related with the urban problem of the informal recycling in our means, and it is framed inside the denominated investigation project alternatives for their invigoration of the informal recycling in Medellin, which is a thesis of the grade that looks for to strengthen the informal recycling through the study of the factors associated to the labor productivity of the informal recycle. Specifically, the study will identify options of improvement of its work y points to propose alternatives to dignify the labor of these people integrally by the light of environmental precepts, technicians, normative, institutional social and of sustainability. This document describe the theoretical elements in which this investigation will be based, showing the informal recycling inside of an international context, and their situation in a national and local environment. As a result of the bibliographical revision carried out, can be said, that it glimpses a low interest in to improve the conditions of work a International level of the informal recycle, unless the strategies that it outlines the international labor organization, with regard to the strengthening of the informal economy; in Latin America, it has not been possible to go further of the official rhetoric and the pro motion of the groups environmentalists, but in the issue of the recovery policies, reuse, and the recycling of solid wastes, if there. Has been a sustained advance; at national level clear strategies to improve the informal work of the recycle are being identified, however, lacks many efforts to develop the committed actions with these strategies, in spite of the fact that has been advancing the creation of recycle organizations little by little

  19. Theoretical information reuse and integration

    CERN Document Server

    Rubin, Stuart

    2016-01-01

    Information Reuse and Integration addresses the efficient extension and creation of knowledge through the exploitation of Kolmogorov complexity in the extraction and application of domain symmetry. Knowledge, which seems to be novel, can more often than not be recast as the image of a sequence of transformations, which yield symmetric knowledge. When the size of those transformations and/or the length of that sequence of transforms exceeds the size of the image, then that image is said to be novel or random. It may also be that the new knowledge is random in that no such sequence of transforms, which produces it exists, or is at least known. The nine chapters comprising this volume incorporate symmetry, reuse, and integration as overt operational procedures or as operations built into the formal representations of data and operators employed. Either way, the aforementioned theoretical underpinnings of information reuse and integration are supported.

  20. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.