WorldWideScience

Sample records for quantification theory type

  1. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  2. A Study of Tongue and Pulse Diagnosis in Traditional Korean Medicine for Stroke Patients Based on Quantification Theory Type II

    Directory of Open Access Journals (Sweden)

    Mi Mi Ko

    2013-01-01

    Full Text Available In traditional Korean medicine (TKM, pattern identification (PI diagnosis is important for treating diseases. The aim of this study was to comprehensively investigate the relationship between the PI type and tongue diagnosis or pulse diagnosis variables. The study included 1,879 stroke patients who were admitted to 12 oriental medical university hospitals from June 2006 through March 2009. The status of the pulse and tongue was examined in each patient. Additionally, to investigate relatively important indicators related to specialist PI, the quantification theory type II analysis was performed regarding the PI type. In the first axis quantification of the external criteria, the Qi-deficiency and the Yin-deficiency patterns were located in the negative direction, while the dampness-phlegm (DP and fire-heat patterns were located in the positive direction. The explanatory variable with the greatest impact on the assessment was a fine pulse. In the second axis quantification, the external criteria were divided into either the DP or non-DP patterns. The slippery pulse exhibited the greatest effect on the division. This study attempted to build a model using a statistical method to objectively quantify PI and various indicators that constitute the unique diagnosis system of TKM. These results should assist the development of future diagnostic standards in stroke PI.

  3. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  4. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  5. Application of the third theory of quantification in coal and gas outburst forecast

    Energy Technology Data Exchange (ETDEWEB)

    Wu, C.; Qin, Y.; Zhang, X. [China University of Mining and Technology, Xuzhou (China). School of Resource and Geoscience Engineering

    2004-12-01

    The essential principles of the third theory of quantification are discussed. The concept and calculated method of reaction degree are put forward which extend the applying range and scientificalness of the primary reaction. Taking the Zhongmacun mine as example, on the base of analyzing the rules of gas geology synthetically and traversing the geological factors affecting coal and gas outburst. The paper adopts the method of combining statistical units with the third theory of quantification, screens out 8 sensitive geological factors from 11 geological indexes and carries through the work of gas geology regionalism to the exploited area of Zhongmacun according to the research result. The practice shows that it is feasible to apply the third theory of quantification to gas geology, which offers a new thought to screen the sensitive geological factors of gas outburst forecast. 3 refs., 3 figs., 3 tabs.

  6. Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions

    International Nuclear Information System (INIS)

    Shah, Harsheel; Hosder, Serhat; Winter, Tyler

    2015-01-01

    The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)

  7. Guarded Cubical Type Theory

    DEFF Research Database (Denmark)

    Birkedal, Lars; Bizjak, Aleš; Clouston, Ranald

    2016-01-01

    This paper improves the treatment of equality in guarded dependent type theory (GDTT), by combining it with cubical type theory (CTT). GDTT is an extensional type theory with guarded recursive types, which are useful for building models of program logics, and for programming and reasoning...... with coinductive types. We wish to implement GDTT with decidable type-checking, while still supporting non-trivial equality proofs that reason about the extensions of guarded recursive constructions. CTT is a variation of Martin-L\\"of type theory in which the identity type is replaced by abstract paths between...... terms. CTT provides a computational interpretation of functional extensionality, is conjectured to have decidable type checking, and has an implemented type-checker. Our new type theory, called guarded cubical type theory, provides a computational interpretation of extensionality for guarded recursive...

  8. Guarded Cubical Type Theory

    DEFF Research Database (Denmark)

    Birkedal, Lars; Bizjak, Aleš; Clouston, Ranald

    2016-01-01

    This paper improves the treatment of equality in guarded dependent type theory (GDTT), by combining it with cubical type theory (CTT). GDTT is an extensional type theory with guarded recursive types, which are useful for building models of program logics, and for programming and reasoning...... with coinductive types. We wish to implement GDTT with decidable type checking, while still supporting non-trivial equality proofs that reason about the extensions of guarded recursive constructions. CTT is a variation of Martin-L\\"of type theory in which the identity type is replaced by abstract paths between...... terms. CTT provides a computational interpretation of functional extensionality, enjoys canonicity for the natural numbers type, and is conjectured to support decidable type-checking. Our new type theory, guarded cubical type theory (GCTT), provides a computational interpretation of extensionality...

  9. Standard Error Computations for Uncertainty Quantification in Inverse Problems: Asymptotic Theory vs. Bootstrapping.

    Science.gov (United States)

    Banks, H T; Holm, Kathleen; Robbins, Danielle

    2010-11-01

    We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, confidence intervals, and computational times for both bootstrapping and asymptotic theory methods.

  10. Guarded dependent type theory with coinductive types

    DEFF Research Database (Denmark)

    Bizjak, Aleš; Grathwohl, Hans Bugge; Clouston, Ranald

    2016-01-01

    We present guarded dependent type theory, gDTT, an extensional dependent type theory with a later' modality and clock quantifiers for programming and proving with guarded recursive and coinductive types. The later modality is used to ensure the productivity of recursive definitions in a modular......, type based, way. Clock quantifiers are used for controlled elimination of the later modality and for encoding coinductive types using guarded recursive types. Key to the development of gDTT are novel type and term formers involving what we call delayed substitutions’. These generalise the applicative...... functor rules for the later modality considered in earlier work, and are crucial for programming and proving with dependent types. We show soundness of the type theory with respect to a denotational model....

  11. Stack semantics of type theory

    DEFF Research Database (Denmark)

    Coquand, Thierry; Mannaa, Bassel; Ruch, Fabian

    2017-01-01

    We give a model of dependent type theory with one univalent universe and propositional truncation interpreting a type as a stack, generalizing the groupoid model of type theory. As an application, we show that countable choice cannot be proved in dependent type theory with one univalent universe...

  12. Linear contextual modal type theory

    DEFF Research Database (Denmark)

    Schack-Nielsen, Anders; Schürmann, Carsten

    Abstract. When one implements a logical framework based on linear type theory, for example the Celf system [?], one is immediately con- fronted with questions about their equational theory and how to deal with logic variables. In this paper, we propose linear contextual modal type theory that gives...... a mathematical account of the nature of logic variables. Our type theory is conservative over intuitionistic contextual modal type theory proposed by Nanevski, Pfenning, and Pientka. Our main contributions include a mechanically checked proof of soundness and a working implementation....

  13. Treatise on intuitionistic type theory

    CERN Document Server

    Granström, Johan Georg

    2011-01-01

    Intuitionistic type theory can be described, somewhat boldly, as a fulfillment of the dream of a universal language for science.  In particular, intuitionistic type theory is a foundation for mathematics and a programming language.

  14. Modalities in homotopy type theory

    DEFF Research Database (Denmark)

    Rijke, Egbert; Shulman, Michael; Spitters, Bas

    2017-01-01

    Univalent homotopy type theory (HoTT) may be seen as a language for the category of ∞-groupoids. It is being developed as a new foundation for mathematics and as an internal language for (elementary) higher toposes. We develop the theory of factorization systems, reflective subuniverses......, and modalities in homotopy type theory, including their construction using a "localization" higher inductive type. This produces in particular the (n-connected, n-truncated) factorization system as well as internal presentations of subtoposes, through lex modalities. We also develop the semantics...

  15. Orbifolds of M-theory and type II string theories in two dimensions

    International Nuclear Information System (INIS)

    Roy, S.

    1997-01-01

    We consider several orbifold compactifications of M-theory and theircorresponding type II duals in two space-time dimensions. In particular, we show that while the orbifold compactification of M-theory on T 9 /J 9 is dual to the orbifold compactification of type IIB string theory on T 8 /I 8 , the same orbifold T 8 /I 8 of type IIA string theory is dual to M-theory compactified on a smooth product manifold K3 x T 5 . Similarly, while the orbifold compactification of M-theory on (K3 x T 5 )/σ. J 5 is dual to the orbifold compactification of type IIB string theory on (K3 x T 4 )/σ.I 4 , the same orbifold of type IIA string theory is dual to the orbifold T 4 x (K3 x S 1 )/σ.J 1 of M-theory. The spectrum of various orbifold compactifications of M-theory and type II string theories on both sides are compared giving evidence in favor of these duality conjectures. We also comment on a connection between the Dasgupta-Mukhi-Witten conjecture and the Dabholkar-Park-Sen conjecture for the six-dimensional orbifold models of type IIB string theory and M-theory. (orig.)

  16. Aerosol-type retrieval and uncertainty quantification from OMI data

    Science.gov (United States)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model

  17. Aerosol-type retrieval and uncertainty quantification from OMI data

    Directory of Open Access Journals (Sweden)

    A. Kauppi

    2017-11-01

    Full Text Available We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs and top-of-atmosphere (TOA spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD. The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the

  18. Denotational semantics for guarded dependent type theory

    DEFF Research Database (Denmark)

    Bizjak, Aleš; Møgelberg, Rasmus Ejlers

    2018-01-01

    We present a new model of Guarded Dependent Type Theory (GDTT), a type theory with guarded recursion and multiple clocks in which one can program with, and reason about coinductive types. Productivity of recursively defined coinductive programs and proofs is encoded in types using guarded recursion......, crucial for programming with coinductive types, types must be interpreted as presheaves orthogonal to the object of clocks. In the case of dependent types, this translates to a unique lifting condition similar to the one found in homotopy theoretic models of type theory. Since the universes defined...... by inclusions of clock variable contexts commute on the nose with type operations on the universes....

  19. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  20. Aspects of type $0$ string theory

    CERN Document Server

    Blumenhagen, R; Kumar, A; Lüst, Dieter

    2000-01-01

    A construction of compact tachyon-free orientifolds of the non-supersymmetric Type 0B string theory is presented. Moreover, we study effective non-supersymmetric gauge theories arising on self-dual D3-branes in Type 0B orbifolds and orientifolds.

  1. The Independence of Markov's Principle in Type Theory

    DEFF Research Database (Denmark)

    Coquand, Thierry; Mannaa, Bassel

    2017-01-01

    for the generic point of this model. Instead we design an extension of type theory, which intuitively extends type theory by the addition of a generic point of Cantor space. We then show the consistency of this extension by a normalization argument. Markov's principle does not hold in this extension......In this paper, we show that Markov's principle is not derivable in dependent type theory with natural numbers and one universe. One way to prove this would be to remark that Markov's principle does not hold in a sheaf model of type theory over Cantor space, since Markov's principle does not hold......, and it follows that it cannot be proved in type theory....

  2. A computable type theory for control systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter); L. Guo; J. Baillieul

    2009-01-01

    htmlabstractIn this paper, we develop a theory of computable types suitable for the study of control systems. The theory uses type-two effectivity as the underlying computational model, but we quickly develop a type system which can be manipulated abstractly, but for which all allowable operations

  3. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies.

    Science.gov (United States)

    Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando

    2016-05-27

    Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers' activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers' health systematically. The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. A

  4. Quantification of fibre type regionalisation : an analysis of lower hindlimb muscles in the rat

    NARCIS (Netherlands)

    Wang, LC; Kernell, D

    Newly developed concepts and methods for the quantification of fibre type regionalisation were used for comparison between all muscles traversing the ankle of the rat lower hindlimb (n = 13). For each muscle, cross-sections from the proximodistal midlevel were stained for myofibrillar ATPase and

  5. Type classes for mathematics in type theory

    OpenAIRE

    Spitters, Bas; Van der Weegen, Eelis

    2011-01-01

    The introduction of first-class type classes in the Coq system calls for re-examination of the basic interfaces used for mathematical formalization in type theory. We present a new set of type classes for mathematics and take full advantage of their unique features to make practical a particularly flexible approach formerly thought infeasible. Thus, we address both traditional proof engineering challenges as well as new ones resulting from our ambition to build upon this development a library...

  6. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  7. Verifying design patterns in Hoare Type Theory

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Buisse, Alexandre; Birkedal, Lars

    In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory.......In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory....

  8. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  9. Massive deformations of Type IIA theory within double field theory

    Science.gov (United States)

    Çatal-Özer, Aybike

    2018-02-01

    We obtain massive deformations of Type IIA supergravity theory through duality twisted reductions of Double Field Theory (DFT) of massless Type II strings. The mass deformation is induced through the reduction of the DFT of the RR sector. Such reductions are determined by a twist element belonging to Spin+(10, 10), which is the duality group of the DFT of the RR sector. We determine the form of the twists and give particular examples of twists matrices, for which a massive deformation of Type IIA theory can be obtained. In one of the cases, requirement of gauge invariance of the RR sector implies that the dilaton field must pick up a linear dependence on one of the dual coordinates. In another case, the choice of the twist matrix violates the weak and the strong constraints explicitly in the internal doubled space.

  10. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...

  11. Hoare type theory, polymorphism and separation

    DEFF Research Database (Denmark)

    Nanevski, Alexandar; Morrisett, J. Gregory; Birkedal, Lars

    2008-01-01

    We consider the problem of reconciling a dependently typed functional language with imperative features such as mutable higher-order state, pointer aliasing, and nontermination. We propose Hoare type theory (HTT), which incorporates Hoare-style specifications into types, making it possible to sta...

  12. Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales

    Directory of Open Access Journals (Sweden)

    J. Ellen Blue

    2008-05-01

    Full Text Available We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex vocal behavior of these humpback whales which could have direct consequences on their feeding behavior and thus ultimately on their health and reproduction. Humpback whale feeding calls recorded during conditions of high vessel-generated noise and lower levels of background noise are compared for differences in acoustic structure, use, and organization using information theoretic measures. We apply information theory in a self-referential manner (i.e., orders of entropy to quantify the changes in signaling behavior. We then compare this with the reduction in channel capacity due to noise in Glacier Bay itself treating it as a (Gaussian noisy channel. We find that high vessel noise is associated with an increase in the rate and repetitiveness of sequential use of feeding call types in our averaged sample of humpback whale vocalizations, indicating that vessel noise may be modifying the patterns of use of feeding calls by the endangered humpback whales in Southeast Alaska. The information theoretic approach suggested herein can make a reliable quantitative measure of such relationships and may also be adapted for wider application to many species where environmental noise is thought to be a problem.

  13. Revised theory of Pierce-type electron guns

    International Nuclear Information System (INIS)

    Sar-El, H.Z.

    1982-01-01

    Attempts to date to obtain the shape of the beam forming electrodes of various Pierce-type electron guns are briefly discussed with emphasis on the many discrepansis in the results of previous works. A revised theory of Pierce-type electron guns is proposed. The shapes of the beam-forming electrodes for all known configurations of Pierce guns were computed on the basis of the proposed theory. (orig.)

  14. Heterotic/Type-II duality and its field theory avatars

    International Nuclear Information System (INIS)

    Kiritsis, Elias

    1999-01-01

    In these lecture notes, I will describe heterotic/type-II duality in six and four dimensions. When supersymmetry is the maximal N=4 it will be shown that the duality reduces in the field theory limit to the Montonen-Olive duality of N=4 Super Yang-Mills theory. We will consider further compactifications of type II theory on Calabi-Yau manifolds. We will understand the physical meaning of geometric conifold singularities and the dynamics of conifold transitions. When the CY manifold is a K3 fibration we will argue that the type-II ground-state is dual to the heterotic theory compactified on K3xT 2 . This allows an exact computation of the low effective action. Taking the field theory limit, α ' →0, we will recover the Seiberg-Witten non-perturbative solution of N=2 gauge theory

  15. Bell-type quantum field theories

    International Nuclear Information System (INIS)

    Duerr, Detlef; Goldstein, Sheldon; Tumulka, Roderich; Zanghi, Nino

    2005-01-01

    In his paper (1986 Beables for quantum field theory Phys. Rep. 137 49-54) John S Bell proposed how to associate particle trajectories with a lattice quantum field theory, yielding what can be regarded as a vertical bar Ψ vertical bar 2 -distributed Markov process on the appropriate configuration space. A similar process can be defined in the continuum, for more or less any regularized quantum field theory; we call such processes Bell-type quantum field theories. We describe methods for explicitly constructing these processes. These concern, in addition to the definition of the Markov processes, the efficient calculation of jump rates, how to obtain the process from the processes corresponding to the free and interaction Hamiltonian alone, and how to obtain the free process from the free Hamiltonian or, alternatively, from the one-particle process by a construction analogous to 'second quantization'. As an example, we consider the process for a second quantized Dirac field in an external electromagnetic field. (topical review)

  16. Multi-level Contextual Type Theory

    Directory of Open Access Journals (Sweden)

    Mathieu Boespflug

    2011-10-01

    Full Text Available Contextual type theory distinguishes between bound variables and meta-variables to write potentially incomplete terms in the presence of binders. It has found good use as a framework for concise explanations of higher-order unification, characterize holes in proofs, and in developing a foundation for programming with higher-order abstract syntax, as embodied by the programming and reasoning environment Beluga. However, to reason about these applications, we need to introduce meta^2-variables to characterize the dependency on meta-variables and bound variables. In other words, we must go beyond a two-level system granting only bound variables and meta-variables. In this paper we generalize contextual type theory to n levels for arbitrary n, so as to obtain a formal system offering bound variables, meta-variables and so on all the way to meta^n-variables. We obtain a uniform account by collapsing all these different kinds of variables into a single notion of variabe indexed by some level k. We give a decidable bi-directional type system which characterizes beta-eta-normal forms together with a generalized substitution operation.

  17. Exotic dual of type II double field theory

    Directory of Open Access Journals (Sweden)

    Eric A. Bergshoeff

    2017-04-01

    Full Text Available We perform an exotic dualization of the Ramond–Ramond fields in type II double field theory, in which they are encoded in a Majorana–Weyl spinor of O(D,D. Starting from a first-order master action, the dual theory in terms of a tensor–spinor of O(D,D is determined. This tensor–spinor is subject to an exotic version of the (self-duality constraint needed for a democratic formulation. We show that in components, reducing O(D,D to GL(D, one obtains the expected exotically dual theory in terms of mixed Young tableaux fields. To this end, we generalize exotic dualizations to self-dual fields, such as the 4-form in type IIB string theory.

  18. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  19. Predominant Lactobacillus species types of vaginal microbiota in pregnant Korean women: quantification of the five Lactobacillus species and two anaerobes.

    Science.gov (United States)

    Kim, Jeong Hyun; Yoo, Seung Min; Sohn, Yong Hak; Jin, Chan Hee; Yang, Yun Suk; Hwang, In Taek; Oh, Kwan Young

    2017-10-01

    To investigate the predominant Lactobacillus species types (LSTs) of vaginal microbiota in pregnant Korean women by quantifying five Lactobacillus species and two anaerobes. In all, 168 pregnant Korean women under antenatal care at Eulji University Hospital and local clinics were enrolled in the prospective cohort study during pregnancy (10-14 weeks). Vaginal samples were collected with Eswab for Quantitative polymerase chain reaction (qPCR) and stored in a -80 °C freezer. qPCR was performed for five Lactobacillus species and two anaerobes. To identify the predominant LSTs, quantifications were analyzed by the Cluster and Tree View programs of Eisen Lab. Also the quantifications were compared among classified groups. L. crispatus and L. iners were most commonly found in pregnant Korean women, followed by L. gasseri and L. jensenii; L. vaginalis was nearly absent. Five types (four predominant LSTs and one predominant anaerobe type without predominant Lactobacillus species) were classified. Five predominant LSTs were identified in vaginal microbiota of pregnant Korean women. L. crispatus and L. iners predominant types comprised a large proportion.

  20. Development of Primer-Probe Energy Transfer real-time PCR for the detection and quantification of porcine circovirus type 2

    DEFF Research Database (Denmark)

    Balint, Adam; Tenk, Miklós; Deim, Zoltán

    2009-01-01

    A real-time PCR assay, based on Primer-Probe Energy Transfer (PriProET), was developed to improve the detection and quantification of porcine circovirus type 2 (PVC2). PCV2 is recognised as the essential infectious agent in post-weaning multisystemic wasting syndrome (PMWS) and has been associated...... in different organs. The data obtained in this study correlate with those described earlier; namely, the viral load in 1 ml plasma and in 500 ng tissue DNA exceeds 10(7) copies in the case of PMWS. The results indicate that the new assay provides a specific, sensitive and robust tool for the improved detection...... and quantification of PCV2....

  1. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  2. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  3. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  4. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  5. A model of PCF in guarded type theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  6. A Model of PCF in Guarded Type Theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  7. Irregular singularities in Liouville theory and Argyres-Douglas type gauge theories, I

    Energy Technology Data Exchange (ETDEWEB)

    Gaiotto, D. [Institute for Advanced Study (IAS), Princeton, NJ (United States); Teschner, J. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-03-15

    Motivated by problems arising in the study of N=2 supersymmetric gauge theories we introduce and study irregular singularities in two-dimensional conformal field theory, here Liouville theory. Irregular singularities are associated to representations of the Virasoro algebra in which a subset of the annihilation part of the algebra act diagonally. In this paper we define natural bases for the space of conformal blocks in the presence of irregular singularities, describe how to calculate their series expansions, and how such conformal blocks can be constructed by some delicate limiting procedure from ordinary conformal blocks. This leads us to a proposal for the structure functions appearing in the decomposition of physical correlation functions with irregular singularities into conformal blocks. Taken together, we get a precise prediction for the partition functions of some Argyres-Douglas type theories on S{sup 4}. (orig.)

  8. Irregular singularities in Liouville theory and Argyres-Douglas type gauge theories, I

    International Nuclear Information System (INIS)

    Gaiotto, D.; Teschner, J.

    2012-03-01

    Motivated by problems arising in the study of N=2 supersymmetric gauge theories we introduce and study irregular singularities in two-dimensional conformal field theory, here Liouville theory. Irregular singularities are associated to representations of the Virasoro algebra in which a subset of the annihilation part of the algebra act diagonally. In this paper we define natural bases for the space of conformal blocks in the presence of irregular singularities, describe how to calculate their series expansions, and how such conformal blocks can be constructed by some delicate limiting procedure from ordinary conformal blocks. This leads us to a proposal for the structure functions appearing in the decomposition of physical correlation functions with irregular singularities into conformal blocks. Taken together, we get a precise prediction for the partition functions of some Argyres-Douglas type theories on S 4 . (orig.)

  9. Fivebrane instantons and higher derivative couplings in type I theory

    International Nuclear Information System (INIS)

    Hammou, Amine B.; Morales, Jose F.

    2000-01-01

    We express the infinite sum of D5-brane instanton corrections to R 2 couplings in N=4 type I string vacua, in terms of an elliptic index counting 1/2-BPS excitations in the effective Sp(N) brane theory. We compute the index explicitly in the infrared, where the effective theory is argued to flow to an orbifold CFT. The form of the instanton sum agrees completely with the predicted formula from a dual one-loop computation in type IIA theory on K3xT 2 . The proposed CFT provides a proper description of the whole spectrum of masses, charges and multiplicities for 1/2- and 1/4-BPS states, associated to bound states of D5-branes and KK momenta. These results are applied to show how fivebrane instanton sums, entering higher derivative couplings which are sensitive to 1/4-BPS contributions, also match the perturbative results in the dual type IIA theory

  10. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  11. Enhanced gauge symmetry in type II string theory

    International Nuclear Information System (INIS)

    Katz, S.; Ronen Plesser, M.

    1996-01-01

    We show how enhanced gauge symmetry in type II string theory compactified on a Calabi-Yau threefold arises from singularities in the geometry of the target space. When the target space of the type IIA string acquires a genus g curve C of A N-1 singularities, we find that an SU(N) gauge theory with g adjoint hypermultiplets appears at the singularity. The new massless states correspond to solitons wrapped about the collapsing cycles, and their dynamics is described by a twisted supersymmetric gauge theory on C x R 4 . We reproduce this result from an analysis of the S-dual D-manifold. We check that the predictions made by this model about the nature of the Higgs branch, the monodromy of period integrals, and the asymptotics of the one-loop topological amplitude are in agreement with geometrical computations. In one of our examples we find that the singularity occurs at strong coupling in the heterotic dual proposed by Kachru and Vafa. (orig.)

  12. Types of two-dimensional = 4 superconformal field theories

    Indian Academy of Sciences (India)

    Types of two-dimensional = 4 superconformal field theories. Abbas Ali ... Various types of = 4 superconformal symmetries in two dimensions are considered. It is proposed that apart ... Pramana – Journal of Physics | News. © 2017 Indian ...

  13. Towards deconstruction of the type D(2,0) theory

    International Nuclear Information System (INIS)

    Bourget, Antoine; Rodriguez-Gomez, Diego; Pini, Alessandro

    2017-10-01

    We propose a four-dimensional supersymmetric theory that deconstructs, in a particular limit, the six-dimensional (2,0) theory of type D k . This 4d theory is defined by a necklace quiver with alternating gauge nodes O(2k) and Sp(k). We test this proposal by comparing the 6d half-BPS index to the Higgs branch Hilbert series of the 4d theory. In the process, we overcome several technical difficulties, such as Hilbert series calculations for non-complete intersections, and the choice of O versus SO gauge groups. Consistently, the result matches the Coulomb branch formula for the mirror theory upon reduction to 3d.

  14. A Generalized Perturbation Theory Solver In Rattlesnake Based On PETSc With Application To TREAT Steady State Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, Sebastian; Wang, Congjian; Wang, Yaqi; Kong, Fande; Ortensi, Javier; Baker, Benjamin; Gleicher, Frederick; DeHart, Mark; Martineau, Richard

    2017-04-01

    Rattlesnake and MAMMOTH are the designated TREAT analysis tools currently being developed at the Idaho National Laboratory. Concurrent with development of the multi-physics, multi-scale capabilities, sensitivity analysis and uncertainty quantification (SA/UQ) capabilities are required for predicitive modeling of the TREAT reactor. For steady-state SA/UQ, that is essential for setting initial conditions for the transients, generalized perturbation theory (GPT) will be used. This work describes the implementation of a PETSc based solver for the generalized adjoint equations that constitute a inhomogeneous, rank deficient problem. The standard approach is to use an outer iteration strategy with repeated removal of the fundamental mode contamination. The described GPT algorithm directly solves the GPT equations without the need of an outer iteration procedure by using Krylov subspaces that are orthogonal to the operator’s nullspace. Three test problems are solved and provide sufficient verification for the Rattlesnake’s GPT capability. We conclude with a preliminary example evaluating the impact of the Boron distribution in the TREAT reactor using perturbation theory.

  15. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    Science.gov (United States)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  16. Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts

    2006-05-01

    This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and

  17. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  18. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  19. Precise iteration formulae of the Maslov-type index theory for symplectic paths

    International Nuclear Information System (INIS)

    Yiming Long

    1998-10-01

    In this paper, using homotopy components of symplectic matrices, and basic properties of the Maslov-type index theory, we establish precise iteration formulae of the Maslov-type index theory for any path in the symplectic group starting from the identity. (author)

  20. Development and validation of an enzyme-linked immunosorbent assay for the quantification of a specific MMP-9 mediated degradation fragment of type III collagen--A novel biomarker of atherosclerotic plaque remodeling

    DEFF Research Database (Denmark)

    Barascuk, Natasha; Vassiliadis, Efstathios; Larsen, Lise

    2011-01-01

    Degradation of collagen in the arterial wall by matrix metalloproteinases is the hallmark of atherosclerosis. We have developed an ELISA for the quantification of type III collagen degradation mediated by MMP-9 in urine.......Degradation of collagen in the arterial wall by matrix metalloproteinases is the hallmark of atherosclerosis. We have developed an ELISA for the quantification of type III collagen degradation mediated by MMP-9 in urine....

  1. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  2. A Kallosh theorem for BF-type topological field theory

    International Nuclear Information System (INIS)

    Birmingham, D.; Gibbs, R.; Mokhtari, S.

    1991-01-01

    A Kallosh theorem is established for the case of BF-type theories in three dimensions, including a coupling to Chern-Simons theory. The phase contribution to the one-loop off-shell effective action is computed for a two-parameter family of local covariant gauges. It is shown that the phase is independent of these parameters, and thus equals the 'no Vilkovisky-DeWitt' gauge result. The field space metric dependence of a corresponding calculation for generalized BF theory is briefly discussed. (orig.)

  3. Multivariate Bonferroni-type inequalities theory and applications

    CERN Document Server

    Chen, John

    2014-01-01

    Multivariate Bonferroni-Type Inequalities: Theory and Applications presents a systematic account of research discoveries on multivariate Bonferroni-type inequalities published in the past decade. The emergence of new bounding approaches pushes the conventional definitions of optimal inequalities and demands new insights into linear and Fréchet optimality. The book explores these advances in bounding techniques with corresponding innovative applications. It presents the method of linear programming for multivariate bounds, multivariate hybrid bounds, sub-Markovian bounds, and bounds using Hamil

  4. [DNA quantification of blood samples pre-treated with pyramidon].

    Science.gov (United States)

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  5. A Kallosh theorem for BF-type topological field theory

    Energy Technology Data Exchange (ETDEWEB)

    Birmingham, D. (Theory Div., CERN, Geneva (Switzerland)); Gibbs, R.; Mokhtari, S. (Physics Dept., Louisiana Tech. Univ., Ruston, LA (United States))

    1991-12-12

    A Kallosh theorem is established for the case of BF-type theories in three dimensions, including a coupling to Chern-Simons theory. The phase contribution to the one-loop off-shell effective action is computed for a two-parameter family of local covariant gauges. It is shown that the phase is independent of these parameters, and thus equals the 'no Vilkovisky-DeWitt' gauge result. The field space metric dependence of a corresponding calculation for generalized BF theory is briefly discussed. (orig.).

  6. Quantification of intrapancreatic fat in type 2 diabetes by MRI.

    Directory of Open Access Journals (Sweden)

    Ahmad Al-Mrabeh

    Full Text Available Accumulation of intrapancreatic fat may be important in type 2 diabetes, but widely varying data have been reported. The standard quantification by MRI in vivo is time consuming and dependent upon a high level of experience. We aimed to develop a new method which would minimise inter-observer variation and to compare this against previously published datasets.A technique of 'biopsying' the image to minimise inclusion of non-parenchymal tissues was developed. Additionally, thresholding was applied to exclude both pancreatic ducts and intrusions of visceral fat, with pixels of fat values of 20% being excluded. The new MR image 'biopsy' (MR-opsy was compared to the standard method by 6 independent observers with wide experience of image analysis but no experience of pancreas imaging. The effect of the new method was examined on datasets from two studies of weight loss in type 2 diabetes.At low levels of intrapancreatic fat neither the result nor the inter-observer CV was changed by MR-opsy, thresholding or a combination of the methods. However, at higher levels the conventional method exhibited poor inter-observer agreement (coefficient of variation 26.9% and the new combined method improved the CV to 4.3% (p<0.03. Using either MR-opsy alone or with thresholding, the new methods indicated a closer relationship between decrease in intrapancreatic fat and fall in blood glucose.The inter-observer variation for quantifying intrapancreatic fat was substantially improved by the new method when pancreas fat levels were moderately high. The method will improve comparability of pancreas fat measurement between research groups.

  7. The use of self-quantification systems for personal health information: big data management activities and prospects.

    Science.gov (United States)

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance

  8. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  9. A theory of solar type III radio bursts

    International Nuclear Information System (INIS)

    Goldstein, M.L.; Smith, R.A.

    1979-01-01

    A theory of type III bursts is reviewed. Energetic electrons propagating through the interplanetary medium are shown to excite the one dimensional oscillating two stream instability (OTSI). The OTSI is in turn stabilized by anomalous resistivity which completes the transfer of long wavelength Langmuir waves to short wavelengths, out of resonance with the electrons. The theory explains the small energy losses suffered by the electrons in propagating to 1 AU, the predominance of second harmonic radiation, and the observed correlation between radio and electron fluxes. (Auth.)

  10. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Hans B\\"uhlmann; Pavel V. Shevchenko; Mario V. W\\"uthrich

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses exceeding some high thr...

  11. E7 type modular invariant Wess-Zumino theory and Gepner's string compactification

    International Nuclear Information System (INIS)

    Kato, Akishi; Kitazawa, Yoshihisa

    1989-01-01

    The report addresses the development of a general procedure to study the structure of operator algebra in off-diagonal modular invariant theories. An effort is made to carry out this procedure in E 7 type modular invariant Wess-Zumino-Witten theory and explicitly check the closure of operator product algebra, which is required for any consistent conformal field theory. The conformal field theory is utilized to construct perturbative vacuum in string theory. Apparently quite nontrivial vacuums can be constructed out of minimal models of the N = 2 superconformal theory. Here, an investigation made of the Yukawa couplings of such a model which uses E 7 type off-diagonal modular invariance. Phenomenological properties of this model is also discussed. Although off-diagonal modular invariant theories are rather special, realistic models seem to require very special manifolds. Therefore they may enhance the viability of string theory to describe real world. A study is also made on Verlinde's fusion algebra in E 7 modular invariant theory. It is determined in the holomorphic sector only. Furthermore the indicator is given by the modular transformation matrix. A pair of operators which operate on the characters play a crucial role in this theory. (Nogami, K.)

  12. Two Ramond-Ramond corrections to type II supergravity via field-theory amplitude

    Energy Technology Data Exchange (ETDEWEB)

    Bakhtiarizadeh, Hamid R. [Sirjan University of Technology, Department of Physics, Sirjan (Iran, Islamic Republic of)

    2017-12-15

    Motivated by the standard form of the string-theory amplitude, we calculate the field-theory amplitude to complete the higher-derivative terms in type II supergravity theories in their conventional form. We derive explicitly the O(α{sup '3}) interactions for the RR (Ramond-Ramond) fields with graviton, B-field and dilaton in the low-energy effective action of type II superstrings. We check our results by comparison with previous work that has been done by the other methods, and we find exact agreement. (orig.)

  13. On the theory of the type III burst exciter

    Science.gov (United States)

    Smith, R. A.; Goldstein, M. L.; Papadopoulos, K.

    1976-01-01

    In situ satellite observations of type III burst exciters at 1 AU show that the beam does not evolve into a plateau in velocity space, contrary to the prediction of quasilinear theory. The observations can be explained by a theory that includes mode coupling effects due to excitation of the parametric oscillating two-stream instability and its saturation by anomalous resistivity. The time evolution of the beam velocity distribution is included in the analysis.

  14. Leptogenesis in unified theories with Type II see-saw

    International Nuclear Information System (INIS)

    Antusch, Stefan; King, Steve F.

    2006-01-01

    In some classes of flavour models based on unified theories with a type I see-saw mechanism, the prediction for the mass of the lightest right-handed neutrino is in conflict with the lower bound from the requirement of successful thermal leptogenesis. We investigate how lifting the absolute neutrino mass scale by adding a type II see-saw contribution proportional to the unit matrix can solve this problem. Generically, lifting the neutrino mass scale increases the prediction for the mass of the lightest right-handed neutrino while the decay asymmetry is enhanced and washout effects are reduced, relaxing the lower bound on the mass of the lightest right-handed neutrino from thermal leptogenesis. For instance in classes of unified theories where the lightest right-handed neutrino dominates the type I see-saw contribution, we find that thermal leptogenesis becomes possible if the neutrino mass scale is larger than about 0.15 eV, making this scenario testable by neutrinoless double beta decay experiments in the near future

  15. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  16. Cartan's equations define a topological field theory of the BF type

    International Nuclear Information System (INIS)

    Cuesta, Vladimir; Montesinos, Merced

    2007-01-01

    Cartan's first and second structure equations together with first and second Bianchi identities can be interpreted as equations of motion for the tetrad, the connection and a set of two-form fields T I and R J I . From this viewpoint, these equations define by themselves a field theory. Restricting the analysis to four-dimensional spacetimes (keeping gravity in mind), it is possible to give an action principle of the BF type from which these equations of motion are obtained. The action turns out to be equivalent to a linear combination of the Nieh-Yan, Pontrjagin, and Euler classes, and so the field theory defined by the action is topological. Once Einstein's equations are added, the resulting theory is general relativity. Therefore, the current results show that the relationship between general relativity and topological field theories of the BF type is also present in the first-order formalism for general relativity

  17. Nonperturbative type IIB model building in the F-theory framework

    Energy Technology Data Exchange (ETDEWEB)

    Jurke, Benjamin Helmut Friedrich

    2011-02-28

    This dissertation is concerned with the topic of non-perturbative string theory, which is generally considered to be the most promising approach to a consistent description of quantum gravity. The five known 10-dimensional perturbative string theories are all interconnected by numerous dualities, such that an underlying non-perturbative 11-dimensional theory, called M-theory, is postulated. Due to several technical obstacles, little is known about the fundamental objects in this theory. There exists an alternative non-perturbative description to type IIB string theory, namely F-theory. Here the SL(2;Z) self-duality of IIB theory is geometrized in the form of an elliptic fibration over the space-time. Moreover, higher-dimensional objects like 7-branes are included via singularities into the geometric picture. This formally elegant description, however, requires significant technical effort for the construction of suitable compactification geometries, as many different aspects necessarily have to be dealt with at the same time. On the other hand, the generation of essential GUT building blocks like certain Yukawa couplings or spinor representations is easier compared to perturbative string theory. The goal of this study is therefore to formulate a unified theory within the framework of F-theory, that satisfies basic phenomenological constraints. Within this thesis, at first E3-brane instantons in type IIB string theory - 4-dimensional objects that are entirely wrapped around the invisible dimensions of space-time - are matched with M5-branes in F-theory. Such objects are of great importance in the generation of critical Yukawa couplings or the stabilization of the free parameters of a theory. Certain properties of M5-branes then allow to derive a new criterion for E3-branes to contribute to the superpotential. In the aftermath of this analysis, several compactification geometries are constructed and checked for basic properties that are relevant for semi

  18. Nonperturbative type IIB model building in the F-theory framework

    International Nuclear Information System (INIS)

    Jurke, Benjamin Helmut Friedrich

    2011-01-01

    This dissertation is concerned with the topic of non-perturbative string theory, which is generally considered to be the most promising approach to a consistent description of quantum gravity. The five known 10-dimensional perturbative string theories are all interconnected by numerous dualities, such that an underlying non-perturbative 11-dimensional theory, called M-theory, is postulated. Due to several technical obstacles, little is known about the fundamental objects in this theory. There exists an alternative non-perturbative description to type IIB string theory, namely F-theory. Here the SL(2;Z) self-duality of IIB theory is geometrized in the form of an elliptic fibration over the space-time. Moreover, higher-dimensional objects like 7-branes are included via singularities into the geometric picture. This formally elegant description, however, requires significant technical effort for the construction of suitable compactification geometries, as many different aspects necessarily have to be dealt with at the same time. On the other hand, the generation of essential GUT building blocks like certain Yukawa couplings or spinor representations is easier compared to perturbative string theory. The goal of this study is therefore to formulate a unified theory within the framework of F-theory, that satisfies basic phenomenological constraints. Within this thesis, at first E3-brane instantons in type IIB string theory - 4-dimensional objects that are entirely wrapped around the invisible dimensions of space-time - are matched with M5-branes in F-theory. Such objects are of great importance in the generation of critical Yukawa couplings or the stabilization of the free parameters of a theory. Certain properties of M5-branes then allow to derive a new criterion for E3-branes to contribute to the superpotential. In the aftermath of this analysis, several compactification geometries are constructed and checked for basic properties that are relevant for semi

  19. Development of an exchange–correlation functional with uncertainty quantification capabilities for density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Aldegunde, Manuel, E-mail: M.A.Aldegunde-Rodriguez@warwick.ac.uk; Kermode, James R., E-mail: J.R.Kermode@warwick.ac.uk; Zabaras, Nicholas

    2016-04-15

    This paper presents the development of a new exchange–correlation functional from the point of view of machine learning. Using atomization energies of solids and small molecules, we train a linear model for the exchange enhancement factor using a Bayesian approach which allows for the quantification of uncertainties in the predictions. A relevance vector machine is used to automatically select the most relevant terms of the model. We then test this model on atomization energies and also on bulk properties. The average model provides a mean absolute error of only 0.116 eV for the test points of the G2/97 set but a larger 0.314 eV for the test solids. In terms of bulk properties, the prediction for transition metals and monovalent semiconductors has a very low test error. However, as expected, predictions for types of materials not represented in the training set such as ionic solids show much larger errors.

  20. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Stereomicroscopic imaging technique for the quantification of cold flow in drug-in-adhesive type of transdermal drug delivery systems.

    Science.gov (United States)

    Krishnaiah, Yellela S R; Katragadda, Usha; Khan, Mansoor A

    2014-05-01

    Cold flow is a phenomenon occurring in drug-in-adhesive type of transdermal drug delivery systems (DIA-TDDS) because of the migration of DIA coat beyond the edge. Excessive cold flow can affect their therapeutic effectiveness, make removal of DIA-TDDS difficult from the pouch, and potentially decrease available dose if any drug remains adhered to pouch. There are no compendial or noncompendial methods available for quantification of this critical quality attribute. The objective was to develop a method for quantification of cold flow using stereomicroscopic imaging technique. Cold flow was induced by applying 1 kg force on punched-out samples of marketed estradiol DIA-TDDS (model product) stored at 25°C, 32°C, and 40°C/60% relative humidity (RH) for 1, 2, or 3 days. At the end of testing period, dimensional change in the area of DIA-TDDS samples was measured using image analysis software, and expressed as percent of cold flow. The percent of cold flow significantly decreased (p < 0.001) with increase in size of punched-out DIA-TDDS samples and increased (p < 0.001) with increase in cold flow induction temperature and time. This first ever report suggests that dimensional change in the area of punched-out samples stored at 32°C/60%RH for 2 days applied with 1 kg force could be used for quantification of cold flow in DIA-TDDS. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  2. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  3. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.; Julfakyan, Khachatur; Sommer, Christoph; Perez, Jose E.; Contreras, Maria F.; Khashab, Niveen M.; Kosel, Jü rgen; Ravasi, Timothy

    2016-01-01

    novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types

  4. Formation of social types in the theory of Orrin Klapp

    Directory of Open Access Journals (Sweden)

    Trifunović Vesna

    2007-01-01

    Full Text Available Theory of Orrin Klapp about social types draws attention to important functions that these types have within certain societies as well as that it is preferable to take them into consideration if our goal is more complete knowledge of that society. For Klapp, social types are important social symbols, which in an interesting way reflect society they are part of and for that reason this author dedicates his work to considering their meanings and social functions. He thinks that we can not understand a society without the knowledge about the types with which its members are identified and which serve them as models in their social activity. Hence, these types have cognitive value since, according to Klapp, they assist in perception and "contain the truth", and therefore the knowledge of them allows easier orientation within the social system. Social types also offer insight into the scheme of the social structure, which is otherwise invisible and hidden, but certainly deserves attention if we wish clearer picture about social relations within specific community. The aim of this work is to present this very interesting and inspirative theory of Orrin Klapp, pointing out its importance but also its weaknesses which should be kept in mind during its application in further research.

  5. T-dualization of type II superstring theory in double space

    Energy Technology Data Exchange (ETDEWEB)

    Nikolic, B.; Sazdovic, B. [University of Belgrade, Institute of Physics Belgrade, Belgrade (Serbia)

    2017-03-15

    In this article we offer a new interpretation of the T-dualization procedure of type II superstring theory in the double space framework. We use the ghost free action of type II superstring in pure spinor formulation in approximation of constant background fields up to the quadratic terms. T-dualization along any subset of the initial coordinates, x{sup a}, is equivalent to the permutation of this subset with subset of the corresponding T-dual coordinates, y{sub a}, in double space coordinate Z{sup M} = (x{sup μ}, y{sub μ}). Requiring that the T-dual transformation law after the exchange x{sup a} <-> y{sub a} has the same form as the initial one, we obtain the T-dual NS-NS and NS-R background fields. The T-dual R-R field strength is determined up to one arbitrary constant under some assumptions. The compatibility between supersymmetry and T-duality produces a change of bar spinors and R-R field strength. If we dualize an odd number of dimensions x{sup a}, such a change flips type IIA/B to type II B/A. If we T-dualize the time-like direction, one imaginary unit i maps type II superstring theories to type II{sup *} ones. (orig.)

  6. Type IIB flux vacua from G-theory I

    Energy Technology Data Exchange (ETDEWEB)

    Candelas, Philip [Mathematical Institute, University of Oxford,Andrew Wiles Building, Radcliffe Observatory Quarter,Woodstock Road, Oxford, OX2 6GG (United Kingdom); Constantin, Andrei [Department of Physics and Astronomy, Uppsala University,SE-751 20, Uppsala (Sweden); Damian, Cesar [Departamento de Fisica, DCI, Campus Leon, Universidad de Guanajuato,C.P. 37150, Leon, Guanajuato (Mexico); Larfors, Magdalena [Department of Physics and Astronomy, Uppsala University,SE-751 20, Uppsala (Sweden); Morales, Jose Francisco [INFN - Sezione di Roma “TorVergata”, Dipartimento di Fisica,Università di Roma “TorVergata”, Via della Ricerca Scientica, 00133 Roma (Italy)

    2015-02-27

    We construct non-perturbatively exact four-dimensional Minkowski vacua of type IIB string theory with non-trivial fluxes. These solutions are found by gluing together, consistently with U-duality, local solutions of type IIB supergravity on T{sup 4}×ℂ with the metric, dilaton and flux potentials varying along ℂ and the flux potentials oriented along T{sup 4}. We focus on solutions locally related via U-duality to non-compact Ricci-flat geometries. More general solutions and a complete analysis of the supersymmetry equations are presented in the companion paper http://arxiv.org/abs/1411.4786. We build a precise dictionary between fluxes in the global solutions and the geometry of an auxiliary K3 surface fibered over ℂℙ{sup 1}. In the spirit of F-theory, the flux potentials are expressed in terms of locally holomorphic functions that parametrize the complex structure moduli space of the K3 fiber in the auxiliary geometry. The brane content is inferred from the monodromy data around the degeneration points of the fiber.

  7. Type IIB flux vacua from G-theory I

    International Nuclear Information System (INIS)

    Candelas, Philip; Constantin, Andrei; Damian, Cesar; Larfors, Magdalena; Morales, Jose Francisco

    2015-01-01

    We construct non-perturbatively exact four-dimensional Minkowski vacua of type IIB string theory with non-trivial fluxes. These solutions are found by gluing together, consistently with U-duality, local solutions of type IIB supergravity on T"4×ℂ with the metric, dilaton and flux potentials varying along ℂ and the flux potentials oriented along T"4. We focus on solutions locally related via U-duality to non-compact Ricci-flat geometries. More general solutions and a complete analysis of the supersymmetry equations are presented in the companion paper http://arxiv.org/abs/1411.4786. We build a precise dictionary between fluxes in the global solutions and the geometry of an auxiliary K3 surface fibered over ℂℙ"1. In the spirit of F-theory, the flux potentials are expressed in terms of locally holomorphic functions that parametrize the complex structure moduli space of the K3 fiber in the auxiliary geometry. The brane content is inferred from the monodromy data around the degeneration points of the fiber.

  8. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  9. Electrical detection and quantification of single and mixed DNA nucleotides in suspension

    Science.gov (United States)

    Ahmad, Mahmoud Al; Panicker, Neena G.; Rizvi, Tahir A.; Mustafa, Farah

    2016-09-01

    High speed sequential identification of the building blocks of DNA, (deoxyribonucleotides or nucleotides for short) without labeling or processing in long reads of DNA is the need of the hour. This can be accomplished through exploiting their unique electrical properties. In this study, the four different types of nucleotides that constitute a DNA molecule were suspended in a buffer followed by performing several types of electrical measurements. These electrical parameters were then used to quantify the suspended DNA nucleotides. Thus, we present a purely electrical counting scheme based on the semiconductor theory that allows one to determine the number of nucleotides in a solution by measuring their capacitance-voltage dependency. The nucleotide count was observed to be similar to the multiplication of the corresponding dopant concentration and debye volume after de-embedding the buffer contribution. The presented approach allows for a fast and label-free quantification of single and mixed nucleotides in a solution.

  10. Branes at Singularities in Type 0 String Theory

    OpenAIRE

    Alishahiha, M; Brandhuber, A; Oz, Y

    1999-01-01

    We consider Type 0B D3-branes placed at conical singularities and analyze in detail the conifold singularity. We study the non supersymmetric gauge theories on their worldvolume and their conjectured dual gravity descriptions. In the ultraviolet the solutions exhibit a logarithmic running of the gauge coupling. In the infrared we find confining solutions and IR fixed points.

  11. Application of adaptive hierarchical sparse grid collocation to the uncertainty quantification of nuclear reactor simulators

    Energy Technology Data Exchange (ETDEWEB)

    Yankov, A.; Downar, T. [University of Michigan, 2355 Bonisteel Blvd, Ann Arbor, MI 48109 (United States)

    2013-07-01

    Recent efforts in the application of uncertainty quantification to nuclear systems have utilized methods based on generalized perturbation theory and stochastic sampling. While these methods have proven to be effective they both have major drawbacks that may impede further progress. A relatively new approach based on spectral elements for uncertainty quantification is applied in this paper to several problems in reactor simulation. Spectral methods based on collocation attempt to couple the approximation free nature of stochastic sampling methods with the determinism of generalized perturbation theory. The specific spectral method used in this paper employs both the Smolyak algorithm and adaptivity by using Newton-Cotes collocation points along with linear hat basis functions. Using this approach, a surrogate model for the outputs of a computer code is constructed hierarchically by adaptively refining the collocation grid until the interpolant is converged to a user-defined threshold. The method inherently fits into the framework of parallel computing and allows for the extraction of meaningful statistics and data that are not within reach of stochastic sampling and generalized perturbation theory. This paper aims to demonstrate the advantages of spectral methods-especially when compared to current methods used in reactor physics for uncertainty quantification-and to illustrate their full potential. (authors)

  12. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Quantification of left to right shunts in adults with atrial septal defects of secundum type, using radionuclide technique

    International Nuclear Information System (INIS)

    Sire, S.; Rootwelt, K.; Mangschau, A.

    1991-01-01

    Quantification of left to right shunt was carried out in 15 adult patients with a suspected ostium secundum atrial septal defect (ASD II). Radionuclide shunt quantitation correlated well with the results of righ heart catheterization. The radionuclide technique failed in two patients for technical reasons, but revealed no false negative or false positive results when technically satisfactory. The diagnosis was confirmed at operation. It is concluded that the radionuclide technique is a useful and reliable method which can also be used at follow-up after surgery in patients with artrial septal defects of secundum type. 20 refs., 3 figs., 1 tab

  14. TYPOLOGY OF POSTMODERN THEORIES IN SOCIOLOGY: CRITERIAS, GENERAL DESCRIPTION OF TYPES AND COMPOSITION

    Directory of Open Access Journals (Sweden)

    Chudova I. A.

    2014-09-01

    Full Text Available Despite the concept of postmodernism in sociology is well-known and frequently used its place and role in sociology, including sociological theory, is not enough clarified. The article presents the version of typology of identification of theoretical cases as a part of postmodern sociological theory, offers a set of criteria for the identification of theories, among them are the involvement of the poststructuralist ideas, criticism of modernism, evading identification and deconstruction, conception of post-modern society. In formulating the criteria were taken into account external references and the specific content and style of self-presentation of theories. On the base of these criterias the differentiation of theories by «saturation» of postmodern features have been made and «concentrated» and «liquid» types of theories have been identified. These types are splited into subtypes depending on the composition of saturated criterias. Each of subtyping theories described by the example of the theoreties of J. Baudrillard, J.-F. Lyotard, R. Barthes, Z. Bauman, A. Giddens. This typology can be used to argumentation of closeness of theory to postmodernism, disclosure of a number of theories accents and theoretical systematization in the field of sociology as a whole.

  15. A modern elaboration of the Ramified Theory of Types

    NARCIS (Netherlands)

    Laan, T.D.L.; Nederpelt, R.P.

    1996-01-01

    The paper first formalizes the ramified type theory as (informally) described in the Principia Mathematica [32]. This formalization is close to the ideas of the Principia, but also meets contemporary requirements on formality and accuracy, and therefore is a new supply to the known literature on the

  16. Introduction to type-2 fuzzy logic control theory and applications

    CERN Document Server

    Mendel, Jerry M; Tan, Woei-Wan; Melek, William W; Ying, Hao

    2014-01-01

    Written by world-class leaders in type-2 fuzzy logic control, this book offers a self-contained reference for both researchers and students. The coverage provides both background and an extensive literature survey on fuzzy logic and related type-2 fuzzy control. It also includes research questions, experiment and simulation results, and downloadable computer programs on an associated website. This key resource will prove useful to students and engineers wanting to learn type-2 fuzzy control theory and its applications.

  17. Water type quantification in the Skagerrak, the Kattegat and off the Jutland west coast

    Directory of Open Access Journals (Sweden)

    Trond Kristiansen

    2015-04-01

    Full Text Available An extensive data series of salinity, nutrients and coloured dissolved organic material (CDOM was collected in the Skagerrak, the northern part of the Kattegat and off the Jutland west coast in April each year during the period 1996–2000, by the Institute of Marine Research in Norway. In this month, after the spring bloom, German Bight Water differs from its surrounding waters by a higher nitrate content and higher nitrate/phosphate and nitrate/silicate ratios. The spreading of this water type into the Skagerrak is of special interest with regard to toxic algal blooms. The quantification of the spatial distributions of the different water types required the development of a new algorithm for the area containing the Norwegian Coastal Current, while an earlier Danish algorithm was applied for the rest of the area. From the upper 50 m a total of 2227 observations of salinity and CDOM content have been used to calculate the mean concentration of water from the German Bight, the North Sea (Atlantic water, the Baltic Sea and Norwegian rivers. The Atlantic Water was the dominant water type, with a mean concentration of 79%, German Bight Water constituted 11%, Baltic Water 8%, and Norwegian River Water 2%. At the surface the mean percentages of these water types were found to be 68%, 15%, 15%, and 3%, respectively. Within the northern part of the Skagerrak, closer to the Norwegian coast, the surface waters were estimated to consist of 74% Atlantic Water, 20% Baltic Water, and 7% Norwegian River Water. The analysis indicates that the content of German Bight Water in this part is less than 5%.

  18. MOTIVATION INTERNALIZATION AND SIMPLEX STRUCTURE IN SELF-DETERMINATION THEORY.

    Science.gov (United States)

    Ünlü, Ali; Dettweiler, Ulrich

    2015-12-01

    Self-determination theory, as proposed by Deci and Ryan, postulated different types of motivation regulation. As to the introjected and identified regulation of extrinsic motivation, their internalizations were described as "somewhat external" and "somewhat internal" and remained undetermined in the theory. This paper introduces a constrained regression analysis that allows these vaguely expressed motivations to be estimated in an "optimal" manner, in any given empirical context. The approach was even generalized and applied for simplex structure analysis in self-determination theory. The technique was exemplified with an empirical study comparing science teaching in a classical school class versus an expeditionary outdoor program. Based on a sample of 84 German pupils (43 girls, 41 boys, 10 to 12 years old), data were collected using the German version of the Academic Self-Regulation Questionnaire. The science-teaching format was seen to not influence the pupils' internalization of identified regulation. The internalization of introjected regulation differed and shifted more toward the external pole in the outdoor teaching format. The quantification approach supported the simplex structure of self-determination theory, whereas correlations may disconfirm the simplex structure.

  19. Ginzburg-Landau-type theory of nonpolarized spin superconductivity

    Science.gov (United States)

    Lv, Peng; Bao, Zhi-qiang; Guo, Ai-Min; Xie, X. C.; Sun, Qing-Feng

    2017-01-01

    Since the concept of spin superconductor was proposed, all the related studies concentrate on the spin-polarized case. Here, we generalize the study to the spin-non-polarized case. The free energy of nonpolarized spin superconductor is obtained, and Ginzburg-Landau-type equations are derived by using the variational method. These Ginzburg-Landau-type equations can be reduced to the spin-polarized case when the spin direction is fixed. Moreover, the expressions of super linear and angular spin currents inside the superconductor are derived. We demonstrate that the electric field induced by the super spin current is equal to the one induced by an equivalent charge obtained from the second Ginzburg-Landau-type equation, which shows self-consistency of our theory. By applying these Ginzburg-Landau-type equations, the effect of electric field on the superconductor is also studied. These results will help us get a better understanding of the spin superconductor and related topics such as the Bose-Einstein condensate of magnons and spin superfluidity.

  20. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Science.gov (United States)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  1. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    Science.gov (United States)

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Quantifications and Modeling of Human Failure Events in a Fire PSA

    International Nuclear Information System (INIS)

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol

    2014-01-01

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures

  3. Quantifications and Modeling of Human Failure Events in a Fire PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures.

  4. On running couplings in gauge theories from type-IIB supergravity

    CERN Document Server

    Kehagias, A A

    1999-01-01

    We construct an explicit solution of type-IIB supergravity describing the strong coupling regime of a non-supersymmetric gauge theory. The latter has a running coupling with an ultraviolet stable fixed point corresponding to the N=4 SU(N) super-Yang-Mills theory at large N. The running coupling has a power law behaviour, argued to be universal, that is consistent with holography. Around the critical point, our solution defines an asymptotic expansion for the gauge coupling beta-function. We also calculate the first correction to the Coulombic quark-antiquark potential.

  5. Did natural selection make the Dutch taller? A cautionary note on the importance of quantification in understanding evolution.

    Science.gov (United States)

    Tarka, Maja; Bolstad, Geir H; Wacker, Sebastian; Räsänen, Katja; Hansen, Thomas F; Pélabon, Christophe

    2015-12-01

    One of the main achievements of the modern synthesis is a rigorous mathematical theory for evolution by natural selection. Combining this theory with statistical models makes it possible to estimate the relevant parameters so as to quantify selection and evolution in nature. Although quantification is a sign of a mature science, statistical models are unfortunately often interpreted independently of the motivating mathematical theory. Without a link to theory, numerical results do not represent proper quantifications, because they lack the connections that designate their biological meaning. Here, we want to raise awareness and exemplify this problem by examining a recent study on natural selection in a contemporary human population. Stulp et al. (2015) concluded that natural selection may partly explain the increasing stature of the Dutch population. This conclusion was based on a qualitative assessment of the presence of selection on height. Here, we provide a quantitative interpretation of these results using standard evolutionary theory to show that natural selection has had a minuscule effect. © 2015 The Author(s). Evolution © 2015 The Society for the Study of Evolution.

  6. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  7. Quality control assessment of human immunodeficiency virus type 2 (HIV-2) viral load quantification assays: results from an international collaboration on HIV-2 infection in 2006

    NARCIS (Netherlands)

    Damond, Florence; Benard, Antoine; Ruelle, Jean; Alabi, Abraham; Kupfer, Bernd; Gomes, Perpetua; Rodes, Berta; Albert, Jan; Böni, Jürg; Garson, Jeremy; Ferns, Bridget; Matheron, Sophie; Chene, Geneviève; Brun-Vezinet, Françoise; Goubau, Patrick; Campa, Pauline; Descamps, Diane; Simon, François; Taieb, Audrey; Autran, Brigitte; Cotten, Matt; Jaye, Assan; Peterson, Kevin; Rowland-Jones, Sarah; Rockstroh, Jürgen; Schwarze-Zander, Carolynne; de Wolf, Frank; van Sighem, Ard; Reiss, Peter; van der Loeff, Maarten Schim; Schutten, Martin; Camacho, Ricardo; Mansinho, Kamal; Antunes, Francisco; Luis, Franca; Valadas, Emilia; Toro, Carlos; Soriano, Vicente; Gyllensten, Katarina; Sonnerborg, Anders; Yilmaz, Aylin; Gisslén, Magnus; Calmy, Alexandra; Rickenbach, Martin; Pillay, Deenan; Tosswill, Jennifer; Anderson, Jane; Chadwick, David

    2008-01-01

    Human immunodeficiency virus type 2 (HIV-2) RNA quantification assays used in nine laboratories of the ACHI(E)V(2E) (A Collaboration on HIV-2 Infection) study group were evaluated. In a blinded experimental design, laboratories quantified three series of aliquots of an HIV-2 subtype A strain, each

  8. Uncovering the underlying physical mechanisms of biological systems via quantification of landscape and flux

    International Nuclear Information System (INIS)

    Xu Li; Chu Xiakun; Yan Zhiqiang; Zheng Xiliang; Zhang Kun; Zhang Feng; Yan Han; Wu Wei; Wang Jin

    2016-01-01

    In this review, we explore the physical mechanisms of biological processes such as protein folding and recognition, ligand binding, and systems biology, including cell cycle, stem cell, cancer, evolution, ecology, and neural networks. Our approach is based on the landscape and flux theory for nonequilibrium dynamical systems. This theory provides a unifying principle and foundation for investigating the underlying mechanisms and physical quantification of biological systems. (topical review)

  9. Mathematical model for biomolecular quantification using surface-enhanced Raman spectroscopy based signal intensity distributions

    DEFF Research Database (Denmark)

    Palla, Mirko; Bosco, Filippo Giacomo; Yang, Jaeyoung

    2015-01-01

    This paper presents the development of a novel statistical method for quantifying trace amounts of biomolecules by surface-enhanced Raman spectroscopy (SERS) using a rigorous, single molecule (SM) theory based mathematical derivation. Our quantification framework could be generalized for planar...

  10. Towards an integrative account of social cognition: marrying theory of mind and interactionism to study the interplay of Type 1 and Type 2 processes

    Directory of Open Access Journals (Sweden)

    Vivian eBohl

    2012-10-01

    Full Text Available Traditional theory of mind accounts of social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional theory of mind accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering theory of mind and interactionism as mutually exclusive opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labelled Type 1 refers to processes that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labelled Type 2 refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, theory of mind is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behaviour may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction.

  11. Integrable systems and quantum field theory. Works in progress Nr 75

    International Nuclear Information System (INIS)

    Baird, Paul; Helein, Frederic; Kouneiher, Joseph; Roubtsov, Volodya; Antunes, Paulo; Banos, Bertrand; Barbachoux, Cecile; Desideri, Laura; Kahouadji, Nabil; Gerding, Aaron; Heller, Sebastian; Schmitt, Nicholas; Harrivel, Dikanaina; Hoevenaars, Luuk K.; Iftime, Mihaela; Levy, Thierry; Lisovyy, Oleg; Masson, Thierry; Skrypnyk, Taras; Pedit, Franz; Egeileh, Michel

    2009-01-01

    The contributions of this collective book address the quantum field theory (integrable systems and quantum field theory, introduction to supermanifolds and supersymmetry, beyond geometric quantification, Gaussian measurements and Fock spaces), differential geometry and physics (gravitation and geometry, physical events and the superspace about the hole argument, the Cartan-Kaehler theory and applications to local isometric and conformal embedding, calibrations, Cabal-Yau structures and Monge-Ampere structures, Hamiltonian multi-symplectic formalism and Monge-Ampere equations, big bracket, derivations and derivative multi-brackets), integrable system, geometry and physics (finite-volume correlation functions of monodromy fields on the lattice with the Toeplitz representation, Frobenius manifolds and algebraic integrability, an introduction to twistors, Hamiltonian systems on the 'coupled' curves, Nambu-Poisson mechanics and Fairlie-type integrable systems, minimal surfaces with polygonal boundary and Fuchsian equations, global aspects of integrable surface geometry), and non commutative geometry (an informal introduction to the ideas and concepts of non commutative geometry)

  12. Negative affectivity in cardiovascular disease: Evaluating Type D personality assessment using item response theory

    NARCIS (Netherlands)

    Emons, Wilco H.M.; Meijer, R.R.; Denollet, Johan

    2007-01-01

    Objective: Individuals with increased levels of both negative affectivity (NA) and social inhibition (SI)—referred to as type-D personality—are at increased risk of adverse cardiac events. We used item response theory (IRT) to evaluate NA, SI, and type-D personality as measured by the DS14. The

  13. Extension of anisotropic effective medium theory to account for an arbitrary number of inclusion types

    Science.gov (United States)

    Myles, Timothy D.; Peracchio, Aldo A.; Chiu, Wilson K. S.

    2015-01-01

    The purpose of this work is to extend, to multi-components, a previously reported theory for calculating the effective conductivity of a two component mixture. The previously reported theory involved preferentially oriented spheroidal inclusions contained in a continuous matrix, with inclusions oriented relative to a principle axis. This approach was based on Bruggeman's unsymmetrical theory, and is extended to account for an arbitrary number of different inclusion types. The development begins from two well-known starting points; the Maxwell approach and the Maxwell-Garnett approach for dilute mixtures. It is shown that despite these two different starting points, the final Bruggeman type equation is the same. As a means of validating the developed expression, comparisons are made to several existing effective medium theories. It is shown that these existing theories coincide with the developed equations for the appropriate parameter set. Finally, a few example mixtures are considered to demonstrate the effect of multiple inclusions on the calculated effective property. Inclusion types of different conductivities, shapes, and orientations are considered and each of the aforementioned properties is shown to have a potentially significant impact on the calculated mixture property.

  14. Search of unified theory of basic types of elementary particle interactions

    International Nuclear Information System (INIS)

    Anselm, A.

    1981-01-01

    Four types of forces are described (strong, weak, electromagnetic and gravitational) mediating the basic interactions of quarks and leptons, and attempts are reported of forming a unified theory of all basic interactions. The concepts are discussed, such as the theory symmetry (eg., invariance in relation to the Lorentz transformations) and isotopic symmetry (based on the interchangeability of particles in a given isotopic multiplet). Described are the gauge character of electromagnetic and gravitational interactions, the violation of the gauge symmetry and the mechanism of particle confinement. (H.S.)

  15. Quantification in dynamic and small-animal positron emission tomography

    NARCIS (Netherlands)

    Disselhorst, Johannes Antonius

    2011-01-01

    This thesis covers two aspects of positron emission tomography (PET) quantification. The first section addresses the characterization and optimization of a small-animal PET/CT scanner. The sensitivity and resolution as well as various parameters affecting image quality (reconstruction settings, type

  16. Predictive Game Theory

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  17. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  18. Quantification of three-dimensional cell-mediated collagen remodeling using graph theory.

    Science.gov (United States)

    Bilgin, Cemal Cagatay; Lund, Amanda W; Can, Ali; Plopper, George E; Yener, Bülent

    2010-09-30

    Cell cooperation is a critical event during tissue development. We present the first precise metrics to quantify the interaction between mesenchymal stem cells (MSCs) and extra cellular matrix (ECM). In particular, we describe cooperative collagen alignment process with respect to the spatio-temporal organization and function of mesenchymal stem cells in three dimensions. We defined two precise metrics: Collagen Alignment Index and Cell Dissatisfaction Level, for quantitatively tracking type I collagen and fibrillogenesis remodeling by mesenchymal stem cells over time. Computation of these metrics was based on graph theory and vector calculus. The cells and their three dimensional type I collagen microenvironment were modeled by three dimensional cell-graphs and collagen fiber organization was calculated from gradient vectors. With the enhancement of mesenchymal stem cell differentiation, acceleration through different phases was quantitatively demonstrated. The phases were clustered in a statistically significant manner based on collagen organization, with late phases of remodeling by untreated cells clustering strongly with early phases of remodeling by differentiating cells. The experiments were repeated three times to conclude that the metrics could successfully identify critical phases of collagen remodeling that were dependent upon cooperativity within the cell population. Definition of early metrics that are able to predict long-term functionality by linking engineered tissue structure to function is an important step toward optimizing biomaterials for the purposes of regenerative medicine.

  19. Quantification of three-dimensional cell-mediated collagen remodeling using graph theory.

    Directory of Open Access Journals (Sweden)

    Cemal Cagatay Bilgin

    2010-09-01

    Full Text Available Cell cooperation is a critical event during tissue development. We present the first precise metrics to quantify the interaction between mesenchymal stem cells (MSCs and extra cellular matrix (ECM. In particular, we describe cooperative collagen alignment process with respect to the spatio-temporal organization and function of mesenchymal stem cells in three dimensions.We defined two precise metrics: Collagen Alignment Index and Cell Dissatisfaction Level, for quantitatively tracking type I collagen and fibrillogenesis remodeling by mesenchymal stem cells over time. Computation of these metrics was based on graph theory and vector calculus. The cells and their three dimensional type I collagen microenvironment were modeled by three dimensional cell-graphs and collagen fiber organization was calculated from gradient vectors. With the enhancement of mesenchymal stem cell differentiation, acceleration through different phases was quantitatively demonstrated. The phases were clustered in a statistically significant manner based on collagen organization, with late phases of remodeling by untreated cells clustering strongly with early phases of remodeling by differentiating cells. The experiments were repeated three times to conclude that the metrics could successfully identify critical phases of collagen remodeling that were dependent upon cooperativity within the cell population.Definition of early metrics that are able to predict long-term functionality by linking engineered tissue structure to function is an important step toward optimizing biomaterials for the purposes of regenerative medicine.

  20. Four types of coping with COPD-induced breathlessness in daily living: a grounded theory study

    DEFF Research Database (Denmark)

    Bastrup, Lene; Dahl, Ronald; Pedersen, Preben Ulrich

    2013-01-01

    COPD predominantly cope with breathlessness during daily living. We chose a multimodal grounded theory design that holds the opportunity to combine qualitative and quantitative data to capture and explain the multidimensional coping behaviour among poeple with COPD. The participants' main concern...... in coping with breathlessness appeared to be an endless striving to economise on resources in an effort to preserve their integrity. In this integrity-preserving process, four predominant coping types emerged and were labelled: `Overrater´, `Challenger´, `Underrater´, and `Leveller´. Each coping type...... comprised distrinctive physiological, cognitive, affective and psychosocial features constituting coping-type-specific indicators. In theory, four predominant coping types with distinct physiological, cognitive, affective and psychosocial properties are observed among people with COPD. The four coping types...

  1. Deformed type 0A matrix model and super-Liouville theory for fermionic black holes

    International Nuclear Information System (INIS)

    Ahn, Changrim; Kim, Chanju; Park, Jaemo; Suyama, Takao; Yamamoto, Masayoshi

    2006-01-01

    We consider a c-circumflex = 1 model in the fermionic black hole background. For this purpose we consider a model which contains both the N 1 and the N = 2 super-Liouville interactions. We propose that this model is dual to a recently proposed type 0A matrix quantum mechanics model with vortex deformations. We support our conjecture by showing that non-perturbative corrections to the free energy computed by both the matrix model and the super-Liouville theories agree exactly by treating the N = 2 interaction as a small perturbation. We also show that a two-point function on sphere calculated from the deformed type 0A matrix model is consistent with that of the N = 2 super-Liouville theory when the N = 1 interaction becomes small. This duality between the matrix model and super-Liouville theories leads to a conjecture for arbitrary n-point correlation functions of the N = 1 super-Liouville theory on the sphere

  2. A candidate liquid chromatography mass spectrometry reference method for the quantification of the cardiac marker 1-32 B-type natriuretic peptide.

    Science.gov (United States)

    Torma, Attila F; Groves, Kate; Biesenbruch, Sabine; Mussell, Chris; Reid, Alan; Ellison, Steve; Cramer, Rainer; Quaglia, Milena

    2017-08-28

    B-type natriuretic peptide (BNP) is a 32 amino acid cardiac hormone routinely measured by immunoassays to diagnose heart failure. While it is reported that immunoassay results can vary up to 45%, no attempt of standardization and/or harmonization through the development of certified reference materials (CRMs) or reference measurement procedures (RMPs) has yet been carried out. B-type natriuretic peptide primary calibrator was quantified traceably to the International System of Units (SI) by both amino acid analysis and tryptic digestion. A method for the stabilization of BNP in plasma followed by protein precipitation, solid phase extraction (SPE) and liquid chromatography (LC) mass spectrometry (MS) was then developed and validated for the quantification of BNP at clinically relevant concentrations (15-150 fmol/g). The candidate reference method was applied to the quantification of BNP in a number of samples from the UK NEQAS Cardiac Markers Scheme to demonstrate its applicability to generate reference values and to preliminary evaluate the commutability of a potential CRM. The results from the reference method were consistently lower than the immunoassay results and discrepancy between the immunoassays was observed confirming previous data. The application of the liquid chromatography-mass spectrometry (LC-MS) method to the UK NEQAS samples and the correlation of the results with the immunoassay results shows the potential of the method to support external quality assessment schemes, to improve understanding of the bias of the assays and to establish RMPs for BNP measurements. Furthermore, the method has the potential to be multiplexed for monitoring circulating truncated forms of BNP.

  3. The theory of discrete barriers and its applications to linear boundary-value problems of the 'Dirichlet type'; Theorie des barrieres discretes et applications a des problemes lineaires elliptiques du ''type de dirichlet''

    Energy Technology Data Exchange (ETDEWEB)

    Jamet, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    This report gives a general presentation of barrier theory for finite difference operators, with its applications to some boundary value problems. (author) [French] Ce rapport est un expose synthetique de la theorie des barrieres pour les operateurs aux differences finies et ses applications a certaines classes de problemes lineaires elliptiques du 'type de Dirichlet'. (auteur)

  4. Techniques for quantification of liver fat in risk stratification of diabetics

    International Nuclear Information System (INIS)

    Kuehn, J.P.; Spoerl, M.C.; Mahlke, C.; Hegenscheid, K.

    2015-01-01

    Fatty liver disease plays an important role in the development of type 2 diabetes. Accurate techniques for detection and quantification of liver fat are essential for clinical diagnostics. Chemical shift-encoded magnetic resonance imaging (MRI) is a simple approach to quantify liver fat content. Liver fat quantification using chemical shift-encoded MRI is influenced by several bias factors, such as T2* decay, T1 recovery and the multispectral complexity of fat. The confounder corrected proton density fat fraction is a simple approach to quantify liver fat with comparable results independent of the software and hardware used. The proton density fat fraction is an accurate biomarker for assessment of liver fat. An accurate and reproducible quantification of liver fat using chemical shift-encoded MRI requires a calculation of the proton density fat fraction. (orig.) [de

  5. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  6. Hartree-type approximation applied to a phi4 field theory

    International Nuclear Information System (INIS)

    Chang, S.-J.

    1976-01-01

    Recently, there has been considerable interest in studying the relativistic field theories by means of nonperturbative method. These studies are partially motivated by the now fashionable physical picture that the hadrons are created from an 'abnormal vacuum state'. This abnormal vacuum state is the ground state associated with a spontaneously broken symmetry and is usually characterized by the non-vanishing expectation value of one or more scale fields. Presently, nearly all understandings of hadrons in the above description are based on semi-classical calculations. It is important to know how significant are the effects of the quantum corrections. Some results on the quantum fluctuations in a phi 4 field theory based in a self-consistent Hartree-type approximation are described. (Auth.)

  7. Protocol for Quantification of Defects in Natural Fibres for Composites

    DEFF Research Database (Denmark)

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based...

  8. Cosmic string solution in a Born-Infeld type theory of gravity

    International Nuclear Information System (INIS)

    Rocha, W.J. da; Guimaraes, M.E.X.

    2009-01-01

    Full text. Advances in the formal structure of string theory point to the emergence, and necessity, of a scalar-tensorial theory of gravity. It seems that, at least at high energy scales, the Einstein's theory is not enough to explain the gravitational phenomena. In other words, the existence of a scalar (gravitational) field acting as a mediator of the gravitational interaction together with the usual purely rank-2 tensorial field is, indeed, a natural prediction of unification models as supergravity, superstrings and M-theory. This type of modified gravitation was first introduced in a different context in the 60's in order to incorporate the Mach's principle into relativity, but nowadays it acquired different sense in cosmology and gravity theories. Although such unification theories are the most acceptable, they all exist in higher dimensional spaces. The compactification from these higher dimensions to the 4-dimensional physics is not unique and there exist many effective theories of gravity which come from the unification process. Each of them must, of course, satisfy some predictions. Here, in this paper, we will deal with one of them. The so-called NDL theory. One important assumption in General Relativity is that all field interact in the same way with gravity. This is the so called Strong Equivalence Principle (SEP). It is well known, with good accuracy, that this is true when we concern with matter to matter interaction, i.e, the Weak Equivalence Principle(WEP) is tested. But, until now, there is no direct observational confirmation of this affirmation to the gravity to gravity interaction. In an extension of the field theoretical description of General Relativity constructed by is used to propose an alternative field theory of gravity. In this theory gravitons propagate in a different spacetime. The velocity of propagation of the gravitational waves in this theory does not coincide with the General Relativity predictions. (author)

  9. Massive IIA string theory and Matrix theory compactification

    International Nuclear Information System (INIS)

    Lowe, David A.; Nastase, Horatiu; Ramgoolam, Sanjaye

    2003-01-01

    We propose a Matrix theory approach to Romans' massive Type IIA supergravity. It is obtained by applying the procedure of Matrix theory compactifications to Hull's proposal of the massive Type IIA string theory as M-theory on a twisted torus. The resulting Matrix theory is a super-Yang-Mills theory on large N three-branes with a space-dependent noncommutativity parameter, which is also independently derived by a T-duality approach. We give evidence showing that the energies of a class of physical excitations of the super-Yang-Mills theory show the correct symmetry expected from massive Type IIA string theory in a lightcone quantization

  10. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, Xiao [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Iaccarino, Gianluca [Stanford Univ., CA (United States); Mittal, Akshay [Stanford Univ., CA (United States)

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  11. Non-renormalisation conditions in type II string theory and maximal supergravity

    International Nuclear Information System (INIS)

    Green, Michael B.; Russo, Jorge G.; Vanhove, Pierre

    2007-01-01

    This paper considers general features of the derivative expansion of Feynman diagram contributions to the four-graviton scattering amplitude in eleven-dimensional supergravity compactified on a two-torus. These are translated into statements about interactions of the form D 2k R 4 in type II superstring theories, assuming the standard M-theory/string theory duality relationships, which provide powerful constraints on the effective interactions. In the ten-dimensional IIA limit we find that there can be no perturbative contributions beyond k string loops (for k>0). Furthermore, the genus h = k contributions are determined exactly by the one-loop eleven-dimensional supergravity amplitude for all values of k. A plausible interpretation of these observations is that the sum of h-loop Feynman diagrams of maximally extended supergravity is less divergent than might be expected and could be ultraviolet finite in dimensions d<4+6/h - the same bound as for N = 4 Yang-Mills

  12. Non-renormalisation conditions in type II string theory and maximal supergravity

    Science.gov (United States)

    Green, Michael B.; Russo, Jorge G.; Vanhove, Pierre

    2007-02-01

    This paper considers general features of the derivative expansion of Feynman diagram contributions to the four-graviton scattering amplitude in eleven-dimensional supergravity compactified on a two-torus. These are translated into statements about interactions of the form D2kR4 in type II superstring theories, assuming the standard M-theory/string theory duality relationships, which provide powerful constraints on the effective interactions. In the ten-dimensional IIA limit we find that there can be no perturbative contributions beyond k string loops (for k>0). Furthermore, the genus h = k contributions are determined exactly by the one-loop eleven-dimensional supergravity amplitude for all values of k. A plausible interpretation of these observations is that the sum of h-loop Feynman diagrams of maximally extended supergravity is less divergent than might be expected and could be ultraviolet finite in dimensions d<4+6/h - the same bound as for N = 4 Yang-Mills.

  13. Universal properties of type IIB and F-theory flux compactifications at large complex structure

    International Nuclear Information System (INIS)

    Marsh, M.C. David; Sousa, Kepa

    2016-01-01

    We consider flux compactifications of type IIB string theory and F-theory in which the respective superpotentials at large complex structure are dominated by cubic or quartic terms in the complex structure moduli. In this limit, the low-energy effective theory exhibits universal properties that are insensitive to the details of the compactification manifold or the flux configuration. Focussing on the complex structure and axio-dilaton sector, we show that there are no vacua in this region and the spectrum of the Hessian matrix is highly peaked and consists only of three distinct eigenvalues (0, 2m 3/2 2 and 8m 3/2 2 ), independently of the number of moduli. We briefly comment on how the inclusion of Kähler moduli affect these findings. Our results generalise those of Brodie & Marsh http://dx.doi.org/10.1007/JHEP01(2016)037, in which these universal properties were found in a subspace of the large complex structure limit of type IIB compactifications.

  14. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  15. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  16. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  17. Nilpotent symmetries and Curci-Ferrari-type restrictions in 2D non-Abelian gauge theory: Superfield approach

    Science.gov (United States)

    Srinivas, N.; Malik, R. P.

    2017-11-01

    We derive the off-shell nilpotent symmetries of the two (1 + 1)-dimensional (2D) non-Abelian 1-form gauge theory by using the theoretical techniques of the geometrical superfield approach to Becchi-Rouet-Stora-Tyutin (BRST) formalism. For this purpose, we exploit the augmented version of superfield approach (AVSA) and derive theoretically useful nilpotent (anti-)BRST, (anti-)co-BRST symmetries and Curci-Ferrari (CF)-type restrictions for the self-interacting 2D non-Abelian 1-form gauge theory (where there is no interaction with matter fields). The derivation of the (anti-)co-BRST symmetries and all possible CF-type restrictions are completely novel results within the framework of AVSA to BRST formalism where the ordinary 2D non-Abelian theory is generalized onto an appropriately chosen (2, 2)-dimensional supermanifold. The latter is parametrized by the superspace coordinates ZM = (xμ,𝜃,𝜃¯) where xμ (with μ = 0, 1) are the bosonic coordinates and a pair of Grassmannian variables (𝜃,𝜃¯) obey the relationships: 𝜃2 = 𝜃¯2 = 0, 𝜃𝜃¯ + 𝜃¯𝜃 = 0. The topological nature of our 2D theory allows the existence of a tower of CF-type restrictions.

  18. A sufficient condition for de Sitter vacua in type IIB string theory

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, Markus [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2011-07-15

    We derive a sufficient condition for realizing meta-stable de Sitter vacua with small positive cosmological constant within type IIB string theory flux compactifications with spontaneously broken supersymmetry. There are a number of 'lamp post' constructions of de Sitter vacua in type IIB string theory and supergravity. We show that one of them - the method of 'Kaehler uplifting' by F-terms from an interplay between non-perturbative effects and the leading {alpha}'-correction - allows for a more general parametric understanding of the existence of de Sitter vacua. The result is a condition on the values of the flux induced superpotential and the topological data of the Calabi-Yau compactification, which guarantees the existence of a meta-stable de Sitter vacuum if met. Our analysis explicitly includes the stabilization of all moduli, i.e. the Kaehler, dilaton and complex structure moduli, by the interplay of the leading perturbative and non-perturbative effects at parametrically large volume. (orig.)

  19. Gödel and Gödel-type universes in Brans–Dicke theory

    Energy Technology Data Exchange (ETDEWEB)

    Agudelo, J.A., E-mail: jaar@fisica.ufmt.br [Instituto de Física, Universidade Federal de Mato Grosso, 78060-900, Cuiabá, Mato Grosso (Brazil); Nascimento, J.R., E-mail: jroberto@fisica.ufpb.br [Departamento de Física, Universidade Federal da Paraíba, Caixa Postal 5008, 58051-970, João Pessoa, Paraíba (Brazil); Petrov, A.Yu., E-mail: petrov@fisica.ufpb.br [Departamento de Física, Universidade Federal da Paraíba, Caixa Postal 5008, 58051-970, João Pessoa, Paraíba (Brazil); Porfírio, P.J., E-mail: pporfirio@fisica.ufpb.br [Departamento de Física, Universidade Federal da Paraíba, Caixa Postal 5008, 58051-970, João Pessoa, Paraíba (Brazil); Santos, A.F., E-mail: alesandroferreira@fisica.ufmt.br [Instituto de Física, Universidade Federal de Mato Grosso, 78060-900, Cuiabá, Mato Grosso (Brazil); Department of Physics and Astronomy, University of Victoria, 3800 Finnerty Road Victoria, BC (Canada)

    2016-11-10

    In this paper, conditions for existence of Gödel and Gödel-type solutions in Brans–Dicke (BD) scalar–tensor theory and their main features are studied. The consistency of equations of motion, causality violation and existence of CTCs (closed time-like curves) are investigated. The role which cosmological constant and Mach principle play to achieve the consistency of this model is studied.

  20. Cognitive Load Theory: How Many Types of Load Does It Really Need?

    Science.gov (United States)

    Kalyuga, Slava

    2011-01-01

    Cognitive load theory has been traditionally described as involving three separate and additive types of load. Germane load is considered as a learning-relevant load complementing extraneous and intrinsic load. This article argues that, in its traditional treatment, germane load is essentially indistinguishable from intrinsic load, and therefore…

  1. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  2. XPS quantification of the hetero-junction interface energy

    International Nuclear Information System (INIS)

    Ma, Z.S.; Wang Yan; Huang, Y.L.; Zhou, Z.F.; Zhou, Y.C.; Zheng Weitao; Sun, Chang Q.

    2013-01-01

    Highlights: ► Quantum entrapment or polarization dictates the performance of dopant, impurity, interface, alloy and compounds. ► Interface bond energy, energy density, and atomic cohesive energy can be determined using XPS and our BOLS theory. ► Presents a new and reliable method for catalyst design and identification. ► Entrapment makes CuPd to be a p-type catalyst and polarization derives AgPd as an n-type catalyst. - Abstract: We present an approach for quantifying the heterogeneous interface bond energy using X-ray photoelectron spectroscopy (XPS). Firstly, from analyzing the XPS core-level shift of the elemental surfaces we obtained the energy levels of an isolated atom and their bulk shifts of the constituent elements for reference; then we measured the energy shifts of the specific energy levels upon interface alloy formation. Subtracting the referential spectrum from that collected from the alloy, we can distil the interface effect on the binding energy. Calibrated based on the energy levels and their bulk shifts derived from elemental surfaces, we can derive the bond energy, energy density, atomic cohesive energy, and free energy at the interface region. This approach has enabled us to clarify the dominance of quantum entrapment at CuPd interface and the dominance of polarization at AgPd and BeW interfaces, as the origin of interface energy change. Developed approach not only enhances the power of XPS but also enables the quantification of the interface energy at the atomic scale that has been an issue of long challenge.

  3. BIonic system: Extraction of Lovelock gravity from a Born-Infeld-type theory

    Science.gov (United States)

    Naimi, Yaghoob; Sepehri, Alireza; Ghaffary, Tooraj; Ghaforyan, Hossein; Ebrahimzadeh, Majid

    It was shown that both Lovelock gravity and Born-Infeld (BI) electrodynamics can be obtained from low effective limit of string theory. Motivated by the mentioned unique origin of the gauge-gravity theories, we are going to find a close relation between them. In this research, we start from the Lagrangian of a BI-type nonlinear electrodynamics with an exponential form to extract the action of Lovelock gravity. We investigate the origin of Lovelock gravity in a system of branes which are connected with each other by different wormholes through a BIonic system. These wormholes are produced as due to the nonlinear electrodynamics which are emerged on the interacting branes. By approaching branes, wormholes dissolve into branes and Lovelock gravity is generated. Also, throats of some wormholes become smaller than their horizons and they transit to black holes. Generalizing calculations to M-theory, it is found that by compacting Mp-branes, Lovelock gravity changes to nonlinear electrodynamics and thus both of them have the same origin. This result is consistent with the prediction of BIonic model in string theory.

  4. ON HAMILTONIAN FORMULATIONS AND CONSERVATION LAWS FOR PLATE THEORIES OF VEKUA-AMOSOV TYPE

    Directory of Open Access Journals (Sweden)

    Sergey I. Zhavoronok

    2017-12-01

    Full Text Available Some variants of the generalized Hamiltonian formulation of the plate theory of I. N. Vekua – A. A. Amosov type are presented. The infinite dimensional formulation with one evolution variable, or an “instantaneous” formalism, as well as the de Donder – Weyl one are considered, and their application to the numerical simulation of shell and plate dynamics is briefly discussed. The main conservation laws are formulated for the general plate theory of Nth order, and the possible motion integrals are introduced

  5. The hexagon gauge anomaly in type 1 superstring theory

    International Nuclear Information System (INIS)

    Green, M.B.; Schwarz, J.H.

    1985-01-01

    Hexagon diagrams with external on-mass-shell Yang-Mills gauge particles are investigated in type I superstring theory. Both the annulus and the Moebuis-strip diagrams are shown to give anomalies, implying that spurious longitudinal modes cannot be consistently decoupled. However, the anomalies cancel when the two diagrams are added together if the gauge group is chosen to be SO(32). In carrying out the analysis, two different regulators are considered, but the same conclusions emerge in both cases. We point out where various terms in the low-energy effective action originate in superstring diagrams. (orig.)

  6. Nuclear and mitochondrial DNA quantification of various forensic materials.

    Science.gov (United States)

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  7. Kaluza–Klein-type models of de Sitter and Poincaré gauge theories of gravity

    International Nuclear Information System (INIS)

    Lu Jiaan; Huang Chaoguang

    2013-01-01

    We construct Kaluza–Klein-type models with a de Sitter or Minkowski bundle in the de Sitter or Poincaré gauge theory of gravity, respectively. A manifestly gauge-invariant formalism has been given. The gravitational dynamics is constructed by the geometry of the de Sitter or Minkowski bundle and a global section which plays an important role in the gauge-invariant formalism. Unlike the old Kaluza–Klein-type models of gauge theory of gravity, a suitable cosmological term can be obtained in the Lagrangian of our models and the models in the spin-current-free and torsion-free limit will come back to general relativity with a corresponding cosmological term. We also generalize the results to the case with a variable cosmological term. (paper)

  8. On SYM theory and all order bulk singularity structures of BPS strings in type II theory

    Science.gov (United States)

    Hatefi, Ehsan

    2018-06-01

    The complete forms of the S-matrix elements of a transverse scalar field, two world volume gauge fields, and a Potential Cn-1 Ramond-Ramond (RR) form field are investigated. In order to find an infinite number of t , s , (t + s + u)-channel bulk singularity structures of this particular mixed open-closed amplitude, we employ all the conformal field theory techniques to , exploring all the entire correlation functions and all order α‧ contact interactions to these supersymmetric Yang-Mills (SYM) couplings. Singularity and contact term comparisons with the other symmetric analysis, and are also carried out in detail. Various couplings from pull-Back of branes, Myers terms and several generalized Bianchi identities should be taken into account to be able to reconstruct all order α‧ bulk singularities of type IIB (IIA) superstring theory. Finally, we make a comment on how to derive without any ambiguity all order α‧ contact terms of this S-matrix which carry momentum of RR in transverse directions.

  9. Existence theory for sequential fractional differential equations with anti-periodic type boundary conditions

    Directory of Open Access Journals (Sweden)

    Aqlan Mohammed H.

    2016-01-01

    Full Text Available We develop the existence theory for sequential fractional differential equations involving Liouville-Caputo fractional derivative equipped with anti-periodic type (non-separated and nonlocal integral boundary conditions. Several existence criteria depending on the nonlinearity involved in the problems are presented by means of a variety of tools of the fixed point theory. The applicability of the results is shown with the aid of examples. Our results are not only new in the given configuration but also yield some new special cases for specific choices of parameters involved in the problems.

  10. Scale relativity theory and integrative systems biology: 2. Macroscopic quantum-type mechanics.

    Science.gov (United States)

    Nottale, Laurent; Auffray, Charles

    2008-05-01

    In these two companion papers, we provide an overview and a brief history of the multiple roots, current developments and recent advances of integrative systems biology and identify multiscale integration as its grand challenge. Then we introduce the fundamental principles and the successive steps that have been followed in the construction of the scale relativity theory, which aims at describing the effects of a non-differentiable and fractal (i.e., explicitly scale dependent) geometry of space-time. The first paper of this series was devoted, in this new framework, to the construction from first principles of scale laws of increasing complexity, and to the discussion of some tentative applications of these laws to biological systems. In this second review and perspective paper, we describe the effects induced by the internal fractal structures of trajectories on motion in standard space. Their main consequence is the transformation of classical dynamics into a generalized, quantum-like self-organized dynamics. A Schrödinger-type equation is derived as an integral of the geodesic equation in a fractal space. We then indicate how gauge fields can be constructed from a geometric re-interpretation of gauge transformations as scale transformations in fractal space-time. Finally, we introduce a new tentative development of the theory, in which quantum laws would hold also in scale space, introducing complexergy as a measure of organizational complexity. Initial possible applications of this extended framework to the processes of morphogenesis and the emergence of prokaryotic and eukaryotic cellular structures are discussed. Having founded elements of the evolutionary, developmental, biochemical and cellular theories on the first principles of scale relativity theory, we introduce proposals for the construction of an integrative theory of life and for the design and implementation of novel macroscopic quantum-type experiments and devices, and discuss their potential

  11. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  12. Resonant modal group theory of membrane-type acoustical metamaterials for low-frequency sound attenuation

    Science.gov (United States)

    Ma, Fuyin; Wu, Jiu Hui; Huang, Meng

    2015-09-01

    In order to overcome the influence of the structural resonance on the continuous structures and obtain a lightweight thin-layer structure which can effectively isolate the low-frequency noises, an elastic membrane structure was proposed. In the low-frequency range below 500 Hz, the sound transmission loss (STL) of this membrane type structure is greatly higher than that of the current sound insulation material EVA (ethylene-vinyl acetate copo) of vehicle, so it is possible to replace the EVA by the membrane-type metamaterial structure in practice engineering. Based on the band structure, modal shapes, as well as the sound transmission simulation, the sound insulation mechanism of the designed membrane-type acoustic metamaterials was analyzed from a new perspective, which had been validated experimentally. It is suggested that in the frequency range above 200 Hz for this membrane-mass type structure, the sound insulation effect was principally not due to the low-level locally resonant mode of the mass block, but the continuous vertical resonant modes of the localized membrane. So based on such a physical property, a resonant modal group theory is initially proposed in this paper. In addition, the sound insulation mechanism of the membrane-type structure and thin plate structure were combined by the membrane/plate resonant theory.

  13. Loss Aversion under Prospect Theory: a Parameter-Free Measurement

    NARCIS (Netherlands)

    H. Bleichrodt (Han); M. Abdellaoui (Mohammed); C. Paraschiv (Corina)

    2007-01-01

    textabstractA growing body of qualitative evidence shows that loss aversion, a phenomenon formalized in prospect theory, can explain a variety of field and experimental data. Quantifications of loss aversion are, however, hindered by the absence of a general preference-based method to elicit the

  14. Uncertainty analysis of 137Cs and 90Sr activity in borehole water from a waste disposal site

    International Nuclear Information System (INIS)

    Dafauti, Sunita; Pulhani, Vandana; Datta, D.; Hegde, A.G.

    2005-01-01

    Uncertainty quantification (UQ) is the quantitative characterization and use of uncertainty in experimental applications. There are two distinct types of uncertainty variability which can be quantified in principle using classical probability theory and lack of knowledge which requires more than classical probability theory for its quantification. Fuzzy set theory was applied to quantify the second type of uncertainty associated with the measurement of activity due to 137 Cs and 90 Sr present in bore-well water samples from a waste disposal site. The upper and lower limits of concentration were computed and it may be concluded from the analysis that the alpha cut technique of fuzzy set theory is a good nonprecise estimator of these types of bounds. (author)

  15. A Yang-Mills Type Gauge Theory of Gravity and the Dark Matter and Dark Energy Problems

    OpenAIRE

    Yang, Yi; Yeung, Wai Bong

    2012-01-01

    A Yang-Mills type gauge theory of gravity is shown to have a richer structure than the Einstein's General Theory of Relativity. This new structure can give an explanation of the form of the galactic rotation curves, of the amount of intergalactic gravitational lensing, and of the accelerating expansion of the Universe.

  16. Classical Bianchi Type I Cosmology in K-Essence Theory

    International Nuclear Information System (INIS)

    Pimentel, Luis O.; Socorro, J.; Espinoza-García, Abraham

    2014-01-01

    We use one of the simplest forms of the K-essence theory and we apply it to the classical anisotropic Bianchi type I cosmological model, with a barotropic perfect fluid (p=γρ) modeling the usual matter content and with cosmological constant Λ. Classical exact solutions for any γ≠1 and Λ=0 are found in closed form, whereas solutions for Λ≠0 are found for particular values in the barotropic parameter. We present the possible isotropization of the cosmological model Bianchi I using the ratio between the anisotropic parameters and the volume of the universe. We also include a qualitative analysis of the analog of the Friedmann equation.

  17. The epsilon regime of chiral perturbation theory with Wilson-type fermions

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Shindler, A. [Liverpool Univ. (United Kingdom). Theoretical Physics Division

    2009-11-15

    In this proceeding contribution we report on the ongoing effort to simulate Wilson-type fermions in the so called epsilon regime of chiral perturbation theory (cPT).We present results for the chiral condensate and the pseudoscalar decay constant obtained with Wilson twisted mass fermions employing two lattice spacings, two different physical volumes and several quark masses. With this set of simulations we make a first attempt to estimate the systematic uncertainties. (orig.)

  18. The epsilon regime of chiral perturbation theory with Wilson-type fermions

    International Nuclear Information System (INIS)

    Jansen, K.; Shindler, A.

    2009-11-01

    In this proceeding contribution we report on the ongoing effort to simulate Wilson-type fermions in the so called epsilon regime of chiral perturbation theory (cPT).We present results for the chiral condensate and the pseudoscalar decay constant obtained with Wilson twisted mass fermions employing two lattice spacings, two different physical volumes and several quark masses. With this set of simulations we make a first attempt to estimate the systematic uncertainties. (orig.)

  19. Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.

    Science.gov (United States)

    Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2018-02-01

    Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Type II Superstring Field Theory: Geometric Approach and Operadic Description

    CERN Document Server

    Jurco, Branislav

    2013-01-01

    We outline the construction of type II superstring field theory leading to a geometric and algebraic BV master equation, analogous to Zwiebach's construction for the bosonic string. The construction uses the small Hilbert space. Elementary vertices of the non-polynomial action are described with the help of a properly formulated minimal area problem. They give rise to an infinite tower of superstring field products defining a $\\mathcal{N}=1$ generalization of a loop homotopy Lie algebra, the genus zero part generalizing a homotopy Lie algebra. Finally, we give an operadic interpretation of the construction.

  1. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    Science.gov (United States)

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  2. Uncertainty quantification using evidence theory in multidisciplinary design optimization

    International Nuclear Information System (INIS)

    Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh

    2004-01-01

    Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems

  3. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  4. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Impaired Theory of Mind and psychosocial functioning among pediatric patients with Type I versus Type II bipolar disorder.

    Science.gov (United States)

    Schenkel, Lindsay S; Chamberlain, Todd F; Towne, Terra L

    2014-03-30

    Deficits in Theory of Mind (ToM) have been documented among pediatric patients with Bipolar Disorder (BD). However, fewer studies have directly examined differences between type I and type II patients and whether or not ToM deficits are related to psychosocial difficulties. Therefore, the aim of this study was to compare type I versus type II pediatric bipolar patients and matched Healthy Controls (HC) on ToM and interpersonal functioning tasks. All participants completed the Revised Mind in the Eyes Task (MET), the Cognitive and Emotional Perspective Taking Task (CEPTT), and the Index of Peer Relations (IPR). Type I BD patients reported greater peer difficulties on the IPR compared to HC, and also performed more poorly on the MET and the cognitive condition of the CEPTT, but did not differ significantly on the emotional condition. There were no significant group differences between type II BD patients and HC. More impaired ToM performance was associated with poorer interpersonal functioning. Type I BD patients show deficits in the ability to understand another's mental state, irrespective of emotional valence. Deficits in understanding others' mental states could be an important treatment target for type I pediatric patients with BD. © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  7. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  8. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  9. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  10. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  11. Conference on Geometric Analysis &Conference on Type Theory, Homotopy Theory and Univalent Foundations : Extended Abstracts Fall 2013

    CERN Document Server

    Yang, Paul; Gambino, Nicola; Kock, Joachim

    2015-01-01

    The two parts of the present volume contain extended conference abstracts corresponding to selected talks given by participants at the "Conference on Geometric Analysis" (thirteen abstracts) and at the "Conference on Type Theory, Homotopy Theory and Univalent Foundations" (seven abstracts), both held at the Centre de Recerca Matemàtica (CRM) in Barcelona from July 1st to 5th, 2013, and from September 23th to 27th, 2013, respectively. Most of them are brief articles, containing preliminary presentations of new results not yet published in regular research journals. The articles are the result of a direct collaboration between active researchers in the area after working in a dynamic and productive atmosphere. The first part is about Geometric Analysis and Conformal Geometry; this modern field lies at the intersection of many branches of mathematics (Riemannian, Conformal, Complex or Algebraic Geometry, Calculus of Variations, PDE's, etc) and relates directly to the physical world, since many natural phenomena...

  12. Dualities in M-theory and Born-Infeld Theory

    International Nuclear Information System (INIS)

    Brace, Daniel M.

    2001-01-01

    We discuss two examples of duality. The first arises in the context of toroidal compactification of the discrete light cone quantization of M-theory. In the presence of nontrivial moduli coming from the M-theory three form, it has been conjectured that the system is described by supersymmetric Yang-Mills gauge theory on a noncommutative torus. We are able to provide evidence for this conjecture, by showing that the dualities of this M-theory compactification, which correspond to T-duality in Type IIA string theory, are also dualities of the noncommutative supersymmetric Yang-Mills description. One can also consider this as evidence for the accuracy of the Matrix Theory description of M-theory in this background. The second type of duality is the self-duality of theories with U(1) gauge fields. After discussing the general theory of duality invariance for theories with complex gauge fields, we are able to find a generalization of the well known U(1) Born-Infeld theory that contains any number of gauge fields and which is invariant under the maximal duality group. We then find a supersymmetric extension of our results, and also show that our results can be extended to find Born-Infeld type actions in any even dimensional spacetime

  13. Higher derivatives in Type II and M-theory on Calabi-Yau threefolds

    Science.gov (United States)

    Grimm, Thomas W.; Mayer, Kilian; Weissenbacher, Matthias

    2018-02-01

    The four- and five-dimensional effective actions of Calabi-Yau threefold compactifications are derived with a focus on terms involving up to four space-time derivatives. The starting points for these reductions are the ten- and eleven-dimensional supergravity actions supplemented with the known eight-derivative corrections that have been inferred from Type II string amplitudes. The corrected background solutions are determined and the fluctuations of the Kähler structure of the compact space and the form-field back-ground are discussed. It is concluded that the two-derivative effective actions for these fluctuations only takes the expected supergravity form if certain additional ten- and eleven-dimensional higher-derivative terms for the form-fields are included. The main results on the four-derivative terms include a detailed treatment of higher-derivative gravity coupled to Kähler structure deformations. This is supplemented by a derivation of the vector sector in reductions to five dimensions. While the general result is only given as an expansion in the fluctuations, a complete treatment of the one-Kähler modulus case is presented for both Type II theories and M-theory.

  14. Negative affectivity and social inhibition in cardiovascular disease: evaluating type-D personality and its assessment using item response theory.

    Science.gov (United States)

    Emons, Wilco H M; Meijer, Rob R; Denollet, Johan

    2007-07-01

    Individuals with increased levels of both negative affectivity (NA) and social inhibition (SI)-referred to as type-D personality-are at increased risk of adverse cardiac events. We used item response theory (IRT) to evaluate NA, SI, and type-D personality as measured by the DS14. The objectives of this study were (a) to evaluate the relative contribution of individual items to the measurement precision at the cutoff to distinguish type-D from non-type-D personality and (b) to investigate the comparability of NA, SI, and type-D constructs across the general population and clinical populations. Data from representative samples including 1316 respondents from the general population, 427 respondents diagnosed with coronary heart disease, and 732 persons suffering from hypertension were analyzed using the graded response IRT model. In Study 1, the information functions obtained in the IRT analysis showed that (a) all items had highest measurement precision around the cutoff and (b) items are most informative at the higher end of the scale. In Study 2, the IRT analysis showed that measurements were fairly comparable across the general population and clinical populations. The DS14 adequately measures NA and SI, with highest reliability in the trait range around the cutoff. The DS14 is a valid instrument to assess and compare type-D personality across clinical groups.

  15. QUANTIFICATION AND BIOREMEDIATION OF ENVIRONMENTAL SAMPLES BY DEVELOPING A NOVEL AND EFFICIENT METHOD

    Directory of Open Access Journals (Sweden)

    Mohammad Osama

    2014-06-01

    Full Text Available Pleurotus ostreatus, a white rot fungus, is capable of bioremediating a wide range of organic contaminants including Polycyclic Aromatic Hydrocarbons (PAHs. Ergosterol is produced by living fungal biomass and used as a measure of fungal biomass. The first part of this work deals with the extraction and quantification of PAHs from contaminated sediments by Lipid Extraction Method (LEM. The second part consists of the development of a novel extraction method (Ergosterol Extraction Method (EEM, quantification and bioremediation. The novelty of this method is the simultaneously extraction and quantification of two different types of compounds, sterol (ergosterol and PAHs and is more efficient than LEM. EEM has been successful in extracting ergosterol from the fungus grown on barley in the concentrations of 17.5-39.94 µg g-1 ergosterol and the PAHs are much more quantified in numbers and amounts as compared to LEM. In addition, cholesterol usually found in animals, has also been detected in the fungus, P. ostreatus at easily detectable levels.

  16. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  17. Type synthesis for 4-DOF parallel press mechanism using GF set theory

    Science.gov (United States)

    He, Jun; Gao, Feng; Meng, Xiangdun; Guo, Weizhong

    2015-07-01

    Parallel mechanisms is used in the large capacity servo press to avoid the over-constraint of the traditional redundant actuation. Currently, the researches mainly focus on the performance analysis for some specific parallel press mechanisms. However, the type synthesis and evaluation of parallel press mechanisms is seldom studied, especially for the four degrees of freedom(DOF) press mechanisms. The type synthesis of 4-DOF parallel press mechanisms is carried out based on the generalized function(GF) set theory. Five design criteria of 4-DOF parallel press mechanisms are firstly proposed. The general procedure of type synthesis of parallel press mechanisms is obtained, which includes number synthesis, symmetrical synthesis of constraint GF sets, decomposition of motion GF sets and design of limbs. Nine combinations of constraint GF sets of 4-DOF parallel press mechanisms, ten combinations of GF sets of active limbs, and eleven combinations of GF sets of passive limbs are synthesized. Thirty-eight kinds of press mechanisms are presented and then different structures of kinematic limbs are designed. Finally, the geometrical constraint complexity( GCC), kinematic pair complexity( KPC), and type complexity( TC) are proposed to evaluate the press types and the optimal press type is achieved. The general methodologies of type synthesis and evaluation for parallel press mechanism are suggested.

  18. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  19. Quantification of arbuscular mycorrhizal fungal DNA in roots: how important is material preservation?

    Science.gov (United States)

    Janoušková, Martina; Püschel, David; Hujslová, Martina; Slavíková, Renata; Jansa, Jan

    2015-04-01

    Monitoring populations of arbuscular mycorrhizal fungi (AMF) in roots is a pre-requisite for improving our understanding of AMF ecology and functioning of the symbiosis in natural conditions. Among other approaches, quantification of fungal DNA in plant tissues by quantitative real-time PCR is one of the advanced techniques with a great potential to process large numbers of samples and to deliver truly quantitative information. Its application potential would greatly increase if the samples could be preserved by drying, but little is currently known about the feasibility and reliability of fungal DNA quantification from dry plant material. We addressed this question by comparing quantification results based on dry root material to those obtained from deep-frozen roots of Medicago truncatula colonized with Rhizophagus sp. The fungal DNA was well conserved in the dry root samples with overall fungal DNA levels in the extracts comparable with those determined in extracts of frozen roots. There was, however, no correlation between the quantitative data sets obtained from the two types of material, and data from dry roots were more variable. Based on these results, we recommend dry material for qualitative screenings but advocate using frozen root materials if precise quantification of fungal DNA is required.

  20. Forest Carbon Leakage Quantification Methods and Their Suitability for Assessing Leakage in REDD

    Directory of Open Access Journals (Sweden)

    Sabine Henders

    2012-01-01

    Full Text Available This paper assesses quantification methods for carbon leakage from forestry activities for their suitability in leakage accounting in a future Reducing Emissions from Deforestation and Forest Degradation (REDD mechanism. To that end, we first conducted a literature review to identify specific pre-requisites for leakage assessment in REDD. We then analyzed a total of 34 quantification methods for leakage emissions from the Clean Development Mechanism (CDM, the Verified Carbon Standard (VCS, the Climate Action Reserve (CAR, the CarbonFix Standard (CFS, and from scientific literature sources. We screened these methods for the leakage aspects they address in terms of leakage type, tools used for quantification and the geographical scale covered. Results show that leakage methods can be grouped into nine main methodological approaches, six of which could fulfill the recommended REDD leakage requirements if approaches for primary and secondary leakage are combined. The majority of methods assessed, address either primary or secondary leakage; the former mostly on a local or regional and the latter on national scale. The VCS is found to be the only carbon accounting standard at present to fulfill all leakage quantification requisites in REDD. However, a lack of accounting methods was identified for international leakage, which was addressed by only two methods, both from scientific literature.

  1. Development of a situation-specific theory for explaining health-related quality of life among older South Korean adults with type 2 diabetes.

    Science.gov (United States)

    Chang, Sun Ju; Im, Eun-Ok

    2014-01-01

    The purpose of the study was to develop a situation-specific theory for explaining health-related quality of life (QOL) among older South Korean adults with type 2 diabetes. To develop a situation-specific theory, three sources were considered: (a) the conceptual model of health promotion and QOL for people with chronic and disabling conditions (an existing theory related to the QOL in patients with chronic diseases); (b) a literature review using multiple databases including Cumulative Index for Nursing and Allied Health Literature (CINAHL), PubMed, PsycINFO, and two Korean databases; and (c) findings from our structural equation modeling study on health-related QOL in older South Korean adults with type 2 diabetes. The proposed situation-specific theory is constructed with six major concepts including barriers, resources, perceptual factors, psychosocial factors, health-promoting behaviors, and health-related QOL. The theory also provides the interrelationships among concepts. Health care providers and nurses could incorporate the proposed situation-specific theory into development of diabetes education programs for improving health-related QOL in older South Korean adults with type 2 diabetes.

  2. Identification and Quantification of Carbonate Species Using Rock-Eval Pyrolysis

    Directory of Open Access Journals (Sweden)

    Pillot D.

    2013-03-01

    Full Text Available This paper presents a new reliable and rapid method to characterise and quantify carbonates in solid samples based on monitoring the CO2 flux emitted by progressive thermal decomposition of carbonates during a programmed heating. The different peaks of destabilisation allow determining the different types of carbonates present in the analysed sample. The quantification of each peak gives the respective proportions of these different types of carbonates in the sample. In addition to the chosen procedure presented in this paper, using a standard Rock-Eval 6 pyrolyser, calibration characteristic profiles are also presented for the most common carbonates in nature. This method should allow different types of application for different disciplines, either academic or industrial.

  3. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  4. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  5. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  6. Quantification of heterogeneity as a biomarker in tumor imaging: a systematic review.

    Directory of Open Access Journals (Sweden)

    Lejla Alic

    Full Text Available BACKGROUND: Many techniques are proposed for the quantification of tumor heterogeneity as an imaging biomarker for differentiation between tumor types, tumor grading, response monitoring and outcome prediction. However, in clinical practice these methods are barely used. This study evaluates the reported performance of the described methods and identifies barriers to their implementation in clinical practice. METHODOLOGY: The Ovid, Embase, and Cochrane Central databases were searched up to 20 September 2013. Heterogeneity analysis methods were classified into four categories, i.e., non-spatial methods (NSM, spatial grey level methods (SGLM, fractal analysis (FA methods, and filters and transforms (F&T. The performance of the different methods was compared. PRINCIPAL FINDINGS: Of the 7351 potentially relevant publications, 209 were included. Of these studies, 58% reported the use of NSM, 49% SGLM, 10% FA, and 28% F&T. Differentiation between tumor types, tumor grading and/or outcome prediction was the goal in 87% of the studies. Overall, the reported area under the curve (AUC ranged from 0.5 to 1 (median 0.87. No relation was found between the performance and the quantification methods used, or between the performance and the imaging modality. A negative correlation was found between the tumor-feature ratio and the AUC, which is presumably caused by overfitting in small datasets. Cross-validation was reported in 63% of the classification studies. Retrospective analyses were conducted in 57% of the studies without a clear description. CONCLUSIONS: In a research setting, heterogeneity quantification methods can differentiate between tumor types, grade tumors, and predict outcome and monitor treatment effects. To translate these methods to clinical practice, more prospective studies are required that use external datasets for validation: these datasets should be made available to the community to facilitate the development of new and improved

  7. The iteration formula of the Maslov-type index theory with applications to nonlinear Hamiltonian systems

    International Nuclear Information System (INIS)

    Di Dong; Yiming Long.

    1994-10-01

    In this paper, the iteration formula of the Maslov-type index theory for linear Hamiltonian systems with continuous periodic and symmetric coefficients is established. This formula yields a new method to determine the minimality of the period for solutions of nonlinear autonomous Hamiltonian systems via their Maslov-type indices. Applications of this formula give new results on the existence of periodic solutions with prescribed minimal period for such systems. (author). 40 refs

  8. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  9. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  10. Synthesis of nanodiamond derivatives carrying amino functions and quantification by a modified Kaiser test

    Directory of Open Access Journals (Sweden)

    Gerald Jarre

    2014-11-01

    Full Text Available Nanodiamonds functionalized with different organic moieties carrying terminal amino groups have been synthesized. These include conjugates generated by Diels–Alder reactions of ortho-quinodimethanes formed in situ from pyrazine and 5,6-dihydrocyclobuta[d]pyrimidine derivatives. For the quantification of primary amino groups a modified photometric assay based on the Kaiser test has been developed and validated for different types of aminated nanodiamond. The results correspond well to values obtained by thermogravimetry. The method represents an alternative wet-chemical quantification method in cases where other techniques like elemental analysis fail due to unfavourable combustion behaviour of the analyte or other impediments.

  11. Transient theory of double slope floating cum tilted - wick type solar still

    International Nuclear Information System (INIS)

    Balan, R.; Chandrasekaran, J.; Janarthanan, B.; Kumar, S.

    2011-01-01

    A double slope floating cum tilted-wick solar still has been fabricated and transient theory of floating cum tilted-wick type solar still has been proposed. Analytical expressions have been derived for the different temperatures components of the proposed system. For elocution of the analytical results, numerical calculations have been carried out using the meteorological parameters for a typical summer day in Coimbatore. Analytical expression results are found to be in the close agreement with the experimental results. (authors)

  12. Kinetic quantification of plyometric exercise intensity.

    Science.gov (United States)

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.

  13. Effectiveness of training on preventative nutritional behaviors for type-2 diabetes among the female adolescents: Examination of theory of planned behavior.

    Science.gov (United States)

    Maleki, Farzaneh; Hosseini Nodeh, Zahra; Rahnavard, Zahra; Arab, Masoume

    2016-01-01

    Since type-2 diabetes is the most common chronic disease among Iranian female adolescents, we applied theory of planned behavior to examine the effect of training to intention to preventative nutritional behaviors for type-2 diabetes among female adolescents. In this experimental study 200 (11-14 year old) girls from 8 schools of Tehran city (100 in each intervention and control group) were recruited based on cluster sampling method during two stages. For intervention group, an educational program was designed based on the theory of planned behavior and presented in 6 workshop sessions to prevent type-2 diabetes. The data were collected before and two months after the workshops using a valid and reliable (α=0.72 and r=0.80) authormade questionnaire based on Ajzens TPB questionnaire manual. The data were analyzed using t-test, chi-square test and analysis of covariance. Findings indicate that the two groups were homogeneous regarding the demographic characteristics before education, but the mean score of the theory components (attitudes, subjective norms, perceived behavioral control, and intention) was higher in the control group. Also, results showed all of the theory components significantly increased after the education in the intervention group (p=0.000). Training based on the theory of planned behavior enhances the intention to adherence preventative nutritional behaviors for type-2 diabetes among the studied female adolescents.

  14. Prevalence, quantification and typing of adenoviruses detected in river and treated drinking water in South Africa.

    Science.gov (United States)

    van Heerden, J; Ehlers, M M; Heim, A; Grabow, W O K

    2005-01-01

    risk of infection constituted by these viruses. The risk of infection may have implications for the management of drinking water quality. This study is unique as it is the first report on the quantification and typing of HAds in treated drinking water and river water. This baseline data is necessary for the meaningful assessment of the potential risk of infection constituted by these viruses.

  15. Quantification of phosphorus in single cells using synchrotron X-ray fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Núñez-Milland, Daliángelis R. [Department of Chemistry and Biochemistry, University of South Carolina, Columbia, SC 29208 (United States); Baines, Stephen B. [Department of Ecology and Evolution, Stony Brook University, Stony Brook, NY 11755 (United States); Vogt, Stefan [Experimental Facilities Division, Advanced Photon Source, Argonne National Laboratory, Argonne, IL (United States); Twining, Benjamin S., E-mail: btwining@bigelow.org [Department of Chemistry and Biochemistry, University of South Carolina, Columbia, SC 29208 (United States)

    2010-07-01

    Phosphorus abundance was quantified in individual phytoplankton cells by synchrotron X-ray fluorescence and compared with bulk spectrophotometric measurements to confirm accuracy of quantification. Figures of merit for P quantification on three different types of transmission electron microscopy grids are compared to assess possible interferences. Phosphorus is required for numerous cellular compounds and as a result can serve as a useful proxy for total cell biomass in studies of cell elemental composition. Single-cell analysis by synchrotron X-ray fluorescence (SXRF) enables quantitative and qualitative analyses of cell elemental composition with high elemental sensitivity. Element standards are required to convert measured X-ray fluorescence intensities into element concentrations, but few appropriate standards are available, particularly for the biologically important element P. Empirical P conversion factors derived from other elements contained in certified thin-film standards were used to quantify P in the model diatom Thalassiosira pseudonana, and the measured cell quotas were compared with those measured in bulk by spectrophotometry. The mean cellular P quotas quantified with SXRF for cells on Au, Ni and nylon grids using this approach were not significantly different from each other or from those measured spectrophotometrically. Inter-cell variability typical of cell populations was observed. Additionally, the grid substrates were compared for their suitability to P quantification based on the potential for spectral interferences with P. Nylon grids were found to have the lowest background concentrations and limits of detection for P, while background concentrations in Ni and Au grids were 1.8- and 6.3-fold higher. The advantages and disadvantages of each grid type for elemental analysis of individual phytoplankton cells are discussed.

  16. Prospects of using the second-order perturbation theory of the MP2 type in the theory of electron scattering by polyatomic molecules

    Energy Technology Data Exchange (ETDEWEB)

    Čársky, Petr [J. Heyrovský Institute of Physical Chemistry, Academy of Sciences of the Czech Republic, v.i.i., Dolejškova 3, 18223 Prague 8 (Czech Republic)

    2015-01-22

    So far the second-order perturbation theory has been only applied to the hydrogen molecule. No application was attempted for another molecule, probably because of technical difficulties of such calculations. The purpose of this contribution is to show that the calculations of this type are now feasible on larger polyatomic molecules even on commonly used computers.

  17. N=1 field theory duality from M theory

    International Nuclear Information System (INIS)

    Schmaltz, M.; Sundrum, R.

    1998-01-01

    We investigate Seiberg close-quote s N=1 field theory duality for four-dimensional supersymmetric QCD with the M-theory 5-brane. We find that the M-theory configuration for the magnetic dual theory arises via a smooth deformation of the M-theory configuration for the electric theory. The creation of Dirichlet 4-branes as Neveu-Schwarz 5-branes are passed through each other in type IIA string theory is given an elegant derivation from M theory. copyright 1998 The American Physical Society

  18. Comparison between magnetic force microscopy and electron back-scatter diffraction for ferrite quantification in type 321 stainless steel

    Energy Technology Data Exchange (ETDEWEB)

    Warren, A.D., E-mail: Xander.Warren@bristol.ac.uk [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); Harniman, R.L. [School of Chemistry, University of Bristol, Bristol BS8 1 TS (United Kingdom); Collins, A.M. [School of Chemistry, University of Bristol, Bristol BS8 1 TS (United Kingdom); Bristol Centre for Functional Nanomaterials, Nanoscience and Quantum Information Centre, University of Bristol, Bristol BS8 1FD (United Kingdom); Davis, S.A. [School of Chemistry, University of Bristol, Bristol BS8 1 TS (United Kingdom); Younes, C.M. [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); Flewitt, P.E.J. [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); School of Physics, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); Scott, T.B. [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom)

    2015-01-15

    Several analytical techniques that are currently available can be used to determine the spatial distribution and amount of austenite, ferrite and precipitate phases in steels. The application of magnetic force microscopy, in particular, to study the local microstructure of stainless steels is beneficial due to the selectivity of this technique for detection of ferromagnetic phases. In the comparison of Magnetic Force Microscopy and Electron Back-Scatter Diffraction for the morphological mapping and quantification of ferrite, the degree of sub-surface measurement has been found to be critical. Through the use of surface shielding, it has been possible to show that Magnetic Force Microscopy has a measurement depth of 105–140 nm. A comparison of the two techniques together with the depth of measurement capabilities are discussed. - Highlights: • MFM used to map distribution and quantify ferrite in type 321 stainless steels. • MFM results compared with EBSD for same region, showing good spatial correlation. • MFM gives higher area fraction of ferrite than EBSD due to sub-surface measurement. • From controlled experiments MFM depth sensitivity measured from 105 to 140 nm. • A correction factor to calculate area fraction from MFM data is estimated.

  19. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  20. An information theory account of cognitive control.

    Science.gov (United States)

    Fan, Jin

    2014-01-01

    Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  1. An information theory account of cognitive control

    Directory of Open Access Journals (Sweden)

    Jin eFan

    2014-09-01

    Full Text Available Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  2. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  3. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  4. Practical Applications of Generalizability Theory for Designing, Evaluating, and Improving Psychological Assessments.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-01-01

    In this article, we illustrate how generalizability theory (G-theory) can extend traditional assessment methods for designing, improving, and evaluating results from both objectively and subjectively scored measures of individual differences. Our illustrations include quantification of multiple sources of measurement error, derivation of unique indexes of consistency for norm- and criterion-referenced interpretations of scores, estimation of score consistency when changing a measurement procedure, and disattenuation of correlation coefficients for measurement error. We also expand G-theory analyses beyond the item level to include parcels and split measures and highlight linkages among G-theory, classical test theory, and structural equation modeling. Computer code and sample data are provided in online supplements to help readers apply the demonstrated techniques to their own assessments.

  5. On the effective theory of type II string compactifications on nilmanifolds and coset spaces

    International Nuclear Information System (INIS)

    Caviezel, Claudio

    2009-01-01

    In this thesis we analyzed a large number of type IIA strict SU(3)-structure compactifications with fluxes and O6/D6-sources, as well as type IIB static SU(2)-structure compactifications with fluxes and O5/O7-sources. Restricting to structures and fluxes that are constant in the basis of left-invariant one-forms, these models are tractable enough to allow for an explicit derivation of the four-dimensional low-energy effective theory. The six-dimensional compact manifolds we studied in this thesis are nilmanifolds based on nilpotent Lie-algebras, and, on the other hand, coset spaces based on semisimple and U(1)-groups, which admit a left-invariant strict SU(3)- or static SU(2)-structure. In particular, from the set of 34 distinct nilmanifolds we identified two nilmanifolds, the torus and the Iwasawa manifold, that allow for an AdS 4 , N = 1 type IIA strict SU(3)-structure solution and one nilmanifold allowing for an AdS 4 , N = 1 type IIB static SU(2)-structure solution. From the set of all the possible six-dimensional coset spaces, we identified seven coset spaces suitable for strict SU(3)-structure compactifications, four of which also allow for a static SU(2)-structure compactification. For all these models, we calculated the four-dimensional low-energy effective theory using N = 1 supergravity techniques. In order to write down the most general four-dimensional effective action, we also studied how to classify the different disconnected ''bubbles'' in moduli space. (orig.)

  6. Social cognitive theory correlates of moderate-intensity exercise among adults with type 2 diabetes.

    Science.gov (United States)

    Heiss, Valerie J; Petosa, R L

    2016-01-01

    The purpose of this study was to identify social cognitive theory (SCT) correlates of moderate- to vigorous-intensity exercise (MVPA) among adults with type 2 diabetes. Adults with type 2 diabetes (N = 181) participated in the study. Participants were recruited through ResearchMatch.org to complete an online survey. The survey used previously validated instruments to measure dimensions of self-efficacy, self-regulation, social support, outcome expectations, the physical environment, and minutes of MVPA per week. Spearman Rank Correlations were used to determine the relationship between SCT variables and MVPA. Classification and Regression Analysis using a decision tree model was used to determine the amount of variance in MVPA explained by SCT variables. Due to low levels of vigorous activity, only moderate-intensity exercise (MIE) was analyzed. SCT variables explained 42.4% of the variance in MIE. Self-monitoring, social support from family, social support from friends, and self-evaluative outcome expectations all contributed to the variability in MIE. Other contributing variables included self-reward, task self-efficacy, social outcome expectations, overcoming barriers, and self-efficacy for making time for exercise. SCT is a useful theory for identifying correlates of MIE among adults with type 2 diabetes. The SCT correlates can be used to refine diabetes education programs to target the adoption and maintenance of regular exercise.

  7. Uncertainty quantification and experimental design based on unsupervised machine learning identification of contaminant sources and groundwater types using hydrogeochemical data

    Science.gov (United States)

    Vesselinov, V. V.

    2017-12-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical species. Numerous geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. As a result, these types of model analyses are typically extremely challenging. Here, we demonstrate a new contaminant source identification approach that performs decomposition of the observation mixtures based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. We also demonstrate how NMFk can be extended to perform uncertainty quantification and experimental design related to real-world site characterization. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios). The NMFk algorithm has been extensively tested on synthetic datasets; NMFk analyses have been actively performed on real-world data collected at the Los Alamos National

  8. Quantification of the detriment and comparison of health risks. Methodological problems

    International Nuclear Information System (INIS)

    Jammet, H.

    1982-01-01

    Some of the methodological problems involved in the quantitative estimate of the health detriment of different energy sources and in risk comparison are described. First, the question of determining the detriment is discussed from the point of view of the distortions introduced in the quantification when dealing with risks for which the amount of information available varies widely. The main criteria applied to classifying types of detriment are then recalled. Finally, the problems involved in comparisons are outlined: spatial and temporal variations in the types of detriment, operation under normal and accident conditions, and the risks to the public and workers. (author)

  9. Quantification of thermal damage in skin tissue

    Institute of Scientific and Technical Information of China (English)

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  10. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  11. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, J. M., E-mail: jschmidt@physics.usyd.edu.au; Cairns, Iver H. [School of Physics, Physics Road, Building A28, University of Sydney, NSW 2006 (Australia)

    2016-03-25

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME and plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 10{sup 6} and ≈ 10{sup 3}, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth’s magnetosphere and drive space weather events.

  12. Increased accuracy of starch granule type quantification using mixture distributions.

    Science.gov (United States)

    Tanaka, Emi; Ral, Jean-Phillippe F; Li, Sean; Gaire, Raj; Cavanagh, Colin R; Cullis, Brian R; Whan, Alex

    2017-01-01

    The proportion of granule types in wheat starch is an important characteristic that can affect its functionality. It is widely accepted that granule types are either large, disc-shaped A-type granules or small, spherical B-type granules. Additionally, there are some reports of the tiny C-type granules. The differences between these granule types are due to its carbohydrate composition and crystallinity which is highly, but not perfectly, correlated with the granule size. A majority of the studies that have considered granule types analyse them based on a size threshold rather than chemical composition. This is understandable due to the expense of separating starch into different types. While the use of a size threshold to classify granule type is a low-cost measure, this results in misclassification. We present an alternative, statistical method to quantify the proportion of granule types by a fit of the mixture distribution, along with an R package, a web based app and a video tutorial for how to use the web app to enable its straightforward application. Our results show that the reliability of the genotypic effects increase approximately 60% using the proportions of the A-type and B-type granule estimated by the mixture distribution over the standard size-threshold measure. Although there was a marginal drop in reliability for C-type granules. The latter is likely due to the low observed genetic variance for C-type granules. The determination of the proportion of granule types from size-distribution is better achieved by using the mixing probabilities from the fit of the mixture distribution rather than using a size-threshold.

  13. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  14. Increased accuracy of starch granule type quantification using mixture distributions

    OpenAIRE

    Tanaka, Emi; Ral, Jean-Phillippe F.; Li, Sean; Gaire, Raj; Cavanagh, Colin R.; Cullis, Brian R.; Whan, Alex

    2017-01-01

    Background The proportion of granule types in wheat starch is an important characteristic that can affect its functionality. It is widely accepted that granule types are either large, disc-shaped A-type granules or small, spherical B-type granules. Additionally, there are some reports of the tiny C-type granules. The differences between these granule types are due to its carbohydrate composition and crystallinity which is highly, but not perfectly, correlated with the granule size. A majority...

  15. Aspects of Moduli Stabilization in Type IIB String Theory

    Directory of Open Access Journals (Sweden)

    Shaaban Khalil

    2016-01-01

    Full Text Available We review moduli stabilization in type IIB string theory compactification with fluxes. We focus on KKLT and Large Volume Scenario (LVS. We show that the predicted soft SUSY breaking terms in KKLT model are not phenomenological viable. In LVS, the following result for scalar mass, gaugino mass, and trilinear term is obtained: m0=m1/2=-A0=m3/2, which may account for Higgs mass limit if m3/2~O(1.5 TeV. However, in this case, the relic abundance of the lightest neutralino cannot be consistent with the measured limits. We also study the cosmological consequences of moduli stabilization in both models. In particular, the associated inflation models such as racetrack inflation and Kähler inflation are analyzed. Finally, the problem of moduli destabilization and the effect of string moduli backreaction on the inflation models are discussed.

  16. A Variational Statistical-Field Theory for Polar Liquid Mixtures

    Science.gov (United States)

    Zhuang, Bilin; Wang, Zhen-Gang

    Using a variational field-theoretic approach, we derive a molecularly-based theory for polar liquid mixtures. The resulting theory consists of simple algebraic expressions for the free energy of mixing and the dielectric constant as functions of mixture composition. Using only the dielectric constants and the molar volumes of the pure liquid constituents, the theory evaluates the mixture dielectric constants in good agreement with the experimental values for a wide range of liquid mixtures, without using adjustable parameters. In addition, the theory predicts that liquids with similar dielectric constants and molar volumes dissolve well in each other, while sufficient disparity in these parameters result in phase separation. The calculated miscibility map on the dielectric constant-molar volume axes agrees well with known experimental observations for a large number of liquid pairs. Thus the theory provides a quantification for the well-known empirical ``like-dissolves-like'' rule. Bz acknowledges the A-STAR fellowship for the financial support.

  17. Use of a medication quantification scale for comparison of pain medication usage in patients with complex regional pain syndrome (CRPS).

    Science.gov (United States)

    Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman

    2015-03-01

    To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.

  18. To the theory of the first-type phase transformations for many variables

    International Nuclear Information System (INIS)

    Fateev, M.P.

    2002-01-01

    The multidimensional theory on the first-type phase transitions near the one-dimensional saddle point is considered. The transformations of the variables, describing the new phase nucleation, making it possible to achieve their complex separation in the Fokker-Planck equation, and thus to reduce the problem to the one-dimensional one, are proposed. The distribution function and nucleation velocity are determined both for the stationary and nonstationary nucleation stages. The problem on volatile liquid boiling is considered as an example for the case when there are two parameters, characterizing the new phase nucleation [ru

  19. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    International Nuclear Information System (INIS)

    Seebauer, Matthias

    2014-01-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO 2  ha −1  yr −1 with significantly different mitigation benefits depending on typologies of the crop–livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms. (paper)

  20. Krichever-Novikov type algebras theory and applications

    CERN Document Server

    Schlichenmaier, Martin

    2014-01-01

    Krichever and Novikov introduced certain classes of infinite dimensionalLie algebrasto extend the Virasoro algebra and its related algebras to Riemann surfaces of higher genus. The author of this book generalized and extended them toa more general setting needed by the applications. Examples of applications are Conformal Field Theory, Wess-Zumino-Novikov-Witten models, moduli space problems, integrable systems, Lax operator algebras, and deformation theory of Lie algebra. Furthermore they constitute an important class of infinite dimensional Lie algebras which due to their geometric origin are

  1. Constraints on Nonlinear and Stochastic Growth Theories for Type 3 Solar Radio Bursts from the Corona to 1 AU

    Science.gov (United States)

    Cairns, Iver H.; Robinson, P. A.

    1998-01-01

    Existing, competing theories for coronal and interplanetary type III solar radio bursts appeal to one or more of modulational instability, electrostatic (ES) decay processes, or stochastic growth physics to preserve the electron beam, limit the levels of Langmuir-like waves driven by the beam, and produce wave spectra capable of coupling nonlinearly to generate the observed radio emission. Theoretical constraints exist on the wavenumbers and relative sizes of the wave bandwidth and nonlinear growth rate for which Langmuir waves are subject to modulational instability and the parametric and random phase versions of ES decay. A constraint also exists on whether stochastic growth theory (SGT) is appropriate. These constraints are evaluated here using the beam, plasma, and wave properties (1) observed in specific interplanetary type III sources, (2) predicted nominally for the corona, and (3) predicted at heliocentric distances greater than a few solar radii by power-law models based on interplanetary observations. It is found that the Langmuir waves driven directly by the beam have wavenumbers that are almost always too large for modulational instability but are appropriate to ES decay. Even for waves scattered to lower wavenumbers (by ES decay, for instance), the wave bandwidths are predicted to be too large and the nonlinear growth rates too small for modulational instability to occur for the specific interplanetary events studied or the great majority of Langmuir wave packets in type III sources at arbitrary heliocentric distances. Possible exceptions are for very rare, unusually intense, narrowband wave packets, predominantly close to the Sun, and for the front portion of very fast beams traveling through unusually dilute, cold solar wind plasmas. Similar arguments demonstrate that the ES decay should proceed almost always as a random phase process rather than a parametric process, with similar exceptions. These results imply that it is extremely rare for

  2. Methods for the physical characterization and quantification of extracellular vesicles in biological samples.

    Science.gov (United States)

    Rupert, Déborah L M; Claudio, Virginia; Lässer, Cecilia; Bally, Marta

    2017-01-01

    Our body fluids contain a multitude of cell-derived vesicles, secreted by most cell types, commonly referred to as extracellular vesicles. They have attracted considerable attention for their function as intercellular communication vehicles in a broad range of physiological processes and pathological conditions. Extracellular vesicles and especially the smallest type, exosomes, have also generated a lot of excitement in view of their potential as disease biomarkers or as carriers for drug delivery. In this context, state-of-the-art techniques capable of comprehensively characterizing vesicles in biological fluids are urgently needed. This review presents the arsenal of techniques available for quantification and characterization of physical properties of extracellular vesicles, summarizes their working principles, discusses their advantages and limitations and further illustrates their implementation in extracellular vesicle research. The small size and physicochemical heterogeneity of extracellular vesicles make their physical characterization and quantification an extremely challenging task. Currently, structure, size, buoyant density, optical properties and zeta potential have most commonly been studied. The concentration of vesicles in suspension can be expressed in terms of biomolecular or particle content depending on the method at hand. In addition, common quantification methods may either provide a direct quantitative measurement of vesicle concentration or solely allow for relative comparison between samples. The combination of complementary methods capable of detecting, characterizing and quantifying extracellular vesicles at a single particle level promises to provide new exciting insights into their modes of action and to reveal the existence of vesicle subpopulations fulfilling key biological tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Science.gov (United States)

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  4. Quantifiers and the Foundations of Quasi-Set Theory

    Directory of Open Access Journals (Sweden)

    Jonas R. Becker Arenhart

    2009-12-01

    Full Text Available In this paper we discuss some questions proposed by Prof. Newton da Costa on the foundations of quasi-set theory. His main doubts concern the possibility of a reasonable semantical understanding of the theory, mainly due to the fact that identity and difference do not apply to some entities of the theory’s intended domain of discourse. According to him, the quantifiers employed in the theory, when understood in the usual way, rely on the assumption that identity applies to all entities in the domain of discourse. Inspired by his provocation, we suggest that, using some ideas presented by da Costa himself in his seminars at UFSC (the Federal University of Santa Catarina and by one of us (DK in some papers, these difficulties can be overcome both on a formal level and on an informal level, showing how quantification over items for which identity does not make sense can be understood without presupposing a semantics based on a ‘classical’ set theory.

  5. Quantification of the N-terminal propeptide of human procollagen type I (PINP): comparison of ELISA and RIA with respect to different molecular forms

    DEFF Research Database (Denmark)

    Jensen, Charlotte Harken; Hansen, M; Brandt, J

    1998-01-01

    This paper compares the results of procollagen type I N-terminal propeptide (PINP) quantification by radioimmunoassay (RIA) and enzyme linked immunosorbent assay (ELISA). PINP in serum from a patient with uremic hyperparathyroidism was measured in RIA and ELISA to 20 micrograms l-1 and 116...... of PINP when analysed in a direct ELISA. It is concluded that the major difference in the ELISA and RIA results is due to assay efficacy with respect to the low molecular weight form of PINP. Udgivelsesdato: 1998-Jan-12......-PAGE). Analysis of fractions from size separated amniotic fluid, serum and dialysis fluid demonstrated that the RIA failed to measure the low molecular weight form of PINP. However, the anti-PINP supplied with the RIA-kit and the anti-PINP applied in the ELISA reacted equally well with both molecular forms...

  6. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  7. Quantification of type I error probabilities for heterogeneity LOD scores.

    Science.gov (United States)

    Abreu, Paula C; Hodge, Susan E; Greenberg, David A

    2002-02-01

    Locus heterogeneity is a major confounding factor in linkage analysis. When no prior knowledge of linkage exists, and one aims to detect linkage and heterogeneity simultaneously, classical distribution theory of log-likelihood ratios does not hold. Despite some theoretical work on this problem, no generally accepted practical guidelines exist. Nor has anyone rigorously examined the combined effect of testing for linkage and heterogeneity and simultaneously maximizing over two genetic models (dominant, recessive). The effect of linkage phase represents another uninvestigated issue. Using computer simulation, we investigated type I error (P value) of the "admixture" heterogeneity LOD (HLOD) score, i.e., the LOD score maximized over both recombination fraction theta and admixture parameter alpha and we compared this with the P values when one maximizes only with respect to theta (i.e., the standard LOD score). We generated datasets of phase-known and -unknown nuclear families, sizes k = 2, 4, and 6 children, under fully penetrant autosomal dominant inheritance. We analyzed these datasets (1) assuming a single genetic model, and maximizing the HLOD over theta and alpha; and (2) maximizing the HLOD additionally over two dominance models (dominant vs. recessive), then subtracting a 0.3 correction. For both (1) and (2), P values increased with family size k; rose less for phase-unknown families than for phase-known ones, with the former approaching the latter as k increased; and did not exceed the one-sided mixture distribution xi = (1/2) chi1(2) + (1/2) chi2(2). Thus, maximizing the HLOD over theta and alpha appears to add considerably less than an additional degree of freedom to the associated chi1(2) distribution. We conclude with practical guidelines for linkage investigators. Copyright 2002 Wiley-Liss, Inc.

  8. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  9. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  10. The Power of M Theory

    OpenAIRE

    Schwarz, John H.

    1995-01-01

    A proposed duality between type IIB superstring theory on R^9 X S^1 and a conjectured 11D fundamental theory (``M theory'') on R^9 X T^2 is investigated. Simple heuristic reasoning leads to a consistent picture relating the various p-branes and their tensions in each theory. Identifying the M theory on R^{10} X S^1 with type IIA superstring theory on R^{10}, in a similar fashion, leads to various relations among the p-branes of the IIA theory.

  11. Superstring theory

    International Nuclear Information System (INIS)

    Schwarz, J.H.

    1985-01-01

    Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions

  12. Optimal search behavior and classic foraging theory

    International Nuclear Information System (INIS)

    Bartumeus, F; Catalan, J

    2009-01-01

    Random walk methods and diffusion theory pervaded ecological sciences as methods to analyze and describe animal movement. Consequently, statistical physics was mostly seen as a toolbox rather than as a conceptual framework that could contribute to theory on evolutionary biology and ecology. However, the existence of mechanistic relationships and feedbacks between behavioral processes and statistical patterns of movement suggests that, beyond movement quantification, statistical physics may prove to be an adequate framework to understand animal behavior across scales from an ecological and evolutionary perspective. Recently developed random search theory has served to critically re-evaluate classic ecological questions on animal foraging. For instance, during the last few years, there has been a growing debate on whether search behavior can include traits that improve success by optimizing random (stochastic) searches. Here, we stress the need to bring together the general encounter problem within foraging theory, as a mean for making progress in the biological understanding of random searching. By sketching the assumptions of optimal foraging theory (OFT) and by summarizing recent results on random search strategies, we pinpoint ways to extend classic OFT, and integrate the study of search strategies and its main results into the more general theory of optimal foraging.

  13. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  14. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  15. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  16. Optimizing total reflection X-ray fluorescence for direct trace element quantification in proteins I: Influence of sample homogeneity and reflector type

    Science.gov (United States)

    Wellenreuther, G.; Fittschen, U. E. A.; Achard, M. E. S.; Faust, A.; Kreplin, X.; Meyer-Klaucke, W.

    2008-12-01

    Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence (μXRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.

  17. Optimizing total reflection X-ray fluorescence for direct trace element quantification in proteins I: Influence of sample homogeneity and reflector type

    Energy Technology Data Exchange (ETDEWEB)

    Wellenreuther, G. [European Molecular Biology Laboratory, Notkestr. 85, 22603 Hamburg (Germany); Fittschen, U.E.A. [Department of Chemistry, University of Hamburg, Martin-Luther-King-Platz 6, 20146 Hamburg (Germany); Achard, M.E.S.; Faust, A.; Kreplin, X. [European Molecular Biology Laboratory, Notkestr. 85, 22603 Hamburg (Germany); Meyer-Klaucke, W. [European Molecular Biology Laboratory, Notkestr. 85, 22603 Hamburg (Germany)], E-mail: Wolfram@embl-hamburg.de

    2008-12-15

    Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence ({mu}XRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.

  18. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    Science.gov (United States)

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  19. Renormalization group theory for percolation in time-varying networks.

    Science.gov (United States)

    Karschau, Jens; Zimmerling, Marco; Friedrich, Benjamin M

    2018-05-22

    Motivated by multi-hop communication in unreliable wireless networks, we present a percolation theory for time-varying networks. We develop a renormalization group theory for a prototypical network on a regular grid, where individual links switch stochastically between active and inactive states. The question whether a given source node can communicate with a destination node along paths of active links is equivalent to a percolation problem. Our theory maps the temporal existence of multi-hop paths on an effective two-state Markov process. We show analytically how this Markov process converges towards a memoryless Bernoulli process as the hop distance between source and destination node increases. Our work extends classical percolation theory to the dynamic case and elucidates temporal correlations of message losses. Quantification of temporal correlations has implications for the design of wireless communication and control protocols, e.g. in cyber-physical systems such as self-organized swarms of drones or smart traffic networks.

  20. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  1. Simultaneous quantification of amoxicillin and potassium clavulanate in different commercial drugs using PIXE technique

    International Nuclear Information System (INIS)

    Bejjani, A.; Roumié, M.; Akkad, S.; El-Yazbi, F.; Nsouli, B.

    2016-01-01

    We have demonstrated, in previous studies that Particle Induced X-ray Emission (PIXE) is one of the most rapid and accurate choices for quantification of an active ingredient, in a solid drug, from the reactions induced on its specific heteroatom using pellets made from original tablets. In this work, PIXE is used, for the first time, for simultaneous quantification of two active ingredients, amoxicillin trihydrate and potassium clavulanate, in six different commercial antibiotic type of drugs. Since the quality control process of a drug covers a large number of samples, the scope of this study was also to found the most rapid and low cost sample preparation needed to analyze these drugs with a good precision. The chosen drugs were analyzed in their tablets’ “as received” form, in pellets made from the powder of the tablets and also in pellets made from the powder of the tablets after being heated up to 70 °C to avoid any molecular destruction until constant weight and removal of humidity. The quantification validity related to the aspects of each sample preparation (homogeneity of the drug components and humidity) are presented and discussed.

  2. Simultaneous quantification of amoxicillin and potassium clavulanate in different commercial drugs using PIXE technique

    Energy Technology Data Exchange (ETDEWEB)

    Bejjani, A., E-mail: abejjani@cnrs.edu.lb [IBA Laboratory, Lebanese Atomic Energy Commission-CNRS, P.O. Box: 11-8281, Beirut (Lebanon); Roumié, M. [IBA Laboratory, Lebanese Atomic Energy Commission-CNRS, P.O. Box: 11-8281, Beirut (Lebanon); Akkad, S. [Facutly of Pharmacy, Department of Pharmaceutical Analytical Chemistry, Beirut Arab University, Beirut (Lebanon); El-Yazbi, F. [Faculty of Pharmacy, Alexandria University, P.O. Box: 21521, Elmesalla, Alexandria (Egypt); Nsouli, B. [IBA Laboratory, Lebanese Atomic Energy Commission-CNRS, P.O. Box: 11-8281, Beirut (Lebanon)

    2016-03-15

    We have demonstrated, in previous studies that Particle Induced X-ray Emission (PIXE) is one of the most rapid and accurate choices for quantification of an active ingredient, in a solid drug, from the reactions induced on its specific heteroatom using pellets made from original tablets. In this work, PIXE is used, for the first time, for simultaneous quantification of two active ingredients, amoxicillin trihydrate and potassium clavulanate, in six different commercial antibiotic type of drugs. Since the quality control process of a drug covers a large number of samples, the scope of this study was also to found the most rapid and low cost sample preparation needed to analyze these drugs with a good precision. The chosen drugs were analyzed in their tablets’ “as received” form, in pellets made from the powder of the tablets and also in pellets made from the powder of the tablets after being heated up to 70 °C to avoid any molecular destruction until constant weight and removal of humidity. The quantification validity related to the aspects of each sample preparation (homogeneity of the drug components and humidity) are presented and discussed.

  3. Processing and quantification of x-ray energy dispersive spectra in the Analytical Electron Microscope

    International Nuclear Information System (INIS)

    Zaluzec, N.J.

    1988-08-01

    Spectral processing in x-ray energy dispersive spectroscopy deals with the extraction of characteristic signals from experimental data. In this text, the four basic procedures for this methodology are reviewed and their limitations outlined. Quantification, on the other hand, deals with the interpretation of the information obtained from spectral processing. Here the limitations are for the most part instrumental in nature. The prospects of higher voltage operation does not, in theory, present any new problems and may in fact prove to be more desirable assuming that electron damage effects do not preclude analysis. 28 refs., 6 figs

  4. A performance study on three qPCR quantification kits and their compatibilities with the 6-dye DNA profiling systems.

    Science.gov (United States)

    Lin, Sze-Wah; Li, Christina; Ip, Stephen C Y

    2018-03-01

    DNA quantification plays an integral role in forensic DNA profiling. Not only does it estimate the total amount of amplifiable human autosomal and male DNA to ensure optimal amplification of target DNA for subsequent analysis, but also assesses the extraction efficiency and purity of the DNA extract. Latest DNA quantification systems even offer an estimate for the degree of DNA degradation in a sample. Here, we report the performance of three new generation qPCR kits, namely Investigator ® Quantiplex HYres Kit from QIAGEN, Quantifiler ® Trio DNA Quantification Kit from Applied Biosystems™, and PowerQuant ® System from Promega, and their compatibilities with three 6-dye DNA profiling systems. Our results have demonstrated that all three kits generate standard curves with satisfactory consistency and reproducibility, and are capable of screening out traces of male DNA in the presence of 30-fold excess of female DNA. They also exhibit a higher tolerance to PCR inhibition than Quantifiler ® Human DNA Quantification Kit from Applied Biosystems™ in autosomal DNA quantification. PowerQuant ® , as compared to Quantiplex HYres and Quantifiler ® Trio, shows a better precision for both autosomal and male DNA quantifications. Quantifiler ® Trio and PowerQuant ® in contrast to Quantiplex HYres offer better correlations with lower discrepancies between autosomal and male DNA quantification, and their additional degradation index features provide a detection platform for inhibited and/or degraded DNA template. Regarding the compatibility between these quantification and profiling systems: (1) both Quantifiler ® Trio and PowerQuant ® work well with GlobalFiler and Fusion 6C, allowing a fairly accurate prediction of their DNA typing results based on the quantification values; (2) Quantiplex HYres offers a fairly reliable IPC system for detecting any potential inhibitions on Investigator 24plex, whereas Quantifiler ® Trio and PowerQuant ® suit better for Global

  5. Use of wall-less 18F-doped gelatin phantoms for improved volume delineation and quantification in PET/CT

    International Nuclear Information System (INIS)

    Sydoff, Marie; Andersson, Martin; Mattsson, Sören; Leide-Svegborn, Sigrid

    2014-01-01

    Positron emission tomography (PET) with 18 F-FDG is a valuable tool for staging, planning treatment, and evaluating the treatment response for many different types of tumours. The correct volume estimation is of utmost importance in these situations. To date, the most common types of phantoms used in volume quantification in PET utilize fillable, hollow spheres placed in a circular or elliptical cylinder made of polymethyl methacrylate. However, the presence of a non-radioactive sphere wall between the hotspot and the background activity in images of this type of phantom could cause inaccuracies. To investigate the influence of the non-active walls, we developed a phantom without non-active sphere walls for volume delineation and quantification in PET. Three sizes of gelatin hotspots were moulded and placed in a Jaszczak phantom together with hollow plastic spheres of the same sizes containing the same activity concentration. 18 F PET measurements were made with zero background activity and with tumour-to-background ratios of 12.5, 10, 7.5, and 5. The background-corrected volume reproducing threshold, T vol , was calculated for both the gelatin and the plastic spheres. It was experimentally verified that the apparent background dependence of T vol , i.e., a decreasing T vol  with increasing background fraction, was not present for wall-less spheres; the opposite results were seen in plastic, hollow spheres in commercially-available phantoms. For the types of phantoms commonly used in activity quantification, the estimation of T vol  using fillable, hollow, plastic spheres with non-active walls would lead to an overestimate of the tumour volume, especially for small volumes in a high activity background. (paper)

  6. Use of wall-less 18F-doped gelatin phantoms for improved volume delineation and quantification in PET/CT

    Science.gov (United States)

    Sydoff, Marie; Andersson, Martin; Mattsson, Sören; Leide-Svegborn, Sigrid

    2014-03-01

    Positron emission tomography (PET) with 18F-FDG is a valuable tool for staging, planning treatment, and evaluating the treatment response for many different types of tumours. The correct volume estimation is of utmost importance in these situations. To date, the most common types of phantoms used in volume quantification in PET utilize fillable, hollow spheres placed in a circular or elliptical cylinder made of polymethyl methacrylate. However, the presence of a non-radioactive sphere wall between the hotspot and the background activity in images of this type of phantom could cause inaccuracies. To investigate the influence of the non-active walls, we developed a phantom without non-active sphere walls for volume delineation and quantification in PET. Three sizes of gelatin hotspots were moulded and placed in a Jaszczak phantom together with hollow plastic spheres of the same sizes containing the same activity concentration. 18F PET measurements were made with zero background activity and with tumour-to-background ratios of 12.5, 10, 7.5, and 5. The background-corrected volume reproducing threshold, Tvol, was calculated for both the gelatin and the plastic spheres. It was experimentally verified that the apparent background dependence of Tvol, i.e., a decreasing Tvol with increasing background fraction, was not present for wall-less spheres; the opposite results were seen in plastic, hollow spheres in commercially-available phantoms. For the types of phantoms commonly used in activity quantification, the estimation of Tvol using fillable, hollow, plastic spheres with non-active walls would lead to an overestimate of the tumour volume, especially for small volumes in a high activity background.

  7. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  8. Quantification of maltol in Korean ginseng (Panax ginseng) products by high-performance liquid chromatography-diode array detector

    Science.gov (United States)

    Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won

    2015-01-01

    Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746

  9. Development and application of bio-sample quantification to evaluate stability and pharmacokinetics of inulin-type fructo-oligosaccharides from Morinda Officinalis.

    Science.gov (United States)

    Chi, Liandi; Chen, Lingxiao; Zhang, Jiwen; Zhao, Jing; Li, Shaoping; Zheng, Ying

    2018-07-15

    Inulin-type fructooligosaccharides (FOS) purified from Morinda Officinalis, with degrees of polymerization (DP) from 3 to 9, have been approved in China as an oral prescribed drug for mild and moderate depression episode, while the stability and oral absorption of this FOS mixtures are largely unknown. As the main active component and quality control marker for above FOS, DP5 was selected as the representative FOS in this study. Desalting method by ion exchange resin was developed to treat bio-sample, followed by separation and quantification by high performance liquid chromatography-charged aerosol detector. Results showed that the DP5 was stepwisely hydrolyzed in simulated gastric fluid and gut microbiota, while maintained stable in intestinal fluid. DP5 has poor permeability across Caco-2 monolayer with P app of 5.22 × 10 -7  cm/s, and very poor oral absorption with bioavailability of (0.50 ± 0.12)% in rat. In conclusion, FOS in Morinda Officinalis demonstrated poor chemical stability in simulated gastric fluid and human gut microbiota, and low oral absorption in rats. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Building International Business Theory: A Grounded Theory Approach

    OpenAIRE

    Gligor, David; Esmark, Carol; Golgeci, Ismail

    2016-01-01

    The field of international business (IB) is in need of more theory development (Morck & Yeung, 2007). As such, the main focus of our manuscript was to provide guidance on how to build IB specific theory using grounded theory (GT). Moreover, we contribute to future theory development by identifying areas within IB where GT can be applied and the type of research issues that can be addressed using this methodology. Finally, we make a noteworthy contribution by discussing some of GT’s caveats an...

  11. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  12. Completeness in Hybrid Type Theory

    DEFF Research Database (Denmark)

    Areces, Carlos; Blackburn, Patrick Rowan; Huertas, Antonia

    2014-01-01

    We show that basic hybridization (adding nominals and @ operators) makes it possible to give straightforward Henkin-style completeness proofs even when the modal logic being hybridized is higher-order. The key ideas are to add nominals as expressions of type t, and to extend to arbitrary types th......-style intensional models; we build, as simply as we can, hybrid logicover Henkin’s logic...

  13. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  14. A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.

    Science.gov (United States)

    Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo

    2008-01-01

    Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.

  15. Physical activity: The importance of the extended theory of planned behavior, in type 2 diabetes patients.

    Science.gov (United States)

    Ferreira, Gabriela; Pereira, M Graça

    2017-09-01

    This study focused on the contribution of the extended theory of planned behavior regarding intention to perform physical activity, adherence to physical activity, and its mediator role in the relationship between trust in the physician and adherence to physical activity, in a sample of 120 patients with type 2 diabetes. The results revealed that positive attitudes and perception of control predicted a stronger intention to do physical activity. The intention to do physical activity was the only predictor of adherence to physical activity. Planning mediated the relationship between trust in the physician and adherence. Implications for patients with type 2 diabetes are discussed.

  16. Theory of flux cutting and flux transport at the critical current of a type-II superconducting cylindrical wire

    International Nuclear Information System (INIS)

    Clem, John R.

    2011-01-01

    I introduce a critical-state theory incorporating both flux cutting and flux transport to calculate the magnetic-field and current-density distributions inside a type-II superconducting cylinder at its critical current in a longitudinal applied magnetic field. The theory is an extension of the elliptic critical-state model introduced by Romero-Salazar and Perez-Rodriguez. The vortex dynamics depend in detail on two nonlinear effective resistivities for flux cutting (ρ(parallel)) and flux flow (ρ(perpendicular)), and their ratio r = ρ(parallel)/ρ(perpendicular). When r c (φ) that makes the vortex arc unstable.

  17. Quantification and characterization of grouped type I myofibers in human aging.

    Science.gov (United States)

    Kelly, Neil A; Hammond, Kelley G; Stec, Michael J; Bickel, C Scott; Windham, Samuel T; Tuggle, S Craig; Bamman, Marcas M

    2018-01-01

    Myofiber type grouping is a histological hallmark of age-related motor unit remodeling. Despite the accepted concept that denervation-reinnervation events lead to myofiber type grouping, the completeness of those conversions remains unknown. Type I myofiber grouping was assessed in vastus lateralis biopsies from Young (26 ± 4 years; n = 27) and Older (66 ± 4 years; n = 91) adults. Grouped and ungrouped type I myofibers were evaluated for phenotypic differences. Higher type I grouping in Older versus Young was driven by more myofibers per group (i.e., larger group size) (P grouped type I myofibers displayed larger cross-sectional area, more myonuclei, lower capillary supply, and more sarco(endo)plasmic reticulum calcium ATPase I (SERCA I) expression (P Grouped type I myofibers retain type II characteristics suggesting that conversion during denervation-reinnervation events is either progressive or incomplete. Muscle Nerve 57: E52-E59, 2018. © 2017 Wiley Periodicals, Inc.

  18. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  19. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  20. Three-dimensional theory of quantum memories based on Λ-type atomic ensembles

    International Nuclear Information System (INIS)

    Zeuthen, Emil; Grodecka-Grad, Anna; Soerensen, Anders S.

    2011-01-01

    We develop a three-dimensional theory for quantum memories based on light storage in ensembles of Λ-type atoms, where two long-lived atomic ground states are employed. We consider light storage in an ensemble of finite spatial extent and we show that within the paraxial approximation the Fresnel number of the atomic ensemble and the optical depth are the only important physical parameters determining the quality of the quantum memory. We analyze the influence of these parameters on the storage of light followed by either forward or backward read-out from the quantum memory. We show that for small Fresnel numbers the forward memory provides higher efficiencies, whereas for large Fresnel numbers the backward memory is advantageous. The optimal light modes to store in the memory are presented together with the corresponding spin waves and outcoming light modes. We show that for high optical depths such Λ-type atomic ensembles allow for highly efficient backward and forward memories even for small Fresnel numbers F(greater-or-similar sign)0.1.

  1. On the S-matrix of type-0 string theory

    International Nuclear Information System (INIS)

    DeWolfe, Oliver; Roiban, Radu; Spradlin, Marcus; Volovich, Anastasia; Walcher, Johannes

    2003-01-01

    The recent discovery of non-perturbatively stable two-dimensional string back-grounds and their dual matrix models allows the study of complete scattering matrices in string theory. In this note we adapt work of Moore, Plesser, and Ramgoolam on the bosonic string to compute the exact S-matrices of 0A and 0B string theory in two dimensions. Unitarity of the 0B theory requires the inclusion of massless soliton sectors carrying RR scalar charge as asymptotic states. We propose a regularization of IR divergences and find transition probabilities that distinguish the otherwise energetically degenerate soliton sectors. Unstable D-branes can decay into distinct soliton sectors. (author)

  2. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  3. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  4. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  5. Abelian gauge symmetries in F-theory and dual theories

    Science.gov (United States)

    Song, Peng

    In this dissertation, we focus on important physical and mathematical aspects, especially abelian gauge symmetries, of F-theory compactifications and its dual formulations within type IIB and heterotic string theory. F-theory is a non-perturbative formulation of type IIB string theory which enjoys important dualities with other string theories such as M-theory and E8 x E8 heterotic string theory. One of the main strengths of F-theory is its geometrization of many physical problems in the dual string theories. In particular, its study requires a lot of mathematical tools such as advanced techniques in algebraic geometry. Thus, it has also received a lot of interests among mathematicians, and is a vivid area of research within both the physics and the mathematics community. Although F-theory has been a long-standing theory, abelian gauge symmetry in Ftheory has been rarely studied, until recently. Within the mathematics community, in 2009, Grassi and Perduca first discovered the possibility of constructing elliptically fibered varieties with non-trivial toric Mordell-Weil group. In the physics community, in 2012, Morrison and Park first made a major advancement by constructing general F-theory compactifications with U(1) abelian gauge symmetry. They found that in such cases, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the blow-up of the weighted projective space P(1;1;2) at one point. Subsequent developments have been made by Cvetic, Klevers and Piragua extended the works of Morrison and Park and constructed general F-theory compactifications with U(1) x U(1) abelian gauge symmetry. They found that in the U(1) x U(1) abelian gauge symmetry case, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the del Pezzo surface dP2. In chapter 2 of this dissertation, I bring this a step further by

  6. The criticality problem in reflected slab type reactor in the two-group transport theory

    International Nuclear Information System (INIS)

    Garcia, R.D.M.

    1978-01-01

    The criticality problem in reflected slab type reactor is solved for the first time in the two group neutron transport theory, by singular eingenfunctions expansion, the singular integrals obtained through continuity conditions of angular distributions at the interface are regularized by a recently proposed method. The result is a coupled system of regular integral equations for the expansion coefficients, this system is solved by an ordinary interactive method. Numerical results that can be utilized as a comparative standard for aproximation methods, are presented [pt

  7. Hackers' Motivations: Testing Schwartz's Theory of Motivational Types of Values in a Sample of Hackers

    OpenAIRE

    Renushka Madarie

    2017-01-01

    Although much has been written on topic of hacker motivations, little empirical research has been conducted and even less research has attempted to quantify hackers’ motivations. The present study analyses relationships between the frequency of several hacking behaviours and motivations to hack in a sample of male hackers and potential hackers. Motivations frequently recurring in the literature are assessed and Schwartz´s (1992) Theory of Motivational Types of Values is applied. A preference ...

  8. Comparison of Suitability of the Most Common Ancient DNA Quantification Methods.

    Science.gov (United States)

    Brzobohatá, Kristýna; Drozdová, Eva; Smutný, Jiří; Zeman, Tomáš; Beňuš, Radoslav

    2017-04-01

    Ancient DNA (aDNA) extracted from historical bones is damaged and fragmented into short segments, present in low quantity, and usually copurified with microbial DNA. A wide range of DNA quantification methods are available. The aim of this study was to compare the five most common DNA quantification methods for aDNA. Quantification methods were tested on DNA extracted from skeletal material originating from an early medieval burial site. The tested methods included ultraviolet (UV) absorbance, real-time quantitative polymerase chain reaction (qPCR) based on SYBR ® green detection, real-time qPCR based on a forensic kit, quantification via fluorescent dyes bonded to DNA, and fragmentary analysis. Differences between groups were tested using a paired t-test. Methods that measure total DNA present in the sample (NanoDrop ™ UV spectrophotometer and Qubit ® fluorometer) showed the highest concentrations. Methods based on real-time qPCR underestimated the quantity of aDNA. The most accurate method of aDNA quantification was fragmentary analysis, which also allows DNA quantification of the desired length and is not affected by PCR inhibitors. Methods based on the quantification of the total amount of DNA in samples are unsuitable for ancient samples as they overestimate the amount of DNA presumably due to the presence of microbial DNA. Real-time qPCR methods give undervalued results due to DNA damage and the presence of PCR inhibitors. DNA quantification methods based on fragment analysis show not only the quantity of DNA but also fragment length.

  9. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  10. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  11. A Density Functional Theory Study of Doped Tin Monoxide as a Transparent p-type Semiconductor

    KAUST Repository

    Bianchi Granato, Danilo

    2012-05-01

    In the pursuit of enhancing the electronic properties of transparent p-type semiconductors, this work uses density functional theory to study the effects of doping tin monoxide with nitrogen, antimony, yttrium and lanthanum. An overview of the theoretical concepts and a detailed description of the methods employed are given, including a discussion about the correction scheme for charged defects proposed by Freysoldt and others [Freysoldt 2009]. Analysis of the formation energies of the defects points out that nitrogen substitutes an oxygen atom and does not provide charge carriers. On the other hand, antimony, yttrium, and lanthanum substitute a tin atom and donate n-type carriers. Study of the band structure and density of states indicates that yttrium and lanthanum improves the hole mobility. Present results are in good agreement with available experimental works and help to improve the understanding on how to engineer transparent p-type materials with higher hole mobilities.

  12. On the interplay between string theory and field theory

    International Nuclear Information System (INIS)

    Brunner, I.

    1998-01-01

    In this thesis, we have discussed various aspects of branes in string theory and M-theory. In chapter 2 we were able to construct six-dimensional chiral interacting eld theories from Hanany-Witten like brane setups. The field theory requirement that the anomalies cancel was reproduced by RR-charge conservation in the brane setup. The data of the Hanany-Witten setup, which consists of brane positions, was mapped to instanton data. The orbifold construction can be extended to D and E type singularities. In chapter 3 we discussed a matrix conjecture, which claims that M-theory in the light cone gauge is described by the quantum mechanics of D0 branes. Toroidal compactifications of M-theory have a description in terms of super Yang-Mills theory an the dual torus. For more than three compactified dimensions, more degrees of freedom have to be added. In some sense, the philosophy in this chapter is orthogonal to the previous chapter: Here, we want to get M-theory results from eld theory considerations, whereas in the previous chapter we obtained eld theory results by embedding the theories in string theory. Our main focus was on the compactification on T 6 , which leads to complications. Here, the Matrix model is again given by an eleven dimensional theory, not by a lower dimensional field theory. Other problems and possible resolutions of Matrix theory are discussed at the end of chapter 3. In the last chapter we considered M- and F-theory compactifications on Calabi-Yau fourfolds. After explaining some basics of fourfolds, we showed that the web of fourfolds is connected by singular transitions. The two manifolds which are connected by the transition are different resolutions of the same singular manifold. The resolution of the singularities can lead to a certain type of divisors, which lead to non-perturbative superpotentials, when branes wrap them. The vacua connected by the transitions can be physically very different. (orig.)

  13. On the interplay between string theory and field theory

    Energy Technology Data Exchange (ETDEWEB)

    Brunner, I.

    1998-07-08

    In this thesis, we have discussed various aspects of branes in string theory and M-theory. In chapter 2 we were able to construct six-dimensional chiral interacting eld theories from Hanany-Witten like brane setups. The field theory requirement that the anomalies cancel was reproduced by RR-charge conservation in the brane setup. The data of the Hanany-Witten setup, which consists of brane positions, was mapped to instanton data. The orbifold construction can be extended to D and E type singularities. In chapter 3 we discussed a matrix conjecture, which claims that M-theory in the light cone gauge is described by the quantum mechanics of D0 branes. Toroidal compactifications of M-theory have a description in terms of super Yang-Mills theory an the dual torus. For more than three compactified dimensions, more degrees of freedom have to be added. In some sense, the philosophy in this chapter is orthogonal to the previous chapter: Here, we want to get M-theory results from eld theory considerations, whereas in the previous chapter we obtained eld theory results by embedding the theories in string theory. Our main focus was on the compactification on T{sup 6}, which leads to complications. Here, the Matrix model is again given by an eleven dimensional theory, not by a lower dimensional field theory. Other problems and possible resolutions of Matrix theory are discussed at the end of chapter 3. In the last chapter we considered M- and F-theory compactifications on Calabi-Yau fourfolds. After explaining some basics of fourfolds, we showed that the web of fourfolds is connected by singular transitions. The two manifolds which are connected by the transition are different resolutions of the same singular manifold. The resolution of the singularities can lead to a certain type of divisors, which lead to non-perturbative superpotentials, when branes wrap them. The vacua connected by the transitions can be physically very different. (orig.)

  14. Object-Oriented Type Systems

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    binding. Existing languages employ different type systems, and it can be difficult to compare, evaluate and improve them, since there is currently no uniform theory for such languages. This book provides such a theory. The authors review the type systems of Simula, Smalltalk, C++ and Eiffel and present......Object-Oriented Type Systems Jens Palsberg and Michael I. Schwartzbach Aarhus University, Denmark Type systems are required to ensure reliability and efficiency of software. For object-oriented languages, typing is an especially challenging problem because of inheritance, assignment, and late...... a type system that generalizes and explains them. The theory is based on an idealized object-oriented language called BOPL (Basic Object Programming Language), containing common features of the above languages. A type system, type inference algorithm, and typings of inheritance and genericity...

  15. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  16. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  17. The Quantification Process for the PRiME-U34i

    International Nuclear Information System (INIS)

    Hwang, Mee-Jeong; Han, Sang-Hoon; Yang, Joon-Eon

    2006-01-01

    In this paper, we introduce the quantification process for the PRIME-U34i, which is the merged model of ETs (Event Trees) and FTs (Fault Trees) for the level 1 internal PSA of UCN 3 and 4. PRiME-U34i has one top event. Therefore, the quantification process is changed to a simplified method when compared to the past one. In the past, we used the text file called a user file to control the quantification process. However, this user file is so complicated that it is difficult for a non-expert to understand it. Moreover, in the past PSA, ET and FT were separated but in PRiMEU34i, ET and FT were merged together. Thus, the quantification process is different. This paper is composed of five sections. In section 2, we introduce the construction of the one top model. Section 3 shows the quantification process used in the PRiME-U34i. Section 4 describes the post processing. Last section is the conclusions

  18. Prediction of autosomal STR typing success in ancient and Second World War bone samples.

    Science.gov (United States)

    Zupanič Pajnič, Irena; Zupanc, Tomaž; Balažic, Jože; Geršak, Živa Miriam; Stojković, Oliver; Skadrić, Ivan; Črešnar, Matija

    2017-03-01

    Human-specific quantitative PCR (qPCR) has been developed for forensic use in the last 10 years and is the preferred DNA quantification technique since it is very accurate, sensitive, objective, time-effective and automatable. The amount of information that can be gleaned from a single quantification reaction using commercially available quantification kits has increased from the quantity of nuclear DNA to the amount of male DNA, presence of inhibitors and, most recently, to the degree of DNA degradation. In skeletal remains samples from disaster victims, missing persons and war conflict victims, the DNA is usually degraded. Therefore the new commercial qPCR kits able to assess the degree of degradation are potentially able to predict the success of downstream short tandem repeat (STR) typing. The goal of this study was to verify the quantification step using the PowerQuant kit with regard to its suitability as a screening method for autosomal STR typing success on ancient and Second World War (WWII) skeletal remains. We analysed 60 skeletons excavated from five archaeological sites and four WWII mass graves from Slovenia. The bones were cleaned, surface contamination was removed and the bones ground to a powder. Genomic DNA was obtained from 0.5g of bone powder after total demineralization. The DNA was purified using a Biorobot EZ1 device. Following PowerQuant quantification, DNA samples were subjected to autosomal STR amplification using the NGM kit. Up to 2.51ng DNA/g of powder were extracted. No inhibition was detected in any of bones analysed. 82% of the WWII bones gave full profiles while 73% of the ancient bones gave profiles not suitable for interpretation. Four bone extracts yielded no detectable amplification or zero quantification results and no profiles were obtained from any of them. Full or useful partial profiles were produced only from bone extracts where short autosomal (Auto) and long degradation (Deg) PowerQuant targets were detected. It is

  19. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Game Theory and Uncertainty Quantification for Cyber Defense Applications

    Energy Technology Data Exchange (ETDEWEB)

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.

    2016-07-21

    Cyber-system defenders face the challenging task of protecting critical assets and information continually against multiple types of malicious attackers. Defenders typically operate within resource constraints while attackers operate at relatively low costs. As a result, design and development of resilient cyber-systems that can support mission goals under attack while accounting for the dynamics between attackers and defenders is an important research problem.

  1. Quantification of differential gene expression by multiplexed targeted resequencing of cDNA

    Science.gov (United States)

    Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.

    2017-01-01

    Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677

  2. A multivariate shape quantification approach for sickle red blood cell in patient-specific microscopy image data

    Science.gov (United States)

    Xu, Mengjia; Yang, Jinzhu; Zhao, Hong

    2017-07-01

    The morphological change of red blood cells(RBCs) plays an important role in revealing the biomechanical and biorheological characteristics of RBCs. Aiming to extract the shape indices for the sickle RBCs, an automated ex-vivo RBC shape quantification method is proposed. First, single RBC regions (ROIs) are extracted from raw microscopy image via an automatic hierarchical ROI extraction method. Second, an improved random walk method is used to detect the RBC outline. Finally, three types of RBC shape factors are calculated based on the elliptical fitting RBC contour. Experiments indicate that the proposed method can accurately segment the RBCs from the microscopy images with low contrast and prevent the disturbance of artifacts. Moreover, it can provide an efficient shape quantification means for diverse RBC shapes in a batch manner.

  3. Ex vivo activity quantification in micrometastases at the cellular scale using the α-camera technique

    DEFF Research Database (Denmark)

    Chouin, Nicolas; Lindegren, Sture; Frost, Sofia H L

    2013-01-01

    Targeted α-therapy (TAT) appears to be an ideal therapeutic technique for eliminating malignant circulating, minimal residual, or micrometastatic cells. These types of malignancies are typically infraclinical, complicating the evaluation of potential treatments. This study presents a method of ex...... vivo activity quantification with an α-camera device, allowing measurement of the activity taken up by tumor cells in biologic structures a few tens of microns....

  4. Current trends in nursing theories.

    Science.gov (United States)

    Im, Eun-Ok; Chang, Sun Ju

    2012-06-01

    To explore current trends in nursing theories through an integrated literature review. The literature related to nursing theories during the past 10 years was searched through multiple databases and reviewed to determine themes reflecting current trends in nursing theories. The trends can be categorized into six themes: (a) foci on specifics; (b) coexistence of various types of theories; (c) close links to research; (d) international collaborative works; (e) integration to practice; and (f) selective evolution. We need to make our continuous efforts to link research and practice to theories, to identify specifics of our theories, to develop diverse types of theories, and to conduct international collaborative works. Our paper gives implications for future theoretical development in diverse clinical areas of nursing research and practice. © 2012 Sigma Theta Tau International.

  5. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  6. String theory duals of Lifshitz–Chern–Simons gauge theories

    International Nuclear Information System (INIS)

    Balasubramanian, Koushik; McGreevy, John

    2012-01-01

    We propose candidate gravity duals for a class of non-Abelian z = 2 Lifshitz Chern–Simons (LCS) gauge theories studied by Mulligan, Kachru and Nayak. These are nonrelativistic gauge theories in 2+1 dimensions in which parity and time-reversal symmetries are explicitly broken by the presence of a Chern–Simons term. We show that these field theories can be realized as deformations of DLCQ N=4 super Yang–Mills theory. Using the holographic dictionary, we identify the bulk fields of type IIB supergravity that are dual to these deformations. The geometries describing the groundstates of the non-Abelian LCS gauge theories realized here exhibit a mass gap. (paper)

  7. On R4 threshold corrections in type IIB string theory and (p,q)-string instantons

    International Nuclear Information System (INIS)

    Kiritsis, E.; Pioline, B.

    1997-01-01

    We obtain the exact non-perturbative thresholds of R 4 terms in type IIB string theory compactified to eight and seven dimensions. These thresholds are given by the perturbative tree-level and one-loop results together with the contribution of the D-instantons and of the (p,q)-string instantons. The invariance under U-duality is made manifest by rewriting the sum as a non-holomorphic-invariant modular function of the corresponding discrete U-duality group. In the eight-dimensional case, the threshold is the sum of an order-1 Eisenstein series for SL(2,Z) and an order-3/2 Eisenstein series for SL(3,Z). The seven-dimensional result is given by the order-3/2 Eisenstein series for SL(5,Z). We also conjecture formulae for the non-perturbative thresholds in lower-dimensional compactifications and discuss the relation with M-theory. (orig.)

  8. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  9. Quantification of trace-level DNA by real-time whole genome amplification.

    Science.gov (United States)

    Kang, Min-Jung; Yu, Hannah; Kim, Sook-Kyung; Park, Sang-Ryoul; Yang, Inchul

    2011-01-01

    Quantification of trace amounts of DNA is a challenge in analytical applications where the concentration of a target DNA is very low or only limited amounts of samples are available for analysis. PCR-based methods including real-time PCR are highly sensitive and widely used for quantification of low-level DNA samples. However, ordinary PCR methods require at least one copy of a specific gene sequence for amplification and may not work for a sub-genomic amount of DNA. We suggest a real-time whole genome amplification method adopting the degenerate oligonucleotide primed PCR (DOP-PCR) for quantification of sub-genomic amounts of DNA. This approach enabled quantification of sub-picogram amounts of DNA independently of their sequences. When the method was applied to the human placental DNA of which amount was accurately determined by inductively coupled plasma-optical emission spectroscopy (ICP-OES), an accurate and stable quantification capability for DNA samples ranging from 80 fg to 8 ng was obtained. In blind tests of laboratory-prepared DNA samples, measurement accuracies of 7.4%, -2.1%, and -13.9% with analytical precisions around 15% were achieved for 400-pg, 4-pg, and 400-fg DNA samples, respectively. A similar quantification capability was also observed for other DNA species from calf, E. coli, and lambda phage. Therefore, when provided with an appropriate standard DNA, the suggested real-time DOP-PCR method can be used as a universal method for quantification of trace amounts of DNA.

  10. Integrating theory and data to create an online self-management programme for adults with type 2 diabetes: HeLP-Diabetes

    Directory of Open Access Journals (Sweden)

    Kingshuk Pal

    2015-10-01

    This protocol demonstrates a multi-disciplinary approach to combining evidence from multiple sources to create ’HeLP-Diabetes’: a theory and evidence based online self-management intervention for adults with type 2 diabetes.

  11. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  12. Advanced theories of hypoid gears

    CERN Document Server

    Wang, Xudong

    2013-01-01

    In order to develop more efficient types of gears, further investigation into the theories of engagement is necessary. Up until now most of the research work on the theories of engagement has been carried out separately on different groups, and based on individual types of profiles. This book aims at developing some universal theories, which can not only be used for all types of gears, but can also be utilized in other fields such as sculptured surfaces. The book has four characteristics: the investigations are concentrated on mismatched tooth surfaces; all the problems are dealt with from a

  13. Use of the Hage framework for theory construction: Factors affecting glucose control in the college-aged student with type 1 diabetes.

    Science.gov (United States)

    Meyer, Rebecca A; Fish, Anne F; Lou, Qinqing

    2017-10-01

    This article describes the Hage framework for theory construction and its application to the clinical problem of glycemic control in college-aged students with type 1 diabetes. College-aged students with type 1 diabetes struggle to self-manage their condition. Glycated hemoglobin (HbA1c), if controlled within acceptable limits (6-8%), is associated with the prevention or delay of serious diabetic complications such as kidney and cardiovascular disease. Diabetes educators provide knowledge and skills, but young adults must self-manage their condition on a daily basis, independent of parents. The Hage framework includes five tasks of theory construction: narrowing and naming the concepts, specifying the definitions, creating the theoretical statements, specifying the linkages, and ordering components in preparation for model building. During the process, concepts within the theory were revised as the literature was reviewed, and measures and hypotheses, foundational to research, were generated. We were successful in applying the framework and creating a model of factors affecting glycemic control, emphasizing that physical activity, thought of as a normal part of wellness, can be a two-edged sword producing positive effect but also serious negative effects in some college-aged students with type 1 diabetes. Contextual factors important to self-management in college-aged students are emphasized. The Hage framework, already used to a small extent in nursing curricula, deserves more attention and, because of its generic nature, may be used as a template for theory construction to examine a wide variety of nursing topics. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Perturbation theory

    International Nuclear Information System (INIS)

    Bartlett, R.; Kirtman, B.; Davidson, E.R.

    1978-01-01

    After noting some advantages of using perturbation theory some of the various types are related on a chart and described, including many-body nonlinear summations, quartic force-field fit for geometry, fourth-order correlation approximations, and a survey of some recent work. Alternative initial approximations in perturbation theory are also discussed. 25 references

  15. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  16. How Severely Is DNA Quantification Hampered by RNA Co-extraction?

    Science.gov (United States)

    Sanchez, Ignacio; Remm, Matthieu; Frasquilho, Sonia; Betsou, Fay; Mathieson, William

    2015-10-01

    The optional RNase digest that is part of many DNA extraction protocols is often omitted, either because RNase is not provided in the kit or because users do not want to risk contaminating their laboratory. Consequently, co-eluting RNA can become a "contaminant" of unknown magnitude in a DNA extraction. We extracted DNA from liver, lung, kidney, and heart tissues and established that 28-52% of the "DNA" as assessed by spectrophotometry is actually RNA (depending on tissue type). Including an RNase digest in the extraction protocol reduced 260:280 purity ratios. Co-eluting RNA drives an overestimation of DNA yield when quantification is carried out using OD 260 nm spectrophotometry, or becomes an unquantified contaminant when spectrofluorometry is used for DNA quantification. This situation is potentially incompatible with the best practice guidelines for biobanks issued by organizations such as the International Society for Biological and Environmental Repositories, which state that biospecimens should be accurately characterized in terms of their identity, purity, concentration, and integrity. Consequently, we conclude that an RNase digest must be included in DNA extractions if pure DNA is required. We also discuss the implications of unquantified RNA contamination in DNA samples in the context of laboratory accreditation schemes.

  17. Higher Inductive Types as Homotopy-Initial Algebras

    Science.gov (United States)

    2016-08-01

    correspondence between Martin -Löf’s constructive type theory and ab- stract homotopy theory. We have a powerful interplay between these disciplines - we can...inductive types we call W-quotients which generalize Martin -Löf’s well-founded trees to a higher- dimensional setting. We have shown that a...27]). Among the most studied type theories is Martin -Löf’s intuition- istic type theory ([20, 22]), also known as constructive or dependent type

  18. Type IIA flux compactifications. Vacua, effective theories and cosmological challenges

    International Nuclear Information System (INIS)

    Koers, Simon

    2009-01-01

    In this thesis, we studied a number of type IIA SU(3)-structure compactifications with 06-planes on nilmanifolds and cosets, which are tractable enough to allow for an explicit derivation of the low energy effective theory. In particular we calculated the mass spectrum of the light scalar modes, using N = 1 supergravity techniques. For the torus and the Iwasawa solution, we have also performed an explicit Kaluza-Klein reduction, which led to the same result. For the nilmanifold examples we have found that there are always three unstabilized moduli corresponding to axions in the RR sector. On the other hand, in the coset models, except for SU(2) x SU(2), all moduli are stabilized. We discussed the Kaluza-Klein decoupling for the supersymmetric AdS vacua and found that it requires going to the Nearly-Calabi Yau limited. We searched for non-trivial de Sitter minima in the original flux potential away from the AdS vacuum. Finally, in chapter 7, we focused on a family of three coset spaces and constructed non-supersymmetric vacua on them. (orig.)

  19. Verifying Process Algebra Proofs in Type Theory

    NARCIS (Netherlands)

    Sellink, M.P.A.

    In this paper we study automatic verification of proofs in process algebra. Formulas of process algebra are represented by types in typed λ-calculus. Inhabitants (terms) of these types represent proofs. The specific typed λ-calculus we use is the Calculus of Inductive Constructions as implemented

  20. Quantification of discreteness effects in cosmological N-body simulations: Initial conditions

    International Nuclear Information System (INIS)

    Joyce, M.; Marcos, B.

    2007-01-01

    The relation between the results of cosmological N-body simulations, and the continuum theoretical models they simulate, is currently not understood in a way which allows a quantification of N dependent effects. In this first of a series of papers on this issue, we consider the quantification of such effects in the initial conditions of such simulations. A general formalism developed in [A. Gabrielli, Phys. Rev. E 70, 066131 (2004).] allows us to write down an exact expression for the power spectrum of the point distributions generated by the standard algorithm for generating such initial conditions. Expanded perturbatively in the amplitude of the input (i.e. theoretical, continuum) power spectrum, we obtain at linear order the input power spectrum, plus two terms which arise from discreteness and contribute at large wave numbers. For cosmological type power spectra, one obtains as expected, the input spectrum for wave numbers k smaller than that characteristic of the discreteness. The comparison of real space correlation properties is more subtle because the discreteness corrections are not as strongly localized in real space. For cosmological type spectra the theoretical mass variance in spheres and two-point correlation function are well approximated above a finite distance. For typical initial amplitudes this distance is a few times the interparticle distance, but it diverges as this amplitude (or, equivalently, the initial redshift of the cosmological simulation) goes to zero, at fixed particle density. We discuss briefly the physical significance of these discreteness terms in the initial conditions, in particular, with respect to the definition of the continuum limit of N-body simulations

  1. The early life origin theory in the development of cardiovascular disease and type 2 diabetes.

    Science.gov (United States)

    Lindblom, Runa; Ververis, Katherine; Tortorella, Stephanie M; Karagiannis, Tom C

    2015-04-01

    Life expectancy has been examined from a variety of perspectives in recent history. Epidemiology is one perspective which examines causes of morbidity and mortality at the population level. Over the past few 100 years there have been dramatic shifts in the major causes of death and expected life length. This change has suffered from inconsistency across time and space with vast inequalities observed between population groups. In current focus is the challenge of rising non-communicable diseases (NCD), such as cardiovascular disease and type 2 diabetes mellitus. In the search to discover methods to combat the rising incidence of these diseases, a number of new theories on the development of morbidity have arisen. A pertinent example is the hypothesis published by David Barker in 1995 which postulates the prenatal and early developmental origin of adult onset disease, and highlights the importance of the maternal environment. This theory has been subject to criticism however it has gradually gained acceptance. In addition, the relatively new field of epigenetics is contributing evidence in support of the theory. This review aims to explore the implication and limitations of the developmental origin hypothesis, via an historical perspective, in order to enhance understanding of the increasing incidence of NCDs, and facilitate an improvement in planning public health policy.

  2. Pitfalls of DNA Quantification Using DNA-Binding Fluorescent Dyes and Suggested Solutions.

    Science.gov (United States)

    Nakayama, Yuki; Yamaguchi, Hiromi; Einaga, Naoki; Esumi, Mariko

    2016-01-01

    The Qubit fluorometer is a DNA quantification device based on the fluorescence intensity of fluorescent dye binding to double-stranded DNA (dsDNA). Qubit is generally considered useful for checking DNA quality before next-generation sequencing because it measures intact dsDNA. To examine the most accurate and suitable methods for quantifying DNA for quality assessment, we compared three quantification methods: NanoDrop, which measures UV absorbance; Qubit; and quantitative PCR (qPCR), which measures the abundance of a target gene. For the comparison, we used three types of DNA: 1) DNA extracted from fresh frozen liver tissues (Frozen-DNA); 2) DNA extracted from formalin-fixed, paraffin-embedded liver tissues comparable to those used for Frozen-DNA (FFPE-DNA); and 3) DNA extracted from the remaining fractions after RNA extraction with Trizol reagent (Trizol-DNA). These DNAs were serially diluted with distilled water and measured using three quantification methods. For Frozen-DNA, the Qubit values were not proportional to the dilution ratio, in contrast with the NanoDrop and qPCR values. This non-proportional decrease in Qubit values was dependent on a lower salt concentration, and over 1 mM NaCl in the DNA solution was required for the Qubit measurement. For FFPE-DNA, the Qubit values were proportional to the dilution ratio and were lower than the NanoDrop values. However, electrophoresis revealed that qPCR reflected the degree of DNA fragmentation more accurately than Qubit. Thus, qPCR is superior to Qubit for checking the quality of FFPE-DNA. For Trizol-DNA, the Qubit values were proportional to the dilution ratio and were consistently lower than the NanoDrop values, similar to FFPE-DNA. However, the qPCR values were higher than the NanoDrop values. Electrophoresis with SYBR Green I and single-stranded DNA (ssDNA) quantification demonstrated that Trizol-DNA consisted mostly of non-fragmented ssDNA. Therefore, Qubit is not always the most accurate method for

  3. Group field theory with noncommutative metric variables.

    Science.gov (United States)

    Baratin, Aristide; Oriti, Daniele

    2010-11-26

    We introduce a dual formulation of group field theories as a type of noncommutative field theories, making their simplicial geometry manifest. For Ooguri-type models, the Feynman amplitudes are simplicial path integrals for BF theories. We give a new definition of the Barrett-Crane model for gravity by imposing the simplicity constraints directly at the level of the group field theory action.

  4. Modeling the size dependent pull-in instability of beam-type NEMS using strain gradient theory

    Directory of Open Access Journals (Sweden)

    Ali Koochi

    Full Text Available It is well recognized that size dependency of materials characteristics, i.e. size-effect, often plays a significant role in the performance of nano-structures. Herein, strain gradient continuum theory is employed to investigate the size dependent pull-in instability of beam-type nano-electromechanical systems (NEMS. Two most common types of NEMS i.e. nano-bridge and nano-cantilever are considered. Effects of electrostatic field and dispersion forces i.e. Casimir and van der Waals (vdW attractions have been considered in the nonlinear governing equations of the systems. Two different solution methods including numerical and Rayleigh-Ritz have been employed to solve the constitutive differential equations of the system. Effect of dispersion forces, the size dependency and the importance of coupling between them on the instability performance are discussed.

  5. Quantification of camel DNA from three camel types 2017

    OpenAIRE

    Alhaddad, Hasan

    2017-01-01

    This poster is a summary of comparisons of the quantity DNA extracted from blood, saliva, and hair and from three camel types Majaheem, Sufor, Wadh. The poster was presented at Kuwait University-research sector poster day 2017

  6. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  7. Toward an integrative account of social cognition: marrying theory of mind and interactionism to study the interplay of Type 1 and Type 2 processes.

    Science.gov (United States)

    Bohl, Vivian; van den Bos, Wouter

    2012-01-01

    Traditional theory of mind (ToM) accounts for social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional ToM accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering ToM and interactionism as mutually exclusive opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labeled Type 1) refers to processes that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labeled Type 2) refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, ToM is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behavior may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction.

  8. Toward an integrative account of social cognition: marrying theory of mind and interactionism to study the interplay of Type 1 and Type 2 processes

    Science.gov (United States)

    Bohl, Vivian; van den Bos, Wouter

    2012-01-01

    Traditional theory of mind (ToM) accounts for social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional ToM accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering ToM and interactionism as mutually exclusive opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labeled Type 1) refers to processes that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labeled Type 2) refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, ToM is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behavior may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction. PMID:23087631

  9. Validated RP-HPLC/DAD Method for the Quantification of Insect Repellent Ethyl 2-Aminobenzoate in Membrane-Moderated Matrix Type Monolithic Polymeric Device.

    Science.gov (United States)

    Islam, Johirul; Zaman, Kamaruz; Chakrabarti, Srijita; Sharma Bora, Nilutpal; Mandal, Santa; Pratim Pathak, Manash; Srinivas Raju, Pakalapati; Chattopadhyay, Pronobesh

    2017-07-01

    A simple, accurate and sensitive reversed-phase high-performance liquid chromatographic (RP-HPLC) method has been developed for the estimation of ethyl 2-aminobenzoate (EAB) in a matrix type monolithic polymeric device and validated as per the International Conference on Harmonization guidelines. The analysis was performed isocratically on a ZORBAX Eclipse plus C18 analytical column (250 × 4.4 mm, 5 μm) and a diode array detector (DAD) using acetonitrile and water (75:25 v/v) as the mobile phase by keeping the flow-rate constant at 1.0 mL/min. Determination of EAB was not interfered in the presence of excipients. Inter- and intra-day relative standard deviations were not higher than 2%. Mean recovery was between 98.7 and 101.3%. Calibration curve was linear in the concentration range of 0.5-10 µg/mL. Limits of detection and quantification were 0.19 and 0.60 µg/mL, respectively. Thus, the present report put forward a novel method for the estimation of EAB, an emerging insect repellent, by using RP-HPLC technique. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  11. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    Science.gov (United States)

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  12. Objective quantification of the tinnitus decompensation by synchronization measures of auditory evoked single sweeps.

    Science.gov (United States)

    Strauss, Daniel J; Delb, Wolfgang; D'Amelio, Roberto; Low, Yin Fen; Falkai, Peter

    2008-02-01

    Large-scale neural correlates of the tinnitus decompensation might be used for an objective evaluation of therapies and neurofeedback based therapeutic approaches. In this study, we try to identify large-scale neural correlates of the tinnitus decompensation using wavelet phase stability criteria of single sweep sequences of late auditory evoked potentials as synchronization stability measure. The extracted measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. We provide an interpretation for our results by a neural model of top-down projections based on the Jastreboff tinnitus model combined with the adaptive resonance theory which has not been applied to model tinnitus so far. Using this model, our stability measure of evoked potentials can be linked to the focus of attention on the tinnitus signal. It is concluded that the wavelet phase stability of late auditory evoked potential single sweeps might be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory.

  13. Quantification of cellular uptake of DNA nanostructures by qPCR

    DEFF Research Database (Denmark)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias

    2014-01-01

    interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed...

  14. Dynamic behaviors of spin-1/2 bilayer system within Glauber-type stochastic dynamics based on the effective-field theory

    International Nuclear Information System (INIS)

    Ertaş, Mehmet; Kantar, Ersin; Keskin, Mustafa

    2014-01-01

    The dynamic phase transitions (DPTs) and dynamic phase diagrams of the kinetic spin-1/2 bilayer system in the presence of a time-dependent oscillating external magnetic field are studied by using Glauber-type stochastic dynamics based on the effective-field theory with correlations for the ferromagnetic/ferromagnetic (FM/FM), antiferromagnetic/ferromagnetic (AFM/FM) and antiferromagnetic/antiferromagnetic (AFM/AFM) interactions. The time variations of average magnetizations and the temperature dependence of the dynamic magnetizations are investigated. The dynamic phase diagrams for the amplitude of the oscillating field versus temperature were presented. The results are compared with the results of the same system within Glauber-type stochastic dynamics based on the mean-field theory. - Highlights: • The Ising bilayer system is investigated within the Glauber dynamics based on EFT. • The time variations of average order parameters to find phases are studied. • The dynamic phase diagrams are found for the different interaction parameters. • The system displays the critical points as well as a re-entrant behavior

  15. Dynamic behaviors of spin-1/2 bilayer system within Glauber-type stochastic dynamics based on the effective-field theory

    Energy Technology Data Exchange (ETDEWEB)

    Ertaş, Mehmet; Kantar, Ersin, E-mail: ersinkantar@erciyes.edu.tr; Keskin, Mustafa

    2014-05-01

    The dynamic phase transitions (DPTs) and dynamic phase diagrams of the kinetic spin-1/2 bilayer system in the presence of a time-dependent oscillating external magnetic field are studied by using Glauber-type stochastic dynamics based on the effective-field theory with correlations for the ferromagnetic/ferromagnetic (FM/FM), antiferromagnetic/ferromagnetic (AFM/FM) and antiferromagnetic/antiferromagnetic (AFM/AFM) interactions. The time variations of average magnetizations and the temperature dependence of the dynamic magnetizations are investigated. The dynamic phase diagrams for the amplitude of the oscillating field versus temperature were presented. The results are compared with the results of the same system within Glauber-type stochastic dynamics based on the mean-field theory. - Highlights: • The Ising bilayer system is investigated within the Glauber dynamics based on EFT. • The time variations of average order parameters to find phases are studied. • The dynamic phase diagrams are found for the different interaction parameters. • The system displays the critical points as well as a re-entrant behavior.

  16. Scoliosis: review of types of curves, etiological theories and conservative treatment.

    Science.gov (United States)

    Shakil, Halima; Iqbal, Zaheen A; Al-Ghadir, Ahmad H

    2014-01-01

    Scoliosis is the deviation in the normal vertical spine. Although there are numerous studies available about treatment approaches for scoliosis, the numbers of studies that talk about its etiology and pathology are limited. Aim of this study was to discuss the different types of scoliosis; its curves and etiological theories; and to note their implication on its treatment. We examined various electronic databases including Pub MED, Medline, Cinhal, Cochrane library and Google scholar using key words "scoliosis", "etiology", "pathology" and "conservative treatment". References of obtained articles were also examined for cross references. The search was limited to articles in English language. A total of 145 papers, about Prevalence, History, Symptoms, classification, Biomechanics, Pathogenesis, Kinematics and Treatment of scoliosis were identified to be relevant. To choose the appropriate treatment approach for scoliosis we need to understand its etiology and pathogenesis first. Early intervention with conservative treatment like physiotherapy and bracing can prevent surgery.

  17. Theory of magnetoelectric coupling in 2-2-type magnetostrictive/piezoelectric composite film with texture

    International Nuclear Information System (INIS)

    Liu Chaoqian; Fei Weidong; Li Weili

    2008-01-01

    It is well accepted that textures in polycrystalline films have significant effects on film properties. The magnetoelectric (ME) coupling in a 2-2-type multiferroic composite film was theoretically discussed using Landau-Ginsburg-Devonshire theory, where the influences of dispersive texture and residual stress were considered. As an example, the 2-2-type CoFe 2 O 4 /BaTiO 3 composite film was theoretically analysed, wherein the case of both the magnetostrictive phase and the piezoelectric phase with (0 0 1)-oriented texture was considered. Our results show that the ME coupling is enhanced with the texture degree of the piezoelectric phase and/or the magnitude of the residual tensile stress, but weakened with the magnitude of residual compressive stress. With increasing texture degree of the magnetostrictive phase, the ME coupling is enhanced when the texture degree is smaller than a critical value, but weakened when the texture degree is larger than the critical value

  18. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  19. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  20. Quantification of sensory and food quality: the R-index analysis.

    Science.gov (United States)

    Lee, Hye-Seong; van Hout, Danielle

    2009-08-01

    The accurate quantification of sensory difference/similarity between foods, as well as consumer acceptance/preference and concepts, is greatly needed to optimize and maintain food quality. The R-Index is one class of measures of the degree of difference/similarity, and was originally developed for sensory difference tests for food quality control, product development, and so on. The index is based on signal detection theory and is free of the response bias that can invalidate difference testing protocols, including categorization and same-different and A-Not A tests. It is also a nonparametric analysis, making no assumptions about sensory distributions, and is simple to compute and understand. The R-Index is also flexible in its application. Methods based on R-Index analysis have been used as detection and sensory difference tests, as simple alternatives to hedonic scaling, and for the measurement of consumer concepts. This review indicates the various computational strategies for the R-Index and its practical applications to consumer and sensory measurements in food science.

  1. Matrix String Theory

    CERN Document Server

    Dijkgraaf, R; Verlinde, Herman L

    1997-01-01

    Via compactification on a circle, the matrix model of M-theory proposed by Banks et al suggests a concrete identification between the large N limit of two-dimensional N=8 supersymmetric Yang-Mills theory and type IIA string theory. In this paper we collect evidence that supports this identification. We explicitly identify the perturbative string states and their interactions, and describe the appearance of D-particle and D-membrane states.

  2. K-theory, reality, and orientifolds

    International Nuclear Information System (INIS)

    Gukov, S.

    2000-01-01

    We use equivariant K-theory to classify charges of new (possibly non-supersymmetric) states localized on various orientifolds in type II string theory. We also comment on the stringy construction of new D-branes and demonstrate the discrete electric-magnetic duality in type I brane systems with p+q=7, as proposed by Witten. (orig.)

  3. A new type of disconnectedness problem in a field-theory model of the NNπ system

    International Nuclear Information System (INIS)

    Stelbovics, A.T.; Stingl, M.

    1978-01-01

    When treated as an effective three-body problem in the framework of a simple field-theory model, the NNπ system acquires, in addition to the disconnected subsystem interactions usually considered, a new type of disconnected driving term, possible only for non-conserved particles such as the π. These terms pose a disconnectedness problem more intricate than that solved by Faddeev's equations or their known modifications for connected three-body forces. The solution of this problem in terms of a set of connected-kernel integral equations is presented. (Auth.)

  4. Quantification of regional cerebral blood flow and volume with dynamic susceptibility contrast-enhanced MR imaging.

    Science.gov (United States)

    Rempp, K A; Brix, G; Wenz, F; Becker, C R; Gückel, F; Lorenz, W J

    1994-12-01

    Quantification of regional cerebral blood flow (rCBF) and volume (rCBV) with dynamic magnetic resonance (MR) imaging. After bolus administration of a paramagnetic contrast medium, rapid T2*-weighted gradient-echo images of two sections were acquired for the simultaneous creation of concentration-time curves in the brain-feeding arteries and in brain tissue. Absolute rCBF and rCBV values were determined for gray and white brain matter in 12 subjects with use of principles of the indicator dilution theory. The mean rCBF value in gray matter was 69.7 mL/min +/- 29.7 per 100 g tissue and in white matter, 33.6 mL/min +/- 11.5 per 100 g tissue; the average rCBV was 8.0 mL +/- 3.1 per 100 g tissue and 4.2 mL +/- 1.0 per 100 g tissue, respectively. An age-related decrease in rCBF and rCBV for gray and white matter was observed. Preliminary data demonstrate that the proposed technique allows the quantification of rCBF and rCBV. Although the results are in good agreement with data from positron emission tomography studies, further evaluation is needed to establish the validity of method.

  5. Breaking E8 to SO(16) in M-theory and F-theory

    International Nuclear Information System (INIS)

    Aldabe, F.

    1998-01-01

    M-theory on an 11-dimensional manifold with a boundary must have E 8 gauge groups at each boundary in order to cancel anomalies. The type IA supergravity must have SO(16) gauge group at each boundary in order to be a consistent theory. The latter action can be obtained from the former one via dimensional reduction. Here we make use of the current algebra of the open membrane which couples to the former action to explain why the gauge group E 8 breaks down to SO(16) in going from M-theory to type IA supergravity. We also use the same current algebra to explain why F-theory has an E 8 x E 8 gauge group in its strong coupling limit while it has an SO(16) x SO(16) gauge group in its weak coupling limit. (orig.)

  6. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  7. Mass spectrometric quantification of glucosylsphingosine in plasma and urine of type 1 Gaucher patients using an isotope standard.

    Science.gov (United States)

    Mirzaian, Mina; Wisse, Patrick; Ferraz, Maria J; Gold, Henrik; Donker-Koopman, Wilma E; Verhoek, Marri; Overkleeft, Herman S; Boot, Rolf G; Kramer, Gertjan; Dekker, Nick; Aerts, Johannes M F G

    2015-04-01

    Deficiency of glucocerebrosidase (GBA) leads to Gaucher disease (GD), an inherited disorder characterised by storage of glucosylceramide (GlcCer) in lysosomes of tissue macrophages. Recently, we reported marked increases of deacylated GlcCer, named glucosylsphingosine (GlcSph), in plasma of GD patients. To improve quantification, [5-9] (13)C5-GlcSph was synthesised for use as internal standard with quantitative LC-ESI-MS/MS. The method was validated using plasma of 55 GD patients and 20 controls. Intra-assay variation was 1.8% and inter-assay variation was 4.9% for GlcSph (m/z 462.3). Plasma GlcSph levels with the old and new methods closely correlate (r=0.968, slope=1.038). Next, we analysed GlcSph in 24h urine samples of 30 GD patients prior to therapy. GlcSph was detected in the patient samples (median 1.20nM, range 0.11-8.92nM), but was below the limit of quantification in normal urine. Enzyme replacement therapy led to a decrease of urinary GlcSph of GD patients, coinciding with reductions in plasma GlcSph and markers of Gaucher cells (chitotriosidase and CCL18). In analogy to globotriaosylsphingsone in urine of Fabry disease patients, additional isoforms of GlcSph differing in structure of the sphingosine moiety were identified in GD urine samples. In conclusion, GlcSph can be sensitively detected by LC-ESI-MS/MS with an internal isotope standard. Abnormalities in urinary GlcSph are a hallmark of Gaucher disease allowing biochemical confirmation of diagnosis. Copyright © 2015. Published by Elsevier Inc.

  8. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  9. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  10. Demographic and Motivation Differences Among Online Sex Offenders by Type of Offense: An Exploration of Routine Activities Theories.

    Science.gov (United States)

    Navarro, Jordana N; Jasinski, Jana L

    2015-01-01

    This article presents an analysis of the relationship between online sexual offenders' demographic background and characteristics indicative of motivation and offense type. Specifically, we investigate whether these characteristics can distinguish different online sexual offender groups from one another as well as inform routine activity theorists on what potentially motivates perpetrators. Using multinomial logistic regression, this study found that online sexual offenders' demographic backgrounds and characteristics indicative of motivation do vary by offense types. Two important implications of this study are that the term "online sexual offender" encompasses different types of offenders, including some who do not align with mainstream media's characterization of "predators," and that the potential offender within routine activity theory can be the focus of empirical investigation rather than taken as a given in research.

  11. Random matrix theory

    CERN Document Server

    Deift, Percy

    2009-01-01

    This book features a unified derivation of the mathematical theory of the three classical types of invariant random matrix ensembles-orthogonal, unitary, and symplectic. The authors follow the approach of Tracy and Widom, but the exposition here contains a substantial amount of additional material, in particular, facts from functional analysis and the theory of Pfaffians. The main result in the book is a proof of universality for orthogonal and symplectic ensembles corresponding to generalized Gaussian type weights following the authors' prior work. New, quantitative error estimates are derive

  12. Theory of oscillators

    CERN Document Server

    Andronov, Aleksandr Aleksandrovich; Vitt, Aleksandr Adolfovich

    1966-01-01

    Theory of Oscillators presents the applications and exposition of the qualitative theory of differential equations. This book discusses the idea of a discontinuous transition in a dynamic process. Organized into 11 chapters, this book begins with an overview of the simplest type of oscillatory system in which the motion is described by a linear differential equation. This text then examines the character of the motion of the representative point along the hyperbola. Other chapters consider examples of two basic types of non-linear non-conservative systems, namely, dissipative systems and self-

  13. Quantum mechanical analysis on faujasite-type molecular sieves by using fermi dirac statistics and quantum theory of dielectricity

    International Nuclear Information System (INIS)

    Jabeen, S.; Raza, S.M.; Ahmed, M.A.; Zai, M.Y.; Akbar, S.; Jafri, Y.Z.

    2012-01-01

    We studied Faujasite type molecular sieves by using Fermi Dirac statistics and the quantum theory of dielectricity. We developed an empirical relationship for quantum capacitance which follows an inverse Gaussian profile in the frequency range of 66 Hz - 3 MHz. We calculated quantum capacitance, sample crystal momentum, charge quantization and quantized energy of Faujasite type molecular sieves in the frequency range of 0.1 Hz - 10/sup 4/ MHz. Our calculations for diameter of sodalite and super-cages of Faujasite type molecular sieves are in agreement with experimental results reported in this manuscript. We also calculated quantum polarizability, quantized molecular field, orientational polarizability and deformation polarizability by using experimental results of Ligia Frunza etal. The phonons are over damped in the frequency range 0.1 Hz - 10 kHz and become a source for producing cages in the Faujasite type molecular sieves. Ion exchange recovery processes occur due to over damped phonon excitations in Faujasite type molecular sieves and with increasing temperatures. (author)

  14. Conservation of ecosystems : theory and practice

    CSIR Research Space (South Africa)

    Siegfried, WR

    1982-09-01

    Full Text Available stream_source_info Conservation of Ecosystems Theory and Practice.pdf.txt stream_content_type text/plain stream_size 102 Content-Encoding ISO-8859-1 stream_name Conservation of Ecosystems Theory and Practice.pdf.txt Content...-Type text/plain; charset=ISO-8859-1 ...

  15. Flory-type theories of polymer chains under different external stimuli

    Science.gov (United States)

    Budkov, Yu A.; Kiselev, M. G.

    2018-01-01

    In this Review, we present a critical analysis of various applications of the Flory-type theories to a theoretical description of the conformational behavior of single polymer chains in dilute polymer solutions under a few external stimuli. Different theoretical models of flexible polymer chains in the supercritical fluid are discussed and analysed. Different points of view on the conformational behavior of the polymer chain near the liquid-gas transition critical point of the solvent are presented. A theoretical description of the co-solvent-induced coil-globule transitions within the implicit-solvent-explicit-co-solvent models is discussed. Several explicit-solvent-explicit-co-solvent theoretical models of the coil-to-globule-to-coil transition of the polymer chain in a mixture of good solvents (co-nonsolvency) are analysed and compared with each other. Finally, a new theoretical model of the conformational behavior of the dielectric polymer chain under the external constant electric field in the dilute polymer solution with an explicit account for the many-body dipole correlations is discussed. The polymer chain collapse induced by many-body dipole correlations of monomers in the context of statistical thermodynamics of dielectric polymers is analysed.

  16. Computable Types for Dynamic Systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter); K. Ambos-Spies; B. Loewe; W. Merkle

    2009-01-01

    textabstractIn this paper, we develop a theory of computable types suitable for the study of dynamic systems in discrete and continuous time. The theory uses type-two effectivity as the underlying computational model, but we quickly develop a type system which can be manipulated abstractly, but for

  17. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    Science.gov (United States)

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  18. Superspace conformal field theory

    Energy Technology Data Exchange (ETDEWEB)

    Quella, Thomas [Koeln Univ. (Germany). Inst. fuer Theoretische Physik; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-07-15

    Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.

  19. Superspace conformal field theory

    International Nuclear Information System (INIS)

    Quella, Thomas

    2013-07-01

    Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.

  20. Transition operators in electromagnetic-wave diffraction theory - General theory

    Science.gov (United States)

    Hahne, G. E.

    1992-01-01

    A formal theory is developed for the scattering of time-harmonic electromagnetic waves from impenetrable immobile obstacles with given linear, homogeneous, and generally nonlocal boundary conditions of Leontovich (impedance) type for the wave of the obstacle's surface. The theory is modeled on the complete Green's function and the transition (T) operator in time-independent formal scattering theory of nonrelativistic quantum mechanics. An expression for the differential scattering cross section for plane electromagnetic waves is derived in terms of certain matrix elements of the T operator for the obstacle.

  1. Perturbation theory at large order in more than one coupling constant for a field theory with fermions

    International Nuclear Information System (INIS)

    Chowdhury, A.R.; Roy, T.

    1980-01-01

    We have considered the problem of evaluating the large order estimates of perturbation theory in a quantum field theory with more than one coupling constant. The theory considered is four dimensional and possesses instanton-type solutions. It contains a Boson field coupled with a Fermion through the usual g anti psi psi phi type interaction, along with the self-interaction of the Boson lambda phi 4 . Our analysis reveals a phenomenon not observed in a theory with only one coupling constant. One gets different kinds of behavior in different regions of the (lambda, g) plane. The results are quite encouraging for the application to more realistic field theories

  2. Nonrelativistic superstring theories

    International Nuclear Information System (INIS)

    Kim, Bom Soo

    2007-01-01

    We construct a supersymmetric version of the critical nonrelativistic bosonic string theory [B. S. Kim, Phys. Rev. D 76, 106007 (2007).] with its manifest global symmetry. We introduce the anticommuting bc conformal field theory (CFT) which is the super partner of the βγ CFT. The conformal weights of the b and c fields are both 1/2. The action of the fermionic sector can be transformed into that of the relativistic superstring theory. We explicitly quantize the theory with manifest SO(8) symmetry and find that the spectrum is similar to that of type IIB superstring theory. There is one notable difference: the fermions are nonchiral. We further consider noncritical generalizations of the supersymmetric theory using the superspace formulation. There is an infinite range of possible string theories similar to the supercritical string theories. We comment on the connection between the critical nonrelativistic string theory and the lightlike linear dilaton theory

  3. Installation Restoration Program, Phase II - Confirmation/Quantification Stage I, Moody Air Force Base, Georgia.

    Science.gov (United States)

    1985-12-01

    Confirmation/Quantification. Moody AFB- GA _____ S12. PERSONAL AUTHOR(S) .. ’ Steinberg J.A. and Thiess, W.G. 13.& TYPE OF REPORT 13b. TIME COVERED 14I. DATE...2.3.2 Soils On the high ground western portion of the base, the surface soils are mostly in the Tifton series. The soil profile consists of about 2 to...Florida Department of Environmental Regulation FWQS Florida Water Quality Standards gpd Gallons per day gpm Gallons per minute GC Gas chromatograph

  4. Preclinical evaluation and quantification of [18F]MK-9470 as a radioligand for PET imaging of the type 1 cannabinoid receptor in rat brain

    International Nuclear Information System (INIS)

    Casteels, Cindy; Koole, Michel; Laere, Koen van; Celen, Sofie; Bormans, Guy

    2012-01-01

    [ 18 F]MK-9470 is an inverse agonist for the type 1 cannabinoid (CB1) receptor allowing its use in PET imaging. We characterized the kinetics of [ 18 F]MK-9470 and evaluated its ability to quantify CB1 receptor availability in the rat brain. Dynamic small-animal PET scans with [ 18 F]MK-9470 were performed in Wistar rats on a FOCUS-220 system for up to 10 h. Both plasma and perfused brain homogenates were analysed using HPLC to quantify radiometabolites. Displacement and blocking experiments were done using cold MK-9470 and another inverse agonist, SR141716A. The distribution volume (V T ) of [ 18 F]MK-9470 was used as a quantitative measure and compared to the use of brain uptake, expressed as SUV, a simplified method of quantification. The percentage of intact [ 18 F]MK-9470 in arterial plasma samples was 80 ± 23 % at 10 min, 38 ± 30 % at 40 min and 13 ± 14 % at 210 min. A polar radiometabolite fraction was detected in plasma and brain tissue. The brain radiometabolite concentration was uniform across the whole brain. Displacement and pretreatment studies showed that 56 % of the tracer binding was specific and reversible. V T values obtained with a one-tissue compartment model plus constrained radiometabolite input had good identifiability (≤10 %). Ignoring the radiometabolite contribution using a one-tissue compartment model alone, i.e. without constrained radiometabolite input, overestimated the [ 18 F]MK-9470 V T , but was correlated. A correlation between [ 18 F]MK-9470 V T and SUV in the brain was also found (R 2 = 0.26-0.33; p ≤ 0.03). While the presence of a brain-penetrating radiometabolite fraction complicates the quantification of [ 18 F]MK-9470 in the rat brain, its tracer kinetics can be modelled using a one-tissue compartment model with and without constrained radiometabolite input. (orig.)

  5. Theory of mind in children with Neurofibromatosis Type 1.

    Science.gov (United States)

    Payne, Jonathan M; Porter, Melanie; Pride, Natalie A; North, Kathryn N

    2016-05-01

    Neurofibromatosis Type I (NF1) is a single gene disorder associated with cognitive and behavioral deficits. While there is clear evidence for poorer social outcomes in NF1, the factors underlying reduced social function are not well understood. This study examined theory of mind (ToM) in children with NF1 and unaffected controls. ToM was assessed in children with NF1 (n = 26) and unaffected controls (n = 36) aged 4-12 years using a nonverbal picture sequencing task. The task assessed understanding of ToM (unrealized goals, false belief, pretence, intention), while controlling for social script knowledge and physical cause-and-effect reasoning. Children with NF1 made significantly more errors than unaffected controls on most ToM stories while demonstrating no difficulty sequencing physical cause-and-effect stories. Performance on the picture sequencing task was not related to lower intellectual function, symptoms of attention deficit-hyperactivity disorder (ADHD), or parent ratings of executive function. Results suggest a generalized ToM deficit in children with NF1 that appears to be independent of general cognitive abilities and ADHD symptoms. The study refines understanding of the clinical presentation of NF1 and identifies psychological constructs that may contribute to the higher prevalence of social dysfunction in children with NF1. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  7. Breaking Ground: A Study of Gestalt Therapy Theory and Holland's Theory of Vocational Choice.

    Science.gov (United States)

    Hartung, Paul J.

    In both Gestalt therapy and Holland's theory of vocational choice, person-environment interaction receives considerable emphasis. Gestalt therapy theory suggests that people make contact (that is, meet needs) through a characteristic style of interacting with the environment. Holland identifies six personality types in his theory and asserts that…

  8. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    Science.gov (United States)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  9. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  10. Determination of statin drugs in hospital effluent with dispersive liquid-liquid microextraction and quantification by liquid chromatography.

    Science.gov (United States)

    Martins, Ayrton F; Frank, Carla da S; Altissimo, Joseline; de Oliveira, Júlia A; da Silva, Daiane S; Reichert, Jaqueline F; Souza, Darliana M

    2017-08-24

    Statins are classified as being amongst the most prescribed agents for treating hypercholesterolaemia and preventing vascular diseases. In this study, a rapid and effective liquid chromatography method, assisted by diode array detection, was designed and validated for the simultaneous quantification of atorvastatin (ATO) and simvastatin (SIM) in hospital effluent samples. The solid phase extraction (SPE) of the analytes was optimized regarding sorbent material and pH, and the dispersive liquid-liquid microextraction (DLLME), in terms of pH, ionic strength, type and volume of extractor/dispersor solvents. The performance of both extraction procedures was evaluated in terms of linearity, quantification limits, accuracy (recovery %), precision and matrix effects for each analyte. The methods proved to be linear in the concentration range considered; the quantification limits were 0.45 µg L -1 for ATO and 0.75 µg L -1 for SIM; the matrix effect was almost absent in both methods and the average recoveries remained between 81.5-90.0%; and the RSD values were <20%. The validated methods were applied to the quantification of the statins in real samples of hospital effluent; the concentrations ranged from 18.8 µg L -1 to 35.3 µg L -1 for ATO, and from 30.3 µg L -1 to 38.5 µg L -1 for SIM. Since the calculated risk quotient was ≤192, the occurrence of ATO and SIM in hospital effluent poses a potential serious risk to human health and the aquatic ecosystem.

  11. Antioxidant capacity of different types of tea products | Karori ...

    African Journals Online (AJOL)

    In the present study, twelve different types of commercial tea samples were assayed to determine their phenolic composition and antioxidant activity. Reverse phase high performance liquid chromatography using a binary gradient system was used for the identification and quantification of individual catechins. Subsequently ...

  12. Theory of timber connections with slender dowel type fasteners

    DEFF Research Database (Denmark)

    Svensson, Staffan; Munch-Andersen, Jørgen

    2018-01-01

    A theory on the lateral load-carrying capacity of timber connections with slender fasteners is presented. The base of the theory is the coupled mechanical phenomena acting in the connection, while the wood and the slender fastener deform and yield prior to failure. The objective is to derive...... a sufficient description of actions and responses which have determining influence on the load-carrying capacity of timber connections with slender fasteners. Model assumptions are discussed and made, but simplifications are left out. Even so, simple mathematical equations describing the lateral capacity......-carrying capacity of the tested connections....

  13. From F/M-theory to K-theory and back

    International Nuclear Information System (INIS)

    Garcia-Etxebarria, Inaki; Uranga, Angel M.

    2006-01-01

    We consider discrete K-theory tadpole cancellation conditions in type IIB orientifolds with magnetised 7-branes. Cancellation of K-theory charge constrains the choices of world-volume magnetic fluxes on the latter. We describe the F-/M-theory lift of these configurations, where 7-branes are encoded in the geometry of an elliptic fibration, and their magnetic quanta correspond to supergravity 4-form field strength fluxes. In a K3 compactification example, we show that standard quantization of 4-form fluxes as integer cohomology classes in K3 automatically implies the K-theory charge cancellation constraints on the 7-brane worldvolume magnetic fluxes in string theory (as well as new previously unnoticed discrete constraints, which we also interpret). Finally, we show that flux quantization in F-/M-theory implies that 7-brane world-volume flux quantization conditions are modified in the presence of 3-form fluxes

  14. From F/M-theory to K-theory and back

    CERN Document Server

    Garcia-Etxebarria, I; Garcia-Etxebarria, Inaki; Uranga, Angel M.

    2006-01-01

    We consider discrete K-theory tadpole cancellation conditions in type IIB orientifolds with magnetised 7-branes. Cancellation of K-theory charge constrains the choices of world-volume magnetic fluxes on the latter. We describe the F-/M-theory lift of these configurations, where 7-branes are encoded in the geometry of an elliptic fibration, and their magnetic quanta correspond to supergravity 4-form field strength fluxes. In a K3 compactification example, we show that standard quantization of 4-form fluxes as integer cohomology classes in K3 automatically implies the K-theory charge cancellation constraints on the 7-brane worldvolume magnetic fluxes in string theory (as well as new previously unnoticed discrete constraints, which we also interpret). Finally, we show that flux quantization in F-/M-theory implies that 7-brane world-volume flux quantization conditions are modified in the presence of 3-form fluxes.

  15. Field theories with multiple fermionic excitations

    International Nuclear Information System (INIS)

    Crawford, J.P.

    1978-01-01

    The reason for the existence of the muon has been an enigma since its discovery. Since that time there has been a continuing proliferation of elementary particles. It is proposed that this proliferation of leptons and quarks is comprehensible if there are only four fundamental particles, the leptons ν/sub e/ and e - , and the quarks u and d. All other leptons and quarks are imagined to be excited states of these four fundamental entities. Attention is restricted to the charged leptons and the electromagnetic interactions only. A detailed study of a field theory in which there is only one fundamental charged fermionic field having two (or more) excitations is made. When the electromagnetic interactions are introduced and the theory is second quantized, under certain conditions this theory reproduces the S matrix obtained from usual OED. In this case no electromagnetic transitions are allowed. A leptonic charge operator is defined and a superselection rule for this leptonic charge is found. Unfortunately, the mass spectrum cannot be obtained. This theory has many renormalizable generalizations including non-abelian gauge theories, Yukawa-type theories, and Fermi-type theories. Under certain circumstances the Yukawa- and Fermi-type theories are finite in perturbation theory. It is concluded that there are no fundamental objections to having fermionic fields with more than one excitation

  16. Einstein-aether theory: dynamics of relativistic particles with spin or polarization in a Gödel-type universe

    Energy Technology Data Exchange (ETDEWEB)

    Balakin, Alexander B.; Popov, Vladimir A., E-mail: alexander.balakin@kpfu.ru, E-mail: vladipopov@mail.ru [Department of General Relativity and Gravitation, Institute of Physics, Kazan Federal University, Kremlevskaya str. 18, Kazan 420008 (Russian Federation)

    2017-04-01

    In the framework of the Einstein-aether theory we consider a cosmological model, which describes the evolution of the unit dynamic vector field with activated rotational degree of freedom. We discuss exact solutions of the Einstein-aether theory, for which the space-time is of the Gödel-type, the velocity four-vector of the aether motion is characterized by a non-vanishing vorticity, thus the rotational vectorial modes can be associated with the source of the universe rotation. The main goal of our paper is to study the motion of test relativistic particles with a vectorial internal degree of freedom (spin or polarization), which is coupled to the unit dynamic vector field. The particles are considered as the test ones in the given space-time background of the Gödel-type; the spin (polarization) coupling to the unit dynamic vector field is modeled using exact solutions of three types. The first exact solution describes the aether with arbitrary Jacobson's coupling constants; the second one relates to the case, when the Jacobson's constant responsible for the vorticity is vanishing; the third exact solution is obtained using three constraints for the coupling constants. The analysis of the exact expressions, which are obtained for the particle momentum and for the spin (polarization) four-vector components, shows that the interaction of the spin (polarization) with the unit vector field induces a rotation, which is additional to the geodesic precession of the spin (polarization) associated with the universe rotation as a whole.

  17. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  18. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  19. Conformal field theory and functions of hypergeometric type

    International Nuclear Information System (INIS)

    Isachenkov, Mikhail

    2016-03-01

    Conformal field theory provides a universal description of various phenomena in natural sciences. Its development, swift and successful, belongs to the major highlights of theoretical physics of the late XX century. In contrast, advances of the theory of hypergeometric functions always assumed a slower pace throughout the centuries of its existence. Functional identities studied by this mathematical discipline are fascinating both in their complexity and beauty. This thesis investigates the interrelation of two subjects through a direct analysis of three CFT problems: two-point functions of the 2d strange metal CFT, three-point functions of primaries of the non-rational Toda CFT and kinematical parts of Mellin amplitudes for scalar four-point functions in general dimensions. We flash out various generalizations of hypergeometric functions as a natural mathematical language for two of these problems. Several new methods inspired by extensions of classical results on hypergeometric functions, are presented.

  20. Conformal field theory and functions of hypergeometric type

    Energy Technology Data Exchange (ETDEWEB)

    Isachenkov, Mikhail

    2016-03-15

    Conformal field theory provides a universal description of various phenomena in natural sciences. Its development, swift and successful, belongs to the major highlights of theoretical physics of the late XX century. In contrast, advances of the theory of hypergeometric functions always assumed a slower pace throughout the centuries of its existence. Functional identities studied by this mathematical discipline are fascinating both in their complexity and beauty. This thesis investigates the interrelation of two subjects through a direct analysis of three CFT problems: two-point functions of the 2d strange metal CFT, three-point functions of primaries of the non-rational Toda CFT and kinematical parts of Mellin amplitudes for scalar four-point functions in general dimensions. We flash out various generalizations of hypergeometric functions as a natural mathematical language for two of these problems. Several new methods inspired by extensions of classical results on hypergeometric functions, are presented.

  1. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  2. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    Directory of Open Access Journals (Sweden)

    Jongbin Ko

    2014-01-01

    Full Text Available A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  3. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    Science.gov (United States)

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  4. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    Science.gov (United States)

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification

  5. Locality and realism in contextual theories

    International Nuclear Information System (INIS)

    Hoekzema, D.

    1987-01-01

    Two types of contextual theories are distinguished and shown to be related. For theories of each type a criterion of locality is formulated which is weaker than the classical requirement of separability at spacelike intervals. The relations between the concepts of locality, realism, and ontic chance are discussed

  6. The optimization model for multi-type customers assisting wind power consumptive considering uncertainty and demand response based on robust stochastic theory

    International Nuclear Information System (INIS)

    Tan, Zhongfu; Ju, Liwei; Reed, Brent; Rao, Rao; Peng, Daoxin; Li, Huanhuan; Pan, Ge

    2015-01-01

    Highlights: • Our research focuses on demand response behaviors of multi-type customers. • A wind power simulation method is proposed based on the Brownian motion theory. • Demand response revenue functions are proposed for multi-type customers. • A robust stochastic optimization model is proposed for wind power consumptive. • Models are built to measure the impacts of demand response on wind power consumptive. - Abstract: In order to relieve the influence of wind power uncertainty on power system operation, demand response and robust stochastic theory are introduced to build a stochastic scheduling optimization model. Firstly, this paper presents a simulation method for wind power considering external environment based on Brownian motion theory. Secondly, price-based demand response and incentive-based demand response are introduced to build demand response model. Thirdly, the paper constructs the demand response revenue functions for electric vehicle customers, business customers, industry customers and residential customers. Furthermore, robust stochastic optimization theory is introduced to build a wind power consumption stochastic optimization model. Finally, simulation analysis is taken in the IEEE 36 nodes 10 units system connected with 650 MW wind farms. The results show the robust stochastic optimization theory is better to overcome wind power uncertainty. Demand response can improve system wind power consumption capability. Besides, price-based demand response could transform customers’ load demand distribution, but its load curtailment capacity is not as obvious as incentive-based demand response. Since price-based demand response cannot transfer customer’s load demand as the same as incentive-based demand response, the comprehensive optimization effect will reach best when incentive-based demand response and price-based demand response are both introduced.

  7. Hidden symmetries of the Kaluza-Klein-type theories

    International Nuclear Information System (INIS)

    Popov, A.D.

    1987-01-01

    It is shown that introduction of dynamical torsion in Kaluza-Klein theories makes is possible to increase the number of gauge fields extracted from the Lagrangian without increasing the number of extra dimentions. An example of spontaneous compactification of the model investigated is considered

  8. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  9. Oriented matroid theory as a mathematical framework for M-theory

    OpenAIRE

    Nieto, J. A.

    2006-01-01

    We claim that $M$(atroid) theory may provide a mathematical framework for an underlying description of $M$-theory. Duality is the key symmetry which motivates our proposal. The definition of an oriented matroid in terms of the Farkas property plays a central role in our formalism. We outline how this definition may be carried over $M$-theory. As a consequence of our analysis we find a new type of action for extended systems which combines dually the $p$-brane and its dual $p^{\\perp}$-brane.

  10. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  11. Chern-Simons couplings for dielectric F-strings in matrix string theory

    International Nuclear Information System (INIS)

    Brecher, Dominic; Janssen, Bert; Lozano, Yolanda

    2002-01-01

    We compute the non-abelian couplings in the Chern-Simons action for a set of coinciding fundamental strings in both the type IIA and type IIB Matrix string theories. Starting from Matrix theory in a weakly curved background, we construct the linear couplings of closed string fields to type IIA Matrix strings. Further dualities give a type IIB Matrix string theory and a type IIA theory of Matrix strings with winding. (Abstract Copyright[2002], Wiley Periodicals, Inc.)

  12. Some relations between twisted K-theory and E8 gauge theory

    International Nuclear Information System (INIS)

    Mathai, Varghese; Sati, Hisham

    2004-01-01

    Recently, Diaconescu, Moore and Witten provided a nontrivial link between K-theory and M-theory, by deriving the partition function of the Ramond-Ramond fields of Type IIA string theory from an E8 gauge theory in eleven dimensions. We give some relations between twisted K-theory and M-theory by adapting the method of Diaconescu-Moore-Witten and Moore-Saulina. In particular, we construct the twisted K-theory torus which defines the partition function, and also discuss the problem from the E8 loop group picture, in which the Dixmier-Douady class is the Neveu-Schwarz field. In the process of doing this, we encounter some mathematics that is new to the physics literature. In particular, the eta differential form, which is the generalization of the eta invariant, arises naturally in this context. We conclude with several open problems in mathematics and string theory. (author)

  13. Quantification bias caused by plasmid DNA conformation in quantitative real-time PCR assay.

    Science.gov (United States)

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification.

  14. The Predictive Effects of Protection Motivation Theory on Intention and Behaviour of Physical Activity in Patients with Type 2 Diabetes

    Science.gov (United States)

    Ali Morowatisharifabad, Mohammad; Abdolkarimi, Mahdi; Asadpour, Mohammad; Fathollahi, Mahmood Sheikh; Balaee, Parisa

    2018-01-01

    INTRODUCTION: Theory-based education tailored to target behaviour and group can be effective in promoting physical activity. AIM: The purpose of this study was to examine the predictive power of Protection Motivation Theory on intent and behaviour of Physical Activity in Patients with Type 2 Diabetes. METHODS: This descriptive study was conducted on 250 patients in Rafsanjan, Iran. To examine the scores of protection motivation theory structures, a researcher-made questionnaire was used. Its validity and reliability were confirmed. The level of physical activity was also measured by the International Short - form Physical Activity Inventory. Its validity and reliability were also approved. Data were analysed by statistical tests including correlation coefficient, chi-square, logistic regression and linear regression. RESULTS: The results revealed that there was a significant correlation between all the protection motivation theory constructs and the intention to do physical activity. The results showed that the Theory structures were able to predict 60% of the variance of physical activity intention. The results of logistic regression demonstrated that increase in the score of physical activity intent and self - efficacy increased the chance of higher level of physical activity by 3.4 and 1.5 times, respectively OR = (3.39, 1.54). CONCLUSION: Considering the ability of protection motivation theory structures to explain the physical activity behaviour, interventional designs are suggested based on the structures of this theory, especially to improve self -efficacy as the most powerful factor in predicting physical activity intention and behaviour. PMID:29731945

  15. The Predictive Effects of Protection Motivation Theory on Intention and Behaviour of Physical Activity in Patients with Type 2 Diabetes.

    Science.gov (United States)

    Ali Morowatisharifabad, Mohammad; Abdolkarimi, Mahdi; Asadpour, Mohammad; Fathollahi, Mahmood Sheikh; Balaee, Parisa

    2018-04-15

    Theory-based education tailored to target behaviour and group can be effective in promoting physical activity. The purpose of this study was to examine the predictive power of Protection Motivation Theory on intent and behaviour of Physical Activity in Patients with Type 2 Diabetes. This descriptive study was conducted on 250 patients in Rafsanjan, Iran. To examine the scores of protection motivation theory structures, a researcher-made questionnaire was used. Its validity and reliability were confirmed. The level of physical activity was also measured by the International Short - form Physical Activity Inventory. Its validity and reliability were also approved. Data were analysed by statistical tests including correlation coefficient, chi-square, logistic regression and linear regression. The results revealed that there was a significant correlation between all the protection motivation theory constructs and the intention to do physical activity. The results showed that the Theory structures were able to predict 60% of the variance of physical activity intention. The results of logistic regression demonstrated that increase in the score of physical activity intent and self - efficacy increased the chance of higher level of physical activity by 3.4 and 1.5 times, respectively OR = (3.39, 1.54). Considering the ability of protection motivation theory structures to explain the physical activity behaviour, interventional designs are suggested based on the structures of this theory, especially to improve self -efficacy as the most powerful factor in predicting physical activity intention and behaviour.

  16. Conceptual and computational basis for the quantification of margins and uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon Craig

    2009-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e, Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainty (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. Topics considered include (1) the role of aleatory and epistemic uncertainty in QMU, (2) the representation of uncertainty with probability, (3) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, (4) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty, (5) procedures for sampling-based uncertainty and sensitivity analysis, (6) the representation of uncertainty with alternatives to probability such as interval analysis, possibility theory and evidence theory, (7) the representation of uncertainty with alternatives to probability in QMU analyses involving only epistemic uncertainty, and (8) the representation of uncertainty with alternatives to probability in QMU analyses involving aleatory and epistemic uncertainty. Concepts and computational procedures are illustrated with both notional examples and examples from reactor safety and radioactive waste disposal.

  17. Minimal string theories and integrable hierarchies

    Science.gov (United States)

    Iyer, Ramakrishnan

    Well-defined, non-perturbative formulations of the physics of string theories in specific minimal or superminimal model backgrounds can be obtained by solving matrix models in the double scaling limit. They provide us with the first examples of completely solvable string theories. Despite being relatively simple compared to higher dimensional critical string theories, they furnish non-perturbative descriptions of interesting physical phenomena such as geometrical transitions between D-branes and fluxes, tachyon condensation and holography. The physics of these theories in the minimal model backgrounds is succinctly encoded in a non-linear differential equation known as the string equation, along with an associated hierarchy of integrable partial differential equations (PDEs). The bosonic string in (2,2m-1) conformal minimal model backgrounds and the type 0A string in (2,4 m) superconformal minimal model backgrounds have the Korteweg-de Vries system, while type 0B in (2,4m) backgrounds has the Zakharov-Shabat system. The integrable PDE hierarchy governs flows between backgrounds with different m. In this thesis, we explore this interesting connection between minimal string theories and integrable hierarchies further. We uncover the remarkable role that an infinite hierarchy of non-linear differential equations plays in organizing and connecting certain minimal string theories non-perturbatively. We are able to embed the type 0A and 0B (A,A) minimal string theories into this single framework. The string theories arise as special limits of a rich system of equations underpinned by an integrable system known as the dispersive water wave hierarchy. We find that there are several other string-like limits of the system, and conjecture that some of them are type IIA and IIB (A,D) minimal string backgrounds. We explain how these and several other string-like special points arise and are connected. In some cases, the framework endows the theories with a non

  18. On effective theories of topological strings

    International Nuclear Information System (INIS)

    Elitzur, S.; Forge, A.; Rabinovici, E.

    1992-01-01

    We study the construction of effective target-space theories of topological string theories. The example of the CP1 topological sigma model is analysed in detail. An effective target-space theory whose correlation functions are defined by the sum over connected Riemann surfaces of all genera is found to be itself topological. The values of the couplings of this effective theory are expressed in terms of those of the world-sheet theory for a general CP1-like world-sheet model. Any model of this type can be obtained as an effective theory. The definition of the effective theory's expectation values as a sum over disconnected surfaces as well, is shown not to be compatible with those of a topological theory, at least as long as the connectivity of the target space is kept fixed. Dilaton-type couplings emerge in the full lagrangian realization of the moduli space of topological theories with n observables. En route, we encounter a nonperturbative duality, an equivalence of theories with different world-sheets and discuss the relation between the cosmological constant in these finite theories and the zero-point function. (orig.)

  19. The effects of metal ion PCR inhibitors on results obtained with the Quantifiler(®) Human DNA Quantification Kit.

    Science.gov (United States)

    Combs, Laura Gaydosh; Warren, Joseph E; Huynh, Vivian; Castaneda, Joanna; Golden, Teresa D; Roby, Rhonda K

    2015-11-01

    Forensic DNA samples may include the presence of PCR inhibitors, even after extraction and purification. Studies have demonstrated that metal ions, co-purified at specific concentrations, inhibit DNA amplifications. Metal ions are endogenous to sample types, such as bone, and can be introduced from environmental sources. In order to examine the effect of metal ions as PCR inhibitors during quantitative real-time PCR, 2800 M DNA was treated with 0.0025-18.750 mM concentrations of aluminum, calcium, copper, iron, nickel, and lead. DNA samples, both untreated and metal-treated, were quantified using the Quantifiler(®) Human DNA Quantification Kit. Quantification cycle (Cq) values for the Quantifiler(®) Human DNA and internal PCR control (IPC) assays were measured and the estimated concentrations of human DNA were obtained. Comparisons were conducted between metal-treated and control DNA samples to determine the accuracy of the quantification estimates and to test the efficacy of the IPC inhibition detection. This kit is most resistant to the presence of calcium as compared to all metals tested; the maximum concentration tested does not affect the amplification of the IPC or quantification of the sample. This kit is most sensitive to the presence of aluminum; concentrations greater than 0.0750 mM negatively affected the quantification, although the IPC assay accurately assessed the presence of PCR inhibition. The Quantifiler(®) Human DNA Quantification Kit accurately quantifies human DNA in the presence of 0.5000 mM copper, iron, nickel, and lead; however, the IPC does not indicate the presence of PCR inhibition at this concentration of these metals. Unexpectedly, estimates of DNA quantity in samples treated with 18.750 mM copper yielded values in excess of the actual concentration of DNA in the samples; fluorescence spectroscopy experiments indicated this increase was not a direct interaction between the copper metal and 6-FAM dye used to label the probe that

  20. An Examination of the Four-Part Theory of the Chinese Self: The Differentiation and Relative Importance of the Different Types of Social-Oriented Self.

    Science.gov (United States)

    Sun, Chien-Ru

    2017-01-01

    Because culture has a deep and far-reaching influence, individuals who grew up within different cultures tend to develop different basic self-constructions. With respect to the Chinese under the influence of Chinese culture, Yang proposed the concepts of individual-oriented self and social-oriented self. He argued that, besides the individual-oriented self, the social-oriented self of the Chinese contains three types of self: the relationship-oriented self, the familistic (group)-oriented self, and the other-oriented self. The theory proposed that the Chinese self is appropriately covered only through this four-part theory of the Chinese self. However, this remains to be tested; whether these three types of sub-level "selves" can be effectively triggered, along with their relative importance. This study examines the four-part theory of the Chinese self. Through photo priming, Experiment 1 shows that the three types of social-oriented self are differentiated from each other and can be individually triggered. In Experiment 2, the importance of the three types of self was investigated, adopting the concept of limited self-regulation resources to design scenarios. The participants were asked to make counterarguments about the notion of each of the three types of self, with performance in the subsequent task serving as the main dependent variable. In Experiment 3, the relative importance of the three types of self was examined by investigating the choices made by individuals within the context of conflict under the three orientations of the social-oriented self. Overall, results of the experiments showed that the Chinese have a four-part self with the importance of the other-oriented self as the most remarkable.

  1. An Examination of the Four-Part Theory of the Chinese Self: The Differentiation and Relative Importance of the Different Types of Social-Oriented Self

    Directory of Open Access Journals (Sweden)

    Chien-Ru Sun

    2017-06-01

    Full Text Available Because culture has a deep and far-reaching influence, individuals who grew up within different cultures tend to develop different basic self-constructions. With respect to the Chinese under the influence of Chinese culture, Yang proposed the concepts of individual-oriented self and social-oriented self. He argued that, besides the individual-oriented self, the social-oriented self of the Chinese contains three types of self: the relationship-oriented self, the familistic (group-oriented self, and the other-oriented self. The theory proposed that the Chinese self is appropriately covered only through this four-part theory of the Chinese self. However, this remains to be tested; whether these three types of sub-level “selves” can be effectively triggered, along with their relative importance. This study examines the four-part theory of the Chinese self. Through photo priming, Experiment 1 shows that the three types of social-oriented self are differentiated from each other and can be individually triggered. In Experiment 2, the importance of the three types of self was investigated, adopting the concept of limited self-regulation resources to design scenarios. The participants were asked to make counterarguments about the notion of each of the three types of self, with performance in the subsequent task serving as the main dependent variable. In Experiment 3, the relative importance of the three types of self was examined by investigating the choices made by individuals within the context of conflict under the three orientations of the social-oriented self. Overall, results of the experiments showed that the Chinese have a four-part self with the importance of the other-oriented self as the most remarkable.

  2. Theory of chromatography of partially cyclic polymers: Tadpole-type and manacle-type macromolecules.

    Science.gov (United States)

    Vakhrushev, Andrey V; Gorbunov, Alexei A

    2016-02-12

    A theory of chromatography is developed for partially cyclic polymers of tadpole- and manacle-shaped topological structures. We present exact equations for the distribution coefficient K at different adsorption interactions; simpler approximate formulae are also derived, relevant to the conditions of size-exclusion, adsorption, and critical chromatography. Theoretical chromatograms of heterogeneous partially cyclic polymers are simulated, and conditions for good separation by topology are predicted. According to the theory, an effective SEC-radius of tadpoles and manacles is mostly determined by the molar mass M, and by the linear-cyclic composition. In the interactive chromatography, the effect of molecular topology on the retention becomes significant. At the critical interaction point, partial dependences K(Mlin) and K(Mring) are qualitatively different: while being almost independent of Mlin, K increases with Mring. This behavior could be realized in critical chromatography-for separation of partially cyclic polymers by the number and molar mass of cyclic elements. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Electrophoresis Gel Quantification with a Flatbed Scanner and Versatile Lighting from a Screen Scavenged from a Liquid Crystal Display (LCD) Monitor

    Science.gov (United States)

    Yeung, Brendan; Ng, Tuck Wah; Tan, Han Yen; Liew, Oi Wah

    2012-01-01

    The use of different types of stains in the quantification of proteins separated on gels using electrophoresis offers the capability of deriving good outcomes in terms of linear dynamic range, sensitivity, and compatibility with specific proteins. An inexpensive, simple, and versatile lighting system based on liquid crystal display backlighting is…

  4. Nambu–Poisson gauge theory

    Energy Technology Data Exchange (ETDEWEB)

    Jurčo, Branislav, E-mail: jurco@karlin.mff.cuni.cz [Charles University in Prague, Faculty of Mathematics and Physics, Mathematical Institute, Prague 186 75 (Czech Republic); Schupp, Peter, E-mail: p.schupp@jacobs-university.de [Jacobs University Bremen, 28759 Bremen (Germany); Vysoký, Jan, E-mail: vysokjan@fjfi.cvut.cz [Jacobs University Bremen, 28759 Bremen (Germany); Czech Technical University in Prague, Faculty of Nuclear Sciences and Physical Engineering, Prague 115 19 (Czech Republic)

    2014-06-02

    We generalize noncommutative gauge theory using Nambu–Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg–Witten map. We construct a covariant Nambu–Poisson gauge theory action, give its first order expansion in the Nambu–Poisson tensor and relate it to a Nambu–Poisson matrix model.

  5. Nambu–Poisson gauge theory

    International Nuclear Information System (INIS)

    Jurčo, Branislav; Schupp, Peter; Vysoký, Jan

    2014-01-01

    We generalize noncommutative gauge theory using Nambu–Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg–Witten map. We construct a covariant Nambu–Poisson gauge theory action, give its first order expansion in the Nambu–Poisson tensor and relate it to a Nambu–Poisson matrix model.

  6. Supersymmetric gauge theories with classical groups via M theory fivebrane

    International Nuclear Information System (INIS)

    Terashima, S.

    1998-01-01

    We study the moduli space of vacua of four-dimensional N=1 and N=2 supersymmetric gauge theories with the gauge groups Sp(2N c ), SO(2N c ) and SO(2N c +1) using the M theory fivebrane. Higgs branches of the N=2 supersymmetric gauge theories are interpreted in terms of the M theory fivebrane and the type IIA s-rule is realized in it. In particular, we construct the fivebrane configuration which corresponds to a special Higgs branch root. This root is analogous to the baryonic branch root in the SU(N c ) theory which remains as a vacuum after the adjoint mass perturbation to break N=2 to N=1. Furthermore, we obtain the monopole condensations and the meson vacuum expectation values in the confining phase of N=1 supersymmetric gauge theories using the fivebrane technique. These are in complete agreement with the field theory results for the vacua in the phase with a single confined photon. (orig.)

  7. Strong Coupling Dynamics of Four-Dimensional N=1 Gauge Theories from M Theory Fivebrane

    International Nuclear Information System (INIS)

    Hori, K.; Ooguri, H.; Oz, Y.

    1997-01-01

    It has been known that the fivebrane of type IIA theory can be used to give an exact low energy description of N=2 supersymmetric gauge theories in four dimensions. We follow the recent M theory description by Witten and show that it can be used to study theories with N=1 supersymmetry. The N=2 supersymmetry can be broken to N=1 by turning on a mass for the adjoint chiral superfield in the N=2 vector multiplet. We construct the configuration of the fivebrane for both finite and infinite values of the adjoint mass. The fivebrane describes strong coupling dynamics of N=1 theory with SU(N c ) gauge group and N f quarks. For N c > N f , we show how the brane configuration encodes the information of the Affleck-Dine-Seiberg superpotential. For N c and f , we study the deformation space of the brane configuration and compare it with the moduli space of the N=1 theory. We find agreement with field theory results, including the quantum deformation of the moduli space at N c = N f . We also prove the type II s-rule in M theory and find new non-renormalization theorems for N = 1 superpotentials

  8. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    Science.gov (United States)

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  9. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    Science.gov (United States)

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  10. Supersymmetric gauge theories, quantization of Mflat, and conformal field theory

    International Nuclear Information System (INIS)

    Teschner, J.; Vartanov, G.S.

    2013-02-01

    We propose a derivation of the correspondence between certain gauge theories with N=2 supersymmetry and conformal field theory discovered by Alday, Gaiotto and Tachikawa in the spirit of Seiberg-Witten theory. Based on certain results from the literature we argue that the quantum theory of the moduli spaces of flat SL(2,R)-connections represents a nonperturbative ''skeleton'' of the gauge theory, protected by supersymmetry. It follows that instanton partition functions can be characterized as solutions to a Riemann-Hilbert type problem. In order to solve it, we describe the quantization of the moduli spaces of flat connections explicitly in terms of two natural sets of Darboux coordinates. The kernel describing the relation between the two pictures represents the solution to the Riemann Hilbert problem, and is naturally identified with the Liouville conformal blocks.

  11. Implicit Theories of Persuasion.

    Science.gov (United States)

    Roskos-Ewoldsen, David R.

    1997-01-01

    Explores whether individuals have implicit theories of persuasion. Examines how persuasive strategies are cognitively represented--identifies types of tactics in attitude change and social acceptability of persuasive strategies. Finds implicit theories of persuasion reflect the audience's familiarity with the topic. Finds also that implicit…

  12. Documentary and Cognitive Theory

    DEFF Research Database (Denmark)

    Bondebjerg, Ib

    2014-01-01

    This article deals with the benefits of using cognitive theory in documentary film studies. The article outlines general aspects of cognitive theory in humanities and social science, however the main focus is on the role of narrative, visual style and emotional dimensions of different types...

  13. Preclinical evaluation and quantification of [{sup 18}F]MK-9470 as a radioligand for PET imaging of the type 1 cannabinoid receptor in rat brain

    Energy Technology Data Exchange (ETDEWEB)

    Casteels, Cindy [K.U. Leuven, University Hospital Leuven, Division of Nuclear Medicine, Leuven (Belgium); K.U. Leuven, MoSAIC, Molecular Small Animal Imaging Center, Leuven (Belgium); University Hospital Gasthuisberg, Division of Nuclear Medicine, Leuven (Belgium); Koole, Michel; Laere, Koen van [K.U. Leuven, University Hospital Leuven, Division of Nuclear Medicine, Leuven (Belgium); K.U. Leuven, MoSAIC, Molecular Small Animal Imaging Center, Leuven (Belgium); Celen, Sofie; Bormans, Guy [K.U. Leuven, MoSAIC, Molecular Small Animal Imaging Center, Leuven (Belgium); K.U. Leuven, Laboratory for Radiopharmacy, Leuven (Belgium)

    2012-09-15

    [{sup 18}F]MK-9470 is an inverse agonist for the type 1 cannabinoid (CB1) receptor allowing its use in PET imaging. We characterized the kinetics of [{sup 18}F]MK-9470 and evaluated its ability to quantify CB1 receptor availability in the rat brain. Dynamic small-animal PET scans with [{sup 18}F]MK-9470 were performed in Wistar rats on a FOCUS-220 system for up to 10 h. Both plasma and perfused brain homogenates were analysed using HPLC to quantify radiometabolites. Displacement and blocking experiments were done using cold MK-9470 and another inverse agonist, SR141716A. The distribution volume (V{sub T}) of [{sup 18}F]MK-9470 was used as a quantitative measure and compared to the use of brain uptake, expressed as SUV, a simplified method of quantification. The percentage of intact [{sup 18}F]MK-9470 in arterial plasma samples was 80 {+-} 23 % at 10 min, 38 {+-} 30 % at 40 min and 13 {+-} 14 % at 210 min. A polar radiometabolite fraction was detected in plasma and brain tissue. The brain radiometabolite concentration was uniform across the whole brain. Displacement and pretreatment studies showed that 56 % of the tracer binding was specific and reversible. V{sub T} values obtained with a one-tissue compartment model plus constrained radiometabolite input had good identifiability ({<=}10 %). Ignoring the radiometabolite contribution using a one-tissue compartment model alone, i.e. without constrained radiometabolite input, overestimated the [{sup 18}F]MK-9470 V{sub T}, but was correlated. A correlation between [{sup 18}F]MK-9470 V{sub T} and SUV in the brain was also found (R {sup 2} = 0.26-0.33; p {<=} 0.03). While the presence of a brain-penetrating radiometabolite fraction complicates the quantification of [{sup 18}F]MK-9470 in the rat brain, its tracer kinetics can be modelled using a one-tissue compartment model with and without constrained radiometabolite input. (orig.)

  14. At the end of the string: the M theory; Au bout de la corde: la theorie M

    Energy Technology Data Exchange (ETDEWEB)

    Vanhove, P

    1998-04-15

    The first chapter is a general introduction that presents the more or less historical path that led to the discovery of the superstring perturbative theory, to the duality conjectures and eventually to the M-theory. Non-perturbative solutions of supergravity theories and the particular roles of these solutions to superstrings are detailed in chapter 2. The relevant features of extended supersymmetries from super-Poincare algebra are also presented in chapter 2. The superstring considered as a basic perturbative object as well as the non-perturbative solutions of Dirichlet membranes are presented in chapter 3. Static and dynamic properties of these solutions are detailed and discussed in chapter 4. Chapter 5 is dedicated to tests of duality conjectures through the calculation of instanton corrections for various superstring theories. The duality transformation of the heterotic/type-I couple with the SO(32) group are tested. Chapter 5 ends with the explicit computations of non-perturbative contributions for the type-I and type-II theories generated inside the frame of a super Yang-Mill supersymmetric model. The role of a new matrix formulation of the superstring theory is highlighted. (A.C.)

  15. A theory of piezoelectric laminates

    International Nuclear Information System (INIS)

    Giangreco, E.

    1997-01-01

    A theory of piezoelectric laminates is rationally derived from the three-dimensional Voigt theory of piezoelectricity. The present theory is a generalization to piezoelectric laminates of the Reissner-Mindlin-type layer-wise theory of elastic laminates. Both a differential formulation and a variational formulation of the piezoelectric laminate problem are presented. The proposed theory is adopted in the analysis of simple problems, in order to verify its effectiveness. The results it provides turn out to be in good agreement with the results supplied by the Voigt theory of piezoelectricity

  16. Quantification of uncertainties in turbulence modeling: A comparison of physics-based and random matrix theoretic approaches

    International Nuclear Information System (INIS)

    Wang, Jian-Xun; Sun, Rui; Xiao, Heng

    2016-01-01

    Highlights: • Compared physics-based and random matrix methods to quantify RANS model uncertainty. • Demonstrated applications of both methods in channel ow over periodic hills. • Examined the amount of information introduced in the physics-based approach. • Discussed implications to modeling turbulence in both near-wall and separated regions. - Abstract: Numerical models based on Reynolds-Averaged Navier-Stokes (RANS) equations are widely used in engineering turbulence modeling. However, the RANS predictions have large model-form uncertainties for many complex flows, e.g., those with non-parallel shear layers or strong mean flow curvature. Quantification of these large uncertainties originating from the modeled Reynolds stresses has attracted attention in the turbulence modeling community. Recently, a physics-based Bayesian framework for quantifying model-form uncertainties has been proposed with successful applications to several flows. Nonetheless, how to specify proper priors without introducing unwarranted, artificial information remains challenging to the current form of the physics-based approach. Another recently proposed method based on random matrix theory provides the prior distributions with maximum entropy, which is an alternative for model-form uncertainty quantification in RANS simulations. This method has better mathematical rigorousness and provides the most non-committal prior distributions without introducing artificial constraints. On the other hand, the physics-based approach has the advantages of being more flexible to incorporate available physical insights. In this work, we compare and discuss the advantages and disadvantages of the two approaches on model-form uncertainty quantification. In addition, we utilize the random matrix theoretic approach to assess and possibly improve the specification of priors used in the physics-based approach. The comparison is conducted through a test case using a canonical flow, the flow past

  17. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    Science.gov (United States)

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  18. Building theory through design

    DEFF Research Database (Denmark)

    Markussen, Thomas

    2017-01-01

    This chapter deals with a fundamental matter of concern in research through design: how can design work lead to the building of new theory? Controversy exists about the balance between theory and design work in research through design. While some researchers see theory production as the scientific...... hallmark of this type of research, others argue for design work being the primary achievement, with theory serving the auxiliary function of inspiring new designs. This paper demonstrates how design work and theory can be appreciated as two equally important outcomes of research through design. To set...... the scene, it starts out by briefly examining ideas on this issue presented in existing research literature. Hereafter, it introduces three basic forms in which design work can lead to theory that is referred to as extending theories, scaffolding theories and blending theories. Finally, it is discussed how...

  19. Application of the homology method for quantification of low-attenuation lung region in patients with and without COPD

    Directory of Open Access Journals (Sweden)

    Nishio M

    2016-09-01

    Full Text Available Mizuho Nishio,1 Kazuaki Nakane,2 Yutaka Tanaka3 1Clinical PET Center, Institute of Biomedical Research and Innovation, Hyogo, Japan; 2Department of Molecular Pathology, Osaka University Graduate School of Medicine and Health Science, Osaka, Japan; 3Department of Radiology, Chibune General Hospital, Osaka, Japan Background: Homology is a mathematical concept that can be used to quantify degree of contact. Recently, image processing with the homology method has been proposed. In this study, we used the homology method and computed tomography images to quantify emphysema.Methods: This study included 112 patients who had undergone computed tomography and pulmonary function test. Low-attenuation lung regions were evaluated by the homology method, and homology-based emphysema quantification (b0, b1, nb0, nb1, and R was performed. For comparison, the percentage of low-attenuation lung area (LAA% was also obtained. Relationships between emphysema quantification and pulmonary function test results were evaluated by Pearson’s correlation coefficients. In addition to the correlation, the patients were divided into the following three groups based on guidelines of the Global initiative for chronic Obstructive Lung Disease: Group A, nonsmokers; Group B, smokers without COPD, mild COPD, and moderate COPD; Group C, severe COPD and very severe COPD. The homology-based emphysema quantification and LAA% were compared among these groups.Results: For forced expiratory volume in 1 second/forced vital capacity, the correlation coefficients were as follows: LAA%, -0.603; b0, -0.460; b1, -0.500; nb0, -0.449; nb1, -0.524; and R, -0.574. For forced expiratory volume in 1 second, the coefficients were as follows: LAA%, -0.461; b0, -0.173; b1, -0.314; nb0, -0.191; nb1, -0.329; and R, -0.409. Between Groups A and B, difference in nb0 was significant (P-value = 0.00858, and those in the other types of quantification were not significant.Conclusion: Feasibility of the

  20. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: Validation of the quantitative multimolecular approach by radiocarbon analysis

    International Nuclear Information System (INIS)

    Jeanneau, Laurent; Faure, Pierre

    2010-01-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary 14 C) and fossil organic matter ( 14 C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments.

  1. Multiparty Asynchronous Session Types

    DEFF Research Database (Denmark)

    Honda, Kohei; Yoshida, Nobuko; Carbone, Marco

    2016-01-01

    . This work extends the foregoing theories of binary session types to multiparty, asynchronous sessions, which often arise in practical communication-centered applications. Presented as a typed calculus for mobile processes, the theory introduces a new notion of types in which interactions involving multiple......Communication is a central elements in software development. As a potential typed foundation for structured communication-centered programming, session types have been studied over the past decade for a wide range of process calculi and programming languages, focusing on binary (two-party) sessions...... peers are directly abstracted as a global scenario. Global types retain the friendly type syntax of binary session types while specifying dependencies and capturing complex causal chains of multiparty asynchronous interactions. A global type plays the role of a shared agreement among communication peers...

  2. Multiparty symmetric sum types

    DEFF Research Database (Denmark)

    Nielsen, Lasse; Yoshida, Nobuko; Honda, Kohei

    2010-01-01

    This paper introduces a new theory of multiparty session types based on symmetric sum types, by which we can type non-deterministic orchestration choice behaviours. While the original branching type in session types can represent a choice made by a single participant and accepted by others...... determining how the session proceeds, the symmetric sum type represents a choice made by agreement among all the participants of a session. Such behaviour can be found in many practical systems, including collaborative workflow in healthcare systems for clinical practice guidelines (CPGs). Processes...... with the symmetric sums can be embedded into the original branching types using conductor processes. We show that this type-driven embedding preserves typability, satisfies semantic soundness and completeness, and meets the encodability criteria adapted to the typed setting. The theory leads to an efficient...

  3. Theories of superconductivity (a few remarks)

    International Nuclear Information System (INIS)

    Ginzburg, V.L.

    1992-01-01

    The early history in the development of superconductivity. Idea of pairing, Schafroth and BCS types of theories. Some remarks on present state of the microscopical theory of high-temperature superconductors (HTSC). Mean field macroscopic theory of superconductivity and its specific features in HTSC. About generalized macroscopic theory applicable in critical region. Concluding remarks. (orig.)

  4. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  5. Quantitative modeling of operational risk in finance and banking using possibility theory

    CERN Document Server

    Chaudhuri, Arindam

    2016-01-01

    This book offers a comprehensive guide to the modelling of operational risk using possibility theory. It provides a set of methods for measuring operational risks under a certain degree of vagueness and impreciseness, as encountered in real-life data. It shows how possibility theory and indeterminate uncertainty-encompassing degrees of belief can be applied in analysing the risk function, and describes the parametric g-and-h distribution associated with extreme value theory as an interesting candidate in this regard. The book offers a complete assessment of fuzzy methods for determining both value at risk (VaR) and subjective value at risk (SVaR), together with a stability estimation of VaR and SVaR. Based on the simulation studies and case studies reported on here, the possibilistic quantification of risk performs consistently better than the probabilistic model. Risk is evaluated by integrating two fuzzy techniques: the fuzzy analytic hierarchy process and the fuzzy extension of techniques for order prefere...

  6. Cadmium voltametric quantification in table chocolate produced in Chiquinquira-Boyaca, Colombia

    Directory of Open Access Journals (Sweden)

    Paola Andrea Vargas Moreno

    2017-04-01

    Full Text Available Bioaccumulation of heavy metals such as cadmium has been a major concern in scientific communities and international food organizations, given the great toxicological risk to the consumer, and in many places there is no detailed record of its actual content. In this way, the need arises to carry out a study and registration of the concentration of this metal in products such as table chocolate, of great consumption at regional and national level. Likewise, we seek to have effective quantification tools and a reliable and affordable method to achieve the aim of this research. In this research, Cadmium content in powdered and granulated table chocolate was determined, elaborated and commercialized in the municipality of Chiquinquira, Boyacá-Colombia, using the differential pulse voltammetric method of anodic redisolution (DPVMAR. Previously, the parameters of this method were evaluated, selecting selectivity, linearity, sensitivity, precision and accuracy with satisfactory results as follows: selective at a potential range of 0.54 to 0.64 V, sensitivity in ppb, R2> 0.95, % CV 80%. Analysis of variance showed no significant statistical differences (P <0.05 between the results. Cadmium quantification in samples of granulated and powder chocolate showed values of concentration between 214 and 260 ppb, with the highest concentrations of powder chocolate. Cadmium level did not exceed the tolerable weekly intake limit for this type of food.

  7. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  8. Quantification by aberration corrected (S)TEM of boundaries formed by symmetry breaking phase transformations

    Energy Technology Data Exchange (ETDEWEB)

    Schryvers, D., E-mail: nick.schryvers@uantwerpen.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Salje, E.K.H. [Department of Earth Sciences, University of Cambridge, Cambridge CB2 3EQ (United Kingdom); Nishida, M. [Department of Engineering Sciences for Electronics and Materials, Faculty of Engineering Sciences, Kyushu University, Kasuga, Fukuoka 816-8580 (Japan); De Backer, A. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Idrissi, H. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Institute of Mechanics, Materials and Civil Engineering, Université Catholique de Louvain, Place Sainte Barbe, 2, B-1348, Louvain-la-Neuve (Belgium); Van Aert, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2017-05-15

    The present contribution gives a review of recent quantification work of atom displacements, atom site occupations and level of crystallinity in various systems and based on aberration corrected HR(S)TEM images. Depending on the case studied, picometer range precisions for individual distances can be obtained, boundary widths at the unit cell level determined or statistical evolutions of fractions of the ordered areas calculated. In all of these cases, these quantitative measures imply new routes for the applications of the respective materials. - Highlights: • Quantification of picometer displacements at ferroelastic twin boundary in CaTiO{sub 3.} • Quantification of kinks in meandering ferroelectric domain wall in LiNbO{sub 3}. • Quantification of column occupation in anti-phase boundary in Co-Pt. • Quantification of atom displacements at twin boundary in Ni-Ti B19′ martensite.

  9. Identification and quantification of selected chemicals in laser pyrolysis products of mammalian tissues

    Science.gov (United States)

    Spleiss, Martin; Weber, Lothar W.; Meier, Thomas H.; Treffler, Bernd

    1995-01-01

    Liver and muscle tissue have been irradiated with a surgical CO2-laser. The prefiltered fumes were adsorbed on different sorbents (activated charcoal type NIOSH and Carbotrap) and desorbed with different solvents (carbondisulphide and acetone). Analysis was done by gas chromatography/mass spectrometry. An updated list of identified substances is shown. Typical Maillard reaction products as found in warmed over flavour as aldehydes, aromatics, heterocyclic and sulphur compounds were detected. Quantification of some toxicological relevant substances is presented. The amounts of these substances are given in relation to the laser parameters and different tissues for further toxicological assessment.

  10. Hidden Borcherds symmetries in Zn orbifolds of M-theory and magnetized D-branes in type 0' orientifolds

    International Nuclear Information System (INIS)

    Bagnoud, Maxime; Carlevaro, Luca

    2006-01-01

    We study T 11-D-q x T q /Z n orbifold compactifications of eleven-dimensional supergravity and M-theory using a purely algebraic method. Given the description of maximal supergravities reduced on square tori as non-linear coset σ-models, we exploit the mapping between scalar fields of the reduced theory and directions in the tangent space over the coset to construct the orbifold action as a non-Cartan preserving finite order inner automorphism of the complexified U-duality algebra. Focusing on the exceptional serie of Cremmer-Julia groups, we compute the residual U-duality symmetry after orbifold projection and determine the reality properties of their corresponding Lie algebras. We carry out this analysis as far as the hyperbolic e 10 algebra, conjectured to be a symmetry of M-theory. In this case the residual subalgebras are shown to be described by a special class of Borcherds and Kac-Moody algebras, modded out by their centres and derivations. Furthermore, we construct an alternative description of the orbifold action in terms of equivalence classes of shift vectors, and, in D 1, we show that a root of e 10 can always be chosen as the class representative. Then, in the framework of the E 10/10 /K(E 10/10 ) effective σ-model approach to M-theory near a spacelike singularity, we identify these roots with brane configurations stabilizing the corresponding orbifolds. In the particular case of Z 2 orbifolds of M-theory descending to type 0' orientifolds, we argue that these roots can be interpreted as pairs of magnetized D9- and D9'-branes, carrying the lower-dimensional brane charges required for tadpole cancellation. More generally, we provide a classification of all such roots generating Z n product orbifolds for n≤6, and hint at their possible interpretation

  11. Supersymmetric gauge theories from string theory

    International Nuclear Information System (INIS)

    Metzger, St.

    2005-12-01

    This thesis presents various ways to construct four-dimensional quantum field theories from string theory. In a first part we study the generation of a supersymmetric Yang-Mills theory, coupled to an adjoint chiral superfield, from type IIB string theory on non-compact Calabi-Yau manifolds, with D-branes wrapping certain sub-cycles. Properties of the gauge theory are then mapped to the geometric structure of the Calabi-Yau space. Even if the Calabi-Yau geometry is too complicated to evaluate the geometric integrals explicitly, one can then always use matrix model perturbation theory to calculate the effective superpotential. The second part of this work covers the generation of four-dimensional super-symmetric gauge theories, carrying several important characteristic features of the standard model, from compactifications of eleven-dimensional supergravity on G 2 -manifolds. If the latter contain conical singularities, chiral fermions are present in the four-dimensional gauge theory, which potentially lead to anomalies. We show that, locally at each singularity, these anomalies are cancelled by the non-invariance of the classical action through a mechanism called 'anomaly inflow'. Unfortunately, no explicit metric of a compact G 2 -manifold is known. Here we construct families of metrics on compact weak G 2 -manifolds, which contain two conical singularities. Weak G 2 -manifolds have properties that are similar to the ones of proper G 2 -manifolds, and hence the explicit examples might be useful to better understand the generic situation. Finally, we reconsider the relation between eleven-dimensional supergravity and the E 8 x E 8 -heterotic string. This is done by carefully studying the anomalies that appear if the supergravity theory is formulated on a ten-manifold times the interval. Again we find that the anomalies cancel locally at the boundaries of the interval through anomaly inflow, provided one suitably modifies the classical action. (author)

  12. Effects of information type on children's interrogative suggestibility: is Theory-of-Mind involved?

    Science.gov (United States)

    Hünefeldt, Thomas; Rossi-Arnaud, Clelia; Furia, Augusta

    2009-08-01

    This research was aimed at learning more about the different psychological mechanisms underlying children's suggestibility to leading questions, on the one hand, and children's suggestibility to negative feedback, on the other, by distinguishing between interview questions concerning different types of information. Results showed that, unlike the developmental pattern of children's suggestibility to leading questions, the developmental pattern of children's suggestibility to negative feedback differed depending on whether the interview questions concerned external facts (physical states and events) or internal facts (mental states and events). This difference was not manifested in response to questions concerning central versus peripheral facts. Results are interpreted in terms of the hypothesis that children's suggestibility to negative feedback is differently affected by "Theory-of-Mind" abilities than children's suggestibility to leading questions. Further research is needed in order to test this hypothesis.

  13. Induced WZW-type term in dual field theory

    International Nuclear Information System (INIS)

    Nielsen, N.K.

    1990-01-01

    One-loop quantum equivalence is investigated by proper time regularization for a nonlinear σ-model in two dimensions on a group manifold and its dual theory constructed by Fradkin and Tseytlin. The one-loop effective actions are found to deviate by a finite local counterterm with a structure similar to that of a Wess-Zumino-Witten term

  14. Theory of relations

    CERN Document Server

    Fraïssé, R

    2011-01-01

    The first part of this book concerns the present state of the theory of chains (= total or linear orderings), in connection with some refinements of Ramsey's theorem, due to Galvin and Nash-Williams. This leads to the fundamental Laver's embeddability theorem for scattered chains, using Nash-Williams' better quasi-orderings, barriers and forerunning.The second part (chapters 9 to 12) extends to general relations the main notions and results from order-type theory. An important connection appears with permutation theory (Cameron, Pouzet, Livingstone and Wagner) and with logics (existence criter

  15. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Ginzburg-Landau-Gor close-quote kov theory of magnetic oscillations in a type-II two-dimensional superconductor

    International Nuclear Information System (INIS)

    Bruun, G.M.; Nicopoulos, V.N.; Johnson, N.F.

    1997-01-01

    We investigate de Haas endash van Alphen (dHvA) oscillations in the mixed state of a type-II two-dimensional superconductor within a self-consistent Gor close-quote kov perturbation scheme. Assuming that the order parameter forms a vortex lattice we can calculate the expansion coefficients exactly to any order. We have tested the results of the perturbation theory to fourth and eighth order against an exact numerical solution of the corresponding Bogoliubov endash de Gennes equations. The perturbation theory is found to describe well the onset of superconductivity close to the transition point H c2 . Contrary to earlier calculations by other authors we do not find that the perturbative scheme predicts any maximum of the dHvA oscillations below H c2 . Instead we obtain a substantial damping of the magnetic oscillations in the mixed state as compared to the normal state. We have examined the effect of an oscillatory chemical potential due to particle conservation and the effect of a finite Zeeman splitting. Furthermore, we have investigated the recently debated issue of the possibility of a sign change of the fundamental harmonic of the magnetic oscillations. Our theory is compared with experiment and we have found good agreement. copyright 1997 The American Physical Society

  17. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  18. Solitons, gauge theories and the 'great Einstein theorem'

    International Nuclear Information System (INIS)

    Dresden, M.; Chen, S.F.

    1976-01-01

    A field theory is said to be of 'Einstein type' if it has the property that the field equations imply the equations of motion. It is known that general relativity is of Einstein type, it is demonstrated here that the Yang-Mills gauge theory is of Einstein type. The relationship between the singularities in the solutions of the field equations and soliton type is analyzed. (Auth.)

  19. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  20. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  1. Supersymmetric gauge theories from string theory; Theorie de jauge supersymetrique de la theorie des cordes

    Energy Technology Data Exchange (ETDEWEB)

    Metzger, St

    2005-12-15

    This thesis presents various ways to construct four-dimensional quantum field theories from string theory. In a first part we study the generation of a supersymmetric Yang-Mills theory, coupled to an adjoint chiral superfield, from type IIB string theory on non-compact Calabi-Yau manifolds, with D-branes wrapping certain sub-cycles. Properties of the gauge theory are then mapped to the geometric structure of the Calabi-Yau space. Even if the Calabi-Yau geometry is too complicated to evaluate the geometric integrals explicitly, one can then always use matrix model perturbation theory to calculate the effective superpotential. The second part of this work covers the generation of four-dimensional super-symmetric gauge theories, carrying several important characteristic features of the standard model, from compactifications of eleven-dimensional supergravity on G{sub 2}-manifolds. If the latter contain conical singularities, chiral fermions are present in the four-dimensional gauge theory, which potentially lead to anomalies. We show that, locally at each singularity, these anomalies are cancelled by the non-invariance of the classical action through a mechanism called 'anomaly inflow'. Unfortunately, no explicit metric of a compact G{sub 2}-manifold is known. Here we construct families of metrics on compact weak G{sub 2}-manifolds, which contain two conical singularities. Weak G{sub 2}-manifolds have properties that are similar to the ones of proper G{sub 2}-manifolds, and hence the explicit examples might be useful to better understand the generic situation. Finally, we reconsider the relation between eleven-dimensional supergravity and the E{sub 8} x E{sub 8}-heterotic string. This is done by carefully studying the anomalies that appear if the supergravity theory is formulated on a ten-manifold times the interval. Again we find that the anomalies cancel locally at the boundaries of the interval through anomaly inflow, provided one suitably modifies the

  2. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  3. A survey of hidden-variables theories

    CERN Document Server

    Belinfante, F J

    1973-01-01

    A Survey of Hidden-Variables Theories is a three-part book on the hidden-variable theories, referred in this book as """"theories of the first kind"""". Part I reviews the motives in developing different types of hidden-variables theories. The quest for determinism led to theories of the first kind; the quest for theories that look like causal theories when applied to spatially separated systems that interacted in the past led to theories of the second kind. Parts II and III further describe the theories of the first kind and second kind, respectively. This book is written to make the literat

  4. Unified string theories

    International Nuclear Information System (INIS)

    Gross, D.J.

    1985-01-01

    String theories offer a way of realizing the potential of supersymmetry, Kaluza-Klein and much more. They represent a radical departure from ordinary quantum field theory, but in the direction of increased symmetry and structure. They are based on an enormous increase in the number of degrees of freedom, since in addition to fermionic coordinates and extra dimensions, the basic entities are extended one dimensional objects instead of points. Correspondingly the symmetry group is greatly enlarged, in a way that we are only beginning to comprehend. At the very least this extended symmetry contains the largest group of symmetries that can be contemplated within the framework of point field theories-those of ten-dimensional supergravity and super Yang-Mills theory. Types of string theories and the phenomenology to be expected from them are reviewed

  5. Performance of Density Functional Theory Procedures for the Calculation of Proton-Exchange Barriers: Unusual Behavior of M06-Type Functionals.

    Science.gov (United States)

    Chan, Bun; Gilbert, Andrew T B; Gill, Peter M W; Radom, Leo

    2014-09-09

    We have examined the performance of a variety of density functional theory procedures for the calculation of complexation energies and proton-exchange barriers, with a focus on the Minnesota-class of functionals that are generally highly robust and generally show good accuracy. A curious observation is that M05-type and M06-type methods show an atypical decrease in calculated barriers with increasing proportion of Hartree-Fock exchange. To obtain a clearer picture of the performance of the underlying components of M05-type and M06-type functionals, we have investigated the combination of MPW-type and PBE-type exchange and B95-type and PBE-type correlation procedures. We find that, for the extensive E3 test set, the general performance of the various hybrid-DFT procedures improves in the following order: PBE1-B95 → PBE1-PBE → MPW1-PBE → PW6-B95. As M05-type and M06-type procedures are related to PBE1-B95, it would be of interest to formulate and examine the general performance of an alternative Minnesota DFT method related to PW6-B95.

  6. Exploration of Action Figure Appeals Using Evaluation Grid Method and Quantification Theory Type I

    Science.gov (United States)

    Chang, Hua-Cheng; Chen, Hung-Yuan

    2017-01-01

    Contemporary toy is characterized by accelerating social, cultural and technological change. An attractive action figure can grab consumers' attention, influence the latent consuming preference and evoke their pleasure. However, traditional design of action figure is always dependent on designer's opinion, subjective experience and preference. It…

  7. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  8. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  9. Supersymmetric Gödel Universes in string theory

    DEFF Research Database (Denmark)

    Harmark, Troels; Takayanagi, Tadashi

    2003-01-01

    Supersymmetric backgrounds in string and M-theory of the Gödel Universe type are studied. We find several new Gödel Universes that preserve up to 20 supersymmetries. In particular, we obtain an interesting Gödel Universe in M-theory with 18 supersymmetries which does not seem to be dual to a pp......-wave. We show that not only T-duality but also the type-IIA/M-theory S-duality can give supersymmetric Gödel Universes from pp-waves. We find solutions that can interpolate between Gödel Universes and pp-waves. We also compute the string spectrum on two type IIA Gödel Universes. Furthermore, we obtain...

  10. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  11. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J.

    2015-01-01

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  12. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  13. Tree-level stability without spacetime fermions: novel examples in string theory

    International Nuclear Information System (INIS)

    Israel, Dan; Niarchos, Vasilis

    2007-01-01

    Is perturbative stability intimately tied with the existence of spacetime fermions in string theory in more than two dimensions? Type 0'B string theory in ten-dimensional flat space is a rare example of a non-tachyonic, non-supersymmetric string theory with a purely bosonic closed string spectrum. However, all known type 0' constructions exhibit massless NSNS tadpoles signaling the fact that we are not expanding around a true vacuum of the theory. In this note, we are searching for perturbatively stable examples of type 0' string theory without massless tadpoles in backgrounds with a spatially varying dilaton. We present two examples with this property in non-critical string theories that exhibit four- and six-dimensional Poincare invariance. We discuss the D-branes that can be embedded in this context and the type of gauge theories that can be constructed in this manner. We also comment on the embedding of these non-critical models in critical string theories and their holographic (Little String Theory) interpretation and propose a general conjecture for the role of asymptotic supersymmetry in perturbative string theory

  14. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    Science.gov (United States)

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  15. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  16. The Scientific Status of Learning Styles Theories

    Science.gov (United States)

    Willingham, Daniel T.; Hughes, Elizabeth M.; Dobolyi, David G.

    2015-01-01

    Theories of learning styles suggest that individuals think and learn best in different ways. These are not differences of ability but rather preferences for processing certain types of information or for processing information in certain types of way. If accurate, learning styles theories could have important implications for instruction because…

  17. In-vivo segmentation and quantification of coronary lesions by optical coherence tomography images for a lesion type definition and stenosis grading.

    Science.gov (United States)

    Celi, Simona; Berti, Sergio

    2014-10-01

    Optical coherence tomography (OCT) is a catheter-based medical imaging technique that produces cross-sectional images of blood vessels. This technique is particularly useful for studying coronary atherosclerosis. In this paper, we present a new framework that allows a segmentation and quantification of OCT images of coronary arteries to define the plaque type and stenosis grading. These analyses are usually carried out on-line on the OCT-workstation where measuring is mainly operator-dependent and mouse-based. The aim of this program is to simplify and improve the processing of OCT images for morphometric investigations and to present a fast procedure to obtain 3D geometrical models that can also be used for external purposes such as for finite element simulations. The main phases of our toolbox are the lumen segmentation and the identification of the main tissues in the artery wall. We validated the proposed method with identification and segmentation manually performed by expert OCT readers. The method was evaluated on ten datasets from clinical routine and the validation was performed on 210 images randomly extracted from the pullbacks. Our results show that automated segmentation of the vessel and of the tissue components are possible off-line with a precision that is comparable to manual segmentation for the tissue component and to the proprietary-OCT-console for the lumen segmentation. Several OCT sections have been processed to provide clinical outcome. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Supergravity duals of matrix string theory

    International Nuclear Information System (INIS)

    Morales, Jose F.; Samtleben, Henning

    2002-01-01

    We study holographic duals of type II and heterotic matrix string theories described by warped AdS 3 supergravities. By explicitly solving the linearized equations of motion around near horizon D-string geometries, we determine the spectrum of Kaluza-Klein primaries for type I, II supergravities on warped AdS 3 xS 7 . The results match those coming from the dual two-dimensional gauge theories living on the D-string worldvolumes. We briefly discuss the connections with the N=(8,8), N=(8,0) orbifold superconformal field theories to which type IIB/heterotic matrix strings flow in the infrared. In particular, we associate the dimension (h,h-bar) (32,32) twisted operator which brings the matrix string theories out from the conformal point (R; 8 ) N /S N with the dilaton profile in the supergravity background. The familiar dictionary between masses and 'scaling' dimensions of field and operators are modified by the presence of non-trivial warp factors and running dilatons. These modifications are worked out for the general case of domain wall/QFT correspondences between supergravities on warped AdS d+1 xS q geometries and super Yang-Mills theories with 16 supercharges. (author)

  19. Introduction to superstring theory

    International Nuclear Information System (INIS)

    Nunez, Carmen

    2009-01-01

    This is a very basic introduction to the AdS/CFT correspondence. The first lecture motivates the duality between gauge theories and gravity/string theories. The next two lectures introduce the bosonic and supersymmetric string theories. The fourth lecture is devoted to study Dp-branes and finally, in the fifth lecture I discuss the two worlds: N=4 SYM in 3+1 flat dimensions and type IIB superstrings in AdS 5 x S5. (author)

  20. Introduction to superstring theory

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, Carmen [Instituto de Astronomia y Fisica del Espacio, Buenos Aires (Argentina)], e-mail: carmen@iafe.uba.ar

    2009-07-01

    This is a very basic introduction to the AdS/CFT correspondence. The first lecture motivates the duality between gauge theories and gravity/string theories. The next two lectures introduce the bosonic and supersymmetric string theories. The fourth lecture is devoted to study Dp-branes and finally, in the fifth lecture I discuss the two worlds: N=4 SYM in 3+1 flat dimensions and type IIB superstrings in AdS{sub 5} x S5. (author)

  1. Can Malin's gravitational-field equations be modified to obtain a viable theory of gravity to obtain a viable theory of gravity to obtain a viable theory of gravity

    International Nuclear Information System (INIS)

    Smalley, L.L.; Prestage, J.

    1976-01-01

    Malin's gravitational theory, which was recently shown by Lindblom and Nester to be incorrect, is modified by means of a recently proposed method for obtaining viable gravitational theories. The resulting self-consistent theory, which is in effect a Rastall-type modification of the Einstein theory, exhibits nonconservation of momentum, yet agrees with all experimental limits known to date within the PPN framework

  2. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  3. Field theory and strings

    International Nuclear Information System (INIS)

    Bonara, L.; Cotta-Ramusino, P.; Rinaldi, M.

    1987-01-01

    It is well-known that type I and heterotic superstring theories have a zero mass spectrum which correspond to the field content of N=1 supergravity theory coupled to supersymmetric Yang-Mills theory in 10-D. The authors study the field theory ''per se'', in the hope that simple consistency requirements will determine the theory completely once one knows the field content inherited from string theory. The simplest consistency requirements are: N=1 supersymmetry; and absence of chiral anomalies. This is what the authors discuss in this paper here leaving undetermined the question of the range of validity of the resulting field theory. As is known, a model of N=1 supergravity (SUGRA) coupled to supersymmetric Yang-Mills (SYM) theory was known in the form given by Chapline and Manton. The coupling of SUGRA to SYM was determined by the definition of the ''field strength'' 3-form H in this paper

  4. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    Science.gov (United States)

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  5. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  6. Competitive Reporter Monitored Amplification (CMA) - Quantification of Molecular Targets by Real Time Monitoring of Competitive Reporter Hybridization

    Science.gov (United States)

    Ullrich, Thomas; Ermantraut, Eugen; Schulz, Torsten; Steinmetzer, Katrin

    2012-01-01

    Background State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. Methodology and Principal Findings The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. Conclusions and Significance The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2), we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls and targets into a

  7. Competitive reporter monitored amplification (CMA--quantification of molecular targets by real time monitoring of competitive reporter hybridization.

    Directory of Open Access Journals (Sweden)

    Thomas Ullrich

    Full Text Available BACKGROUND: State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. METHODOLOGY AND PRINCIPAL FINDINGS: The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. CONCLUSIONS AND SIGNIFICANCE: The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2, we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls

  8. Biosensors for the Detection and Quantification of AI-2 Class Quorum-Sensing Compounds.

    Science.gov (United States)

    Rajamani, Sathish; Sayre, Richard

    2018-01-01

    Intercellular small-molecular-weight signaling molecules modulate a variety of biological functions in bacteria. One of the more complex behaviors mediated by intercellular signaling molecules is the suite of activities regulated by quorum-sensing molecules. These molecules mediate a variety of population-dependent responses including the expression of genes that regulate bioluminescence, type III secretion, siderophore production, colony morphology, biofilm formation, and metalloprotease production. Given their central role in regulating these responses, the detection and quantification of QS molecules have important practical implications. Until recently, the detection of QS molecules from Gram-negative bacteria has relied primarily on bacterial reporter systems. These bioassays though immensely useful are subject to interference by compounds that affect bacterial growth and metabolism. In addition, the reporter response is highly dependent on culture age and cell population density. To overcome such limitations, we developed an in vitro protein-based assay system for the rapid detection and quantification of the furanosyl borate diester (BAI-2) subclass of autoinducer-2 (AI-2) QS molecules. The biosensor is based on the interaction of BAI-2 with the Vibrio harveyi QS receptor LuxP. Conformation changes associated with BAI-2 binding to the LuxP receptor change the orientation of cyan and yellow variants of GFP (CFP and YFP) fused to the N- and C-termini, respectively, of the LuxP receptor. LuxP-BAI2 binding induces changes in fluorescence resonance energy transfer (FRET) between CFP and YFP, whose magnitude of change is ligand concentration dependent. Ligand-insensitive LuxP mutant FRET protein sensors were also developed for use as control biosensors. The FRET-based BAI-2 biosensor responds selectively to both synthetic and biologically derived BAI-2 compounds. This report describes the use of the LuxP-FRET biosensor for the detection and quantification of BAI-2.

  9. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.

    Science.gov (United States)

    Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina

    2017-11-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.

  10. Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.

    Science.gov (United States)

    Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G

    2018-06-01

    Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).

  11. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  12. Cosmological constraints on Brans-Dicke theory.

    Science.gov (United States)

    Avilez, A; Skordis, C

    2014-07-04

    We report strong cosmological constraints on the Brans-Dicke (BD) theory of gravity using cosmic microwave background data from Planck. We consider two types of models. First, the initial condition of the scalar field is fixed to give the same effective gravitational strength Geff today as the one measured on Earth, GN. In this case, the BD parameter ω is constrained to ω>692 at the 99% confidence level, an order of magnitude improvement over previous constraints. In the second type, the initial condition for the scalar is a free parameter leading to a somewhat stronger constraint of ω>890, while Geff is constrained to 0.981theory and are valid for any Horndeski theory, the most general second-order scalar-tensor theory, which approximates the BD theory on cosmological scales. In this sense, our constraints place strong limits on possible modifications of gravity that might explain cosmic acceleration.

  13. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  14. String theory and water waves

    International Nuclear Information System (INIS)

    Iyer, Ramakrishnan; Johnson, Clifford V; Pennington, Jeffrey S

    2011-01-01

    We uncover a remarkable role that an infinite hierarchy of nonlinear differential equations plays in organizing and connecting certain c-hat <1 string theories non-perturbatively. We are able to embed the type 0A and 0B (A, A) minimal string theories into this single framework. The string theories arise as special limits of a rich system of equations underpinned by an integrable system known as the dispersive water wave hierarchy. We observe that there are several other string-like limits of the system, and conjecture that some of them are type IIA and IIB (A, D) minimal string backgrounds. We explain how these and several string-like special points arise and are connected. In some cases, the framework endows the theories with a non-perturbative definition for the first time. Notably, we discover that the Painleve IV equation plays a key role in organizing the string theory physics, joining its siblings, Painleve I and II, whose roles have previously been identified in this minimal string context.

  15. A cluster randomised pragmatic trial applying Self-determination theory to type 2 diabetes care in general practice

    DEFF Research Database (Denmark)

    Juul, Lise; Maindal, Helle T; Zoffmann, Vibeke

    2011-01-01

    BACKGROUND: Treatment recommendations for prevention of type 2 diabetes complications often require radical and life-long health behaviour changes. Observational studies based on Self-determination theory (SDT) propose substantial factors for the maintenance of behaviour changes and concomitant...... well-being, but experimental research is needed to develop and evaluate SDT-based interventions. The aims of this paper were to describe 1) the design of a trial assessing the effectiveness of a training course for practice-nurses in autonomy support on patient-perceived motivation, HbA1, cholesterol...... will be assessed on the diabetes populations with regard to well-being (PAID, SF-12), HbA1c- and cholesterol-levels, perceived autonomy support (HCCQ), type of motivation (TSRQ), and perceived competence for diabetes care (PCD) 15-21 months after the core course; the completion of the second course afternoon. Data...

  16. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    Directory of Open Access Journals (Sweden)

    Fuqiang Sun

    2017-06-01

    Full Text Available Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM and a genetic algorithm (GA. Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  17. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    Science.gov (United States)

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  18. Real-Time PCR Quantification of Chloroplast DNA Supports DNA Barcoding of Plant Species.

    Science.gov (United States)

    Kikkawa, Hitomi S; Tsuge, Kouichiro; Sugita, Ritsuko

    2016-03-01

    Species identification from extracted DNA is sometimes needed for botanical samples. DNA quantification is required for an accurate and effective examination. If a quantitative assay provides unreliable estimates, a higher quantity of DNA than the estimated amount may be used in additional analyses to avoid failure to analyze samples from which extracting DNA is difficult. Compared with conventional methods, real-time quantitative PCR (qPCR) requires a low amount of DNA and enables quantification of dilute DNA solutions accurately. The aim of this study was to develop a qPCR assay for quantification of chloroplast DNA from taxonomically diverse plant species. An absolute quantification method was developed using primers targeting the ribulose-1,5-bisphosphate carboxylase/oxygenase large subunit (rbcL) gene using SYBR Green I-based qPCR. The calibration curve was generated using the PCR amplicon as the template. DNA extracts from representatives of 13 plant families common in Japan. This demonstrates that qPCR analysis is an effective method for quantification of DNA from plant samples. The results of qPCR assist in the decision-making will determine the success or failure of DNA analysis, indicating the possibility of optimization of the procedure for downstream reactions.

  19. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  20. Legendre transformations and Clairaut-type equations

    Energy Technology Data Exchange (ETDEWEB)

    Lavrov, Peter M., E-mail: lavrov@tspu.edu.ru [Tomsk State Pedagogical University, Kievskaya St. 60, 634061 Tomsk (Russian Federation); National Research Tomsk State University, Lenin Av. 36, 634050 Tomsk (Russian Federation); Merzlikin, Boris S., E-mail: merzlikin@tspu.edu.ru [National Research Tomsk Polytechnic University, Lenin Av. 30, 634050 Tomsk (Russian Federation)

    2016-05-10

    It is noted that the Legendre transformations in the standard formulation of quantum field theory have the form of functional Clairaut-type equations. It is shown that in presence of composite fields the Clairaut-type form holds after loop corrections are taken into account. A new solution to the functional Clairaut-type equation appearing in field theories with composite fields is found.

  1. Absolute and direct microRNA quantification using DNA-gold nanoparticle probes.

    Science.gov (United States)

    Degliangeli, Federica; Kshirsagar, Prakash; Brunetti, Virgilio; Pompa, Pier Paolo; Fiammengo, Roberto

    2014-02-12

    DNA-gold nanoparticle probes are implemented in a simple strategy for direct microRNA (miRNA) quantification. Fluorescently labeled DNA-probe strands are immobilized on PEGylated gold nanoparticles (AuNPs). In the presence of target miRNA, DNA-RNA heteroduplexes are formed and become substrate for the endonuclease DSN (duplex-specific nuclease). Enzymatic hydrolysis of the DNA strands yields a fluorescence signal due to diffusion of the fluorophores away from the gold surface. We show that the molecular design of our DNA-AuNP probes, with the DNA strands immobilized on top of the PEG-based passivation layer, results in nearly unaltered enzymatic activity toward immobilized heteroduplexes compared to substrates free in solution. The assay, developed in a real-time format, allows absolute quantification of as little as 0.2 fmol of miR-203. We also show the application of the assay for direct quantification of cancer-related miR-203 and miR-21 in samples of extracted total RNA from cell cultures. The possibility of direct and absolute quantification may significantly advance the use of microRNAs as biomarkers in the clinical praxis.

  2. Activity quantification of phantom using dual-head SPECT with two-view planar image

    International Nuclear Information System (INIS)

    Guo Leiming; Chen Tao; Sun Xiaoguang; Huang Gang

    2005-01-01

    The absorbed radiation dose from internally deposited radionuclide is a major factor in assessing risk and therapeutic utility in nuclear medicine diagnosis or treatment. The quantification of absolute activity in vivo is necessary procedure of estimating the absorbed dose of organ or tissue. To understand accuracy in the determination of organ activity, the experiments on 99 Tc m activity quantification were made for a body phantom using dual-heat SPECT with the two-view counting technique. Accuracy in the activity quantification is credible and is not affected by depth of source organ in vivo. When diameter of the radiation source is ≤2 cm, the most accurate activity quantification result can be obtained on the basis of establishing the system calibration factor and transmission factor. The use of Buijs's method is preferable, especially at very low source-to-background activity concentration rations. (authors)

  3. Theory of mind in schizophrenia: error types and associations with symptoms.

    Science.gov (United States)

    Fretland, Ragnhild A; Andersson, Stein; Sundet, Kjetil; Andreassen, Ole A; Melle, Ingrid; Vaskinn, Anja

    2015-03-01

    Social cognition is an important determinant of functioning in schizophrenia. However, how social cognition relates to the clinical symptoms of schizophrenia is still unclear. The aim of this study was to explore the relationship between a social cognition domain, Theory of Mind (ToM), and the clinical symptoms of schizophrenia. Specifically, we investigated the associations between three ToM error types; 1) "overmentalizing" 2) "reduced ToM and 3) "no ToM", and positive, negative and disorganized symptoms. Fifty-two participants with a diagnosis of schizophrenia or schizoaffective disorder were assessed with the Movie for the Assessment of Social Cognition (MASC), a video-based ToM measure. An empirically validated five-factor model of the Positive and Negative Syndrome Scale (PANSS) was used to assess clinical symptoms. There was a significant, small-moderate association between overmentalizing and positive symptoms (rho=.28, p=.04). Disorganized symptoms correlated at a trend level with "reduced ToM" (rho=.27, p=.05). There were no other significant correlations between ToM impairments and symptom levels. Positive/disorganized symptoms did not contribute significantly in explaining total ToM performance, whereas IQ did (B=.37, p=.01). Within the undermentalizing domain, participants performed more "reduced ToM" errors than "no ToM" errors. Overmentalizing was associated with positive symptoms. The undermentalizing error types were unrelated to symptoms, but "reduced ToM" was somewhat associated to disorganization. The higher number of "reduced ToM" responses suggests that schizophrenia is characterized by accuracy problems rather than a fundamental lack of mental state concept. The findings call for the use of more sensitive measures when investigating ToM in schizophrenia to avoid the "right/wrong ToM"-dichotomy. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    Science.gov (United States)

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  5. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    Garcia G, N.; Barrera D, C.E.; Ordonez R, E.

    2007-01-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  6. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    Science.gov (United States)

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures

  7. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    Science.gov (United States)

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  8. Constructor theory of probability

    Science.gov (United States)

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  9. Radiation dose determines the method for quantification of DNA double strand breaks

    International Nuclear Information System (INIS)

    Bulat, Tanja; Keta, Olitija; Korićanac, Lela; Žakula, Jelena; Petrović, Ivan; Ristić-Fira, Aleksandra; Todorović, Danijela

    2016-01-01

    Ionizing radiation induces DNA double strand breaks (DSBs) that trigger phosphorylation of the histone protein H2AX (γH2AX). Immunofluorescent staining visualizes formation of γH2AX foci, allowing their quantification. This method, as opposed to Western blot assay and Flow cytometry, provides more accurate analysis, by showing exact position and intensity of fluorescent signal in each single cell. In practice there are problems in quantification of γH2AX. This paper is based on two issues: the determination of which technique should be applied concerning the radiation dose, and how to analyze fluorescent microscopy images obtained by different microscopes. HTB140 melanoma cells were exposed to γ-rays, in the dose range from 1 to 16 Gy. Radiation effects on the DNA level were analyzed at different time intervals after irradiation by Western blot analysis and immunofluorescence microscopy. Immunochemically stained cells were visualized with two types of microscopes: AxioVision (Zeiss, Germany) microscope, comprising an ApoTome software, and AxioImagerA1 microscope (Zeiss, Germany). Obtained results show that the level of γH2AX is time and dose dependent. Immunofluorescence microscopy provided better detection of DSBs for lower irradiation doses, while Western blot analysis was more reliable for higher irradiation doses. AxioVision microscope containing ApoTome software was more suitable for the detection of γH2AX foci. (author)

  10. Quantification of the degradation of steels exposed to liquid lead-bismuth eutectic

    International Nuclear Information System (INIS)

    Schroer, C.; Voss, Z.; Novotny, J.; Konys, J.

    2006-05-01

    Metallographic and gravimetric methods of measuring the degradation of steels are introduced and compared, with emphasis on the quantification of oxidation in molten lead-bismuth eutectic (LBE). In future applications of LBE or other molten lead alloys, additions of oxygen should prevent the dissolution of steel constituents in the liquid heavy metal. Therefore, also the amount of steel constituents transferred between the steel (including the oxide scale formed on the surface) and the LBE has to be assessed, in order to evaluate the efficiency of oxygen additions with respect to preventing dissolution of the steel. For testing the methods of quantification, specimens of martensitic steel T91 were exposed for 1500 h to stagnant, oxygen-saturated LBE at 550 C, whereby, applying both metallographic and gravimetric measurements, the recession of the cross-section of sound material deviated by ± 3 μm for a mean value of 11 μm. Although the transfer of steel constituents between the solid phases and the LBE is negligible under the considered exposure conditions, the investigation shows that a gravimetric analysis is most promising for quantifying such a mass transfer. For laboratory experiments on the behaviour of steels in oxygen-containing LBE, it is suggested to make provisions for both metallographic and gravimetric measurements, since both types of methods have specific benefits in the characterisation of the oxidation process. (Orig.)

  11. Radiation dose determines the method for quantification of DNA double strand breaks

    Energy Technology Data Exchange (ETDEWEB)

    Bulat, Tanja; Keta, Olitija; Korićanac, Lela; Žakula, Jelena; Petrović, Ivan; Ristić-Fira, Aleksandra [University of Belgrade, Vinča Institute of Nuclear Sciences, Belgrade (Serbia); Todorović, Danijela, E-mail: dtodorovic@medf.kg.ac.rs [University of Kragujevac, Faculty of Medical Sciences, Kragujevac (Serbia)

    2016-03-15

    Ionizing radiation induces DNA double strand breaks (DSBs) that trigger phosphorylation of the histone protein H2AX (γH2AX). Immunofluorescent staining visualizes formation of γH2AX foci, allowing their quantification. This method, as opposed to Western blot assay and Flow cytometry, provides more accurate analysis, by showing exact position and intensity of fluorescent signal in each single cell. In practice there are problems in quantification of γH2AX. This paper is based on two issues: the determination of which technique should be applied concerning the radiation dose, and how to analyze fluorescent microscopy images obtained by different microscopes. HTB140 melanoma cells were exposed to γ-rays, in the dose range from 1 to 16 Gy. Radiation effects on the DNA level were analyzed at different time intervals after irradiation by Western blot analysis and immunofluorescence microscopy. Immunochemically stained cells were visualized with two types of microscopes: AxioVision (Zeiss, Germany) microscope, comprising an ApoTome software, and AxioImagerA1 microscope (Zeiss, Germany). Obtained results show that the level of γH2AX is time and dose dependent. Immunofluorescence microscopy provided better detection of DSBs for lower irradiation doses, while Western blot analysis was more reliable for higher irradiation doses. AxioVision microscope containing ApoTome software was more suitable for the detection of γH2AX foci. (author)

  12. Developing a Model of Theory-to-Practice-to-Theory in Student Affairs: An Extended Case Analysis of Theories of Student Learning and Development

    Science.gov (United States)

    Kimball, Ezekiel W.

    2012-01-01

    Recent literature suggests a problematic connection between theory and practice in higher education scholarship generally and the study of student learning and development specifically (e.g. Bensimon, 2007; Kezar, 2000; Love, 2012). Much of this disconnect stems from a lack of differentiation between various types of theory used in student affairs…

  13. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  14. Quantification of taurine in energy drinks using ¹H NMR.

    Science.gov (United States)

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  16. Final Technical Report: Mathematical Foundations for Uncertainty Quantification in Materials Design

    Energy Technology Data Exchange (ETDEWEB)

    Plechac, Petr [Univ. of Delaware, Newark, DE (United States); Vlachos, Dionisios G. [Univ. of Delaware, Newark, DE (United States)

    2018-01-23

    We developed path-wise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of non-equilibrium extended molecular systems. The combination of these novel methodologies provided the first methods in the literature which are capable to handle UQ questions for stochastic complex systems with some or all of the following features: (a) multi-scale stochastic models such as (bio)chemical reaction networks, with a very large number of parameters, (b) spatially distributed systems such as Kinetic Monte Carlo or Langevin Dynamics, (c) non-equilibrium processes typically associated with coupled physico-chemical mechanisms, driven boundary conditions, hybrid micro-macro systems, etc. A particular computational challenge arises in simulations of multi-scale reaction networks and molecular systems. Mathematical techniques were applied to in silico prediction of novel materials with emphasis on the effect of microstructure on model uncertainty quantification (UQ). We outline acceleration methods to make calculations of real chemistry feasible followed by two complementary tasks on structure optimization and microstructure-induced UQ.

  17. An external standard method for quantification of human cytomegalovirus by PCR

    International Nuclear Information System (INIS)

    Rongsen, Shen; Liren, Ma; Fengqi, Zhou; Qingliang, Luo

    1997-01-01

    An external standard method for PCR quantification of HCMV was reported. [α- 32 P]dATP was used as a tracer. 32 P-labelled specific amplification product was separated by agarose gel electrophoresis. A gel piece containing the specific product band was excised and counted in a plastic scintillation counter. Distribution of [α- 32 P]dATP in the electrophoretic gel plate and effect of separation between the 32 P-labelled specific product and free [α- 32 P]dATP were observed. A standard curve for quantification of HCMV by PCR was established and detective results of quality control templets were presented. The external standard method and the electrophoresis separation effect were appraised. The results showed that the method could be used for relative quantification of HCMV. (author)

  18. Symmetries of string, M- and F-theories

    NARCIS (Netherlands)

    Bergshoeff, Eric; Proeyen, Antoine Van

    2001-01-01

    The d = 10 type II string theories, d = 11 M-theory and d = 12 F-theory have the same symmetry group. It can be viewed either as a subgroup of a conformal group OSp(1|64) or as a contraction of OSp(1|32). The theories are related by different identifications of their symmetry operators as generators

  19. Generation of structural MR images from amyloid PET: Application to MR-less quantification.

    Science.gov (United States)

    Choi, Hongyoon; Lee, Dong Soo

    2017-12-07

    Structural magnetic resonance (MR) images concomitantly acquired with PET images can provide crucial anatomical information for precise quantitative analysis. However, in the clinical setting, not all the subjects have corresponding MR. Here, we developed a model to generate structural MR images from amyloid PET using deep generative networks. We applied our model to quantification of cortical amyloid load without structural MR. Methods: We used florbetapir PET and structural MR data of Alzheimer's Disease Neuroimaging Initiative database. The generative network was trained to generate realistic structural MR images from florbetapir PET images. After the training, the model was applied to the quantification of cortical amyloid load. PET images were spatially normalized to the template space using the generated MR and then standardized uptake value ratio (SUVR) of the target regions was measured by predefined regions-of-interests. A real MR-based quantification was used as the gold standard to measure the accuracy of our approach. Other MR-less methods, a normal PET template-based, multi-atlas PET template-based and PET segmentation-based normalization/quantification methods, were also tested. We compared performance of quantification methods using generated MR with that of MR-based and MR-less quantification methods. Results: Generated MR images from florbetapir PET showed visually similar signal patterns to the real MR. The structural similarity index between real and generated MR was 0.91 ± 0.04. Mean absolute error of SUVR of cortical composite regions estimated by the generated MR-based method was 0.04±0.03, which was significantly smaller than other MR-less methods (0.29±0.12 for the normal PET-template, 0.12±0.07 for multiatlas PET-template and 0.08±0.06 for PET segmentation-based methods). Bland-Altman plots revealed that the generated MR-based SUVR quantification was the closest to the SUVR values estimated by the real MR-based method. Conclusion

  20. On Kaluza-Klein theory

    International Nuclear Information System (INIS)

    Salam, A.; Strathdee, J.

    1981-10-01

    Assuming the compactification of 4+K-dimensional spacetime implied in Kaluza-Klein type theories, we consider the case in which the internal manifold is a quotient space, G/H. We develop normal mode expansions on the internal manifold and show that the conventional gravitational plus Yang-Mills theory (realizing local G symmetry) is obtained in the leading approximation. The higher terms in the expansions give rise to field theories of massive particles. In particular, for the original Kaluza-Klein 4+1-dimensional theory, the higher excitations describe massive, charged, purely spin-2 particles. These belong to infinite dimensional representations of an 0(1,2). (author)

  1. The utility of quantum field theory

    International Nuclear Information System (INIS)

    Dine, Michael

    2001-01-01

    This talk surveys a broad range of applications of quantum field theory, as well as some recent developments. The stress is on the notion of effective field theories. Topics include implications of neutrino mass and a possible small value of sin(2β), supersymmetric extensions of the standard model, the use of field theory to understand fundamental issues in string theory (the problem of multiple ground states and the question: does string theory predict low energy supersymmetry), and the use of string theory to solve problems in field theory. Also considered are a new type of field theory, and indications from black hole physics and the cosmological constant problem that effective field theories may not completely describe theories of gravity. (author)

  2. On the equivalence of vacuum equations of gauge quadratic theory of gravity and general relativity theory

    International Nuclear Information System (INIS)

    Zhitnikov, V.V.; Ponomarev, V.N.

    1986-01-01

    An attempt is made to compare the solution of field equations, corresponding to quadratic equations for the fields (g μν , Γ μν α ) in gauge gravitation theory (GGT) with general relativity theory solutions. Without restrictions for a concrete type of metrics only solutions of equations, for which torsion turns to zero, are considered. Equivalence of vacuum equations of gauge quadratic theory of gravity and general relativity theory is proved using the Newman-Penrose formalism

  3. Perturbation Theory of Embedded Eigenvalues

    DEFF Research Database (Denmark)

    Engelmann, Matthias

    project gives a general and systematic approach to analytic perturbation theory of embedded eigenvalues. The spectral deformation technique originally developed in the theory of dilation analytic potentials in the context of Schrödinger operators is systematized by the use of Mourre theory. The group...... of dilations is thereby replaced by the unitary group generated y the conjugate operator. This then allows to treat the perturbation problem with the usual Kato theory.......We study problems connected to perturbation theory of embedded eigenvalues in two different setups. The first part deals with second order perturbation theory of mass shells in massive translation invariant Nelson type models. To this end an expansion of the eigenvalues w.r.t. fiber parameter up...

  4. Fractional Quantum Field Theory: From Lattice to Continuum

    Directory of Open Access Journals (Sweden)

    Vasily E. Tarasov

    2014-01-01

    Full Text Available An approach to formulate fractional field theories on unbounded lattice space-time is suggested. A fractional-order analog of the lattice quantum field theories is considered. Lattice analogs of the fractional-order 4-dimensional differential operators are proposed. We prove that continuum limit of the suggested lattice field theory gives a fractional field theory for the continuum 4-dimensional space-time. The fractional field equations, which are derived from equations for lattice space-time with long-range properties of power-law type, contain the Riesz type derivatives on noninteger orders with respect to space-time coordinates.

  5. A non-supersymmetric open-string theory and S-duality

    International Nuclear Information System (INIS)

    Bergman, O.; Gaberdiel, M.R.

    1997-01-01

    A non-supersymmetric ten-dimensional open-string theory is constructed as an orbifold of type I string theory, and as an orientifold of the bosonic type B theory. It is purely bosonic, and cancellation of massless tadpoles requires the gauge group to be SO(32) x SO(32). The spectrum of the theory contains a closed-string tachyon, and open-string tachyons in the (32,32) multiplet. The D-branes of this theory are analyzed, and it is found that the massless excitations of one of the 1-branes coincide with the world-sheet degrees of freedom of the D=26 bosonic string theory compactified on the SO(32) lattice. This suggests that the two theories are related by S-duality. (orig.)

  6. Walter Charleton (1620 - 1707 e sua Teoria Atômica The atomic theory of walter charleton (1620 - 1707

    Directory of Open Access Journals (Sweden)

    Paulo Alves Porto

    1997-06-01

    Full Text Available Several authors in the 17th century used the atomic hypothesis to explain observable phenomena. This paper analyzes some ideas about chemical transformation proposed by the English physician Walter Charleton. In Physiologia Epicuro-Gassendo-Charltoniana (London, 1654, Charleton examined philosophical aspects of the atomic theory, and suggested that the best explanation for all natural phenomena would be only in terms of atoms and their motions. Sometimes, however, he had to attribute to the atoms some kind of "internal virtue", to explain more complex properties of the matter. His idea of "element", and the little use of experimentation and quantification, also limited the range of Charleton's theory.

  7. Potential overestimation of HPV vaccine impact due to unmasking of non-vaccine types: quantification using a multi-type mathematical model.

    Science.gov (United States)

    Choi, Yoon Hong; Chapman, Ruth; Gay, Nigel; Jit, Mark

    2012-05-14

    Estimates of human papillomavirus (HPV) vaccine impact in clinical trials and modelling studies rely on DNA tests of cytology or biopsy specimens to determine the HPV type responsible for a cervical lesion. DNA of several oncogenic HPV types may be detectable in a specimen. However, only one type may be responsible for a particular cervical lesion. Misattribution of the causal HPV type for a particular abnormality may give rise to an apparent increase in disease due to non-vaccine HPV types following vaccination ("unmasking"). To investigate the existence and magnitude of unmasking, we analysed data from residual cytology and biopsy specimens in English women aged 20-64 years old using a stochastic type-specific individual-based model of HPV infection, progression and disease. The model parameters were calibrated to data on the prevalence of HPV DNA and cytological lesion of different grades, and used to assign causal HPV types to cervical lesions. The difference between the prevalence of all disease due to non-vaccine HPV types, and disease due to non-vaccine HPV types in the absence of vaccine HPV types, was then estimated. There could be an apparent maximum increase of 3-10% in long-term cervical cancer incidence due to non-vaccine HPV types following vaccination. Unmasking may be an important phenomenon in HPV post-vaccination epidemiology, in the same way that has been observed following pneumococcal conjugate vaccination. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. AdS3 xw (S3 x S3 x S1) solutions of type IIB string theory

    International Nuclear Information System (INIS)

    Donos, Aristomenis; Gauntlett, Jerome P.; Imperial College, London; Sparks, James

    2008-10-01

    We analyse a recently constructed class of local solutions of type IIB supergravity that consist of a warped product of AdS 3 with a sevendimensional internal space. In one duality frame the only other nonvanishing fields are the NS three-form and the dilaton. We analyse in detail how these local solutions can be extended to globally well-defined solutions of type IIB string theory, with the internal space having topology S 3 x S 3 x S 1 and with properly quantised three-form flux. We show that many of the dual (0,2) SCFTs are exactly marginal deformations of the (0,2) SCFTs whose holographic duals are warped products of AdS 3 with seven-dimensional manifolds of topology S 3 x S 2 x T 2 . (orig.)

  9. MR Spectroscopy: Real-Time Quantification of in-vivo MR Spectroscopic data

    OpenAIRE

    Massé, Kunal

    2009-01-01

    In the last two decades, magnetic resonance spectroscopy (MRS) has had an increasing success in biomedical research. This technique has the faculty of discerning several metabolites in human tissue non-invasively and thus offers a multitude of medical applications. In clinical routine, quantification plays a key role in the evaluation of the different chemical elements. The quantification of metabolites characterizing specific pathologies helps physicians establish the patient's diagnosis. E...

  10. Electromagnetic scattering theory

    Science.gov (United States)

    Bird, J. F.; Farrell, R. A.

    1986-01-01

    Electromagnetic scattering theory is discussed with emphasis on the general stochastic variational principle (SVP) and its applications. The stochastic version of the Schwinger-type variational principle is presented, and explicit expressions for its integrals are considered. Results are summarized for scalar wave scattering from a classic rough-surface model and for vector wave scattering from a random dielectric-body model. Also considered are the selection of trial functions and the variational improvement of the Kirchhoff short-wave approximation appropriate to large size-parameters. Other applications of vector field theory discussed include a general vision theory and the analysis of hydromagnetism induced by ocean motion across the geomagnetic field. Levitational force-torque in the magnetic suspension of the disturbance compensation system (DISCOS), now deployed in NOVA satellites, is also analyzed using the developed theory.

  11. Relativity, symmetry and the structure of quantum theory

    CERN Document Server

    Klink, William H; Schweiger, Wolfgang

    Quantum theory is one of the most successful of all physical theories. Our everyday world is dominated by devices that function because of knowledge of the quantum world. Yet many, physicists and non-physicists alike, find the theory which explains the behavior of the quantum world baffling and strange. This book is the first in a series of three that argues that relativity and symmetry determine the structure of quantum theory. That is to say, the structure of quantum theory is what it is because of relativity and symmetry. There are different types of relativity, each leading to a particular type of quantum theory. This book deals specifically with what we call Newton relativity, the form of relativity built into Newtonian mechanics, and the quantum theory to which it gives rise, which we call Galilean (often misleadingly called non-relativistic) quantum theory. Key Features: • Meaning and significance of the term of relativity; discussion of the principle of relativity. • Relation of symmetry to relati...

  12. Superspace gauge fixing of topological Yang-Mills theories

    Energy Technology Data Exchange (ETDEWEB)

    Constantinidis, Clisthenis P; Piguet, Olivier [Universidade Federal do Espirito Santo (UFES) (Brazil); Spalenza, Wesley [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro (Brazil)

    2004-03-01

    We revisit the construction of topological Yang-Mills theories of the Witten type with arbitrary space-time dimension and number of ''shift supersymmetry'' generators, using a superspace formalism. The super-BF structure of these theories is exploited in order to determine their actions uniquely, up to the ambiguities due to the fixing of the Yang-Mills and BF gauge invariance. UV finiteness to all orders of perturbation theory is proved in a gauge of the Landau type. (orig.)

  13. Superspace gauge fixing of topological Yang-Mills theories

    International Nuclear Information System (INIS)

    Constantinidis, Clisthenis P.; Piguet, Olivier; Spalenza, Wesley

    2004-01-01

    We revisit the construction of topological Yang-Mills theories of the Witten type with arbitrary space-time dimension and number of ''shift supersymmetry'' generators, using a superspace formalism. The super-BF structure of these theories is exploited in order to determine their actions uniquely, up to the ambiguities due to the fixing of the Yang-Mills and BF gauge invariance. UV finiteness to all orders of perturbation theory is proved in a gauge of the Landau type. (orig.)

  14. Continuum theory of the mixed-state and surface Joule effects in type-II superconductors

    International Nuclear Information System (INIS)

    Hocquet, T.; Mathieu, P.; Simon, Y.

    1992-01-01

    A phenomenological theory of vortex motion, where the mixed state is regarded as a continuum, has been proposed by two of the authors in a short previous letter. Its outlines are recalled in this paper with further comments and arguments; in particular the basic equations and their implications are discussed at some length. This theory leads to a model of pinning, from which we argue that critical currents I c , in soft type-II samples of standard bulk homogeneity, should be governed essentially by surface defects. I c is interpreted as a physically well-defined part of the total transport current I, which is flowing over a small depth close to the surface. Thus, on the scale of an ordinary sample, this part of the transport current is superficial, the remaining part I-I c being uniformly distributed over the cross section. Coherently, an analysis of the dissipation in such samples predicts that the part VI c of the total Joule effect VI must arise as surface heat sources, while the Joule effect V(I-I c ), usually associated with the steady viscous flow of vortices, is uniformly distributed in the bulk. As a proof, we present a method, using second-sound acoustics, to detect and separate surface and volume heat sources. Experimental results give clear evidence of a surface Joule effect, and support the validity of our model of surface pinning in soft materials

  15. Quantification of within-sample genetic heterogeneity from SNP-array data

    DEFF Research Database (Denmark)

    Martinez, Pierre; Kimberley, Christopher; Birkbak, Nicolai Juul

    2017-01-01

    Intra-tumour genetic heterogeneity (ITH) fosters drug resistance and is a critical hurdle to clinical treatment. ITH can be well-measured using multi-region sampling but this is costly and challenging to implement. There is therefore a need for tools to estimate ITH in individual samples, using...... standard genomic data such as SNP-arrays, that could be implemented routinely. We designed two novel scores S and R, respectively based on the Shannon diversity index and Ripley's L statistic of spatial homogeneity, to quantify ITH in single SNP-array samples. We created in-silico and in-vitro mixtures...... sequencing data but heterogeneity in the fraction of tumour cells present across samples hampered accurate quantification. The prognostic potential of both scores was moderate but significantly predictive of survival in several tumour types (corrected p = 0.03). Our work thus shows how individual SNP...

  16. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  17. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  18. Swift Quantification of Fenofibrate and Tiemonium methylsulfate Active Ingredients in Solid Drugs Using Particle Induced X-Ray Emission

    International Nuclear Information System (INIS)

    Bejjani, A.; Nsouli, B.; Zahraman, K.; Assi, S.; Younes, Gh.; Yazbi, F.

    2011-01-01

    The quantification of active ingredients (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. However, if the active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA like PIXE and PIGE techniques, using small tandem accelerator of 1-2 MV, can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. In this work, we demonstrate the ability of the Thick Target PIXE technique for rapid and accurate quantification of both low and high concentrations of active ingredients in different commercial drugs. Fenofibrate, a chlorinated active ingredient, is present in high amounts in two different commercial drugs, its quantification was done using the relative approach to an external standard. On the other hand, Tiemonium methylsulfate which exists in relatively low amount in commercial drugs, its quantification was done using GUPIX simulation code (absolute quantification). The experimental aspects related to the quantification validity (use of external standards, absolute quantification, matrix effect,...) are presented and discussed. (author)

  19. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  20. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.