WorldWideScience

Sample records for quantification theory type

  1. Quantum Theory without Quantification

    OpenAIRE

    Piron, Constantin

    2002-01-01

    After having explained Samuel Clarke's conception of the new philosophy of physical reality, we will treat the electron field in this context as a field modifying the void. From this we will be able to derive the so-called quantum rules just from Noether's theorem on conserved currents. Thus quantum theory appears as a kind of nonlocal field theory, in fact a new theory.

  2. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  3. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  4. On Irrelevance and Algorithmic Equality in Predicative Type Theory

    CERN Document Server

    Abel, Andreas

    2012-01-01

    Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To separate static and dynamic code, several static analyses and type systems have been put forward. We consider Pfenning's type theory with irrelevant quantification which is compatible with a type-based notion of equality that respects eta-laws. We extend Pfenning's theory to universes and large eliminations and develop its meta-theory. Subject reduction, normalization and consistency are obtained by a Kripke model over the typed equality judgement. Finally, a type-directed equality algorithm is described whose completeness is proven by a second Kripke model.

  5. An overview of type theories

    OpenAIRE

    Guallart, Nino

    2014-01-01

    Pure type systems arise as a generalisation of simply typed lambda calculus. The contemporary development of mathematics has renewed the interest in type theories, as they are not just the object of mere historical research, but have an active role in the development of computational science and core mathematics. It is worth exploring some of them in depth, particularly predicative Martin-L\\"of's intuitionistic type theory and impredicative Coquand's calculus of construction...

  6. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Bühlmann, Hans; Shevchenko, Pavel V.; Wüthrich, Mario V.

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses excee...

  7. Causality in Time Series: Its Detection and Quantification by Means of Information Theory.

    Czech Academy of Sciences Publication Activity Database

    Hlavá?ková-Schindler, Kate?ina

    New York : Springer, 2008 - (Emmert-Streib, F.; Dehmer, M.), s. 183-207 ISBN 978-0-387-84815-0. - (Computer Science) R&D Projects: GA MŠk 2C06001 Institutional research plan: CEZ:AV0Z10750506 Keywords : causality * time series * information theory Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2009/AS/schindler-causality in time series its detection and quantification by means of information theory.pdf

  8. Standard Error Computations for Uncertainty Quantification in Inverse Problems: Asymptotic Theory vs. Bootstrapping.

    Science.gov (United States)

    Banks, H T; Holm, Kathleen; Robbins, Danielle

    2010-11-01

    We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, confidence intervals, and computational times for both bootstrapping and asymptotic theory methods. PMID:20835347

  9. Transcriptional regulatory network refinement and quantification through kinetic modeling, gene expression microarray data and information theory

    OpenAIRE

    Tuncay Kagan; Sayyed-Ahmad Abdallah; Ortoleva Peter J

    2007-01-01

    Abstract Background Gene expression microarray and other multiplex data hold promise for addressing the challenges of cellular complexity, refined diagnoses and the discovery of well-targeted treatments. A new approach to the construction and quantification of transcriptional regulatory networks (TRNs) is presented that integrates gene expression microarray data and cell modeling through information theory. Given a partial TRN and time series data, a probability density is constructed that is...

  10. Standard Error Computations for Uncertainty Quantification in Inverse Problems: Asymptotic Theory vs. Bootstrapping

    OpenAIRE

    Banks, H. T.; Holm, Kathleen; Robbins, Danielle

    2010-01-01

    We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, ...

  11. Uncertainty Quantification and Propagation in Nuclear Density Functional Theory

    CERN Document Server

    Schunck, N; Higdon, D; Sarich, J; Wild, S M

    2015-01-01

    Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going efforts seek to better root nuclear DFT in the theory of nuclear forces [see Duguet et al., this issue], energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in finite nuclei. In this paper, we review recent efforts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statistical analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.

  12. Transcriptional regulatory network refinement and quantification through kinetic modeling, gene expression microarray data and information theory

    Directory of Open Access Journals (Sweden)

    Tuncay Kagan

    2007-01-01

    Full Text Available Abstract Background Gene expression microarray and other multiplex data hold promise for addressing the challenges of cellular complexity, refined diagnoses and the discovery of well-targeted treatments. A new approach to the construction and quantification of transcriptional regulatory networks (TRNs is presented that integrates gene expression microarray data and cell modeling through information theory. Given a partial TRN and time series data, a probability density is constructed that is a functional of the time course of transcription factor (TF thermodynamic activities at the site of gene control, and is a function of mRNA degradation and transcription rate coefficients, and equilibrium constants for TF/gene binding. Results Our approach yields more physicochemical information that compliments the results of network structure delineation methods, and thereby can serve as an element of a comprehensive TRN discovery/quantification system. The most probable TF time courses and values of the aforementioned parameters are obtained by maximizing the probability obtained through entropy maximization. Observed time delays between mRNA expression and activity are accounted for implicitly since the time course of the activity of a TF is coupled by probability functional maximization, and is not assumed to be proportional to expression level of the mRNA type that translates into the TF. This allows one to investigate post-translational and TF activation mechanisms of gene regulation. Accuracy and robustness of the method are evaluated. A kinetic formulation is used to facilitate the analysis of phenomena with a strongly dynamical character while a physically-motivated regularization of the TF time course is found to overcome difficulties due to omnipresent noise and data sparsity that plague other methods of gene expression data analysis. An application to Escherichia coli is presented. Conclusion Multiplex time series data can be used for the construction of the network of cellular processes and the calibration of the associated physicochemical parameters. We have demonstrated these concepts in the context of gene regulation understood through the analysis of gene expression microarray time series data. Casting the approach in a probabilistic framework has allowed us to address the uncertainties in gene expression microarray data. Our approach was found to be robust to error in the gene expression microarray data and mistakes in a proposed TRN.

  13. Some Properties of Type I' String Theory

    OpenAIRE

    Schwarz, John H.

    1999-01-01

    The T-dual formulation of Type I superstring theory, sometimes called Type I' theory, has a number of interesting features. Here we review some of them including the role of D0-branes and D8-branes in controlling possible gauge symmetry enhancement.

  14. Completeness in Hybrid Type Theory

    DEFF Research Database (Denmark)

    Areces, Carlos; Blackburn, Patrick Rowan; Huertas, Antonia; Manzano, Maria

    2014-01-01

    We show that basic hybridization (adding nominals and @ operators) makes it possible to give straightforward Henkin-style completeness proofs even when the modal logic being hybridized is higher-order. The key ideas are to add nominals as expressions of type t, and to extend to arbitrary types the way we interpret @i in propositional and first-order hybrid logic. This means: interpret @i?a , where ?a is an expression of any type a , as an expression of type a that rigidly returns the value that ...

  15. Completeness in Hybrid Type Theory

    DEFF Research Database (Denmark)

    Areces, Carlos; Blackburn, Patrick Rowan

    2014-01-01

    We show that basic hybridization (adding nominals and @ operators) makes it possible to give straightforward Henkin-style completeness proofs even when the modal logic being hybridized is higher-order. The key ideas are to add nominals as expressions of type t, and to extend to arbitrary types the way we interpret @i in propositional and first-order hybrid logic. This means: interpret @i?a , where ?a is an expression of any type a , as an expression of type a that rigidly returns the value that ?a receives at the i-world. The axiomatization and completeness proofs are generalizations of those found in propositional and first-order hybrid logic, and (as is usual inhybrid logic) we automatically obtain a wide range of completeness results for stronger logics and languages. Our approach is deliberately low-tech. We don’t, for example, make use of Montague’s intensional type s, or Fitting-style intensional models; we build, as simply as we can, hybrid logicover Henkin’s logic

  16. NS Branes in Type I Theory

    CERN Document Server

    Blum, J D

    1998-01-01

    We consider novel nonperturbative effects of type I theories compactified on singular ALE spaces obtained by adding NS branes. Such effects include a description of small $E_8$ instantons at singularities.

  17. Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales

    Directory of Open Access Journals (Sweden)

    J. Ellen Blue

    2008-05-01

    Full Text Available We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex vocal behavior of these humpback whales which could have direct consequences on their feeding behavior and thus ultimately on their health and reproduction. Humpback whale feeding calls recorded during conditions of high vessel-generated noise and lower levels of background noise are compared for differences in acoustic structure, use, and organization using information theoretic measures. We apply information theory in a self-referential manner (i.e., orders of entropy to quantify the changes in signaling behavior. We then compare this with the reduction in channel capacity due to noise in Glacier Bay itself treating it as a (Gaussian noisy channel. We find that high vessel noise is associated with an increase in the rate and repetitiveness of sequential use of feeding call types in our averaged sample of humpback whale vocalizations, indicating that vessel noise may be modifying the patterns of use of feeding calls by the endangered humpback whales in Southeast Alaska. The information theoretic approach suggested herein can make a reliable quantitative measure of such relationships and may also be adapted for wider application to many species where environmental noise is thought to be a problem.

  18. Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements

    CERN Document Server

    McDonnell, J D; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-01-01

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models; to estimate model errors and thereby improve predictive capability; to extrapolate beyond the regions reached by experiment; and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, w...

  19. Predictions for orientifold field theories from type 0' string theory

    CERN Document Server

    Armoni, A

    2005-01-01

    Two predictions about finite-N non-supersymmetric "orientifold field theories" are made by using the dual type 0' string theory on C^3 / Z_2 x Z_2 orbifold singularity. First, the mass ratio between the lowest pseudoscalar and scalar color-singlets is estimated to be equal to the ratio between the axial anomaly and the scale anomaly at strong coupling, M_- / M_+ ~ C_- / C_+. Second, the ratio between the domain wall tension and the value of the quark condensate is computed.

  20. Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements

    Science.gov (United States)

    McDonnell, J. D.; Schunck, N.; Higdon, D.; Sarich, J.; Wild, S. M.; Nazarewicz, W.

    2015-03-01

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  1. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method. PMID:25860736

  2. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  3. Multi-level Contextual Type Theory

    OpenAIRE

    Mathieu Boespflug; Brigitte Pientka

    2011-01-01

    Contextual type theory distinguishes between bound variables and meta-variables to write potentially incomplete terms in the presence of binders. It has found good use as a framework for concise explanations of higher-order unification, characterize holes in proofs, and in developing a foundation for programming with higher-order abstract syntax, as embodied by the programming and reasoning environment Beluga. However, to reason about these applications, we need to introduce...

  4. Applications of Jungian Type Theory to Counselor Education.

    Science.gov (United States)

    Dilley, Josiah S.

    1987-01-01

    Describes Carl Jung's theory of psychological type and the Myers-Briggs Type Indicator (MBTI), an instrument to assess Jungian type. Cites sources of information on the research and application of the theory and the MBTI. Explores how knowledge of type theory can be useful to counselor educators. (Author)

  5. Multi-level Contextual Type Theory

    Directory of Open Access Journals (Sweden)

    Mathieu Boespflug

    2011-10-01

    Full Text Available Contextual type theory distinguishes between bound variables and meta-variables to write potentially incomplete terms in the presence of binders. It has found good use as a framework for concise explanations of higher-order unification, characterize holes in proofs, and in developing a foundation for programming with higher-order abstract syntax, as embodied by the programming and reasoning environment Beluga. However, to reason about these applications, we need to introduce meta^2-variables to characterize the dependency on meta-variables and bound variables. In other words, we must go beyond a two-level system granting only bound variables and meta-variables. In this paper we generalize contextual type theory to n levels for arbitrary n, so as to obtain a formal system offering bound variables, meta-variables and so on all the way to meta^n-variables. We obtain a uniform account by collapsing all these different kinds of variables into a single notion of variabe indexed by some level k. We give a decidable bi-directional type system which characterizes beta-eta-normal forms together with a generalized substitution operation.

  6. Multi-level Contextual Type Theory

    CERN Document Server

    Boespflug, Mathieu; 10.4204/EPTCS.71.3

    2011-01-01

    Contextual type theory distinguishes between bound variables and meta-variables to write potentially incomplete terms in the presence of binders. It has found good use as a framework for concise explanations of higher-order unification, characterize holes in proofs, and in developing a foundation for programming with higher-order abstract syntax, as embodied by the programming and reasoning environment Beluga. However, to reason about these applications, we need to introduce meta^2-variables to characterize the dependency on meta-variables and bound variables. In other words, we must go beyond a two-level system granting only bound variables and meta-variables. In this paper we generalize contextual type theory to n levels for arbitrary n, so as to obtain a formal system offering bound variables, meta-variables and so on all the way to meta^n-variables. We obtain a uniform account by collapsing all these different kinds of variables into a single notion of variabe indexed by some level k. We give a decidable ...

  7. Determination and quantification of collagen types by LC-MS/MS and CE-MS/MS.

    Czech Academy of Sciences Publication Activity Database

    Mikšík, Ivan; Pataridis, Statis; Eckhardt, Adam; Lacinová, Kate?ina; Sedláková, Pavla

    Freiberg : Forschungsinstitut für Leder und Kunststoffbahnen (FILK)gGmbH, 2012, s. 131-141. ISBN 978-3-00-039421-8. [Freiberg Collagen Symposium /5./. Freiberg (DE), 04.09.2012-05.09.2012] R&D Projects: GA ?R(CZ) GA203/08/1428; GA ?R(CZ) GAP206/12/0453 Institutional research plan: CEZ:AV0Z50110509 Institutional support: RVO:67985823 Keywords : collagen * protein quantification Subject RIV: CB - Analytical Chemistry, Separation

  8. Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts

    2006-05-01

    This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and improving the prediction accuracy of the damage modeling and finite element simulation.

  9. Evidence theory and differential evolution based uncertainty quantification for buckling load of semi-rigid jointed frames

    Indian Academy of Sciences (India)

    Hesheng Tang; Yu Su; Jiao Wang

    2015-08-01

    The paper describes a procedure for the uncertainty quantification (UQ) using evidence theory in buckling analysis of semi-rigid jointed frame structures under mixed epistemic–aleatory uncertainty. The design uncertainties (geometrical, material, strength, and manufacturing) are often prevalent in engineering applications. Due to lack of knowledge or incomplete, inaccurate, unclear information in the modeling, simulation, measurement, and design, there are limitations in using only one framework (probability theory) to quantify uncertainty in a system because of the impreciseness of data or knowledge. Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. Unfortunately, propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than propagation of a probabilistic representation for uncertainty. In order to alleviate the computational difficulties in the evidence theory based UQ analysis, a differential evolution-based computational strategy for propagation of epistemic uncertainty in a system with evidence theory is presented here. A UQ analysis for the buckling load of steel-plane frames with semi-rigid connections is given herein to demonstrate accuracy and efficiency of the proposed method.

  10. Three-dimensional topological quantum field theory of Witten type

    OpenAIRE

    Bakalarska, Malgorzata; Broda, Boguslaw

    1999-01-01

    Description of two three-dimensional topological quantum field theories of Witten type as twisted supersymmetric theories is presented. Low-energy effective action and a corresponding topological invariant of three-dimensional manifolds are considered.

  11. Three-dimensional topological quantum field theory of Witten type

    CERN Document Server

    Bakalarska, M; Bakalarska, Malgorzata; Broda, Boguslaw

    1998-01-01

    Description of two three-dimensional topological quantum field theories of Witten type as twisted supersymmetric theories is presented. Low-energy effective action and a corresponding topological invariant of three-dimensional manifolds are considered.

  12. Type Arithmetics: Computation based on the theory of types

    OpenAIRE

    Kiselyov, Oleg

    2001-01-01

    The present paper shows meta-programming turn programming, which is rich enough to express arbitrary arithmetic computations. We demonstrate a type system that implements Peano arithmetics, slightly generalized to negative numbers. Certain types in this system denote numerals. Arithmetic operations on such types-numerals - addition, subtraction, and even division - are expressed as type reduction rules executed by a compiler. A remarkable trait is that division by zero becom...

  13. Numerical Domain Wall Type Solutions in phi**4 Theory

    OpenAIRE

    Karkowski, J.; Swierczynski, Z.

    1996-01-01

    The well known domain wall type solutions are nowadays of great physical interest in classical field theory. These solutions can mostly be found only approximately. Recently the Hilbert-Chapman-Enskog method was succesfully applied to obtain this type solutions in phi**4 theory. The goal of the present paper is to verify these perturbative results by numerical computations.

  14. Toward a Theory of Psychological Type Congruence for Advertisers.

    Science.gov (United States)

    McBride, Michael H.; And Others

    Focusing on the impact of advertisers' persuasive selling messages on consumers, this paper discusses topics relating to the theory of psychological type congruence. Based on an examination of persuasion theory and relevant psychological concepts, including recent cognitive stability and personality and needs theory and the older concept of…

  15. Closed tachyon solitons in type II string theory

    CERN Document Server

    García-Etxebarria, Iñaki; Uranga, Angel M

    2015-01-01

    Type II theories can be described as the endpoint of closed string tachyon condensation in certain orbifolds of supercritical type 0 theories. In this paper, we study solitons of this closed string tachyon and analyze the nature of the resulting defects in critical type II theories. The solitons are classified by the real K-theory groups KO of bundles associated to pairs of supercritical dimensions. For real codimension 4 and 8, corresponding to $KO({\\bf S}^4)={\\bf Z}$ and $KO({\\bf S}^8)={\\bf Z}$, the defects correspond to a gravitational instanton and a fundamental string, respectively. We apply these ideas to reinterpret the worldsheet GLSM, regarded as a supercritical theory on the ambient toric space with closed tachyon condensation onto the CY hypersurface, and use it to describe charged solitons under discrete isometries. We also suggest the possible applications of supercritical strings to the physical interpretation of the matrix factorization description of F-theory on singular spaces.

  16. Effects of drying and comminution type on the quantification of Polycyclic Aromatic Hydrocarbons (PAH) in a homogenised gasworks soil and the implications for human health risk assessment

    OpenAIRE

    Beriro, Darren J.; Vane, Christopher H.; Cave, Mark R.; Nathanail, C. Paul

    2014-01-01

    This research investigates the effect of nine physical treatment types comprising a serial combination of three drying (air, freeze and oven) and two comminution (milling and sieving) methods on the quantification of PAH in a soil sample from a former gasworks. Results show that treatment type has a significant effect on PAH concentration (p ? 0.05). Naphthalene, 1-methylnaphthalene and 2-methylnaphthalene concentrations were significantly higher for air drying and freeze drying treatments th...

  17. Quantification and monosaccharide composition of hemicelluloses from different plant functional types.

    Science.gov (United States)

    Schädel, Christina; Blöchl, Andreas; Richter, Andreas; Hoch, Günter

    2010-01-01

    Hemicelluloses are the second most abundant polysaccharide in nature after cellulose. So far, the chemical heterogeneity of cell-wall hemicelluloses and the relatively large sample-volume required in existing methods represent major obstacles for large-scale, cross-species analyses of this important plant compound. Here, we apply a new micro-extraction method to analyse hemicelluloses and the ratio of 'cellulose and lignin' to hemicelluloses in different tissues of 28 plant species comprising four plant functional types (broad-leaved trees, conifers, grasses and herbs). For this study, the fiber analysis after Van Soest was modified to enable the simultaneous quantitative and qualitative measurements of hemicelluloses in small sample volumes. Total hemicellulose concentrations differed markedly among functional types and tissues with highest concentration in sapwood of broad-leaved trees (31% d.m. in Fraxinus excelsior) and lowest concentration between 10 and 15% d.m. in leaves and bark of woody species as well as in roots of herbs. As for total hemicellulose concentrations, plant functional types and tissues exhibited characteristic ratios between the sum of cellulose plus lignin and hemicelluloses, with very high ratios (>4) in bark of trees and low ratios (<2) in all investigated leaves. Additional HPLC analyses of hydrolysed hemicelluloses showed xylose to be the dominant hemicellulose monosaccharide in tissues of broad-leaved trees, grasses and herbs while coniferous species showed higher amounts of arabinose, galactose and mannose. Overall, the micro-extraction method permitted for the simultaneous determination of hemicelluloses of various tissues and plant functional types which exhibited characteristic hemicellulose concentrations and monosaccharide patterns. PMID:19926487

  18. Peripheral type benzodiazepine binding sites as a tool for the detection and quantification of CNS injury.

    Science.gov (United States)

    Benavides, J; Dubois, A; Scatton, B

    2001-05-01

    The concentration of peripheral type benzodiazepine binding sites (PTBS) in the brain parenchyma is greatly increased following brain lesions, reflecting the glial reaction and/or presence of hematogenous cells. Thus, PTBS density is a sensitive and reliable marker of brain injury in a large number of experimental models (ischemia, trauma, excitotoxic lesions, brain tumors) and equivalent human neuropathological conditions. PTBS density can be measured using specific radioligands and a conventional binding technique, or by quantitative autoradiography in tissue sections. PMID:18428526

  19. Intensional type theory with guarded recursive types qua fixed points on universes

    DEFF Research Database (Denmark)

    MØgelberg, Rasmus Ejlers; Birkedal, Lars

    2013-01-01

    Guarded recursive functions and types are useful for giving semantics to advanced programming languages and for higher-order programming with infinite data types, such as streams, e.g., for modeling reactive systems. We propose an extension of intensional type theory with rules for forming fixed points of guarded recursive functions. Guarded recursive types can be formed simply by taking fixed points of guarded recursive functions on the universe of types. Moreover, we present a general model construction for constructing models of the intensional type theory with guarded recursive functions and types. When applied to the groupoid model of intensional type theory with the universe of small discrete groupoids, the construction gives a model of guarded recursion for which there is a one-to-one correspondence between fixed points of functions on the universe of types and fixed points of (suitable) operators on types. In particular, we find that the functor category from the preordered set of natural numbers to the category of groupoids is a model of intensional type theory with guarded recursive types.

  20. Intensional Type Theory with Guarded Recursive Types qua Fixed Points on Universes

    DEFF Research Database (Denmark)

    Birkedal, Lars; Mogelberg, R.E.

    2013-01-01

    Guarded recursive functions and types are useful for giving semantics to advanced programming languages and for higher-order programming with infinite data types, such as streams, e.g., for modeling reactive systems. We propose an extension of intensional type theory with rules for forming fixed points of guarded recursive functions. Guarded recursive types can be formed simply by taking fixed points of guarded recursive functions on the universe of types. Moreover, we present a general model construction for constructing models of the intensional type theory with guarded recursive functions and types. When applied to the groupoid model of intensional type theory with the universe of small discrete groupoids, the construction gives a model of guarded recursion for which there is a one-to-one correspondence between fixed points of functions on the universe of types and fixed points of (suitable) operators on types. In particular, we find that the functor category Grpd?op from the preordered set of natural numbers to the category of groupoids is a model of intensional type theory with guarded recursive types.

  1. Comparison between magnetic force microscopy and electron back-scatter diffraction for ferrite quantification in type 321 stainless steel

    International Nuclear Information System (INIS)

    Several analytical techniques that are currently available can be used to determine the spatial distribution and amount of austenite, ferrite and precipitate phases in steels. The application of magnetic force microscopy, in particular, to study the local microstructure of stainless steels is beneficial due to the selectivity of this technique for detection of ferromagnetic phases. In the comparison of Magnetic Force Microscopy and Electron Back-Scatter Diffraction for the morphological mapping and quantification of ferrite, the degree of sub-surface measurement has been found to be critical. Through the use of surface shielding, it has been possible to show that Magnetic Force Microscopy has a measurement depth of 105–140 nm. A comparison of the two techniques together with the depth of measurement capabilities are discussed. - Highlights: • MFM used to map distribution and quantify ferrite in type 321 stainless steels. • MFM results compared with EBSD for same region, showing good spatial correlation. • MFM gives higher area fraction of ferrite than EBSD due to sub-surface measurement. • From controlled experiments MFM depth sensitivity measured from 105 to 140 nm. • A correction factor to calculate area fraction from MFM data is estimated

  2. Comparison between magnetic force microscopy and electron back-scatter diffraction for ferrite quantification in type 321 stainless steel

    Energy Technology Data Exchange (ETDEWEB)

    Warren, A.D., E-mail: Xander.Warren@bristol.ac.uk [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); Harniman, R.L. [School of Chemistry, University of Bristol, Bristol BS8 1 TS (United Kingdom); Collins, A.M. [School of Chemistry, University of Bristol, Bristol BS8 1 TS (United Kingdom); Bristol Centre for Functional Nanomaterials, Nanoscience and Quantum Information Centre, University of Bristol, Bristol BS8 1FD (United Kingdom); Davis, S.A. [School of Chemistry, University of Bristol, Bristol BS8 1 TS (United Kingdom); Younes, C.M. [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); Flewitt, P.E.J. [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); School of Physics, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom); Scott, T.B. [Interface Analysis Centre, HH Wills Laboratory, University of Bristol, Bristol BS8 1FD (United Kingdom)

    2015-01-15

    Several analytical techniques that are currently available can be used to determine the spatial distribution and amount of austenite, ferrite and precipitate phases in steels. The application of magnetic force microscopy, in particular, to study the local microstructure of stainless steels is beneficial due to the selectivity of this technique for detection of ferromagnetic phases. In the comparison of Magnetic Force Microscopy and Electron Back-Scatter Diffraction for the morphological mapping and quantification of ferrite, the degree of sub-surface measurement has been found to be critical. Through the use of surface shielding, it has been possible to show that Magnetic Force Microscopy has a measurement depth of 105–140 nm. A comparison of the two techniques together with the depth of measurement capabilities are discussed. - Highlights: • MFM used to map distribution and quantify ferrite in type 321 stainless steels. • MFM results compared with EBSD for same region, showing good spatial correlation. • MFM gives higher area fraction of ferrite than EBSD due to sub-surface measurement. • From controlled experiments MFM depth sensitivity measured from 105 to 140 nm. • A correction factor to calculate area fraction from MFM data is estimated.

  3. Introduction to type-2 fuzzy logic control theory and applications

    CERN Document Server

    Mendel, Jerry M; Tan, Woei-Wan; Melek, William W; Ying, Hao

    2014-01-01

    Written by world-class leaders in type-2 fuzzy logic control, this book offers a self-contained reference for both researchers and students. The coverage provides both background and an extensive literature survey on fuzzy logic and related type-2 fuzzy control. It also includes research questions, experiment and simulation results, and downloadable computer programs on an associated website. This key resource will prove useful to students and engineers wanting to learn type-2 fuzzy control theory and its applications.

  4. Quantification of stress history in type 304L stainless steel using positron annihilation spectroscopy

    International Nuclear Information System (INIS)

    Five Type 304L stainless steel specimens were subjected to incrementally increasing values of plastic strain. At each value of strain, the associated static stress was recorded and the specimen was subjected to positron annihilation spectroscopy (PAS) using the Doppler Broadening method. A calibration curve for the 'S' parameter as a function of stress was developed based on the five specimens. Seven different specimens (blind specimens labeled B1-B7) of 304L stainless steel were subjected to values of stress inducing plastic deformation. The values of stress ranged from 310 to 517 MPa. The seven specimens were subjected to PAS post-loading using the Doppler Broadening method, and the results were compared against the developed curve from the previous five specimens. It was found that a strong correlation exists between the 'S' parameter, stress, and strain up to a strain value of 15%, corresponding to a stress value of 500 MPa, beyond which saturation of the 'S' parameter occurs. Research Highlights: ? Specimens were initially in an annealed/recrystallized condition. ? Calibration results indicate positron annihilation measurements yield correlation. ? Deformation produced by cold work was likely larger than the maximum strain.

  5. Closed tachyon solitons in type II string theory

    International Nuclear Information System (INIS)

    Type II theories can be described as the endpoint of closed string tachyon condensation in certain orbifolds of supercritical type 0 theories. In this paper, we study solitons of this closed string tachyon and analyze the nature of the resulting defects in critical type II theories. The solitons are classified by the real K-theory groups KO of bundles associated to pairs of supercritical dimensions. For real codimension 4 and 8, corresponding to KO(S4) = Z and KO(S8) = Z, the defects correspond to a gravitational instanton and a fundamental string, respectively. We apply these ideas to reinterpret the worldsheet GLSM, regarded as a supercritical theory on the ambient toric space with closed tachyon condensation onto the CY hypersurface, and use it to describe charged solitons under discrete isometries. We also suggest the possible applications of supercritical strings to the physical interpretation of the matrix factorization description of F-theory on singular spaces. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  6. Gauge theory on a space with linear Lie type fuzziness

    CERN Document Server

    Khorrami, M; Shariati, A

    2013-01-01

    The U(1) gauge theory on a space with Lie type noncommutativity is constructed. The construction is based on the group of translation in Fourier space, which in contrast to space itself is commutative. In analogy with lattice gauge theory, the object playing the role of flux of field strength per plaquette, as well as the action, are constructed. It is observed that the theory, in comparison with ordinary U(1) gauge theory, has an extra gauge field component. This phenomena is reminiscent of similar ones in formulation of SU(N) gauge theory in space with canonical noncommutativity, and also appearance of gauge field component in discrete direction of Connes' construction of the Standard Model.

  7. Uncertainty Propagation and Quantification using Constrained Coupled Adaptive Forward-Inverse Schemes: Theory and Applications

    Science.gov (United States)

    Ryerson, F. J.; Ezzedine, S. M.; Antoun, T.

    2013-12-01

    The success of implementation and execution of numerous subsurface energy technologies such shale gas extraction, geothermal energy, underground coal gasification rely on detailed characterization of the geology and the subsurface properties. For example, spatial variability of subsurface permeability controls multi-phase flow, and hence impacts the prediction of reservoir performance. Subsurface properties can vary significantly over several length scales making detailed subsurface characterization unfeasible if not forbidden. Therefore, in common practices, only sparse measurements of data are available to image or characterize the entire reservoir. For example pressure, P, permeability, k, and production rate, Q, measurements are only available at the monitoring and operational wells. Elsewhere, the spatial distribution of k is determined by various deterministic or stochastic interpolation techniques and P and Q are calculated from the governing forward mass balance equation assuming k is given at all locations. Several uncertainty drivers, such as PSUADE, are then used to propagate and quantify the uncertainty (UQ) of quantities (variable) of interest using forward solvers. Unfortunately, forward-solver techniques and other interpolation schemes are rarely constrained by the inverse problem itself: given P and Q at observation points determine the spatially variable map of k. The approach presented here, motivated by fluid imaging for subsurface characterization and monitoring, was developed by progressively solving increasingly complex realistic problems. The essence of this novel approach is that the forward and inverse partial differential equations are the interpolator themselves for P, k and Q rather than extraneous and sometimes ad hoc schemes. Three cases with different sparsity of data are investigated. In the simplest case, a sufficient number of passive pressure data (pre-production pressure gradients) are given. Here, only the inverse hyperbolic equation for the distribution of k is solved, provided that Cauchy data are appropriately assigned. In the next stage, only a limited number of passive measurements are provided. In this case, the forward and inverse PDEs are solved simultaneously. This is accomplished by adding regularization terms and filtering the pressure gradients in the inverse problem. Both the forward and the inverse problem are either simultaneously or sequentially coupled and solved using implicit schemes, adaptive mesh refinement, Galerkin finite elements. The final case arises when P, k, and Q data only exist at producing wells. This exceedingly ill posed problem calls for additional constraints on the forward-inverse coupling to insure that the production rates are satisfied at the desired locations. Results from all three cases are presented demonstrating stability and accuracy of the proposed approach and, more importantly, providing some insights into the consequences of data under sampling, uncertainty propagation and quantification. We illustrate the advantages of this novel approach over the common UQ forward drivers on several subsurface energy problems in either porous or fractured or/and faulted reservoirs. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  8. Calabi-Yau compactifications of type IIB superstring theory

    International Nuclear Information System (INIS)

    Starting from a non-self-dual action for ten dimensional type IIB supergravity this theory is compactified on a Calabi-Yau 3-fold and 4- fold. The compactification are thereby performed in the limit, in which the volumina of the manifolds are large against the string scale

  9. A model of PCF in guarded type theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; MØgelberg, Rasmus Ejlers

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about elements of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy

  10. On the homotopy theory of n-types

    CERN Document Server

    Biedermann, G

    2006-01-01

    An n-truncated model structure on simplicial (pre-)sheaves is described having as weak equivalences maps that induce isomorphisms on certain homotopy sheaves only up to degree n. Starting from one of Jardine's intermediate model structures we construct such an n-type model structure via Bousfield-Friedlander localization and exhibit useful generating sets of trivial cofibrations. Injectively fibrant objects in these categories are called n-hyperstacks. The whole setup can consequently be viewed as a description of the homotopy theory of higher hyperstacks. More importantly, we construct analogous n-truncations on simplicial groupoids and prove a Quillen equivalence between these settings. We achieve a classification of n-types of simplicial presheaves in terms of (n-1)-types of presheaves of simplicial groupoids. Our classification holds for general n. Therefore this can also be viewed as the homotopy theory of (pre-)sheaves of (weak) higher groupoids.

  11. Electric Black Holes in Type 0 String Theory

    CERN Document Server

    Sachs, I

    1999-01-01

    We discuss AdS_{2+1} (BTZ) black holes arising in type 0 string theory corresponding to D1-D5 and F1-NS5 bound states. In particular we describe a new non-dilatonic solution in the RR sector with only Dp_{+}, that is ``electric'' branes. This system is distinguished by the absence of fermions in the world volume theory which is an interacting CFT. It can therefore not be obtained as a projection of a type II BPS-solution. As for previous type 0 backgrounds linear stability is guaranteed only if the curvature is of the order of the string scale where alpha' corrections cannot be excluded. Some problems concerning the counting of states are discussed.

  12. Formation of social types in the theory of Orrin Klapp

    Directory of Open Access Journals (Sweden)

    Trifunovi? Vesna

    2007-01-01

    Full Text Available Theory of Orrin Klapp about social types draws attention to important functions that these types have within certain societies as well as that it is preferable to take them into consideration if our goal is more complete knowledge of that society. For Klapp, social types are important social symbols, which in an interesting way reflect society they are part of and for that reason this author dedicates his work to considering their meanings and social functions. He thinks that we can not understand a society without the knowledge about the types with which its members are identified and which serve them as models in their social activity. Hence, these types have cognitive value since, according to Klapp, they assist in perception and "contain the truth", and therefore the knowledge of them allows easier orientation within the social system. Social types also offer insight into the scheme of the social structure, which is otherwise invisible and hidden, but certainly deserves attention if we wish clearer picture about social relations within specific community. The aim of this work is to present this very interesting and inspirative theory of Orrin Klapp, pointing out its importance but also its weaknesses which should be kept in mind during its application in further research.

  13. A Modular Type-checking algorithm for Type Theory with Singleton Types and Proof Irrelevance

    CERN Document Server

    Abel, Andreas; Pagano, Miguel

    2011-01-01

    We define a logical framework with singleton types and one universe of small types. We give the semantics using a PER model; it is used for constructing a normalisation-by-evaluation algorithm. We prove completeness and soundness of the algorithm; and get as a corollary the injectivity of type constructors. Then we give the definition of a correct and complete type-checking algorithm for terms in normal form. We extend the results to proof-irrelevant propositions.

  14. On the theories of type 1 polar stratospheric cloud formation

    Energy Technology Data Exchange (ETDEWEB)

    MacKenzie, A.R. [Univ. of Cambridge (United Kingdom); Kulmala, M.; Laaksonen, A. [Univ. of Helsinki (Finland)] [and others

    1995-06-20

    The authors apply classical theories of nucleation and freezing, in addition to condensation theories, to the problem of modeling the formation of polar stratospheric cloud particles. An array of different paths can be followed to arrive at solutions and solids composed of nitric acid, sulfuric acid, and water vapor. The authors attempt to do an energetics study to determine the most likely path leading to cloud particle formation, and then to study the implications for nitric acid condensation within resultant clouds. Their conclusion is that the most likely growth mechanism is condensation of ternary type solutions, followed by rapid freezing to water ices, nitric acid trihydrates, and sulfuric acid trihydrates.

  15. A ground many-valued type theory and its extensions.

    Czech Academy of Sciences Publication Activity Database

    B?hounek, Libor

    Linz : Johannes Kepler Universität, 2014 - (Flaminio, T.; Godo, L.; Gottwald, S.; Klement, E.). s. 15-18 [Linz Seminar on Fuzzy Set Theory /35./. 18.02.2014-22.02.2014, Linz] R&D Projects: GA MŠk ED1.1.00/02.0070; GA MŠk EE2.3.30.0010 Institutional support: RVO:67985807 Keywords : type theory * many-valued logics * higher-order logic * teorie typ? * vícehodnotové logiky * logika vyššího ?ádu Subject RIV: BA - General Mathematics

  16. Type 1 2HDM as Effective Theory of Supersymmetry

    International Nuclear Information System (INIS)

    It is generally believed that the low energy effective theory of the minimal supersymmetric standard model is the type 2 two Higgs doublet model. We will show that the type 1 two Higgs doublet model can also be as the effective of supersymmetry in a specific case with high scale supersymmetry breaking and gauge mediation. If the other electroweak doublet obtain the vacuum expectation value after the electroweak symmetry breaking, the Higgs spectrum is quite different. A remarkable feature is that the physical Higgs boson mass can be 125 GeV unlike in the ordinary models with high scale supersymmetry in which the Higgs mass is generally around 140 GeV.

  17. Deliverables: a categorical approach to program development in type theory

    OpenAIRE

    McKinna, James H

    1992-01-01

    This thesis considers the problem of program correctness within a rich theory of dependent types, the Extended Calculus of Constructions (ECC). This system contains a powerful programming language of higher-order primitive recursion and higher-order intuitionistic logic. It is supported by Pollack's versatile LEGO implementation, which I use extensively to develop the mathematical constructions studied here. I systematically investigate Burstall's notion of deliverable, that is, a program...

  18. Dilaton-driven brane inflation in type IIB string theory

    OpenAIRE

    KIM, JIN YOUNG

    2000-01-01

    We consider the cosmological evolution of the three-brane in the background of type IIB string theory. For two different backgrounds which give nontrivial dilaton profile we have derived the Friedman-like equations. These give the cosmological evolution which is similar to the one by matter density on the universe brane. The effective density blows up as we move towards the singularity showing the initial singularity problem. The analysis shows that when there is axion field...

  19. Pure subtype systems: a type theory for extensible software

    OpenAIRE

    Hutchins, DeLesley

    2009-01-01

    This thesis presents a novel approach to type theory called “pure subtype systems”, and a core calculus called DEEP which is based on that approach. DEEP is capable of modeling a number of interesting language techniques that have been proposed in the literature, including mixin modules, virtual classes, feature-oriented programming, and partial evaluation. The design of DEEP was motivated by two well-known problems: “the expression problem”, and “the tag elimination problem...

  20. Quantification and Domain Restriction in Basque

    OpenAIRE

    Etxeberria, Urtzi

    2005-01-01

    The main goal of this dissertation is to contribute to the understanding of the internal structure of Basque quantification in particular and natural language quantification in general within the framework of Generalized Quantifier Theory. The proposal put forward in it demonstrates that the standard analysis of Generalized Quantifiers is correct and that it can account for quantification in natural languages, pace alternative analyses that argue for a revision. Assuming that quantification i...

  1. Enhanced gauge symmetry in type II string theory

    International Nuclear Information System (INIS)

    We show how enhanced gauge symmetry in type II string theory compactified on a Calabi-Yau threefold arises from singularities in the geometry of the target space. When the target space of the type IIA string acquires a genus g curve C of AN-1 singularities, we find that an SU(N) gauge theory with g adjoint hypermultiplets appears at the singularity. The new massless states correspond to solitons wrapped about the collapsing cycles, and their dynamics is described by a twisted supersymmetric gauge theory on C x R4. We reproduce this result from an analysis of the S-dual D-manifold. We check that the predictions made by this model about the nature of the Higgs branch, the monodromy of period integrals, and the asymptotics of the one-loop topological amplitude are in agreement with geometrical computations. In one of our examples we find that the singularity occurs at strong coupling in the heterotic dual proposed by Kachru and Vafa. (orig.)

  2. D-brane Instantons in Type II String Theory

    CERN Document Server

    Blumenhagen, Ralph; Kachru, Shamit; Weigand, Timo

    2009-01-01

    We review recent progress in determining the effects of D-brane instantons in N=1 supersymmetric compactifications of Type II string theory to four dimensions. We describe the abstract D-brane instanton calculus for holomorphic couplings such as the superpotential, the gauge kinetic function and higher fermionic F-terms. This includes a discussion of multi-instanton effects and the implications of background fluxes for the instanton sector. Our presentation also highlights, but is not restricted to the computation of D-brane instanton effects in quiver gauge theories on D-branes at singularities. We then summarize the concrete consequences of stringy D-brane instantons for the construction of semi-realistic models of particle physics or SUSY-breaking in compact and non-compact geometries.

  3. Quantum Bianchi Type IX Cosmology in K-Essence Theory

    Science.gov (United States)

    Espinoza-García, Abraham; Socorro, J.; Pimentel, Luis O.

    2014-09-01

    We use one of the simplest forms of the K-essence theory and apply it to the anisotropic Bianchi type IX cosmological model, with a barotropic perfect fluid modeling the usual matter content. We show that the most important contribution of the scalar field occurs during a stiff matter phase. Also, we present a canonical quantization procedure of the theory which can be simplified by reinterpreting the scalar field as an exotic part of the total matter content. The solutions to the Wheeler-DeWitt equation were found using the Bohmian formulation Bohm (Phys. Rev. 85(2):166, 1952) of quantum mechanics, employing the amplitude-real-phase approach Moncrief and Ryan (Phys. Rev. D 44:2375, 1991), where the ansatz for the wave function is of the form ?( ? ? )= ?( ?) W( ? ? ), where S is the superpotential function, which plays an important role in solving the Hamilton-Jacobi equation.

  4. D-brane Instantons in Type II String Theory

    Energy Technology Data Exchange (ETDEWEB)

    Blumenhagen, Ralph; /Munich, Max Planck Inst.; Cvetic, Mirjam; /Pennsylvania U.; Kachru, Shamit; /Stanford U., Phys. Dept. /SLAC; Weigand, Timo; /SLAC

    2009-06-19

    We review recent progress in determining the effects of D-brane instantons in N=1 supersymmetric compactifications of Type II string theory to four dimensions. We describe the abstract D-brane instanton calculus for holomorphic couplings such as the superpotential, the gauge kinetic function and higher fermionic F-terms. This includes a discussion of multi-instanton effects and the implications of background fluxes for the instanton sector. Our presentation also highlights, but is not restricted to the computation of D-brane instanton effects in quiver gauge theories on D-branes at singularities. We then summarize the concrete consequences of stringy D-brane instantons for the construction of semi-realistic models of particle physics or SUSY-breaking in compact and non-compact geometries.

  5. Irregular singularities in Liouville theory and Argyres-Douglas type gauge theories, I

    Energy Technology Data Exchange (ETDEWEB)

    Gaiotto, D. [Institute for Advanced Study (IAS), Princeton, NJ (United States); Teschner, J. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-03-15

    Motivated by problems arising in the study of N=2 supersymmetric gauge theories we introduce and study irregular singularities in two-dimensional conformal field theory, here Liouville theory. Irregular singularities are associated to representations of the Virasoro algebra in which a subset of the annihilation part of the algebra act diagonally. In this paper we define natural bases for the space of conformal blocks in the presence of irregular singularities, describe how to calculate their series expansions, and how such conformal blocks can be constructed by some delicate limiting procedure from ordinary conformal blocks. This leads us to a proposal for the structure functions appearing in the decomposition of physical correlation functions with irregular singularities into conformal blocks. Taken together, we get a precise prediction for the partition functions of some Argyres-Douglas type theories on S{sup 4}. (orig.)

  6. Church-style type theories over finitary weakly implicative logics.

    Czech Academy of Sciences Publication Activity Database

    B?hounek, Libor

    Vienna : Vienna University of Technology, 2014 - (Baaz, M.; Ciabattoni, A.; Hetzl, S.). s. 131-133 [LATD 2014. Logic, Algebra and Truth Degrees. 16.07.2014-19.07.2014, Vienna] R&D Projects: GA MŠk ED1.1.00/02.0070; GA MŠk EE2.3.30.0010 Institutional support: RVO:67985807 Keywords : type theory * higher-order logic * weakly implicative logics * teorie typ? * logika vyššího ?ádu * slab? implika?ní logiky Subject RIV: BA - General Mathematics

  7. Type IIB flux vacua from G-theory I

    CERN Document Server

    Candelas, Philip; Damian, Cesar; Larfors, Magdalena; Morales, Jose Francisco

    2014-01-01

    We construct non-perturbatively exact four-dimensional Minkowski vacua of type IIB string theory with non-trivial fluxes. These solutions are found by gluing together, consistently with U-duality, local solutions of type IIB supergravity on $T^4 \\times \\mathbb{C}$ with the metric, dilaton and flux potentials varying along $\\mathbb{C}$ and the flux potentials oriented along $T^4$. We focus on solutions locally related via U-duality to non-compact Ricci-flat geometries. More general solutions and a complete analysis of the supersymmetry equations are presented in the companion paper [1]. We build a precise dictionary between fluxes in the global solutions and the geometry of an auxiliary $K3$ surface fibered over $\\mathbb{CP}^1$. In the spirit of F-theory, the flux potentials are expressed in terms of locally holomorphic functions that parametrize the complex structure moduli space of the $K3$ fiber in the auxiliary geometry. The brane content is inferred from the monodromy data around the degeneration points o...

  8. Development of Primer-Probe Energy Transfer real-time PCR for the detection and quantification of porcine circovirus type 2

    DEFF Research Database (Denmark)

    Balint, Adam; Tenk, Miklós

    2009-01-01

    A real-time PCR assay, based on Primer-Probe Energy Transfer (PriProET), was developed to improve the detection and quantification of porcine circovirus type 2 (PVC2). PCV2 is recognised as the essential infectious agent in post-weaning multisystemic wasting syndrome (PMWS) and has been associated with other disease syndromes such as porcine dermatitis and nephropathy syndrome (PDNS) and porcine respiratory disease complex (PRDC). Since circoviruses commonly occur in the pig populations and there is a correlation between the severity of the disease and the viral load in the organs and blood, it is important not only to detect PCV2 but also to determine the quantitative aspects of viral load. The PriProET real-time PCR assay described in this study was tested on various virus strains and clinical forms of PMWS in order to investigate any correlation between the clinical signs and viral loads in different organs. The data obtained in this study correlate with those described earlier; namely, the viral load in 1ml plasma and in 500 ng tissue DNA exceeds 10(7) copies in the case of PMWS. The results indicate that the new assay provides a specific, sensitive and robust tool for the improved detection and quantification of PCV2.

  9. Comparison between magnetic force microscopy and electron back-scatter diffraction for ferrite quantification in type 321 stainless steel.

    Science.gov (United States)

    Warren, A D; Harniman, R L; Collins, A M; Davis, S A; Younes, C M; Flewitt, P E J; Scott, T B

    2015-01-01

    Several analytical techniques that are currently available can be used to determine the spatial distribution and amount of austenite, ferrite and precipitate phases in steels. The application of magnetic force microscopy, in particular, to study the local microstructure of stainless steels is beneficial due to the selectivity of this technique for detection of ferromagnetic phases. In the comparison of Magnetic Force Microscopy and Electron Back-Scatter Diffraction for the morphological mapping and quantification of ferrite, the degree of sub-surface measurement has been found to be critical. Through the use of surface shielding, it has been possible to show that Magnetic Force Microscopy has a measurement depth of 105-140 nm. A comparison of the two techniques together with the depth of measurement capabilities are discussed. PMID:25195013

  10. Compactifications of type IIB string theory and F-theory models using toric geometry

    International Nuclear Information System (INIS)

    In this work we focus on the toric construction of type IIB and F-theory models. After introducing the main concepts of type IIB orientifold and F-theory compactifications as well as their connection via the Sen limit, we provide the toric tools to explicitly construct and describe the manifolds involved in our setups. On the type IIB side, we study the 'Large Volume Scenario' on four-modulus, 'Swiss cheese' Calabi-Yau manifolds obtained from four-dimensional simplicial lattice polytopes. We thoroughly analyze the possibility of generating neutral, non-perturbative superpotentials from Euclidean D3-branes in the presence of chirally intersecting D7-branes. We find that taking proper account of the Freed-Witten anomaly on non-spin cycles and the Kaehler cone conditions imposes severe constraints on the models. Nevertheless, we are able to create setups where the constraints are solved, and up to three moduli are stabilized. In the case of F-theory compactifications, we make use of toric geometry to construct a class of grand unified theory (GUT) models in F-theory. The base manifolds are hypersurfaces of the four-dimensional projective space with toric point and curve blowups. The associated Calabi-Yau fourfolds are complete intersections of two hypersurfaces in the P[231] fibered toric sixfolds. We construct SO(10) GUT models on suitable divisors of the basis manifolds using the spectral cover construction. By means of abelian fluxes we break the SO(10) gauge group to SU(5)xU(1) which is interpreted as a flipped SU(5) model. With the GUT Higgses in this model it is possible to further break the gauge symmetry to the Standard Model. We present several phenomenologically attractive examples in detail. (author)

  11. Theory and Phenomenology of Type I strings and M-theory

    CERN Document Server

    Dudas, E A

    2000-01-01

    The physical motivations and the basic construction rules for Type I strings and M-theory compactifications are reviewed in light of the recent developments. The first part contains the basic theoretical ingredients needed for building four-dimensional supersymmetric models, models with broken supersymmetry and for computing low-energy actions and quantum corrections to them. The second part contains some phenomenological applications to brane world scenarios with low values of the string scale and large extra dimensions.

  12. TOPICAL REVIEW: Theory and phenomenology of type I strings and M-theory

    Science.gov (United States)

    Dudas, Emilian

    2000-11-01

    The physical motivations and the basic construction rules for type I strings and M-theory compactifications are reviewed in the light of recent developments. The first part of this review contains the basic theoretical ingredients needed for building four-dimensional supersymmetric models, models with broken supersymmetry, and for computing low-energy actions and quantum corrections to these. The second part contains some phenomenological applications to brane world scenarios with low values of the string scale and large extra dimensions.

  13. A new type of phase transition in gravitational theories

    CERN Document Server

    Camanho, Xian O; Gomberoff, Andres; Giribet, Gaston

    2012-01-01

    We set forth a new type of phase transition that might take place in gravitational theories whenever higher-curvature corrections are considered. It can be regarded as a sophisticated version of the Hawking-Page transition, mediated by the nucleation of a bubble in anti-de Sitter space. The bubble hosts a black hole in its interior, and separates two spacetime regions with different effective cosmological constants. We compute the free energy of this configuration and compare it with that of thermal anti-de Sitter space. The result suggests that a phase transition actually occurs above certain critical temperature, ultimately changing the value of the cosmological constant. We discuss the consistency of the thermodynamic picture and its possible relevance in the context of AdS/CFT.

  14. Aspects of moduli stabilization in type IIB string theory

    CERN Document Server

    Khalil, Shaaban; Nassar, Ali

    2015-01-01

    We review moduli stabilization in type IIB string theory compactification with fluxes. We focus on the KKLT and Large Volume Scenario (LVS). We show that the predicted soft SUSY breaking terms in KKLT model are not phenomenological viable. In LVS, the following result for scalar mass, gaugino mass, and trilinear term is obtained: $m_0 =m_{1/2}= - A_0=m_{3/2}$, which may account for Higgs mass limit if $m_{3/2} \\sim {\\cal O}(1.5)$ TeV. However, in this case the relic abundance of the lightest neutralino can not be consistent with the measured limits. We also study the cosmological consequences of moduli stabilization in both models. In particular, the associated inflation models such as racetrack inflation and K\\"ahler inflation are analyzed. Finally the problem of moduli destabilization and the effect of string moduli backreaction on the inflation models are discussed.

  15. On axionic dark matter in Type IIA string theory

    CERN Document Server

    Honecker, Gabriele

    2014-01-01

    We investigate viable scenarios with various axions in the context of supersymmetric field theory and in globally consistent D-brane models. The Peccei-Quinn symmetry is associated with an anomalous U(1) symmetry, which acquires mass at the string scale but remains as a perturbative global symmetry at low energies. The origin of the scalar Higgs-axion potential from F-, D- and soft breaking terms is derived, and two Standard Model examples of global intersecting D6-brane models in Type II orientifolds are presented, which differ in the realisation of the Higgs sector and in the hidden sector, the latter of which is of particluar importance for the soft supersymmetry breaking terms.

  16. Geometry of model building in type IIB superstring theory and F-theory compactifications

    International Nuclear Information System (INIS)

    The present thesis is devoted to the study and geometrical description of type IIB superstring theory and F-theory model building. After a concise exposition of the basic concepts of type IIB flux compactifications, we explain their relation to F-theory. Moreover, we give a brief introduction to toric geometry focusing on the construction and the analysis of compact Calabi-Yau (CY) manifolds, which play a prominent role in the compactification of extra spatial dimensions. We study the 'Large Volume Scenario' on explicit new compact four-modulus CY manifolds. We thoroughly analyze the possibility of generating neutral non-perturbative superpotentials from Euclidean D3-branes in the presence of chirally intersecting D7-branes. We find that taking proper account of the Freed-Witten anomaly on non-spin cycles and of the Kaehler cone conditions imposes severe constraints on the models. Furthermore, we systematically construct a large number of compact CY fourfolds that are suitable for F-theory model building. These elliptically fibered CYs are complete intersections of two hypersurfaces in a six-dimensional ambient space. We first construct three-dimensional base manifolds that are hypersurfaces in a toric ambient space. We find that elementary conditions, which are motivated by F-theory GUTs (Grand Unified Theory), lead to strong constraints on the geometry, which significantly reduce the number of suitable models. We work out several examples in more detail. At the end, we focus on the complex moduli space of CY threefolds. It is a known result that infinite sequences of type IIB flux vacua with imaginary self-dual flux can only occur in so-called D-limits, corresponding to singular points in complex structure moduli space. We refine this no-go theorem by demonstrating that there are no infinite sequences accumulating to the large complex structure point of a certain class of one-parameter CY manifolds. We perform a similar analysis for conifold points and for the decoupling limit, obtaining identical results. Furthermore, we establish the absence of infinite sequences in a D-limit corresponding to the large complex structure limit of a two-parameter CY. We corroborate our results with a numerical study ofthe sequences. (author)

  17. Preclinical evaluation and quantification of [18F]MK-9470 as a radioligand for PET imaging of the type 1 cannabinoid receptor in rat brain

    International Nuclear Information System (INIS)

    [18F]MK-9470 is an inverse agonist for the type 1 cannabinoid (CB1) receptor allowing its use in PET imaging. We characterized the kinetics of [18F]MK-9470 and evaluated its ability to quantify CB1 receptor availability in the rat brain. Dynamic small-animal PET scans with [18F]MK-9470 were performed in Wistar rats on a FOCUS-220 system for up to 10 h. Both plasma and perfused brain homogenates were analysed using HPLC to quantify radiometabolites. Displacement and blocking experiments were done using cold MK-9470 and another inverse agonist, SR141716A. The distribution volume (VT) of [18F]MK-9470 was used as a quantitative measure and compared to the use of brain uptake, expressed as SUV, a simplified method of quantification. The percentage of intact [18F]MK-9470 in arterial plasma samples was 80 ± 23 % at 10 min, 38 ± 30 % at 40 min and 13 ± 14 % at 210 min. A polar radiometabolite fraction was detected in plasma and brain tissue. The brain radiometabolite concentration was uniform across the whole brain. Displacement and pretreatment studies showed that 56 % of the tracer binding was specific and reversible. VT values obtained with a one-tissue compartment model plus constrained radiometabolite input had good identifiability (?10 %). Ignoring the radiometabolite contribution using a one-tissue compartment model alone, i.e. without constrained radiometabolite input, overestimated the [18F]MK-9470 VT, but was correlated. A correlation between [18F]MK-9470 VT and SUV in the brain was also found (R 2 = 0.26-0.33; p ? 0.03). While the presence of a brain-penetrating radiometabolite fraction complicates the quantification of [18F]MK-9470 in the rat brain, its tracer kinetics can be modelled using a one-tissue compartment model with and without constrained radiometabolite input. (orig.)

  18. Quantification of the types of water in Eudragit RLPO polymer and the kinetics of water loss using FTIR

    DEFF Research Database (Denmark)

    Pirayavaraporn, Chompak; Rades, Thomas; Gordon, Keith C; Tucker, Ian G

    2013-01-01

    Coalescence of polymer particles in polymer matrix tablets influences drug release. The literature has emphasized that coalescence occurs above the glass transition temperature (Tg) of the polymer and that water may plasticize (lower Tg) the polymer. However, we have shown previously that nonplasticizing water also influences coalescence of Eudragit RLPO; so there is a need to quantify the different types of water in Eudragit RLPO. The purpose of this study was to distinguish the types of water ...

  19. Type IIA flux compactifications. Vacua, effective theories and cosmological challenges

    Energy Technology Data Exchange (ETDEWEB)

    Koers, Simon

    2009-07-30

    In this thesis, we studied a number of type IIA SU(3)-structure compactifications with 06-planes on nilmanifolds and cosets, which are tractable enough to allow for an explicit derivation of the low energy effective theory. In particular we calculated the mass spectrum of the light scalar modes, using N = 1 supergravity techniques. For the torus and the Iwasawa solution, we have also performed an explicit Kaluza-Klein reduction, which led to the same result. For the nilmanifold examples we have found that there are always three unstabilized moduli corresponding to axions in the RR sector. On the other hand, in the coset models, except for SU(2) x SU(2), all moduli are stabilized. We discussed the Kaluza-Klein decoupling for the supersymmetric AdS vacua and found that it requires going to the Nearly-Calabi Yau limited. We searched for non-trivial de Sitter minima in the original flux potential away from the AdS vacuum. Finally, in chapter 7, we focused on a family of three coset spaces and constructed non-supersymmetric vacua on them. (orig.)

  20. Type IIA flux compactifications. Vacua, effective theories and cosmological challenges

    International Nuclear Information System (INIS)

    In this thesis, we studied a number of type IIA SU(3)-structure compactifications with 06-planes on nilmanifolds and cosets, which are tractable enough to allow for an explicit derivation of the low energy effective theory. In particular we calculated the mass spectrum of the light scalar modes, using N = 1 supergravity techniques. For the torus and the Iwasawa solution, we have also performed an explicit Kaluza-Klein reduction, which led to the same result. For the nilmanifold examples we have found that there are always three unstabilized moduli corresponding to axions in the RR sector. On the other hand, in the coset models, except for SU(2) x SU(2), all moduli are stabilized. We discussed the Kaluza-Klein decoupling for the supersymmetric AdS vacua and found that it requires going to the Nearly-Calabi Yau limited. We searched for non-trivial de Sitter minima in the original flux potential away from the AdS vacuum. Finally, in chapter 7, we focused on a family of three coset spaces and constructed non-supersymmetric vacua on them. (orig.)

  1. Experimental quantification of dynamic forces and shaft motion in two different types of backup bearings under several contact conditions

    DEFF Research Database (Denmark)

    Lahriri, Said; Santos, Ilmar

    2013-01-01

    This paper treats the experimental study on a shaft impacting its stator for different cases. The paper focuses mainly on the measured contact forces and the shaft motion in two different types of backup bearings. As such, the measured contact forces are thoroughly studied. These measured contact forces enable the hysteresis loops to be computed and analyzed. Consequently, the contact forces are plotted against the local deformation in order to assess the contact force loss during the impacts. T...

  2. Quantification of the types of water in Eudragit RLPO polymer and the kinetics of water loss using FTIR

    DEFF Research Database (Denmark)

    Pirayavaraporn, Chompak; Rades, Thomas

    2013-01-01

    Coalescence of polymer particles in polymer matrix tablets influences drug release. The literature has emphasized that coalescence occurs above the glass transition temperature (Tg) of the polymer and that water may plasticize (lower Tg) the polymer. However, we have shown previously that nonplasticizing water also influences coalescence of Eudragit RLPO; so there is a need to quantify the different types of water in Eudragit RLPO. The purpose of this study was to distinguish the types of water present in Eudragit RLPO polymer and to investigate the water loss kinetics for these different types of water. Eudragit RLPO was stored in tightly closed chambers at various relative humidities (0, 33, 56, 75, and 94%) until equilibrium was reached. Fourier transform infrared spectroscopy (FTIR)-DRIFTS was used to investigate molecular interactions between water and polymer, and water loss over time. Using a curve fitting procedure, the water region (3100-3,700 cm(-1)) of the spectra was analyzed, and used to identify water present in differing environments in the polymer and to determine the water loss kinetics upon purging the sample with dry compressed air. It was found that four environments can be differentiated (dipole interaction of water with quaternary ammonium groups, water cluster, and water indirectly and directly binding to the carbonyl groups of the polymer) but it was not possible to distinguish whether the different types of water were lost at different rates. It is suggested that water is trapped in the polymer in different forms and this should be considered when investigating coalescence of polymer matrices.

  3. Mapping and Quantification of Land Area and Cover Types with LandsatTM in Carey Island, Selangor, Malaysia

    OpenAIRE

    J Hj. Kamaruzaman,; I Mohd Hasmadi,

    2009-01-01

    Information about current land cover type is essential at a certain level to ensure the optimum use of the land resources. Several approaches can be used to estimate land cover area, where remote sensing and Geographic Information System (GIS) is among the method. Therefore, this study was undertaken to evaluate how reliable these technologies in preparing information about land cover in Carey Island, Selangor of Peninsular Malaysia. Erdas Imagine 9.1 was used in digital image processing. A p...

  4. Identification of enzymes and quantification of metabolic fluxes in the wild type and in a recombinant Aspergillus oryzae strain

    DEFF Research Database (Denmark)

    Pedersen, Henrik; Carlsen, Morten

    1999-01-01

    Two alpha-amylase-producing strains of Aspergillus oryzae, a wild-type strain and a recombinant containing additional copies of the alpha-amylase gene, were characterized,vith respect to enzyme activities, localization of enzymes to the mitochondria or cytosol, macromolecular composition, and metabolic fluxes through the central metabolism during glucose-limited chemostat cultivations. Citrate synthase and isocitrate dehydrogenase (NAD) activities were found only in the mitochondria, glucose-6-phosphate dehydrogenase and glutamate dehydrogenase (NADP) activities were found only in the cytosol, and isocitrate dehydrogenase (NADP), glutamate oxaloacetate transaminase, malate dehydrogenase, and glutamate dehydrogenase (NAD) activities were found in both the mitochondria and the cytosol, The measured biomass components and ash could account for 95% (wt/wt) of the biomass. The protein and RNA contents increased linearly with increasing specific growth rate, but the carbohydrate and chitin contents decreased. A metabolic model consisting of 69 fluxes and 59 intracellular metabolites was used to calculate the metabolic fluxes through the central metabolism at several specific growth rates, with ammonia or nitrate as the nitrogen source. The flux through the pentose phosphate pathway increased with increasing specific growth rate. The fluxes through the pentose phosphate pathway were 15 to 26% higher for the recombinant strain than for the wild-type strain.

  5. Quantification of zinc atoms in a surface alloy on copper in an industrial-type methanol synthesis catalyst

    DEFF Research Database (Denmark)

    Kuld, Sebastian; Moses, Poul Georg

    2014-01-01

    Methanol has recently attracted renewed interest because of its potential importance as a solar fuel. Methanol is also an important bulk chemical that is most efficiently formed over the industrial Cu/ZnO/Al2O3 catalyst. The identity of the active site and, in particular, the role of ZnO as a promoter for this type of catalyst is still under intense debate. Structural changes that are strongly dependent on the pretreatment method have now been observed for an industrial-type methanol synthesis catalyst. A combination of chemisorption, reaction, and spectroscopic techniques provides a consistent picture of surface alloying between copper and zinc. This analysis enables a reinterpretation of the methods that have been used for the determination of the Cu surface area and provides an opportunity to independently quantify the specific Cu and Zn areas. This method may also be applied to other systems where metal-support interactions are important, and this work generally addresses the role of the carrier and the nature of the interactions between carrier and metal in heterogeneous catalysts.

  6. On Behavioral Types for OSGi: From Theory to Implementation

    OpenAIRE

    Blech, Jan Olaf; Rueß, Harald; Schätz, Bernhard

    2013-01-01

    This report presents our work on behavioral types for OSGi component systems. It extends previously published work and presents features and details that have not yet been published. In particular, we cover a discussion on behavioral types in general, and Eclipse based implementation work on behavioral types . The implementation work covers: editors, means for comparing types at development and runtime, a tool connection to resolve incompatibilities, and an AspectJ based inf...

  7. Non-perturbative states in type II superstring theory from classical spinning membranes

    CERN Document Server

    Brugues, J; Russo, J G; Brugues, Jan; Rojo, Joan; Russo, Jorge G.

    2004-01-01

    We find a new family of exact solutions in membrane theory, representing toroidal membranes spinning in several planes. They have energy square proportional to the sum of the different angular momenta, generalizing Regge-type string solutions to membrane theory. By compactifying the eleven dimensional theory on a circle and on a torus, we identify a family of new non-perturbative states of type IIA and type IIB superstring theory (which contains the perturbative spinning string solutions of type II string theory as a particular case). The solution represents a spinning bound state of D branes and fundamental strings. Then we find similar solutions for membranes on $AdS_7\\times S^4$ and $AdS_4\\times S^7$. We also consider the analogous solutions in SU(N) matrix theory, and compute the energy. They can be interpreted as rotating open strings with D0 branes attached to their endpoints.

  8. Experimental quantification of dynamic forces and shaft motion in two different types of backup bearings under several contact conditions

    DEFF Research Database (Denmark)

    Lahriri, Said; Santos, Ilmar

    2013-01-01

    This paper treats the experimental study on a shaft impacting its stator for different cases. The paper focuses mainly on the measured contact forces and the shaft motion in two different types of backup bearings. As such, the measured contact forces are thoroughly studied. These measured contact forces enable the hysteresis loops to be computed and analyzed. Consequently, the contact forces are plotted against the local deformation in order to assess the contact force loss during the impacts. The shaft motion during contact with the backup bearing is verified with a two-sided spectrum analyses. The analyses show that by use of a conventional annular guide, the shaft undergoes a direct transition from normal operation to a full annular backward whirling state for the case of external excitation. However, in a self-excited vibration case, where the speed is gradually increased and decreased through the first critical speed, the investigation revealed that different paths initiated the onset of backward whip and whirling motion. In order to improve the whirling and the full annular contact behavior, an unconventional pinned backup bearing is realized. The idea is to utilize pin connections that center the rotor during impacts and prevent the shaft from entering a full annular contact state. The experimental results show that the shaft escapes the pins and returns to a normal operational condition during an impact event. © 2013 Elsevier Ltd. All rights reserved.

  9. Heartbeat-related displacement of the thoracic aorta in patients with chronic aortic dissection type B: Quantification by dynamic CTA

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Tim F. [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: tim.weber@med.uni-heidelberg.de; Ganten, Maria-Katharina [German Cancer Research Center, Department of Radiology, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)], E-mail: m.ganten@dkfz.de; Boeckler, Dittmar [University of Heidelberg, Department of Vascular and Endovascular Surgery, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: dittmar.boeckler@med.uni-heidelberg.de; Geisbuesch, Philipp [University of Heidelberg, Department of Vascular and Endovascular Surgery, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: philipp.geisbuesch@med.uni-heidelberg.de; Kauczor, Hans-Ulrich [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: hu.kauczor@med.uni-heidelberg.de; Tengg-Kobligk, Hendrik von [German Cancer Research Center, Department of Radiology, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)], E-mail: h.vontengg@dkfz.de

    2009-12-15

    Purpose: The purpose of this study was to characterize the heartbeat-related displacement of the thoracic aorta in patients with chronic aortic dissection type B (CADB). Materials and methods: Electrocardiogram-gated computed tomography angiography was performed during inspiratory breath-hold in 11 patients with CADB: Collimation 16 mm x 1 mm, pitch 0.2, slice thickness 1 mm, reconstruction increment 0.8 mm. Multiplanar reformations were taken for 20 equidistant time instances through both ascending (AAo) and descending aorta (true lumen, DAoT; false lumen, DAoF) and the vertex of the aortic arch (VA). In-plane vessel displacement was determined by region of interest analysis. Results: Mean displacement was 5.2 {+-} 1.7 mm (AAo), 1.6 {+-} 1.0 mm (VA), 0.9 {+-} 0.4 mm (DAoT), and 1.1 {+-} 0.4 mm (DAoF). This indicated a significant reduction of displacement from AAo to VA and DAoT (p < 0.05). The direction of displacement was anterior for AAo and cranial for VA. Conclusion: In CADB, the thoracic aorta undergoes a heartbeat-related displacement that exhibits an unbalanced distribution of magnitude and direction along the thoracic vessel course. Since consecutive traction forces on the aortic wall have to be assumed, these observations may have implications on pathogenesis of and treatment strategies for CADB.

  10. A Matrix Model for Heterotic $Spin(32)/Z_2$ and Type I String Theory

    OpenAIRE

    Krogh, Morten

    1998-01-01

    We consider Heterotic string theories in the DLCQ. We derive that the matrix model of the Spin(32)/$Z_2$ Heterotic theory is the theory living on N D-strings in type I wound on a circle with no Spin(32)/$Z_2$ Wilson line on the circle. This is an O(N) gauge theory. We rederive the matrix model for the $\\e8$ Heterotic string theory, explicitly taking care of the Wilson line around the lightlike circle. The result is the same theory as for Spin(32)/$Z_2$ except that now there ...

  11. Quantification of the physiochemical constraints on the export of spider silk proteins by Salmonella type III secretion

    Directory of Open Access Journals (Sweden)

    Voigt Christopher A

    2010-10-01

    Full Text Available Abstract Background The type III secretion system (T3SS is a molecular machine in gram negative bacteria that exports proteins through both membranes to the extracellular environment. It has been previously demonstrated that the T3SS encoded in Salmonella Pathogenicity Island 1 (SPI-1 can be harnessed to export recombinant proteins. Here, we demonstrate the secretion of a variety of unfolded spider silk proteins and use these data to quantify the constraints of this system with respect to the export of recombinant protein. Results To test how the timing and level of protein expression affects secretion, we designed a hybrid promoter that combines an IPTG-inducible system with a natural genetic circuit that controls effector expression in Salmonella (psicA. LacO operators are placed in various locations in the psicA promoter and the optimal induction occurs when a single operator is placed at the +5nt (234-fold and a lower basal level of expression is achieved when a second operator is placed at -63nt to take advantage of DNA looping. Using this tool, we find that the secretion efficiency (protein secreted divided by total expressed is constant as a function of total expressed. We also demonstrate that the secretion flux peaks at 8 hours. We then use whole gene DNA synthesis to construct codon optimized spider silk genes for full-length (3129 amino acids Latrodectus hesperus dragline silk, Bombyx mori cocoon silk, and Nephila clavipes flagelliform silk and PCR is used to create eight truncations of these genes. These proteins are all unfolded polypeptides and they encompass a variety of length, charge, and amino acid compositions. We find those proteins fewer than 550 amino acids reliably secrete and the probability declines significantly after ~700 amino acids. There also is a charge optimum at -2.4, and secretion efficiency declines for very positively or negatively charged proteins. There is no significant correlation with hydrophobicity. Conclusions We show that the natural system encoded in SPI-1 only produces high titers of secreted protein for 4-8 hours when the natural psicA promoter is used to drive expression. Secretion efficiency can be high, but declines for charged or large sequences. A quantitative characterization of these constraints will facilitate the effective use and engineering of this system.

  12. Krichever-Novikov type algebras theory and applications

    CERN Document Server

    Schlichenmaier, Martin

    2014-01-01

    Krichever and Novikov introduced certain classes of infinite dimensionalLie algebrasto extend the Virasoro algebra and its related algebras to Riemann surfaces of higher genus. The author of this book generalized and extended them toa more general setting needed by the applications. Examples of applications are Conformal Field Theory, Wess-Zumino-Novikov-Witten models, moduli space problems, integrable systems, Lax operator algebras, and deformation theory of Lie algebra. Furthermore they constitute an important class of infinite dimensional Lie algebras which due to their geometric origin are

  13. The ultimate speed implied by theories of Weber's type

    International Nuclear Information System (INIS)

    As in the last few years there has been a renewed interest in the laws of Ampere for the force between current elements and of Weber for the force between charges, we analyze the limiting velocity which appears in Weber's law. Then we make the same analysis for Phipps' potential and for generalizations of it. Comparing the results with the relativistic calculation, we obtain that these theories can yield c for the ultimate speed of the charges or for the ultimate relative speed between the charges but not for both simultaneously, as in the case in the special theory of relativity. 59 refs., 2 figs

  14. Relative quantification and detection of different types of infectious bursal disease virus in bursa of Fabricius and cloacal swabs using real time RT-PCR SYBR green technology

    DEFF Research Database (Denmark)

    Li, Yiping; Handberg, K.J.

    2007-01-01

    In present study, different types of infectious bursal disease virus (IBDV), virulent strain DK01, classic strain F52/70 and vaccine strain D78 were quantified and detected in infected bursa of Fabricius (BF) and cloacal swabs using quantitative real time RT-PCR with SYBR green dye. For selection of a suitable internal control gene, real time PCR parameters were evaluated for three candidate genes, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), 28S rRNA and beta-actin to IBDVs. Based on this P-actin was selected as an internal control for quantification of IBDVs in BF. All BF samples with D78, DK01 or F52/70 inoculation were detected as virus positive at day I post inoculation (p.i.). The D78 viral load peaked at day 4 and day 8 p.i., while the DK01 and F52/70 viral load showed relatively high levels at day 2 p.i. In cloacal swabs, viruses detectable were at day 2 p.i. for DK01 and F52/70, day 8 p.i. for D78. Importantly, the primers set were specific as the D78 primer set gave no amplification of F52/70 and DK01 and the DK01 primer set gave no amplification of D78, thus DK01 and D78 could be quantified simultaneously in dually infected chickens by use of these two set of primers. The method described here is robust and may sever as a useful tool with high capacity for diagnostics as well as in viral pathogenesis studies.

  15. A matrix model for heterotic Spin(32)/Z2 and type I string theory

    International Nuclear Information System (INIS)

    We consider heterotic string theories in the DLCQ. We derive that the matrix model of the Spin(32)/Z2 heterotic theory is the theory living on N D-strings in type I wound on a circle with no Spin(32)/Z2 Wilson line on the circle. This is an O(N) gauge theory. We rederive the matrix model for the E8xE8 heterotic string theory, explicitly taking care of the Wilson line around the lightlike circle. The result is the same theory as for Spin(32)/Z2 except that now there is a Wilson line on the circle. We also see that the integer N labeling the sector of the O(N) matrix model is not just the momentum around the lightlike circle, but a shifted momentum depending on the Wilson line. We discuss the aspect of level matching, GSO projections and why, from the point of view of matrix theory the E8xE8 theory, and not the Spin(32)/Z2, develops an 11th dimension for strong coupling. Furthermore a matrix theory for type I is derived. This is again the O(N) theory living on the D-strings of type I. For small type I coupling the system is 0+1-dimensional quantum mechanics

  16. Initial layer theory and model equations of Volterra type

    International Nuclear Information System (INIS)

    It is demonstrated here that there exist initial layers to singularly perturbed Volterra equations whose thicknesses are not of order of magnitude of 0(?), ? ? 0. It is also shown that the initial layer theory is extremely useful because it allows one to construct the approximate solution to an equation, which is almost identical to the exact solution. (author)

  17. Small instantons, del Pezzo surfaces and type I' theory

    International Nuclear Information System (INIS)

    Small instantons of exceptional groups arise geometrically by a collapsing del Pezzo surface in a CY. We use this to explain the physics of a 4-brane probe in type I' compactification to nine dimensions. (orig.)

  18. Algebraic theory of type-and-effect systems

    OpenAIRE

    Kammar, Ohad

    2014-01-01

    We present a general semantic account of Gifford-style type-and-effect systems. These type systems provide lightweight static analyses annotating program phrases with the sets of possible computational effects they may cause, such as memory access and modification, exception raising, and non-deterministic choice. The analyses are used, for example, to justify the program transformations typically used in optimising compilers, such as code reordering and inlining. Despite their ...

  19. On axionic dark matter in Type IIA string theory

    OpenAIRE

    Honecker, Gabriele; Staessens, Wieland

    2013-01-01

    We investigate viable scenarios with various axions in the context of supersymmetric field theory and in globally consistent D-brane models. The Peccei-Quinn symmetry is associated with an anomalous U(1) symmetry, which acquires mass at the string scale but remains as a perturbative global symmetry at low energies. The origin of the scalar Higgs-axion potential from F-, D- and soft breaking terms is derived, and two Standard Model examples of global intersecting D6-brane mod...

  20. A Matrix Model for Heterotic $Spin(32)/Z_2$ and Type I String Theory

    CERN Document Server

    Krogh, M

    1999-01-01

    We consider Heterotic string theories in the DLCQ. We derive that the matrix model of the Spin(32)/$Z_2$ Heterotic theory is the theory living on N D-strings in type I wound on a circle with no Spin(32)/$Z_2$ Wilson line on the circle. This is an O(N) gauge theory. We rederive the matrix model for the the lightlike circle. The result is the same theory as for Spin(32)/$Z_2$ except that now there is a Wilson line on the circle. We also see that the integer N labeling the sector of the O(N) matrix model is not just the momentum around the lightlike circle, but a shifted momentum depending on the Wilson line. We discuss the aspect of level matching, GSO projections and why, from the point of view of matrix theory the $\\e8$ theory, and not the Spin(32)/$Z_2$, develops an 11'th dimension for strong coupling. Furthermore a matrix theory for type I is derived. This is again the O(N) theory living on the D-strings of type I. For small type I coupling the system is 0+1 dimensional quantum mechanics.

  1. Surveying problem solution with theory and objective type questions

    CERN Document Server

    Chandra, AM

    2005-01-01

    The book provides a lucid and step-by-step treatment of the various principles and methods for solving problems in land surveying. Each chapter starts with basic concepts and definitions, then solution of typical field problems and ends with objective type questions. The book explains errors in survey measurements and their propagation. Survey measurements are detailed next. These include horizontal and vertical distance, slope, elevation, angle, and direction. Measurement using stadia tacheometry and EDM are then highlighted, followed by various types of levelling problems. Traversing is then explained, followed by a detailed discussion on adjustment of survey observations and then triangulation and trilateration.

  2. Photometric colors of late-type giants: theory versus observations

    OpenAIRE

    Kucinskas, A; Hauschildt, P. H.; Ludwig, H. -G; Brott, I; Vansevicius, V.; Lindegren, L.; Tanabe, T.; Allard, F.

    2005-01-01

    To assess the current status in the theoretical modeling of the spectral properties of late-type giants, we provide a comparison of synthetic photometric colors of late-type giants (calculated with PHOENIX, MARCS and ATLAS model atmospheres) with observations, at [M/H]=0.0 and -2.0. Overall, there is a good agreement between observed and synthetic colors, and synthetic colors and published Teff-color relations, both at [M/H]=0.0 and -2.0. Deviations from the observed trends ...

  3. The Classification of Gun’s Type Using Image Recognition Theory

    OpenAIRE

    M. L. Kulthon Kasemsan

    2014-01-01

    The research aims to develop the Gun’s Type and Models Classification (GTMC) system using image recognition theory. It is expected that this study can serve as a guide for law enforcement agencies or at least serve as the catalyst for a similar type of research. Master image storage and image recognition are the two main processes. The procedures involved original images, scaling, gray scale, canny edge detector, SUSAN corner detector, block matching template, and finally gun type’s recogniti...

  4. On the Nonlinear Theory of Viscoelasticity of Differential Type

    OpenAIRE

    Pucci, Edvige; Saccomandi, Giuseppe

    2011-01-01

    We consider nonlinear viscoelastic materials of differential type and for some special models we derive exact solutions of initial boundary value problems. These exact solutions are used to investigate the reasons of non-existence of global solutions for such equations.

  5. Godel Type Metrics in Einstein-Aether Theory II: Nonflat Background in Arbitrary Dimensions

    CERN Document Server

    Gurses, Metin

    2015-01-01

    It was previously proved that the G\\"{o}del-type metrics with flat three-dimensional background metric solve exactly the field equations of the Einstein-Aether theory in four dimensions. We generalize this result by showing that the stationary G\\"{o}del-type metrics with nonflat background in $D$ dimensions solve exactly the field equations of the Einstein-Aether theory. The reduced field equations are the $(D-1)$-dimensional Euclidean Ricci-flat and the $(D-1)$-dimensional source-free Maxwell equations, and the parameters of the theory are left free except $c_{1}-c_{3}=1$. We give a method to produce exact solutions of the Einstein-Aether theory from the G\\"{o}del-type metrics in $D$ dimensions. By using this method, we present explicit exact solutions to the theory by considering the particular cases: ($D-1$)-dimensional Euclidean flat, conformally flat, and Tangherlini backgrounds.

  6. Eady Solitary Waves: A Theory of Type B Cyclogenesis.

    Science.gov (United States)

    Mitsudera, Humio

    1994-11-01

    Localized baroclinic instability in a weakly nonlinear, long-wave limit using an Eady model is studied. The resulting evolution equations have a form of the KdV type, including extra terms representing linear coupling. Baroclinic instability is triggered locally by the collision between two neutral solitary waves (one trapped at the upper boundary and the other at the lower boundary) if their incident amplitudes are sufficiently large. This characteristic is explained from the viewpoint of resonance when the relative phase speed, which depends on the amplitudes, is less than a critical value. The upper and lower disturbances grow in a coupled manner (resembling a normal-mode structure) initially, but they reverse direction slowly as the amplitudes increase, and eventually separate from each other.The motivation of this study is to investigate a type of extratropical cyclogenesis that involves a preexisting upper trough (termed as Type B development) from the viewpoint of resonant solitary waves. Two cases are of particular interest. First, the author examines a case where an upper disturbance preexists over an undisturbed low-level waveguide. The solitary waves exhibit behavior similar to that conceived by Hoskins et al. for Type B development; the lower disturbance is forced one sidedly by a preexisting upper disturbance initially, but in turn forces the latter once the former attains a sufficient amplitude, thus resulting in mutual reinforcement. Second, if a weak perturbation exists at the surface ahead of the preexisting strong upper disturbance, baroclinic instability is triggered when the two waves interact. Even though the amplitude of the lower disturbance is initially much weaker, it is intensified quickly and catches up with the amplitude of the upper disturbance, so that the coupled vertical structure resembles that of an unstable normal mode eventually. These results describe the observed behavior in Type B atmospheric cyclogenesis quite well.

  7. Cosmic web-type classification using decision theory

    CERN Document Server

    Leclercq, Florent; Wandelt, Benjamin

    2015-01-01

    We propose a decision criterion for segmenting the cosmic web into different structure types (voids, sheets, filaments and clusters) on the basis of their respective probabilities and the strength of data constraints. Our approach is inspired by an analysis of games of chance where the gambler only plays if a positive expected net gain can be achieved based on some degree of privileged information. The result is a general solution for classification problems in the face of uncertainty, including the option of not committing to a class for a candidate object. As an illustration, we produce high-resolution maps of web-type constituents in the nearby Universe as probed by the Sloan Digital Sky Survey main galaxy sample. Other possible applications include the selection and labeling of objects in catalogs derived from astronomical survey data.

  8. Cosmic web-type classification using decision theory

    Science.gov (United States)

    Leclercq, F.; Jasche, J.; Wandelt, B.

    2015-04-01

    Aims: We propose a decision criterion for segmenting the cosmic web into different structure types (voids, sheets, filaments, and clusters) on the basis of their respective probabilities and the strength of data constraints. Methods: Our approach is inspired by an analysis of games of chance where the gambler only plays if a positive expected net gain can be achieved based on some degree of privileged information. Results: The result is a general solution for classification problems in the face of uncertainty, including the option of not committing to a class for a candidate object. As an illustration, we produce high-resolution maps of web-type constituents in the nearby Universe as probed by the Sloan Digital Sky Survey main galaxy sample. Other possible applications include the selection and labelling of objects in catalogues derived from astronomical survey data.

  9. Counting BPS Blackholes in Toroidal Type II String Theory

    OpenAIRE

    Maldacena, Juan; Moore, Gregory; Strominger, Andrew

    1999-01-01

    We derive a $U$-duality invariant formula for the degeneracies of BPS multiplets in a D1-D5 system for toroidal compactification of the type II string. The elliptic genus for this system vanishes, but it is found that BPS states can nevertheless be counted using a certain topological partition function involving two insertions of the fermion number operator. This is possible due to four extra toroidal U(1) symmetries arising from a Wigner contraction of a large $\\CN=4$ algeb...

  10. Types of two-dimensional $N = 4$ superconformal ?eld theories

    Indian Academy of Sciences (India)

    Abbas Ali

    2003-12-01

    Various types of $N = 4$ superconformal symmetries in two dimensions are considered. It is proposed that apart from the well-known cases of $SU(2)$ and $SU(2)\\times SU(2)\\times U(1)$, their Kac–Moody symmetry can also be $SU(2)\\times (U(1))^{4}$. Operator product expansions for the last case are derived. A complete free ?eld realization for the same is obtained.

  11. On the homotopy theory of n-types

    OpenAIRE

    Biedermann, Georg

    2006-01-01

    An n-truncated model structure on simplicial (pre-)sheaves is described having as weak equivalences maps that induce isomorphisms on certain homotopy sheaves only up to degree n. Starting from one of Jardine's intermediate model structures we construct such an n-type model structure via Bousfield-Friedlander localization and exhibit useful generating sets of trivial cofibrations. Injectively fibrant objects in these categories are called n-hyperstacks. The whole setup can co...

  12. Lifshitz-type SU(N) lattice gauge theory in five dimensions

    CERN Document Server

    Kanazawa, Takuya

    2015-01-01

    We present a lattice formulation of non-Abelian Lifshitz-type gauge theories. Due to anisotropic scaling of space and time, the theory is asymptotically free even in five dimensions. We show results of Monte Carlo simulations that suggest a smooth approach to the continuum limit.

  13. Sex and Theories of Deviance: Toward a Functional Theory of Deviant Type-Scripts

    Science.gov (United States)

    Harris, Anthony R.

    1977-01-01

    Asserts that the continuing failure to consider women has critically weakened contemporary criminal deviance theory, examines the major paradigms in criminal deviance, argues that the inclusion of sex as a variable has more or less disastrous consequences for those paradigms, and argues that the primary purpose of labeling theory is to detect…

  14. Ginzburg-Landau-type theory of spin superconductivity.

    Science.gov (United States)

    Bao, Zhi-qiang; Xie, X C; Sun, Qing-feng

    2013-01-01

    Spin superconductivity is a recently proposed analogue of conventional charge superconductivity, in which spin currents flow without dissipation but charge currents do not. Here we derive a universal framework for describing the properties of a spin superconductor along similar lines to the Ginzburg-Landau equations that describe conventional superconductors, and show that the second of these Ginzburg-Landau-type equations is equivalent to a generalized London equation. Just as the GL equations enabled researchers to explore the behaviour of charge superconductors, our Ginzburg-Landau-type equations enable us to make a number of non-trivial predictions about the potential behaviour of putative spin superconductor. They enable us to calculate the super spin current in a spin superconductor under a uniform electric field or that induced by a thin conducting wire. Moreover, they allow us to predict the emergence of new phenomena, including the spin-current Josephson effect in which a time-independent magnetic field induces a time-dependent spin current. PMID:24335888

  15. A note on half-supersymmetric bound states in M-theory and type IIA

    OpenAIRE

    Larsson, Henric

    2001-01-01

    By using O(7,7) transformations, to deform D6-branes, we obtain half-supersymmetric bound state solutions of type IIA supergravity, containing D6, D4, D2, D0, F1-branes and waves. We lift the solutions to M-theory which gives half-supersymmetric M-theory bound states, e.g. KK6-M5-M5-M5-M2-M2-M2-MW. We also take near horizon limits for the type IIA solutions, which gives supergravity duals of 7-dimensional non-commutative open string theory (with space-time and space-space no...

  16. A note on half-supersymmetric bound states in M-theory and type IIA

    CERN Document Server

    Larsson, H

    2002-01-01

    By using O(7,7) transformations, to deform D6-branes, we obtain half-supersymmetric bound state solutions of type IIA supergravity, containing D6, D4, D2, D0, F1-branes and waves. We lift the solutions to M-theory which gives half-supersymmetric M-theory bound states, e.g. KK6-M5-M5-M5-M2-M2-M2-MW. We also take near horizon limits for the type IIA solutions, which gives supergravity duals of 7-dimensional non-commutative open string theory (with space-time and space-space non-commutativity), non-commutative Yang-Mills theory (with space-space and light-like non-commutativity) and an open D4-brane theory.

  17. Digital games for type 1 and type 2 diabetes: underpinning theory with three illustrative examples.

    Science.gov (United States)

    Kamel Boulos, Maged N; Gammon, Shauna; Dixon, Mavis C; MacRury, Sandra M; Fergusson, Michael J; Miranda Rodrigues, Francisco; Mourinho Baptista, Telmo; Yang, Stephen P

    2015-01-01

    Digital games are an important class of eHealth interventions in diabetes, made possible by the Internet and a good range of affordable mobile devices (eg, mobile phones and tablets) available to consumers these days. Gamifying disease management can help children, adolescents, and adults with diabetes to better cope with their lifelong condition. Gamification and social in-game components are used to motivate players/patients and positively change their behavior and lifestyle. In this paper, we start by presenting the main challenges facing people with diabetes-children/adolescents and adults-from a clinical perspective, followed by three short illustrative examples of mobile and desktop game apps and platforms designed by Ayogo Health, Inc. (Vancouver, BC, Canada) for type 1 diabetes (one example) and type 2 diabetes (two examples). The games target different age groups with different needs-children with type 1 diabetes versus adults with type 2 diabetes. The paper is not meant to be an exhaustive review of all digital game offerings available for people with type 1 and type 2 diabetes, but rather to serve as a taster of a few of the game genres on offer today for both types of diabetes, with a brief discussion of (1) some of the underpinning psychological mechanisms of gamified digital interventions and platforms as self-management adherence tools, and more, in diabetes, and (2) some of the hypothesized potential benefits that might be gained from their routine use by people with diabetes. More research evidence from full-scale evaluation studies is needed and expected in the near future that will quantify, qualify, and establish the evidence base concerning this gamification potential, such as what works in each age group/patient type, what does not, and under which settings and criteria. PMID:25791276

  18. Abelian gauge symmetries and fluxed instantons in compactifications of type IIB and F-theory

    Energy Technology Data Exchange (ETDEWEB)

    Buenaventura Kerstan, Max Bromo

    2013-11-13

    We discuss the role of Abelian gauge symmetries in type IIB orientifold compactifications and their F-theory uplift. Particular emphasis is placed on U(1)s which become massive through the geometric Stueckelberg mechanism in type IIB. We present a proposal on how to take such geometrically massive U(1)s and the associated fluxes into account in the Kaluza-Klein reduction of F-theory with the help of non-harmonic forms. Evidence for this proposal is obtained by working out the F-theory effective action including such non-harmonic forms and matching the results with the known type IIB expressions. We furthermore discuss how world-volume fluxes on D3-brane instantons affect the instanton charge with respect to U(1) gauge symmetries and the chiral zero mode spectrum. The classical partition function of M5-instantons in F-theory is discussed and compared with the type IIB results for D3-brane instantons. The type IIB match allows us to determine the correct M5 partition function. Selection rules for the absence of chiral charged zero modes on M5-instantons in backgrounds with G{sub 4} flux are discussed and compared with the type IIB results. The dimensional reduction of the democratic formulation of M-theory is presented in the appendix.

  19. Cartan's equations define a topological field theory of the BF type

    International Nuclear Information System (INIS)

    Cartan's first and second structure equations together with first and second Bianchi identities can be interpreted as equations of motion for the tetrad, the connection and a set of two-form fields TI and RJI. From this viewpoint, these equations define by themselves a field theory. Restricting the analysis to four-dimensional spacetimes (keeping gravity in mind), it is possible to give an action principle of the BF type from which these equations of motion are obtained. The action turns out to be equivalent to a linear combination of the Nieh-Yan, Pontrjagin, and Euler classes, and so the field theory defined by the action is topological. Once Einstein's equations are added, the resulting theory is general relativity. Therefore, the current results show that the relationship between general relativity and topological field theories of the BF type is also present in the first-order formalism for general relativity

  20. Bianchi Type-I, V and VIo models in modified generalized scalar–tensor theory

    Indian Academy of Sciences (India)

    T Singh; R Chaubey

    2007-08-01

    In modified generalized scalar–tensor (GST) theory, the cosmological term is a function of the scalar field and its derivatives $\\dot{\\phi}^{2}$. We obtain exact solutions of the field equations in Bianchi Type-I, V and VIo space–times. The evolution of the scale factor, the scalar field and the cosmological term has been discussed. The Bianchi Type-I model has been discussed in detail. Further, Bianchi Type-V and VIo models can be studied on the lines similar to Bianchi Type-I model.

  1. A construction principle for ADM-type theories in maximal slicing gauge

    OpenAIRE

    Gomes, Henrique

    2013-01-01

    The differing concepts of time in general relativity and quantum mechanics are widely accused as the main culprits in our persistent failure in finding a complete theory of quantum gravity. Here we address this issue by constructing ADM-type theories \\emph{in a particular time gauge} directly from first principles. The principles are expressed as conditions on phase space constraints: we search for two sets of spatially covariant constraints, which generate symmetries (are f...

  2. Comparison of constructive multi-typed theory with subsystems of second order arithmetic

    OpenAIRE

    Kachapova, Farida

    2015-01-01

    This paper describes an axiomatic theory BT for constructive mathematics. BT has a predicative comprehension axiom for a countable number of set types and usual combinatorial operations. BT has intuitionistic logic, is consistent with classical logic and has such constructive features as consistency with formal Church thesis, and existence and disjunction properties. BT is mutually interpretable with a so called theory of arithmetical truth PATr and with a second-order arith...

  3. Frobenius type and CV-structures for Donaldson-Thomas theory and a convergence property

    CERN Document Server

    Barbieri, Anna

    2015-01-01

    We rephrase some well-known results in Donaldson-Thomas theory in terms of (formal families of) Frobenius type and CV-structures on a vector bundle in the sense of Hertling. We study these structures in an abstract setting, and prove a convergence result which is relevant to the case of triangulated categories. An application to physical field theory is also briefly discussed.

  4. Dynamical study of the empty Bianchi type I model in generalised scalar-tensor theory

    CERN Document Server

    Fay, S

    2000-01-01

    A dynamical study of the generalised scalar-tensor theory in the empty Bianchi type I model is made. We use a method from which we derive the sign of the first and second derivatives of the metric functions and examine three different theories that can all tend towards relativistic behaviours at late time. We determine conditions so that the dynamic be in expansion and decelerated at late time.

  5. Abelian gauge symmetries and fluxed instantons in compactifications of type IIB and F-theory

    CERN Document Server

    Kerstan, Max

    2014-01-01

    We discuss the role of Abelian gauge symmetries in type IIB orientifold compactifications and their F-theory uplift. Particular emphasis is placed on U(1)s which become massive through the geometric St\\"uckelberg mechanism in type IIB. We present a proposal on how to take such geometrically massive U(1)s and the associated fluxes into account in the Kaluza-Klein reduction of F-theory with the help of non-harmonic forms. Evidence for this proposal is obtained by working out the F-theory effective action including such non-harmonic forms and matching the results with the known type IIB expressions. We furthermore discuss how world-volume fluxes on D3-brane instantons affect the instanton charge with respect to U(1) gauge symmetries and the chiral zero mode spectrum. The classical partition function of M5-instantons in F-theory is discussed and compared with the type IIB results for D3-brane instantons. The type IIB match allows us to determine the correct M5 partition function. Selection rules for the absence o...

  6. Nonperturbative type IIB model building in the F-theory framework

    Energy Technology Data Exchange (ETDEWEB)

    Jurke, Benjamin Helmut Friedrich

    2011-02-28

    This dissertation is concerned with the topic of non-perturbative string theory, which is generally considered to be the most promising approach to a consistent description of quantum gravity. The five known 10-dimensional perturbative string theories are all interconnected by numerous dualities, such that an underlying non-perturbative 11-dimensional theory, called M-theory, is postulated. Due to several technical obstacles, little is known about the fundamental objects in this theory. There exists an alternative non-perturbative description to type IIB string theory, namely F-theory. Here the SL(2;Z) self-duality of IIB theory is geometrized in the form of an elliptic fibration over the space-time. Moreover, higher-dimensional objects like 7-branes are included via singularities into the geometric picture. This formally elegant description, however, requires significant technical effort for the construction of suitable compactification geometries, as many different aspects necessarily have to be dealt with at the same time. On the other hand, the generation of essential GUT building blocks like certain Yukawa couplings or spinor representations is easier compared to perturbative string theory. The goal of this study is therefore to formulate a unified theory within the framework of F-theory, that satisfies basic phenomenological constraints. Within this thesis, at first E3-brane instantons in type IIB string theory - 4-dimensional objects that are entirely wrapped around the invisible dimensions of space-time - are matched with M5-branes in F-theory. Such objects are of great importance in the generation of critical Yukawa couplings or the stabilization of the free parameters of a theory. Certain properties of M5-branes then allow to derive a new criterion for E3-branes to contribute to the superpotential. In the aftermath of this analysis, several compactification geometries are constructed and checked for basic properties that are relevant for semi-realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  7. Nonperturbative type IIB model building in the F-theory framework

    International Nuclear Information System (INIS)

    This dissertation is concerned with the topic of non-perturbative string theory, which is generally considered to be the most promising approach to a consistent description of quantum gravity. The five known 10-dimensional perturbative string theories are all interconnected by numerous dualities, such that an underlying non-perturbative 11-dimensional theory, called M-theory, is postulated. Due to several technical obstacles, little is known about the fundamental objects in this theory. There exists an alternative non-perturbative description to type IIB string theory, namely F-theory. Here the SL(2;Z) self-duality of IIB theory is geometrized in the form of an elliptic fibration over the space-time. Moreover, higher-dimensional objects like 7-branes are included via singularities into the geometric picture. This formally elegant description, however, requires significant technical effort for the construction of suitable compactification geometries, as many different aspects necessarily have to be dealt with at the same time. On the other hand, the generation of essential GUT building blocks like certain Yukawa couplings or spinor representations is easier compared to perturbative string theory. The goal of this study is therefore to formulate a unified theory within the framework of F-theory, that satisfies basic phenomenological constraints. Within this thesis, at first E3-brane instantons in type IIB string theory - 4-dimensional objects that are entirely wrapped around the invisible dimensions of space-time - are matched with M5-branes in F-theory. Such objects are of great importance in the generation of critical Yukawa couplings or the stabilization of the free parameters of a theory. Certain properties of M5-branes then allow to derive a new criterion for E3-branes to contribute to the superpotential. In the aftermath of this analysis, several compactification geometries are constructed and checked for basic properties that are relevant for semi-realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  8. What does it take for a specific prospect theory type household to engage in risky investment?

    OpenAIRE

    Hlouskova, Jaroslava; Tsigaris, Panagiotis

    2012-01-01

    This research note examines the conditions which will induce a prospect theory type investor, whose reference level is set by 'playing it safe', to invest in a risky asset. The conditions indicate that this type of investor requires a large equity premium to invest in risky assets. However, once she does invest because of a large risk premium, she becomes aggressive and buys/sells till an externally imposed upper/lower bound is reached.

  9. Cosmic string solution in a Born-Infeld type theory of gravity

    International Nuclear Information System (INIS)

    Full text. Advances in the formal structure of string theory point to the emergence, and necessity, of a scalar-tensorial theory of gravity. It seems that, at least at high energy scales, the Einstein's theory is not enough to explain the gravitational phenomena. In other words, the existence of a scalar (gravitational) field acting as a mediator of the gravitational interaction together with the usual purely rank-2 tensorial field is, indeed, a natural prediction of unification models as supergravity, superstrings and M-theory. This type of modified gravitation was first introduced in a different context in the 60's in order to incorporate the Mach's principle into relativity, but nowadays it acquired different sense in cosmology and gravity theories. Although such unification theories are the most acceptable, they all exist in higher dimensional spaces. The compactification from these higher dimensions to the 4-dimensional physics is not unique and there exist many effective theories of gravity which come from the unification process. Each of them must, of course, satisfy some predictions. Here, in this paper, we will deal with one of them. The so-called NDL theory. One important assumption in General Relativity is that all field interact in the same way with gravity. This is the so called Strong Equivalence Principle (SEP). It is well known, with good accuracy, that this is true when we concern with matter to matter interaction, i.e, the Weak Equivalence Principle(WEP) is tested. But, until now, there is no direct observational confirmation of this affirmation to the gravity to gravity interaction. In an extension of the field theoretical description of General Relativity constructed by is used to propose an alternative field theory of gravity. In this theory gravitons propagate in a different spacetime. The velocity of propagation of the gravitational waves in this theory does not coincide with the General Relativity predictions. (author)

  10. Preschoolers' Generation of Different Types of Counterfactual Statements and Theory of Mind Understanding

    Science.gov (United States)

    Guajardo, Nicole R.; Turley-Ames, Kandi Jo

    2004-01-01

    Two studies examined associations between theory of mind performance and counterfactual thinking using both antecedent and consequent counterfactual tasks. Moreover, the studies examined children's abilities to generate different types of counterfactual statements in terms of direction and structure. Participants were 3-, 4-, and 5-year-old…

  11. Universal Properties of Type IIB and F-theory Flux Compactifications at Large Complex Structure

    CERN Document Server

    Marsh, M C David

    2015-01-01

    We consider flux compactifications of type IIB string theory and F-theory in which the respective superpotentials at large complex structure are dominated by cubic or quartic terms in the complex structure moduli. In this limit, the low-energy effective theory for the complex structure and axio-dilaton sector exhibits universal properties that are insensitive to the details of the compactification manifold or the flux configuration. We show that there are no vacua in this region and the spectrum of the Hessian matrix is highly peaked and consists only of three distinct eigenvalues ($0$, $2m_{3/2}^2$ and $8m_{3/2}^2$), independently of the number of moduli. We briefly comment on how the inclusion of K\\"ahler moduli affect these findings. Our results generalise those of Brodie & Marsh [1], in which these universal properties were found in a subspace of the large complex structure limit of type IIB compactifications.

  12. The effective theory of type IIA AdS4 compactifications on nilmanifolds and cosets

    CERN Document Server

    Caviezel, Claudio; Kors, Simon; Lüst, Dieter; Tsimpis, Dimitrios; Zagermann, Marco

    2008-01-01

    We consider string theory compactifications of the form AdS4 x M6 with orientifold six-planes, where M6 is a six-dimensional compact space that is either a nilmanifold or a coset. For all known solutions of this type we obtain the four-dimensional N=1 low energy effective theory by computing the superpotential, the Kaehler potential and the mass spectrum for the light moduli. For the nilmanifold examples we perform a cross-check on the result for the mass spectrum by calculating it alternatively from a direct Kaluza-Klein reduction and find perfect agreement. We show that in all but one of the coset models all moduli are stabilized at the classical level. As an application we show that all but one of the coset models can potentially be used to bypass a recent no-go theorem against inflation in type IIA theory.

  13. E$_{6(6)}$ Exceptional Field Theory: Review and Embedding of Type IIB

    CERN Document Server

    Baguet, Arnaud; Samtleben, Henning

    2015-01-01

    We review E$_{6(6)}$ exceptional field theory with a particular emphasis on the embedding of type IIB supergravity, which is obtained by picking the GL$(5)\\times {\\rm SL}(2)$ invariant solution of the section constraint. We work out the precise decomposition of the E$_{6(6)}$ covariant fields on the one hand and the Kaluza-Klein-like decomposition of type IIB supergravity on the other. Matching the symmetries, this allows us to establish the precise dictionary between both sets of fields. Finally, we establish on-shell equivalence. In particular, we show how the self-duality constraint for the four-form potential in type IIB is reconstructed from the duality relations in the off-shell formulation of the E$_{6(6)}$ exceptional field theory.

  14. Conference on Geometric Analysis &Conference on Type Theory, Homotopy Theory and Univalent Foundations : Extended Abstracts Fall 2013

    CERN Document Server

    Yang, Paul; Gambino, Nicola; Kock, Joachim

    2015-01-01

    The two parts of the present volume contain extended conference abstracts corresponding to selected talks given by participants at the "Conference on Geometric Analysis" (thirteen abstracts) and at the "Conference on Type Theory, Homotopy Theory and Univalent Foundations" (seven abstracts), both held at the Centre de Recerca Matemàtica (CRM) in Barcelona from July 1st to 5th, 2013, and from September 23th to 27th, 2013, respectively. Most of them are brief articles, containing preliminary presentations of new results not yet published in regular research journals. The articles are the result of a direct collaboration between active researchers in the area after working in a dynamic and productive atmosphere. The first part is about Geometric Analysis and Conformal Geometry; this modern field lies at the intersection of many branches of mathematics (Riemannian, Conformal, Complex or Algebraic Geometry, Calculus of Variations, PDE's, etc) and relates directly to the physical world, since many natural phenomena...

  15. The Classification of Gun’s Type Using Image Recognition Theory

    Directory of Open Access Journals (Sweden)

    M. L. Kulthon Kasemsan

    2014-01-01

    Full Text Available The research aims to develop the Gun’s Type and Models Classification (GTMC system using image recognition theory. It is expected that this study can serve as a guide for law enforcement agencies or at least serve as the catalyst for a similar type of research. Master image storage and image recognition are the two main processes. The procedures involved original images, scaling, gray scale, canny edge detector, SUSAN corner detector, block matching template, and finally gun type’s recognition. Of the 505 images, 80 were control or master images, and 425 were experimental images of the eight gun types. The finding from the experiment indicated that the GTMC was able to classify the images of the semi-automatic gun with the highest accuracy of 99.06 percent, and the average accurate gun image classification was 81.25 percent respectively.

  16. Semi-quantification of endolymphatic size on MR imaging after intravenous injection of single-dose gadodiamide. Comparison between two types of processing strategies

    International Nuclear Information System (INIS)

    Many inner ear disorders, including Meniere's disease, are believed to be based on endolymphatic hydrops. We evaluated a newly proposed method for semi-quantification of endolymphatic size in patients with suspected endolymphatic hydrops that uses 2 kinds of processed magnetic resonance (MR) images. Twenty-four consecutive patients underwent heavily T2-weighted (hT2W) MR cisternography (MRC), hT2W 3-dimensional (3D) fluid-attenuated inversion recovery (FLAIR) with inversion time of 2250 ms (positive perilymph image, PPI), and hT2W-3D-IR with inversion time of 2050 ms (positive endolymph image, PEI) 4 hours after intravenous administration of single-dose gadolinium-based contrast material (IV-SD-GBCM). Two images were generated using 2 new methods to process PPI, PEI, and MRC. Three radiologists contoured the cochlea and vestibule on MRC, copied regions of interest (ROIs) onto the 2 kinds of generated images, and semi-quantitatively measured the size of the endolymph for the cochlea and vestibule by setting a threshold pixel value. Each observer noted a strong linear correlation between endolymphatic size of both the cochlea and vestibule of the 2 kinds of generated images. The Pearson correlation coefficients (r) were 0.783, 0.734, and 0.800 in the cochlea and 0.924, 0.930, and 0.933 in the vestibule (P<0.001, for all). In both the cochlea and vestibule, repeated-measures analysis of variance showed no statistically significant difference between observers. Use of the 2 kinds of generated images generated from MR images obtained 4 hours after IV-SD-GBCM might enable semi-quantification of endolymphatic size with little observer dependency. (author)

  17. The structure of the R8 term in type IIB string theory

    International Nuclear Information System (INIS)

    Based on the structure of the on-shell linearized superspace of type IIB supergravity, we argue that there is a non-BPS 16 derivative interaction in the effective action of type IIB string theory of the form (t8t8R4)2, which we call the R8 interaction. It lies in the same supermultiplet as the G8R4 interaction. Using the Kawai–Lewellen–Tye relation, we analyze the structure of the tree level eight-graviton scattering amplitude in the type IIB theory, which leads to the R8 interaction at the linearized level. This involves an analysis of color-ordered multi-gluon disc amplitudes in the type I theory, which shows an intricate pole structure and transcendentality consistent with various other interactions. Considerations of S-duality show that the R8 interaction receives non-analytic contributions in the string coupling at one and two loops. Apart from receiving perturbative contributions, we show that the R8 interaction receives a non-vanishing contribution in the one D-instanton-anti-instanton background at leading order in the weak coupling expansion. (paper)

  18. A fast-slow dynamical systems theory for the Kuramoto type phase model

    Science.gov (United States)

    Ha, Seung-Yeal; Slemrod, Marshall

    We present a fast-slow dynamical systems theory for the Kuramoto type phase model. When the order parameters are frozen, the fast system consists of independent oscillator equations, whereas the slow system describes the evolution of order parameters. We average out the slow system over the fast manifold to derive a weak form of an amplitude-angle coupled system for the evolution of Kuramoto's order parameters. This yields the slow evolution of order parameters to be constant values which gives a rigorous proof to Kuramoto's original assumption in his self-consistent mean-field theory.

  19. Generalised scalar-tensor theory in the Bianchi type I model

    CERN Document Server

    Fay, S

    2000-01-01

    We use a conformal transformation to find solutions to the generalised scalar-tensor theory, with a coupling constant dependent on a scalar field, in an empty Bianchi type I model. We describe the dynamical behaviour of the metric functions for three different couplings: two exact solutions to the field equations and a qualitative one are found. They exhibit non-singular behaviours and kinetic inflation. Two of them admit both General Relativity and string theory in the low-energy limit as asymptotic cases.

  20. Quantification of Endogenous Retinoids

    OpenAIRE

    Kane, Maureen A.; Napoli, Joseph L

    2010-01-01

    Numerous physiological processes require retinoids, including development, nervous system function, immune responsiveness, proliferation, differentiation, and all aspects of reproduction. Reliable retinoid quantification requires suitable handling and, in some cases, resolution of geometric isomers that have different biological activities. Here we describe procedures for reliable and accurate quantification of retinoids, including detailed descriptions for handling retinoids, preparing stand...

  1. Specimens: "most of" generic NPs in a contextually flexible type theory

    CERN Document Server

    Retoré, Christian

    2011-01-01

    This paper proposes to compute the meanings associated to sentences with generic NPs corresponding to the most of generalized quantifier. We call these generics specimens and they resemble stereotypes or prototypes in lexical semantics. The meanings are viewed as logical formulae that can be thereafter interpreted in your favorite models. We rather depart from the dominant Fregean single untyped universe and go for type theory with hints from Hilbert epsilon calculus and from medieval philosophy. Our type theoretic analysis bears some resemblance with on going work in lexical semantics. Our model also applies to classical examples involving a class (or a generic element of this class) which is provided by the context. An outcome of this study is that, in the minimalism-contextualism debate, if one adopts a type theoretical view, terms encode the purely semantic meaning component while their typing is pragmatically determined.

  2. Resonant modal group theory of membrane-type acoustical metamaterials for low-frequency sound attenuation

    Science.gov (United States)

    Ma, Fuyin; Wu, Jiu Hui; Huang, Meng

    2015-09-01

    In order to overcome the influence of the structural resonance on the continuous structures and obtain a lightweight thin-layer structure which can effectively isolate the low-frequency noises, an elastic membrane structure was proposed. In the low-frequency range below 500 Hz, the sound transmission loss (STL) of this membrane type structure is greatly higher than that of the current sound insulation material EVA (ethylene-vinyl acetate copo) of vehicle, so it is possible to replace the EVA by the membrane-type metamaterial structure in practice engineering. Based on the band structure, modal shapes, as well as the sound transmission simulation, the sound insulation mechanism of the designed membrane-type acoustic metamaterials was analyzed from a new perspective, which had been validated experimentally. It is suggested that in the frequency range above 200 Hz for this membrane-mass type structure, the sound insulation effect was principally not due to the low-level locally resonant mode of the mass block, but the continuous vertical resonant modes of the localized membrane. So based on such a physical property, a resonant modal group theory is initially proposed in this paper. In addition, the sound insulation mechanism of the membrane-type structure and thin plate structure were combined by the membrane/plate resonant theory.

  3. Theory and Observations of Type I X-Ray Bursts from Neutron Stars

    CERN Document Server

    Bildsten, L

    2000-01-01

    I review our understanding of the thermonuclear instabilities on accreting neutron stars that produce Type I X-Ray bursts. I emphasize those observational and theoretical aspects that should interest the broad audience of this meeting. The easily accessible timescales of the bursts (durations of tens of seconds and recurrence times of hours to days) allow for a very stringent comparison to theory. The largest discrepancy (which was found with EXOSAT observations) is the accretion rate dependence of the Type I burst properties. Bursts become less frequent and energetic as the global accretion rate increases, just the opposite of what the spherical theory predicts. I present a resolution of this issue by taking seriously the observed dependence of the burning area on the global accretion rate, which implies that as the accretion rate increases, the accretion rate per unit area decreases. This resurrects the unsolved problem of knowing where the freshly accreted material accumulates on the star, equally relevant...

  4. WKB - type approximations in the theory of vacuum particle creation in strong fields

    CERN Document Server

    Smolyansky, S A; Panferov, A D; Prozorkevich, A V; Blaschke, D; Juchnowski, L

    2014-01-01

    Within the theory of vacuum creation of an $e^{+}e^{-}$ - plasma in the strong electric fields acting in the focal spot of counter-propagating laser beams we compare predictions on the basis of different WKB-type approximations with results obtained in the framework of a strict kinetic approach. Such a comparison demonstrates a considerable divergence results. We analyse some reasoning for this observation and conclude that WKB-type approximations have an insufficient foundation for QED in strong nonstationary fields. The results obtained in this work on the basis of the kinetic approach are most optimistic for the observation of an $e^{+}e^{-}$ - plasma in the range of optical and x-ray laser facilities. We discuss also the influence of unphysical features of non-adiabatic field models on the reliability of predictions of the kinetic theory.

  5. Numerical knot invariants of finite type from Chern-Simons perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, M. (Universidad de Santiago de Compostela (Spain). Dept. de Fisica de Particulas); Labastida, J.M.F. (Universidad de Santiago de Compostela (Spain). Dept. de Fisica de Particulas)

    1995-01-16

    Chern-Simons gauge theory for compact semisimple groups is analyzed from a perturbation theory point of view. The general form of the perturbative series expansion of a Wilson line is presented in terms of the Casimir operators of the gauge group. From this expansion new numerical knot invariants are obtained. These knot invariants turn out to be of finite type (Vassiliev invariants) and to possess an integral representation. Using known results about Jones, HOMFLY, Kauffman and Akutsu-Wadati polynomial invariants these new knot invariants are computed up to type six for all prime knots up to six crossings. Our results suggest that these knot invariants can be normalized in such a way that they are integer-valued. ((orig.))

  6. The epsilon regime of chiral perturbation theory with Wilson-type fermions

    International Nuclear Information System (INIS)

    In this proceeding contribution we report on the ongoing effort to simulate Wilson-type fermions in the so called epsilon regime of chiral perturbation theory (cPT).We present results for the chiral condensate and the pseudoscalar decay constant obtained with Wilson twisted mass fermions employing two lattice spacings, two different physical volumes and several quark masses. With this set of simulations we make a first attempt to estimate the systematic uncertainties. (orig.)

  7. Psychosocial Correlates of Dietary Behaviour in Type 2 Diabetic Women, Using a Behaviour Change Theory

    OpenAIRE

    Didarloo, A.; Shojaeizadeh, D; asl, R. Gharaaghaji; S NIKNAMI; A. Khorami

    2014-01-01

    The study evaluated the efficacy of the Theory of Reasoned Action (TRA), along with self-efficacy to predict dietary behaviour in a group of Iranian women with type 2 diabetes. A sample of 352 diabetic women referred to Khoy Diabetes Clinic, Iran, were selected and given a self-administered survey to assess eating behaviour, using the extended TRA constructs. Bivariate correlations and Enter regression analyses of the extended TRA model were performed with SPSS software. Overall, the proposed...

  8. Type I Superconductivity upon Monopole Condensation in Seiberg-Witten Theory

    OpenAIRE

    Vainshtein, A.; Yung, A.

    2000-01-01

    We study the confinement scenario in N=2 supersymmetric SU(2) gauge theory near the monopole point upon breaking of N=2 supersymmetry by the adjoint matter mass term. We confirm claims made previously that the Abrikosov-Nielsen-Olesen string near the monopole point fails to be a BPS state once next-to-leading corrections in the adjoint mass parameter taken into account. Our results shows that type I superconductivity arises upon monopole condensation. This conclusion allows ...

  9. The criticality problem in reflected slab type reactor in the two-group transport theory

    International Nuclear Information System (INIS)

    The criticality problem in reflected slab type reactor is solved for the first time in the two group neutron transport theory, by singular eingenfunctions expansion, the singular integrals obtained through continuity conditions of angular distributions at the interface are regularized by a recently proposed method. The result is a coupled system of regular integral equations for the expansion coefficients, this system is solved by an ordinary interactive method. Numerical results that can be utilized as a comparative standard for aproximation methods, are presented

  10. Type synthesis for 4-DOF parallel press mechanism using GF set theory

    Science.gov (United States)

    He, Jun; Gao, Feng; Meng, Xiangdun; Guo, Weizhong

    2015-07-01

    Parallel mechanisms is used in the large capacity servo press to avoid the over-constraint of the traditional redundant actuation. Currently, the researches mainly focus on the performance analysis for some specific parallel press mechanisms. However, the type synthesis and evaluation of parallel press mechanisms is seldom studied, especially for the four degrees of freedom(DOF) press mechanisms. The type synthesis of 4-DOF parallel press mechanisms is carried out based on the generalized function(GF) set theory. Five design criteria of 4-DOF parallel press mechanisms are firstly proposed. The general procedure of type synthesis of parallel press mechanisms is obtained, which includes number synthesis, symmetrical synthesis of constraint GF sets, decomposition of motion GF sets and design of limbs. Nine combinations of constraint GF sets of 4-DOF parallel press mechanisms, ten combinations of GF sets of active limbs, and eleven combinations of GF sets of passive limbs are synthesized. Thirty-eight kinds of press mechanisms are presented and then different structures of kinematic limbs are designed. Finally, the geometrical constraint complexity( GCC), kinematic pair complexity( KPC), and type complexity( TC) are proposed to evaluate the press types and the optimal press type is achieved. The general methodologies of type synthesis and evaluation for parallel press mechanism are suggested.

  11. Coordinated encoding between cell types in the retina: insights from the theory of phase transitions

    Science.gov (United States)

    Sharpee, Tatyana

    2015-03-01

    In this talk I will describe how the emergence of some types of neurons in the brain can be quantitatively described by the theory of transitions between different phases of matter. The two key parameters that control the separation of neurons into subclasses are the mean and standard deviation of noise levels among neurons in the population. The mean noise level plays the role of temperature in the classic theory of phase transitions, whereas the standard deviation is equivalent to pressure, in the case of liquid-gas transitions, or to magnetic field for magnetic transitions. Our results account for properties of two recently discovered types of salamander OFF retinal ganglion cells, as well as the absence of multiple types of ON cells. We further show that, across visual stimulus contrasts, retinal circuits continued to operate near the critical point whose quantitative characteristics matched those expected near a liquid-gas critical point and described by the nearest-neighbor Ising model in three dimensions. Because the retina needs to operate under changing stimulus conditions, the observed parameters of cell types corresponded to metastable states in the region between the spinodal line and the line describing maximally informative solutions. Such properties of neural circuits can maximize information transmission in a given environment while retaining the ability to quickly adapt to a new environment. NSF CAREER award 1254123 and NIH R01EY019493

  12. A Density Functional Theory Study of Doped Tin Monoxide as a Transparent p-type Semiconductor

    KAUST Repository

    Bianchi Granato, Danilo

    2012-05-01

    In the pursuit of enhancing the electronic properties of transparent p-type semiconductors, this work uses density functional theory to study the effects of doping tin monoxide with nitrogen, antimony, yttrium and lanthanum. An overview of the theoretical concepts and a detailed description of the methods employed are given, including a discussion about the correction scheme for charged defects proposed by Freysoldt and others [Freysoldt 2009]. Analysis of the formation energies of the defects points out that nitrogen substitutes an oxygen atom and does not provide charge carriers. On the other hand, antimony, yttrium, and lanthanum substitute a tin atom and donate n-type carriers. Study of the band structure and density of states indicates that yttrium and lanthanum improves the hole mobility. Present results are in good agreement with available experimental works and help to improve the understanding on how to engineer transparent p-type materials with higher hole mobilities.

  13. Device independent entanglement quantification

    Energy Technology Data Exchange (ETDEWEB)

    Moroder, Tobias; Hofmann, Martin; Guehne, Otfried [Theoretische Quantenoptik, Department Physik, Universitaet Siegen (Germany); Bancal, Jean-Daniel; Liang, Yeong-Cherng [Group of Applied Physics, University of Geneva (Switzerland)

    2013-07-01

    Most experiments require a rather precise characterization of the employed equipment or of the underlying model generating the data. However and presumably quite surprising at first, many tasks in quantum information processing can be made completely independent of this necessity. This has become the beauty of device independence, and there is a variety of tasks which have been investigated more thoroughly recently, including, for instance, different entanglement verification schemes or witnesses of the underlying quantum dimension. We present a method for device independent entanglement quantification for the bi- and multipartite case, which directly provides non-trivial information about the underlying quantum dimension or the type of entanglement. This becomes possible by a novel technique to device independently characterize correlations if the quantum state has for instance a positive partial transpose or a biseparable structure. With this technique we additionally derive bounds on the maximal violation of a Bell inequality if the underlying state is PPT (and thus bound) entangled, which provides new insights into the bipartite Peres conjecture.

  14. Quantification of AS and AR

    Directory of Open Access Journals (Sweden)

    Mehta Yatin

    2009-01-01

    Full Text Available Trans-esophageal echocardiography (TEE is routinely used in valvular surgery in most institutions. The popularity of TEE stems from the fact that it can supplement or confirm information gained from other methods of evaluation or make completely independant diagnoses. Quantitative and qualitative assessment permits informed decisions regarding surgical intervention, type of intervention, correction of inadequate surgical repair and re-operation for complications. This review summarizes the various methods for quantification of aortic regurgitation and stenosis on TEE. The application of Doppler echo (pulsed wave, continuous wave and color with two-dimensional echo allows the complete evaluation of AV lesions.

  15. Canonical BF-type topological field theory and fractional statistics of strings

    International Nuclear Information System (INIS)

    We consider BF-type topological field theory coupled to non-dynamical particle and string sources on spacetime manifolds of the form R1xM 3, where M 3 is a 3-manifold without boundary. Canonical quantization of the theory is carried out in the hamiltonian formalism and explicit solutions of the Schroedinger equation are obtained. We show that the Hilbert space is finite dimensional and the physical states carry a one-dimensional projective representation of the local gauge symmetries. When M 3 is homologically non-trivial the wavefunctions in addition carry a multi-dimensional projective representation, in terms of the linking matrix of the homology cycles of M 3, of the discrete group of large gauge transformations. The wavefunctions also carry a one-dimensional representation of the non-trivial linking of the particle trajectories and string surfaces in M 3. This topological field theory therefore provides a phenomenological generalization of anyons to (3+1) dimensions where the holonomies representing fractional statistics arise from the adiabatic transport of particles around strings. We also discuss a duality between large gauge transformations and these linking operations around the homology cycles of M 3, and show that this canonical quantum field theory provides novel quantum representations of the cohomology of M 3 and its associated motion group. ((orig.))

  16. Social cognitive theory correlates of moderate-intensity exercise among adults with type 2 diabetes.

    Science.gov (United States)

    Heiss, Valerie J; Petosa, R L

    2016-01-01

    The purpose of this study was to identify social cognitive theory (SCT) correlates of moderate- to vigorous-intensity exercise (MVPA) among adults with type 2 diabetes. Adults with type 2 diabetes (N = 181) participated in the study. Participants were recruited through ResearchMatch.org to complete an online survey. The survey used previously validated instruments to measure dimensions of self-efficacy, self-regulation, social support, outcome expectations, the physical environment, and minutes of MVPA per week. Spearman Rank Correlations were used to determine the relationship between SCT variables and MVPA. Classification and Regression Analysis using a decision tree model was used to determine the amount of variance in MVPA explained by SCT variables. Due to low levels of vigorous activity, only moderate-intensity exercise (MIE) was analyzed. SCT variables explained 42.4% of the variance in MIE. Self-monitoring, social support from family, social support from friends, and self-evaluative outcome expectations all contributed to the variability in MIE. Other contributing variables included self-reward, task self-efficacy, social outcome expectations, overcoming barriers, and self-efficacy for making time for exercise. SCT is a useful theory for identifying correlates of MIE among adults with type 2 diabetes. The SCT correlates can be used to refine diabetes education programs to target the adoption and maintenance of regular exercise. PMID:25753761

  17. S-matrix elements and covariant tachyon action in type 0 theory

    OpenAIRE

    Garousi, Mohammad R.

    2003-01-01

    We evaluate the sphere level S-matrix element of two tachyons and two massless NS states, the S-matrix element of four tachyons, and the S-matrix element of two tachyons and two Ramon-Ramond vertex operators, in type 0 theory. We then find an expansion for theses amplitudes that their leading order terms correspond to a covariant tachyon action. To the order considered, there are no $T^4$, $T^2(\\prt T)^2$, $T^2H^2$, nor $T^2R$ tachyon couplings, whereas, the tachyon coupling...

  18. Bianchi type-I massive string magnetized barotropic perfect ?uid cosmological model in bimetric theory

    Indian Academy of Sciences (India)

    S D Katore; R S Rane; K S Wankhade

    2011-04-01

    Bianchi type-I massive string cosmological model for perfect ?uid distribution in the presence of magnetic ?eld is investigated in Rosen’s [Gen. Relativ. Gravit. 4, 435 (1973)] bimetric theory of gravitation. To obtain the deterministic model in terms of cosmic time, we have used the condition $A = (B C)^n$, where n is a constant, between the metric potentials. The magnetic ?eld is due to the electric current produced along the -axis with in?nite electrical conductivity. Some physical and geometrical properties of the exhibited model are discussed and studied.

  19. Type I superconductivity upon monopole condensation in Seiberg-Witten theory

    Energy Technology Data Exchange (ETDEWEB)

    Vainshtein, A. E-mail: vainshtein@mnhep1.hep.umn.edu; Yung, A

    2001-10-29

    We study the confinement scenario in N=2 supersymmetric SU(2) gauge theory near the monopole point upon breaking of N=2 supersymmetry by the adjoint matter mass term. We confirm claims made previously that the Abrikosov-Nielsen-Olesen string near the monopole point fails to be a BPS state once next-to-leading corrections in the adjoint mass parameter taken into account. Our results shows that type I superconductivity arises upon monopole condensation. This conclusion allows us to make qualitative predictions on the structure of the hadron mass spectrum near the monopole point.

  20. T-dualization of type IIB superstring theory in double space

    CERN Document Server

    Nikoli?, Bojan

    2015-01-01

    In this article we offer the new interpretation of T-dualization procedure of type IIB superstring theory in double space framework. We use the ghost free action of type IIB superstring in pure spinor formulation in approximation of constant background fields up to the quadratic terms. T-dualization along any subset of the initial coordinates, $x^a$, is equivalent to the permutation of this subset with subset of the corresponding T-dual coordinates, $y_a$, in double space coordinate $Z^M=(x^\\mu,y_\\mu)$. Demanding that the T-dual transformation law after exchange $x^a\\leftrightarrow y_a$ has the same form as initial one, we obtain the T-dual NS-NS and NS-R background fields. The T-dual R-R field strength is determined up to one arbitrary constant under some assumptions.

  1. On the effective theory of type II string compactifications on nilmanifolds and coset spaces

    International Nuclear Information System (INIS)

    In this thesis we analyzed a large number of type IIA strict SU(3)-structure compactifications with fluxes and O6/D6-sources, as well as type IIB static SU(2)-structure compactifications with fluxes and O5/O7-sources. Restricting to structures and fluxes that are constant in the basis of left-invariant one-forms, these models are tractable enough to allow for an explicit derivation of the four-dimensional low-energy effective theory. The six-dimensional compact manifolds we studied in this thesis are nilmanifolds based on nilpotent Lie-algebras, and, on the other hand, coset spaces based on semisimple and U(1)-groups, which admit a left-invariant strict SU(3)- or static SU(2)-structure. In particular, from the set of 34 distinct nilmanifolds we identified two nilmanifolds, the torus and the Iwasawa manifold, that allow for an AdS4, N = 1 type IIA strict SU(3)-structure solution and one nilmanifold allowing for an AdS4, N = 1 type IIB static SU(2)-structure solution. From the set of all the possible six-dimensional coset spaces, we identified seven coset spaces suitable for strict SU(3)-structure compactifications, four of which also allow for a static SU(2)-structure compactification. For all these models, we calculated the four-dimensional low-energy effective theory using N = 1 supergravity techniques. In order to write down the most general four-dimensional effective action, we also studied how to classify the different disconnected ''bubbles'' in moduli space. (orig.)

  2. On the effective theory of type II string compactifications on nilmanifolds and coset spaces

    Energy Technology Data Exchange (ETDEWEB)

    Caviezel, Claudio

    2009-07-30

    In this thesis we analyzed a large number of type IIA strict SU(3)-structure compactifications with fluxes and O6/D6-sources, as well as type IIB static SU(2)-structure compactifications with fluxes and O5/O7-sources. Restricting to structures and fluxes that are constant in the basis of left-invariant one-forms, these models are tractable enough to allow for an explicit derivation of the four-dimensional low-energy effective theory. The six-dimensional compact manifolds we studied in this thesis are nilmanifolds based on nilpotent Lie-algebras, and, on the other hand, coset spaces based on semisimple and U(1)-groups, which admit a left-invariant strict SU(3)- or static SU(2)-structure. In particular, from the set of 34 distinct nilmanifolds we identified two nilmanifolds, the torus and the Iwasawa manifold, that allow for an AdS{sub 4}, N = 1 type IIA strict SU(3)-structure solution and one nilmanifold allowing for an AdS{sub 4}, N = 1 type IIB static SU(2)-structure solution. From the set of all the possible six-dimensional coset spaces, we identified seven coset spaces suitable for strict SU(3)-structure compactifications, four of which also allow for a static SU(2)-structure compactification. For all these models, we calculated the four-dimensional low-energy effective theory using N = 1 supergravity techniques. In order to write down the most general four-dimensional effective action, we also studied how to classify the different disconnected ''bubbles'' in moduli space. (orig.)

  3. Mild to severe social fears: ranking types of feared social situations using item response theory.

    Science.gov (United States)

    Crome, Erica; Baillie, Andrew

    2014-06-01

    Social anxiety disorder is one of the most common mental disorders, and is associated with long term impairment, distress and vulnerability to secondary disorders. Certain types of social fears are more common than others, with public speaking fears typically the most prevalent in epidemiological surveys. The distinction between performance- and interaction-based fears has been the focus of long-standing debate in the literature, with evidence performance-based fears may reflect more mild presentations of social anxiety. This study aims to explicitly test whether different types of social fears differ in underlying social anxiety severity using item response theory techniques. Different types of social fears were assessed using items from three different structured diagnostic interviews in four different epidemiological surveys in the United States (n=2261, n=5411) and Australia (n=1845, n=1497); and ranked using 2-parameter logistic item response theory models. Overall, patterns of underlying severity indicated by different fears were consistent across the four samples with items functioning across a range of social anxiety. Public performance fears and speaking at meetings/classes indicated the lowest levels of social anxiety, with increasing severity indicated by situations such as being assertive or attending parties. Fears of using public bathrooms or eating, drinking or writing in public reflected the highest levels of social anxiety. Understanding differences in the underlying severity of different types of social fears has important implications for the underlying structure of social anxiety, and may also enhance the delivery of social anxiety treatment at a population level. PMID:24873885

  4. Bianchi Type-II, VIII & IX Perfect Fluid Cosmological Models in Brans Dicke Theory of Gravitation

    OpenAIRE

    Velagapudi Uma Maheswara Rao; Mandangi Vijaya Santhi

    2011-01-01

    Field equations in the presence of perfect fluid distribution are obtained in a scalar tensor theory of gravitation proposed by Brans and Dicke[1] with the aid of Bianchi type-II, VIII & IX metrics. Exact prefect fluid Bianchi type- IX cosmological model is presented since other models doesn’t exist in Brans-Dicke scalar tensor theory of gravitation. Some physical properties of the model are also discussed.

  5. The early life origin theory in the development of cardiovascular disease and type 2 diabetes.

    Science.gov (United States)

    Lindblom, Runa; Ververis, Katherine; Tortorella, Stephanie M; Karagiannis, Tom C

    2015-04-01

    Life expectancy has been examined from a variety of perspectives in recent history. Epidemiology is one perspective which examines causes of morbidity and mortality at the population level. Over the past few 100 years there have been dramatic shifts in the major causes of death and expected life length. This change has suffered from inconsistency across time and space with vast inequalities observed between population groups. In current focus is the challenge of rising non-communicable diseases (NCD), such as cardiovascular disease and type 2 diabetes mellitus. In the search to discover methods to combat the rising incidence of these diseases, a number of new theories on the development of morbidity have arisen. A pertinent example is the hypothesis published by David Barker in 1995 which postulates the prenatal and early developmental origin of adult onset disease, and highlights the importance of the maternal environment. This theory has been subject to criticism however it has gradually gained acceptance. In addition, the relatively new field of epigenetics is contributing evidence in support of the theory. This review aims to explore the implication and limitations of the developmental origin hypothesis, via an historical perspective, in order to enhance understanding of the increasing incidence of NCDs, and facilitate an improvement in planning public health policy. PMID:25270249

  6. Bianchi type-III models with anisotropic dark energy in Brans-Dicke-Rastall theory

    Science.gov (United States)

    Salako, Ines G.; Jawad, Abdul

    2015-10-01

    In this paper, we consider the Bianchi type-III metric (which is a spatially homogeneous and anisotropic) in the framework of a newly proposed Brans-Dicke-Rastall theory of gravitation by Caramês et al. (Eur. Phys. J. C 74:3145, 2014. In this scenario, we obtain the generalized form of the anisotropy parameter of the expansion, the dynamically anisotropic equation of state parameter, and a dynamical energy density in the presence of a single diagonal imperfect fluid. By assuming the anisotropy of the fluid, and exponential and power-law volumetric expansions, we find the exact solutions of the Brans-Dicke-Rastall field equations. We examine the isotropy of the fluid, of space, and of the expansion of the universe. It is observed that the universe can approach the isotropy monotonically even in the presence of an anisotropic fluid. We also note that the strong anisotropy observed in RG, respectively, is diminished considerably in the Rastall theory and Brans-Dicke-Rastall theory because of the influence of the parameters and.

  7. Maier-Saupe-type theory of ferroelectric nanoparticles in nematic liquid crystals

    Science.gov (United States)

    Lopatina, Lena M.; Selinger, Jonathan V.

    2011-10-01

    Several experiments have reported that ferroelectric nanoparticles have drastic effects on nematic liquid crystals—increasing the isotropic-nematic transition temperature by about 5 K, and greatly increasing the sensitivity to applied electric fields. In a recent paper [Lopatina and Selinger, Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.102.197802 102, 197802 (2009)], we modeled these effects through a Landau theory, based on coupled orientational order parameters for the liquid crystal and the nanoparticles. This model has one important limitation: Like all Landau theories, it involves an expansion of the free energy in powers of the order parameters, and hence it overestimates the order parameters that occur in the low-temperature phase. For that reason, we now develop a new Maier-Saupe-type model, which explicitly shows the low-temperature saturation of the order parameters. This model reduces to the Landau theory in the limit of high temperature or weak coupling, but shows different behavior in the opposite limit. We compare these calculations with experimental results on ferroelectric nanoparticles in liquid crystals.

  8. Quantification of the physiochemical constraints on the export of spider silk proteins by Salmonella type III secretion

    OpenAIRE

    Voigt Christopher A; Widmaier Daniel M

    2010-01-01

    Abstract Background The type III secretion system (T3SS) is a molecular machine in gram negative bacteria that exports proteins through both membranes to the extracellular environment. It has been previously demonstrated that the T3SS encoded in Salmonella Pathogenicity Island 1 (SPI-1) can be harnessed to export recombinant proteins. Here, we demonstrate the secretion of a variety of unfolded spider silk proteins and use these data to quantify the constraints of this system with respect to t...

  9. The use of quantitative PCR for identification and quantification of Brachyspira pilosicoli, Lawsonia intracellularis and Escherichia coli fimbrial types F4 and F18 in pig feces

    OpenAIRE

    Ståhl, Marie; Kokotovic, Branko; Hjulsager, Charlotte Kristiane; Breum, Solvej Østergaard; Angen, Øystein

    2011-01-01

    Four quantitative PCR (qPCR) assays were evaluated for quantitative detection of Brachyspira pilosicoli, Lawsonia intracellularis, and E. coli fimbrial types F4 and F18 in pig feces. Standard curves were based on feces spiked with the respective reference strains. The detection limits from the spiking experiments were 102 bacteria/g feces for BpiloqPCR and Laws-qPCR, 103 CFU/g feces for F4-qPCR and F18-qPCR. The PCR efficiency for all four qPCR assays was between 0.91 and 1.01 ...

  10. The use of quantitative PCR for identification and quantification of Brachyspira pilosicoli, Lawsonia intracellularis and Escherichia coli fimbrial types F4 and F18 in pig feces

    DEFF Research Database (Denmark)

    Ståhl, Marie; Kokotovic, Branko; Hjulsager, Charlotte Kristiane; Breum, Solvej Østergaard; Angen, Øystein

    2011-01-01

    Four quantitative PCR (qPCR) assays were evaluated for quantitative detection of Brachyspira pilosicoli, Lawsonia intracellularis, and E. coli fimbrial types F4 and F18 in pig feces. Standard curves were based on feces spiked with the respective reference strains. The detection limits from the spiking experiments were 102 bacteria/g feces for BpiloqPCR and Laws-qPCR, 103 CFU/g feces for F4-qPCR and F18-qPCR. The PCR efficiency for all four qPCR assays was between 0.91 and 1.01 with R2 above 0.99...

  11. Relative quantification and detection of different types of infectious bursal disease virus in bursa of Fabricius and cloacal swabs using real time RT-PCR SYBR green technology

    DEFF Research Database (Denmark)

    Li, Yiping; Handberg, K.J.; Kabell, Susanne; Kusk, M.; Zhang, M.F.; Jorgensen, P.H.

    2007-01-01

    In present study, different types of infectious bursal disease virus (IBDV), virulent strain DK01, classic strain F52/70 and vaccine strain D78 were quantified and detected in infected bursa of Fabricius (BF) and cloacal swabs using quantitative real time RT-PCR with SYBR green dye. For selection of a suitable internal control gene, real time PCR parameters were evaluated for three candidate genes, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), 28S rRNA and beta-actin to IBDVs. Based on this ...

  12. The use of quantitative PCR for identification and quantification of , and fimbrial types F4 and F18 in pig feces.

    OpenAIRE

    Ståhl, M.; Kokotovic, B.; Hjulsager, C. K.; Breum, S.Ø.; Angen, Ø.

    2011-01-01

    Four quantitative PCR (qPCR) assays were evaluated for quantitative detection of and fimbrial types F4 and F18 in pig feces. Standard curves were based on feces spiked with the respective reference strains. The detection limits from the spiking experiments were 10 bacteria/g feces for Bpilo-qPCR and Laws-qPCR, 10 CFU/g feces for F4-qPCR and F18-qPCR. The PCR efficiency for all four qPCR assays was between 0.91 and 1.01 with R above 0.993. Standard curves, slopes and elevation, varied between ...

  13. Understanding physical activity intentions among French Canadians with type 2 diabetes: an extension of Ajzen's theory of planned behaviour

    OpenAIRE

    Godin Gaston; Boudreau François

    2009-01-01

    Abstract Background Regular physical activity is considered a cornerstone for managing type 2 diabetes. However, in Canada, most individuals with type 2 diabetes do not meet national physical activity recommendations. When designing a theory-based intervention, one should first determine the key determinants of physical activity for this population. Unfortunately, there is a lack of information on this aspect among adults with type 2 diabetes. The purpose of this cross-sectional study is to f...

  14. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  15. The Bianchi type-V Dark Energy Cosmology in Self Interacting Brans Dicke Theory of Gravity

    CERN Document Server

    Singh, J K

    2016-01-01

    This paper deals with a spatially homogeneous and totally anisotropic Bianchi type-V cosmological model within the framework of self interacting Brans Dicke theory of gravity in the background of anisotropic dark energy (DE) with variable equation of state (EoS) parameter and constant deceleration parameter. Constant deceleration parameter leads to two models of universe, i.e. power law model and exponential model. EoS parameter {\\omega} and its existing range for the models is in good agreement with the most recent observational data. We notice that {\\omega} given by (37) i.e {\\omega}(t) = log(k1t) is more suitable in explaining the evolution of the universe. The physical behaviors of the solutions have also been discussed using some physical quantities. Finally, we observe that despite having several prominent features, both of the DE models discussed fail in details.

  16. Adaptation of learning resources based on the MBTI theory of psychological types

    Directory of Open Access Journals (Sweden)

    Amel Behaz

    2012-01-01

    Full Text Available Today, the resources available on the web increases significantly. The motivation for the dissemination of knowledge and their acquisition by learners is central to learning. However, learners show differences between the ways of learning that suits them best. The objective of the work presented in this paper is to study how it is possible to integrate models from cognitive theories and ontologies for the adaptation of educational resources. The goal is to provide the system capabilities to conduct reasoning on descriptions obtained in order to automatically adapt the resources to a learner according to his preferences. We rely on the model MBTI (Myers-Briggs Type Indicator for the consideration of learning styles of learners as a criterion for adaptation.

  17. Spectroscopic and Density Functional Theory Studies of a New Rosane Type Diterpenoid from Stachys parviflora

    Directory of Open Access Journals (Sweden)

    Umar Farooq

    2015-04-01

    Full Text Available A rosane type diterpenoid has been isolated from the ethyl acetate soluble fraction of Stachys parviflora. The structure elucidation was based primarily on 1D- and 2D-NMR techniques including correlation spectroscopy (COSY, heteronuclear multiple quantum coherence (HMQC, heteronuclear multiple bond correlation (HMBC, and nuclear Overhauser effect spectroscopy (NOESY. Density functional theory calculations have been performed to gain insight into the geometric, electronic and spectroscopic properties of the isolated diterpenoid. The geometries, vibrational spectrum and electronic properties were modeled at B3LYP/6-31G(d and the theoretical data correlated nicely with the experimental values. Simulated chemical shifts at B3LYP/6-311+G(2d,p showed much better correlation with the experimental chemical shifts, compared to B3LYP/6-31G(d and WP04/6-31G(d.

  18. Topological and geometrical quantum computation in cohesive Khovanov homotopy type theory

    Science.gov (United States)

    Ospina, Juan

    2015-05-01

    The recently proposed Cohesive Homotopy Type Theory is exploited as a formal foundation for central concepts in Topological and Geometrical Quantum Computation. Specifically the Cohesive Homotopy Type Theory provides a formal, logical approach to concepts like smoothness, cohomology and Khovanov homology; and such approach permits to clarify the quantum algorithms in the context of Topological and Geometrical Quantum Computation. In particular we consider the so-called "open-closed stringy topological quantum computer" which is a theoretical topological quantum computer that employs a system of open-closed strings whose worldsheets are open-closed cobordisms. The open-closed stringy topological computer is able to compute the Khovanov homology for tangles and for hence it is a universal quantum computer given than any quantum computation is reduced to an instance of computation of the Khovanov homology for tangles. The universal algebra in this case is the Frobenius Algebra and the possible open-closed stringy topological quantum computers are forming a symmetric monoidal category which is equivalent to the category of knowledgeable Frobenius algebras. Then the mathematical design of an open-closed stringy topological quantum computer is involved with computations and theorem proving for generalized Frobenius algebras. Such computations and theorem proving can be performed automatically using the Automated Theorem Provers with the TPTP language and the SMT-solver Z3 with the SMT-LIB language. Some examples of application of ATPs and SMT-solvers in the mathematical setup of an open-closed stringy topological quantum computer will be provided.

  19. Identification and quantification of the caproic acid-producing bacterium Clostridium kluyveri in the fermentation of pit mud used for Chinese strong-aroma type liquor production.

    Science.gov (United States)

    Hu, Xiao-Long; Du, Hai; Xu, Yan

    2015-12-01

    Chinese strong-aroma type liquor (CSAL) is a popular distilled alcoholic beverage in China. It is produced by a complex fermentation process that is conducted in pits in the ground. Ethyl caproate is a key flavor compound in CSAL and is thought to originate from caproic acid produced by Clostridia inhabiting the fermentation pit mud. However, the particular species of Clostridium associated with this production are poorly understood and problematic to quantify by culturing. In this study, a total of 28 closest relatives including 15 Clostridia and 8 Bacilli species in pit muds from three CSAL distilleries, were detected by culture-dependent and -independent methods. Among them, Clostridium kluyveri was identified as the main producer of caproic acid. One representative strain C. kluyveri N6 could produce caproic, butyric and octanoic acids and their corresponding ethyl esters, contributing significantly to CSAL flavor. A real time quantitative PCR assay of C. kluyveri in pit muds developed showed that a concentration of 1.79×10(7) 16S rRNA gene copies/g pit mud in LZ-old pit was approximately six times higher than that in HLM and YH pits and sixty times higher than that in LZ-new pit respectively. This method can be used to improve the management of pit mud microbiology and its impact on CSAL quality. PMID:26267890

  20. TH-C-19A-09: Quantification of Transmission and Backscatter Factors as a function of Distance to Inhomogeneity Interface for Three Types of Surgical Implant Plates

    International Nuclear Information System (INIS)

    Purpose: Carbon fiber materials have been increasingly used clinically, mainly in orthopedics, as an alternative to metallic implants because of their minimal artifacts on CT and MRI images. This study characterizes the transmission and backscatter property of carbon fiber plates (CarboFix Orthopedics, Herzeliya, Israel) with measurements for radiation therapy applications, and compares them to traditional Stainless Steel (SS) and Titanium (Ti) metal materials. Methods: For the transmission measurements, 1-mm-thick test plate was placed upstream from a plane parallel Markus chamber, separated by various thicknesses of polystyrene plates in 0.5 cm increments between 0 and 5 cm. With this setup, we quantified the radiation transmission as a function of distance to the inhomogeneity interface. The LINAC source to detector distance was maintained at 100 cm and 200 MU was delivered for each measurement. Two 3-cm solid water phantoms were placed at the top and bottom to provide build up. All the measurements were performed for 6 MV and 18 MV photons. The backscatter measurements had the identical setup, except that the test plate was downstream of the chamber from radiation. Results: The carbon fiber plates did not introduce any measureable inhomogeneity effect on the transmission and backscatter factor because of its low atomic number. In contrast, traditional metal implant materials caused up to 15% dose difference at upstream and 25% backscatter at downstream from radiation. Such differences decrease as the distance to the inhomogeneity interface increases and become unmeasurable at distance of 3 cm and 1 cm for upstream and downstream, respectively. Conclusion: A new type of carbon fiber implant plate was evaluated and found to have minimal inhomogeneity effect in MV radiation beams. Patients would benefit from a carbon based implant over metal for radiation therapy due to their minimal backscatter and imaging artifacts

  1. Analytical validation of an immunoassay for the quantification of N-terminal pro-B-type natriuretic peptide in feline blood.

    Science.gov (United States)

    Mainville, Celine A; Clark, Genevieve H; Esty, Katherine J; Foster, William M; Hanscom, Jancy L; Hebert, Kelly J; Lyons, Helen R

    2015-07-01

    The measurement of N-terminal pro-B-type natriuretic peptide (NT-proBNP), a biomarker for heart stress detectable in blood, has been shown to have clinical utility in cats with heart disease. A second-generation feline enzyme-linked immunosorbent assay (Cardiopet® proBNP, IDEXX Laboratories Inc., Westbrook, Maine) was developed to measure NT-proBNP in routine feline plasma or serum samples with improved analyte stability. Results of the analytical validation for the second-generation assay are presented. Analytic sensitivity was 10 pmol/l. Accuracy of 103.5% was determined via serial dilutions of 6 plasma samples. Coefficients of variation for intra-assay, interassay, and total precision were in the ranges of 1.6-6.3%, 4.3-8.8%, and 10.1-15.1%, respectively. Repeatability across 2 lots for both serum and plasma had an average coefficient of determination (r(2)) of 0.99 and slope of 1.11. Stability of the analyte was found to be high. In serum samples held at 4°C for 24-72 hr, the mean percent recovery from time zero was ?99%. In serum samples held at 25°C for 24 hr, the mean percent recovery from time zero was 91.9%, and for 48 hr, 85.6%. A method comparison of the first- and second-generation assays with a clinically characterized population of cats revealed no difference in the tests' ability to differentiate levels of NT-proBNP between normal cats and cats with occult cardiomyopathy (P < 0.001). Results from our study validate that the second-generation feline Cardiopet proBNP assay can measure NT-proBNP in routine feline plasma and serum samples with accuracy and precision. PMID:26077545

  2. Investigation of the association of growth rate in grower-finishing pigs with the quantification of Lawsonia intracellularis and porcine circovirus type 2.

    Science.gov (United States)

    Johansen, Markku; Nielsen, Maibritt; Dahl, Jan; Svensmark, Birgitta; Bækbo, Poul; Kristensen, Charlotte Sonne; Hjulsager, Charlotte Kristiane; Jensen, Tim K; Ståhl, Marie; Larsen, Lars E; Angen, Oystein

    2013-01-01

    As a part of a prospective cohort study in four herds, a nested case control study was carried out. Five slow growing pigs (cases) and five fast growing pigs (controls) out of 60 pigs were selected for euthanasia and laboratory examination at the end of the study in each herd. A total of 238 pigs, all approximately 12 weeks old, were included in the study during the first week in the grower-finisher barn. In each herd, approximately 60 pigs from four pens were individually ear tagged. The pigs were weighed at the beginning of the study and at the end of the 6-8 weeks observation period. Clinical data, blood and faecal samples were serially collected from the 60 selected piglets every second week in the observation period. In the killed pigs serum was examined for antibodies against Lawsonia intracellularis (LI) and procine circovirus type 2 (PCV2) and in addition PCV2 viral DNA content was quantified. In faeces the quantity of LI cells/g faeces and number of PCV2 copies/g faeces was measured by qPCR. The objective of the study was to examine if growth rate in grower-finishing pig is associated with the detection of LI and PCV2 infection or clinical data. This study has shown that diarrhoea is a significant risk factor for low growth rate and that one log(10) unit increase in LI load increases the odds ratio for a pig to have a low growth rate by 2.0 times. Gross lesions in the small intestine and LI load>log(10)6/g were significant risk factors for low growth. No association between PCV2 virus and low growth was found. PMID:22854321

  3. Investigation of the association of growth rate in grower-finishing pigs with the quantification of Lawsonia intracellularis and porcine circovirus type 2

    DEFF Research Database (Denmark)

    Johansen, Markku; Nielsen, MaiBritt

    2013-01-01

    As a part of a prospective cohort study in four herds, a nested case control study was carried out. Five slow growing pigs (cases) and five fast growing pigs (controls) out of 60 pigs were selected for euthanasia and laboratory examination at the end of the study in each herd. A total of 238 pigs, all approximately 12 weeks old, were included in the study during the first week in the grower–finisher barn. In each herd, approximately 60 pigs from four pens were individually ear tagged. The pigs were weighed at the beginning of the study and at the end of the 6–8 weeks observation period. Clinical data, blood and faecal samples were serially collected from the 60 selected piglets every second week in the observation period. In the killed pigs serum was examined for antibodies against Lawsonia intracellularis (LI) and procine circovirus type 2 (PCV2) and in addition PCV2 viral DNA content was quantified. In faeces the quantity of LI cells/g faeces and number of PCV2 copies/g faeces was measured by qPCR. The objective of the study was to examine if growth rate in grower-finishing pig is associated with the detection of LI and PCV2 infection or clinical data. This study has shown that diarrhoea is a significant risk factor for low growth rate and that one log10 unit increase in LI load increases the odds ratio for a pig to have a low growth rate by 2.0 times. Gross lesions in the small intestine and LI load > log10 6/g were significant risk factors for low growth. No association between PCV2 virus and low growth was found.

  4. LRS Bianchi type -V cosmology with heat flow in scalar: tensor theory

    Scientific Electronic Library Online (English)

    C.P., Singh.

    2009-12-01

    Full Text Available In this paper we present a spatially homogeneous locally rotationally symmetric (LRS) Bianchi type -V perfect fluid model with heat conduction in scalar tensor theory proposed by Saez and Ballester. The field equations are solved with and without heat conduction by using a law of variation for the m [...] ean Hubble parameter, which is related to the average scale factor of metric and yields a constant value for the deceleration parameter. The law of variation for the mean Hubble parameter generates two types of cosmologies one is of power -law form and second the exponential form. Using these two forms singular and non -singular solutions are obtained with and without heat conduction. We observe that a constant value of the deceleration parameter is reasonable a description of the different phases of the universe. We arrive to the conclusion that the universe decelerates for positive value of deceleration parameter where as it accelerates for negative one. The physical constraints on the solutions of the field equations, and, in particular, the thermodynamical laws and energy conditions that govern such solutions are discussed in some detail.The behavior of the observationally important parameters like expansion scalar, anisotropy parameter and shear scalar is considered in detail.

  5. The use of quantitative PCR for identification and quantification of Brachyspira pilosicoli, Lawsonia intracellularis and Escherichia coli fimbrial types F4 and F18 in pig feces

    DEFF Research Database (Denmark)

    Ståhl, Marie; Kokotovic, Branko

    2011-01-01

    Four quantitative PCR (qPCR) assays were evaluated for quantitative detection of Brachyspira pilosicoli, Lawsonia intracellularis, and E. coli fimbrial types F4 and F18 in pig feces. Standard curves were based on feces spiked with the respective reference strains. The detection limits from the spiking experiments were 102 bacteria/g feces for BpiloqPCR and Laws-qPCR, 103 CFU/g feces for F4-qPCR and F18-qPCR. The PCR efficiency for all four qPCR assays was between 0.91 and 1.01 with R2 above 0.993. Standard curves, slopes and elevation, varied between assays and between measurements from pure DNA from reference strains and feces spiked with the respective strains. The linear ranges found for spiked fecal samples differed both from the linear ranges from pure culture of the reference strains and between the qPCR tests. The linear ranges were five log units for F4- qPCR, and Laws-qPCR, six log units for F18-qPCR and three log units for Bpilo-qPCR in spiked feces. When measured on pure DNA from the reference strains used in spiking experiments, the respective log ranges were: seven units for Bpilo-qPCR, Laws-qPCR and F18-qPCR and six log units for F4-qPCR. This shows the importance of using specific standard curves, where each pathogen is analysed in the same matrix as sample DNA. The qPCRs were compared to traditional bacteriological diagnostic methods and found to be more sensitive than cultivation for E. coli and B. pilosicoli. The qPCR assay for Lawsonia was also more sensitive than the earlier used method due to improvements in DNA extraction. In addition, as samples were not analysed for all four pathogen agents by traditional diagnostic methods, many samples were found positive for agents that were not expected on the basis of age and case history. The use of quantitative PCR tests for diagnosis of enteric diseases provides new possibilities for veterinary diagnostics. The parallel simultaneous analysis for several bacteria in multi-qPCR and the determination of the quantities of the infectious agents increases the information obtained from the samples and the chance for obtaining a relevant diagnosis.

  6. Experimento para quantificar a eficiência de aspersão de líquidos: aplicação em distribuidores espinha de peixe / Liquid aspersion efficiency quantification experiment: application in ladder-type distributors

    Scientific Electronic Library Online (English)

    Marlene Silva de, Moraes; José Renato Baptista de, Lima; Deovaldo de, Moraes Júnior; Luis Renato Bastos, Lia; Sandro Megale, Pizzo.

    2008-03-01

    Full Text Available O presente texto descreve um equipamento na escala-piloto e um método simples para comparar a eficiência de distribuidores de líquido. A técnica consiste basicamente em analisar a massa do líquido coletado em 21 tubos verticais de 52mm de diâmetro interno e 800 mm de comprimento dispostos em arranjo [...] quadrático colocados abaixo do distribuidor. Uma manta acrílica que não dispersa o líquido com 50 mm de espessura foi fixada entre o distribuidor e o banco de tubos para evitar respingos. Como exemplo de aplicação foram realizados ensaios com nove distribuidores do tipo espinha de peixe de 4 tubos paralelos cada, para uma coluna com 400 mm de diâmetro. Variaram-se o número (n) de furos (95, 127 e 159 furos/m²), o diâmetro (d) dos furos (2, 3 e 4 mm) e as vazões (q) de (1,2; 1,4 e 1,6m³/h). A melhor eficiência de espalhamento pelo menor desvio-padrão foi obtida com n de 159, d de 2 e q de 1,4 indicando as limitações de regras práticas de projeto. A pressão (p), na entrada do distribuidor, para essa condição, foi de apenas 51000 Pa (0,51 kgf/cm²) e a velocidade média (v) em cada orifício foi de 6,3 m/s. Abstract in english This paper describes a device developed on the pilot scale and a simple approach to compare liquid distributor efficiencies. The technique consists basically of analyzing the mass of the liquid collected in 21 vertical pipes measuring 52 mm in internal diameter and 800 mm in length placed in a quadr [...] atic arrangement and positioned below the distributor. A 50 mm thick acrylic blanket that does not disperse liquids was placed between the distributor and the pipe bank to avoid splashes. Assays were carried out with ladder-type distributors equipped with 4 parallel pipes each for a column measuring 400 mm in diameter as an example of the application. The number (n) of orifices (95, 127, and 159 orifices/m²), orifice diameter (d) (2, 3, and 4 mm) and the flowrate (q) (1.2; 1.4; and 1.6 m3/h) were varied. The best spread efficiency, which presented the lowest standard deviation, was achieved with 159 orifices, 2 mm and 1.4 m³/h. The pressure (p) at the distributor's inlet for this condition was only 51000 Pa (0.51 kgf/cm²), while the average velocity (v) was 6.3 m/s in each orifice. These results show some limitations of the practical rules used in distributor designs.

  7. Experimento para quantificar a eficiência de aspersão de líquidos: aplicação em distribuidores espinha de peixe Liquid aspersion efficiency quantification experiment: application in ladder-type distributors

    Directory of Open Access Journals (Sweden)

    Marlene Silva de Moraes

    2008-03-01

    Full Text Available O presente texto descreve um equipamento na escala-piloto e um método simples para comparar a eficiência de distribuidores de líquido. A técnica consiste basicamente em analisar a massa do líquido coletado em 21 tubos verticais de 52mm de diâmetro interno e 800 mm de comprimento dispostos em arranjo quadrático colocados abaixo do distribuidor. Uma manta acrílica que não dispersa o líquido com 50 mm de espessura foi fixada entre o distribuidor e o banco de tubos para evitar respingos. Como exemplo de aplicação foram realizados ensaios com nove distribuidores do tipo espinha de peixe de 4 tubos paralelos cada, para uma coluna com 400 mm de diâmetro. Variaram-se o número (n de furos (95, 127 e 159 furos/m², o diâmetro (d dos furos (2, 3 e 4 mm e as vazões (q de (1,2; 1,4 e 1,6m³/h. A melhor eficiência de espalhamento pelo menor desvio-padrão foi obtida com n de 159, d de 2 e q de 1,4 indicando as limitações de regras práticas de projeto. A pressão (p, na entrada do distribuidor, para essa condição, foi de apenas 51000 Pa (0,51 kgf/cm² e a velocidade média (v em cada orifício foi de 6,3 m/s.This paper describes a device developed on the pilot scale and a simple approach to compare liquid distributor efficiencies. The technique consists basically of analyzing the mass of the liquid collected in 21 vertical pipes measuring 52 mm in internal diameter and 800 mm in length placed in a quadratic arrangement and positioned below the distributor. A 50 mm thick acrylic blanket that does not disperse liquids was placed between the distributor and the pipe bank to avoid splashes. Assays were carried out with ladder-type distributors equipped with 4 parallel pipes each for a column measuring 400 mm in diameter as an example of the application. The number (n of orifices (95, 127, and 159 orifices/m², orifice diameter (d (2, 3, and 4 mm and the flowrate (q (1.2; 1.4; and 1.6 m3/h were varied. The best spread efficiency, which presented the lowest standard deviation, was achieved with 159 orifices, 2 mm and 1.4 m³/h. The pressure (p at the distributor's inlet for this condition was only 51000 Pa (0.51 kgf/cm², while the average velocity (v was 6.3 m/s in each orifice. These results show some limitations of the practical rules used in distributor designs.

  8. Prospects of using the second-order perturbation theory of the MP2 type in the theory of electron scattering by polyatomic molecules

    International Nuclear Information System (INIS)

    So far the second-order perturbation theory has been only applied to the hydrogen molecule. No application was attempted for another molecule, probably because of technical difficulties of such calculations. The purpose of this contribution is to show that the calculations of this type are now feasible on larger polyatomic molecules even on commonly used computers

  9. Non-perturbative black holes in Type-IIA String Theory versus the No-Hair conjecture

    International Nuclear Information System (INIS)

    We obtain the first black hole solution to Type-IIA String Theory compactified on an arbitrary self-mirror Calabi–Yau manifold in the presence of non-perturbative quantum corrections. Remarkably enough, the solution involves multivalued functions, which could lead to a violation of the No-Hair conjecture. We discuss how String Theory forbids such scenario. However, the possibility still remains open in the context of four-dimensional ungauged Supergravity. (paper)

  10. Campbelling-type theory of fission chamber signals generated by neutron chains in a multiplying medium

    Science.gov (United States)

    Pál, L.; Pázsit, I.

    2015-09-01

    The signals of fission chambers are usually evaluated with the help of the co-called Campbelling techniques. These are based on the Campbell theorem, which states that if the primary incoming events, generating the detector pulses, are independent, then relationships exist between the moments of various orders of the signal in the current mode. This gives the possibility to determine the mean value of the intensity of the detection events, which is proportional to the static flux, from the higher moments of the detector current, which has certain advantages. However, the main application area of fission chambers is measurements in power reactors where, as is well known, the individual detection events are not independent, due to the branching character of the neutron chains (neutron multiplication). Therefore it is of interest to extend the Campbelling-type theory for the case of correlated neutron events. Such a theory could address two questions: partly, to investigate the bias when the traditional Campbell techniques are used for correlated incoming events; and partly, to see whether the correlation properties of the detection events, which carry information on the multiplying medium, could be extracted from the measurements. This paper is devoted to the investigation of these questions. The results show that there is a potential possibility to extract the same information from fission chamber signals in the current mode as with the Rossi- or Feynman-alpha methods, or from coincidence and multiplicity measurements, which so far have required detectors working in the pulse mode. It is also shown that application of the standard Campbelling techniques to neutron detection in multiplying systems does not lead to an error for estimating the stationary flux as long as the detector is calibrated in in situ measurements.

  11. Itinerant type many-body theories for photo-induced structural phase transitions

    Science.gov (United States)

    Nasu, Keiichiro

    2004-09-01

    Itinerant type quantum many-body theories for photo-induced structural phase transitions (PSPTs) are reviewed in close connection with various recent experimental results related to this new optical phenomenon. There are two key concepts: the hidden multi-stability of the ground state, and the proliferations of optically excited states. Taking the ionic (I) rarr neutral (N) phase transition in the organic charge transfer (CT) crystal, TTF-CA, as a typical example for this type of transition, we, at first, theoretically show an adiabatic path which starts from CT excitons in the I-phase, but finally reaches an N-domain with a macroscopic size. In connection with this I-N transition, the concept of the initial condition sensitivity is also developed so as to clarify experimentally observed nonlinear characteristics of this material. In the next, using a more simplified model for the many-exciton system, we theoretically study the early time quantum dynamics of the exciton proliferation, which finally results in the formation of a domain with a large number of excitons. For this purpose, we derive a stepwise iterative equation to describe the exciton proliferation, and clarify the origin of the initial condition sensitivity. Possible differences between a photo-induced nonequilibrium phase and an equilibrium phase at high temperatures are also clarified from general and conceptional points of view, in connection with recent experiments on the photo-induced phase transition in an organo-metallic complex crystal. It will be shown that the photo-induced phase can make a new interaction appear as a broken symmetry only in this phase, even when this interaction is almost completely hidden in all the equilibrium phases, such as the ground state and other high-temperature phases. The relation between the photo-induced nonequilibrium phase and the hysteresis induced nonequilibrium one is also qualitatively discussed. We will be concerned with a macroscopic parity violation and a ferro- (or super-para-) electricity, induced by a photogenerated electron in the perovskite type quantum dielectric SrTiO3. The photogenerated electron in the 3d band of Ti is assumed to couple weakly, but quadratically, with soft-anharmonic T1u phonons, and strongly but linearly to the breathing (A1g) type high energy phonons. These two types of electron-phonon coupling result in two types of polarons, a super-para-electric (SPE) large polaron with a quasi-global parity violation, and an off-centre type self-trapped polaron with only a local parity violation. This SPE large polaron, being equal to a charged and conductive ferroelectric domain, greatly enhances both the quasi-static electric susceptibility and the electronic conductivity. We also briefly review recent successes to observe the PSPTs more directly by using x-ray measurements.

  12. Quantification and Negation in Event Semantics

    OpenAIRE

    Lucas Champollion

    2010-01-01

    Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by e...

  13. Quantification of quadriceps and hamstring antagonist activity

    OpenAIRE

    Kellis, E.; ??????, ?.

    2010-01-01

    The coactivation of hamstrings and quadriceps, and its relation to knee joint stability and cruciate ligament loading, have been extensively examined over the last decades. The purpose of this review is to present findings on the quantification of antagonist activation around the knee. Coactivation of the quadriceps and hamstrings during many activities has been examined using electromyography (EMG). However, there are several factors that affect antagonist EMG activity, such as the type of m...

  14. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very useful in quantifying disease severity, they require an extensive clinical experience and carry a risk of subjectivity. We explore the opportunity to use in vivo near-infrared (NIR) spectra as an objecti...

  15. Search of unified theory of basic types of elementary particle interactions. Part 2

    International Nuclear Information System (INIS)

    An attempt is made at evolving a renormalized theory unifying the electromagnetic interactions theory and the weak interactions theory. The theory is based on the principle of the gauge symmetry and the idea of its spontaneous breaking. Feynman and Gell-Mann suggested a universal four-fermion theory completed with an intermediate boson theory. In order that the theory be renormalized it should be based on a suitable group of local gauge symmetries. All weak interactions comprised in the universal four-fermion theory are correctly described by a gauge theory based on weak isotopic symmetry. When also the phase symmetry is introduced, the photon can be included in the gauge theory suggested. In addition to vector bosons and fermions, the Higgs boson with a zero spin is present in the theory. The theory unifying weak and electromagnetic interactions was proposed by Weinberg and Salam and is confirmed by experiments and new discoveries. Another attempt includes the establishment of a theory unifying the said interactions with strong interactions, the so-called grand unification. The task here consists in finding such a symmetry group which would include as special cases of transformations symmetries corresponding to strong, weak and electromagnetic interactions. Group SU(5) seems to be a suitable group for this unification. (M.D.)

  16. From Peierls brackets to a generalized Moyal bracket for type-I gauge theories

    CERN Document Server

    Esposito, G; Esposito, Giampiero; Stornaiolo, Cosimo

    2006-01-01

    In the space-of-histories approach to gauge fields and their quantization, the Maxwell, Yang--Mills and gravitational field are well known to share the property of being type-I theories, i.e. Lie brackets of the vector fields which leave the action functional invariant are linear combinations of such vector fields, with coefficients of linear combination given by structure constants. The corresponding gauge-field operator in the functional integral for the in-out amplitude is an invertible second-order differential operator. For such an operator, we consider advanced and retarded Green functions giving rise to a Peierls bracket among group-invariant functionals. Our Peierls bracket is a Poisson bracket on the space of all group-invariant functionals in two cases only: either the gauge-fixing is arbitrary but the gauge fields lie on the dynamical sub-space; or the gauge-fixing is a linear functional of gauge fields, which are generic points of the space of histories. In both cases, the resulting Peierls bracke...

  17. Psychosocial correlates of dietary behaviour in type 2 diabetic women, using a behaviour change theory.

    Science.gov (United States)

    Didarloo, A; Shojaeizadeh, D; Gharaaghaji Asl, R; Niknami, S; Khorami, A

    2014-06-01

    The study evaluated the efficacy of the Theory of Reasoned Action (TRA), along with self-efficacy to predict dietary behaviour in a group of Iranian women with type 2 diabetes. A sample of 352 diabetic women referred to Khoy Diabetes Clinic, Iran, were selected and given a self-administered survey to assess eating behaviour, using the extended TRA constructs. Bivariate correlations and Enter regression analyses of the extended TRA model were performed with SPSS software. Overall, the proposed model explained 31.6% of variance of behavioural intention and 21.5% of variance of dietary behaviour. Among the model constructs, self-efficacy was the strongest predictor of intentions and dietary practice. In addition to the model variables, visit intervals of patients and source of obtaining information about diabetes from sociodemographic factors were also associated with dietary behaviours of the diabetics. This research has highlighted the relative importance of the extended TRA constructs upon behavioural intention and subsequent behaviour. Therefore, use of the present research model in designing educational interventions to increase adherence to dietary behaviours among diabetic patients was recommended and emphasized. PMID:25076670

  18. Algebraic Signal Processing Theory: Cooley-Tukey Type Algorithms for Polynomial Transforms Based on Induction

    CERN Document Server

    Sandryhaila, Aliaksei; Pueschel, Markus

    2010-01-01

    A polynomial transform is the multiplication of an input vector $x\\in\\C^n$ by a matrix $\\PT_{b,\\alpha}\\in\\C^{n\\times n},$ whose $(k,\\ell)$-th element is defined as $p_\\ell(\\alpha_k)$ for polynomials $p_\\ell(x)\\in\\C[x]$ from a list $b=\\{p_0(x),\\dots,p_{n-1}(x)\\}$ and sample points $\\alpha_k\\in\\C$ from a list $\\alpha=\\{\\alpha_0,\\dots,\\alpha_{n-1}\\}$. Such transforms find applications in the areas of signal processing, data compression, and function interpolation. Important examples include the discrete Fourier and cosine transforms. In this paper we introduce a novel technique to derive fast algorithms for polynomial transforms. The technique uses the relationship between polynomial transforms and the representation theory of polynomial algebras. Specifically, we derive algorithms by decomposing the regular modules of these algebras as a stepwise induction. As an application, we derive novel $O(n\\log{n})$ general-radix algorithms for the discrete Fourier transform and the discrete cosine transform of type 4.

  19. Kendall's Shape Statistics as a Classical Realization of Barbour-type Timeless Records Theory approach to Quantum Gravity

    CERN Document Server

    Anderson, Edward

    2013-01-01

    I already showed that Kendall's shape geometry work was the geometrical description of Barbour's relational mechanics' reduced configuration spaces (alias shape spaces). I now describe the extent to which Kendall's subsequent statistical application to such as the `standing stones problem' realizes further ideas along the lines of Barbour-type timeless records theories, albeit just at the classical level.

  20. De-Sitter Type of Cosmological Model in n-Dimensional Space-Time-Mass (STM) Theory of Gravitation

    OpenAIRE

    Khadekar, G S; Patki, Vrishali

    2002-01-01

    Exact solution are obtained for a homogeneous spacially isotropic cosmological model in a matter free space with or without cosmological consant for a n-dimensional Kaluza-Klein type of metric in the rest mass varying theory of gravity proposed by Wesson[1983]. The behavior of the model is discussed.

  1. Non-Abelian dual superconductivity in SU(3) Yang-Mills theory: dual Meissner effect and type of the vacuum

    CERN Document Server

    Shibata, Akihiro; Kato, Seikou; Shinohara, Toru

    2013-01-01

    We have proposed the non-Abelian dual superconductivity picture for quark confinement in the SU(3) Yang-Mills (YM) theory, and have given numerical evidences for the restricted-field dominance and the non-Abelian magnetic monopole dominance in the string tension by applying a new formulation of the YM theory on a lattice. To establish the non-Abelian dual superconductivity picture for quark confinement, we have observed the non-Abelian dual Meissner effect in the SU(3) Yang-Mills theory by measuring the chromoelectric flux created by the quark-antiquark source, and the non-Abelian magnetic monopole currents induced around the flux. We conclude that the dual superconductivity of the SU(3) Yang-Mills theory is strictly the type I and that this type of dual superconductivity is reproduced by the restricted field and the non-Abelian magnetic monopole part, in sharp contrast to the SU(2) case: the border of type I and type II.

  2. Advances in type-2 fuzzy sets and systems theory and applications

    CERN Document Server

    Mendel, Jerry; Tahayori, Hooman

    2013-01-01

    This book explores recent developments in the theoretical foundations and novel applications of general and interval type-2 fuzzy sets and systems, including: algebraic properties of type-2 fuzzy sets, geometric-based definition of type-2 fuzzy set operators, generalizations of the continuous KM algorithm, adaptiveness and novelty of interval type-2 fuzzy logic controllers, relations between conceptual spaces and type-2 fuzzy sets, type-2 fuzzy logic systems versus perceptual computers; modeling human perception of real world concepts with type-2 fuzzy sets, different methods for generating membership functions of interval and general type-2 fuzzy sets, and applications of interval type-2 fuzzy sets to control, machine tooling, image processing and diet.  The applications demonstrate the appropriateness of using type-2 fuzzy sets and systems in real world problems that are characterized by different degrees of uncertainty.

  3. Bianchi Type-II String Cosmological Model with Magnetic Field in Scalar-tensor Theory of Gravitation

    Science.gov (United States)

    Sharma, N. K.; Singh, J. K.

    2015-03-01

    The spatially homogeneous and totally anisotropic Bianchi type-II cosmological solutions of massive strings have been investigated in the presence of the magnetic field in the framework of scalar-tensor theory of gravitation formulated by Saez and Ballester (Phys. Lett. A 113:467, 1986). With the help of special law of variation for Hubble's parameter proposed by Berman (Nuovo Cimento B 74:182, 1983) string cosmological model is obtained in this theory. Some physical and kinematical properties of the model are also discussed.

  4. Bianchi type-II String Cosmological Model with Magnetic Field in Scale-Covariant Theory of Gravitation

    Science.gov (United States)

    Sharma, N. K.; Singh, J. K.

    2014-12-01

    The spatially homogeneous and totally anisotropic Bianchi type-II cosmological solutions of massive strings have been investigated in the presence of the magnetic field in the framework of scale-covariant theory of gravitation formulated by Canuto et al. (Phys. Rev. Lett. 39, 429, 1977). With the help of special law of variation for Hubble's parameter proposed by Berman (Nuovo Cimento 74, 182, 1983) string cosmological model is obtained in this theory. We use the power law relation between scalar field ? and scale factor R to find the solutions. Some physical and kinematical properties of the model are also discussed.

  5. Auslander-Reiten quiver and representation theories related to KLR-type Schur-Weyl duality

    OpenAIRE

    Oh, Se-Jin

    2015-01-01

    We introduce new notions on the sequences of positive roots by using Auslander-Reiten quivers. Then we can prove that the new notions provide interesting information on the representation theories of KLR-algebras, quantum groups and quantum affine algebras including generalized Dorey's rule, bases theory for quantum groups, and denominator formulas between fundamental representations.

  6. Quantification in echocardiography.

    Science.gov (United States)

    Korsten, Hendrikus H M; Mischi, Massimo; Grouls, Rene J E; Jansen, Annemiek; van Dantzig, Jan-Melle; Peels, Kathinka

    2006-03-01

    Until recently, more than 2200 Swan Ganz catheters were used annually in the operating rooms (OR) and intensive care unit (ICU) of the Catharina Hospital in Eindhoven, The Netherlands. After cardiologists who were specialists in echocardiography (ECHO) trained anesthesiologists in ECHO, the need for these catheters in cardiac and noncardiac surgery was reduced. Initially intended as a local teaching project, an ECHO teaching compact disk (CD) was produced during the training and distributed later worldwide, thanks to a positive review in a major anesthesiology publication. By reducing the number of Swan Ganz catheters, the hospital could finance and acquire two echocardiography machines for the OR and ICU. The availability of these machines resulted in a further reduction of the number of Swan Ganz catheters. However, the need for quantification (eg, measurements of cardiac output) remained. During the creation of the ECHO teaching CD, the idea was born to apply indicator-dilution principles on injected echo contrast. This study was performed in cooperation with the Signal Processing Department of the Eindhoven University of Technology. Advanced signal processing and modelling were used to develop algorithms to enable quantification of intrapulmonary blood volume, ejection-fraction, and flow from the transesophageal echocardiography approach. These quantitative measurements, which can be performed on an outpatient basis, may become a real asset in cardiology, anesthesiology, and intensive care. PMID:16703235

  7. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  8. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language and Meaning’, Lecture Notes in Computer Science, vol. 6042, 203–212. Berlin, Germany: Springer.Carlson, Gregory N. 1977. Reference to Kinds in English. Ph.D. thesis, University of Massachusetts, Amherst, MA.Carlson, Gregory N. 1984. ‘Thematic roles and their role in semantic interpretation’. Linguistics 22: 259–279.http://dx.doi.org/10.1515/ling.1984.22.3.259Champollion, Lucas. 2010. Parts of a whole: Distributivity as a bridge between aspect and measurement. Ph.D. thesis, University of Pennsylvania, Philadelphia, PA.Champollion, Lucas, Tauberer, Josh & Romero, Maribel. 2007. ‘The Penn Lambda Calculator: Pedagogical software for natural language semantics’. In Tracy Holloway King & Emily Bender (eds. ‘Proceedings of the Grammar Engineering Across Frameworks(GEAF 2007 Workshop’, Stanford, CA: CSLI Online Publications.Condoravdi, Cleo. 2002. ‘Punctual until as a scalar NPI’. In Sharon Inkelas & Kristin Hanson (eds. ‘The nature of the word’, 631–654. Cambridge, MA: MIT Press.Csirmaz, Aniko. 2006. ‘Aspect, Negation and Quantifiers’. In Liliane Haegeman, Joan Maling, James McCloskey & Katalin E. Kiss (eds. ‘Event Structure And The Left Periphery’, Studies in Natural Language and Linguistic Theory, vol. 68, 225–253. SpringerNetherlands.Davidson, Donald. 1967. ‘The logical form of action sentences’. In Nicholas Rescher (ed. ‘The logic of decision and action’, 81–95. Pittsburgh, PA: University of Pittsburgh Press.de Swart, Henriëtte. 1996. ‘Meaning and use of not . . . until’. Journal of Semantics 13: 221–263.http://dx.doi.org/10.1093/jos/13.3.221de Swart, Henriëtte & Molendijk, Arie. 1999. ‘Negation and the temporal structure of narrative discourse’. Journal of Semantics 16: 1–42.http://dx.doi.org/10.1093/jos/16.1.1Dowty, David R. 1979. Word meaning and Montague grammar. Dordrecht, Netherlands: Reidel.Eckardt, Regine. 2010. ‘A Logic for Easy Linking Semantics’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language and Meaning’, Lecture Notes in Computer Science, vo

  9. Transparent quantification into hyperintensional contexts.

    Czech Academy of Sciences Publication Activity Database

    Duží, M.; Jespersen, Bjorn

    London : College Publications, 2011 - (Peliš, M.; Pun?ochá?, V.), s. 81-97 ISBN 978-1-84890-038-7. [LOGICA 2010. Hejnice (CZ), 21.06.2010-25.06.2010] Institutional research plan: CEZ:AV0Z90090514 Keywords : hyperintensions * type theory * Transparant intensional logic * propositional attitudes Subject RIV: AA - Philosophy ; Religion

  10. On a $p$-Laplacian type of evolution system and applications to the Bean model in the type-II superconductivity theory

    CERN Document Server

    Yin, H M

    1998-01-01

    We study the Cauchy problem for an $p$-Laplacian type of evolution system ${\\mathbf H}_{t}+\\g [ | \\g {\\mathbf H}|^{p-2} \\g {\\mathbf H}|]={\\mathbf F}$. This system governs the evolution of a magnetic field ${\\bf H}$, where the current displacement is neglected and the electrical resistivity is assumed to be some power of the current density. The existence, uniqueness and regularity of solutions to the system are established. Furthermore, it is shown that the limit solution as the power $p\\rightarrow \\infty$ solves the problem of Bean's model in the type-II superconductivity theory. The result provides us information about how the superconductor material under the external force to become the normal conductor and vice visa. It also provides an effective method to find numerical solutions to Bean's model.

  11. Bianchi type-III minimally interacting holographic dark energy model with linearly varying deceleration parameter in Brans-Dicke theory

    Science.gov (United States)

    Kiran, M.; Reddy, D. R. K.; Rao, V. U. M.

    2015-12-01

    In this paper, we study the minimally interacting fields; matter and holographic dark energy components in Bianchi type-III space-time in the frame work of Brans-Dicke (Phys. Rev. 125:961, 1961) scalar-tensor theory of gravitation. We present a Bianchi type-III holographic dark energy model with the help of linearly varying deceleration parameter proposed by Akarsu and Dereli (Int. J. Theor. Phys. 51:612, 2012). Some physical and kinematical properties of the model are also discussed.

  12. Cosmological solution of Bianchi type I in a new theory of gravitation

    International Nuclear Information System (INIS)

    We present a homogeneous, plane-symmetric, matter-free solution to a new theory of gravitation. In the limit of large t, the solution goes over into the plane-symmetric Kasner metric of general relativity

  13. Two types of conservation laws. Connection of physical fields with material systems. Peculiarities of field theories

    OpenAIRE

    Petrova, L. I.

    2008-01-01

    Historically it happen so that in branches of physics connected with field theory and of physics of material systems (continuous media) the concept of "conservation laws" has a different meaning. In field theory "conservation laws" are those that claim the existence of conservative physical quantities or objects. These are conservation laws for physical fields. In contrast to that in physics (and mechanics) of material systems the concept of "conservation laws" relates to co...

  14. Spectral analysis of polynomial potentials and its relation with ABJ/M-type theories

    International Nuclear Information System (INIS)

    We obtain a general class of polynomial potentials for which the Schroedinger operator has a discrete spectrum. This class includes all the scalar potentials in membrane, 5-brane, p-branes, multiple M2 branes, BLG and ABJM theories. We provide a proof of the discreteness of the spectrum of the associated Schroedinger operators. This is the first step in order to analyze BLG and ABJM supersymmetric theories from a non-perturbative point of view.

  15. XPS quantification of the hetero-junction interface energy

    International Nuclear Information System (INIS)

    Highlights: ? Quantum entrapment or polarization dictates the performance of dopant, impurity, interface, alloy and compounds. ? Interface bond energy, energy density, and atomic cohesive energy can be determined using XPS and our BOLS theory. ? Presents a new and reliable method for catalyst design and identification. ? Entrapment makes CuPd to be a p-type catalyst and polarization derives AgPd as an n-type catalyst. - Abstract: We present an approach for quantifying the heterogeneous interface bond energy using X-ray photoelectron spectroscopy (XPS). Firstly, from analyzing the XPS core-level shift of the elemental surfaces we obtained the energy levels of an isolated atom and their bulk shifts of the constituent elements for reference; then we measured the energy shifts of the specific energy levels upon interface alloy formation. Subtracting the referential spectrum from that collected from the alloy, we can distil the interface effect on the binding energy. Calibrated based on the energy levels and their bulk shifts derived from elemental surfaces, we can derive the bond energy, energy density, atomic cohesive energy, and free energy at the interface region. This approach has enabled us to clarify the dominance of quantum entrapment at CuPd interface and the dominance of polarization at AgPd and BeW interfaces, as the origin of interface energy change. Developed approach not only enhances the power of XPS but also enables the quantification of the interface energy at the atomic scale that has been an issue of long challenge.

  16. Demographic and Motivation Differences Among Online Sex Offenders by Type of Offense: An Exploration of Routine Activities Theories.

    Science.gov (United States)

    Navarro, Jordana N; Jasinski, Jana L

    2015-10-01

    This article presents an analysis of the relationship between online sexual offenders' demographic background and characteristics indicative of motivation and offense type. Specifically, we investigate whether these characteristics can distinguish different online sexual offender groups from one another as well as inform routine activity theorists on what potentially motivates perpetrators. Using multinomial logistic regression, this study found that online sexual offenders' demographic backgrounds and characteristics indicative of motivation do vary by offense types. Two important implications of this study are that the term "online sexual offender" encompasses different types of offenders, including some who do not align with mainstream media's characterization of "predators," and that the potential offender within routine activity theory can be the focus of empirical investigation rather than taken as a given in research. PMID:26480242

  17. Communication: Cosolvency and cononsolvency explained in terms of a Flory-Huggins type theory

    Science.gov (United States)

    Dudowicz, Jacek; Freed, Karl F.; Douglas, Jack F.

    2015-10-01

    Standard Flory-Huggins (FH) theory is utilized to describe the enigmatic cosolvency and cononsolvency phenomena for systems of polymers dissolved in mixed solvents. In particular, phase boundaries (specifically upper critical solution temperature spinodals) are calculated for solutions of homopolymers B in pure solvents and in binary mixtures of small molecule liquids A and C. The miscibility (or immiscibility) patterns for the ternary systems are classified in terms of the FH binary interaction parameters {???} and the ratio r = ?A/?C of the concentrations ?A and ?C of the two solvents. The trends in miscibility are compared to those observed for blends of random copolymers (AxC1-x) with homopolymers (B) and to those deduced for A/B/C solutions of polymers B in liquid mixtures of small molecules A and C that associate into polymeric clusters {ApCq}i, (i = 1, 2, …, ?). Although the classic FH theory is able to explain cosolvency and cononsolvency phenomena, the theory does not include a consideration of the mutual association of the solvent molecules and the competitive association between the solvent molecules and the polymer. These interactions can be incorporated in refinements of the FH theory, and the present paper provides a foundation for such extensions for modeling the rich thermodynamics of polymers in mixed solvents.

  18. Training load quantification in triathlon

    OpenAIRE

    ROBERTO CEJUELA ANTA; JONATHAN ESTEVE-LANAO

    2011-01-01

    There are different Indices of Training Stress of varying complexity, to quantification Training load. Examples include the training impulse (TRIMP), the session (RPE), Lucia’s TRIMP or Summated Zone Score. But the triathlon, a sport to be combined where there are interactions between different segments, is a complication when it comes to quantify the training. The aim of this paper is to review current methods of quantification, and to propose a scale to quantify the training load in triathl...

  19. Effective field theory of modified gravity on the spherically symmetric background: Leading order dynamics and the odd-type perturbations

    Science.gov (United States)

    Kase, Ryotaro; Gergely, László Á.; Tsujikawa, Shinji

    2014-12-01

    We consider perturbations of a static and spherically symmetric background endowed with a metric tensor and a scalar field in the framework of the effective field theory of modified gravity. We employ the previously developed 2 +1 +1 canonical formalism of a double Arnowitt-Deser-Misner (ADM) decomposition of space-time, which singles out both time and radial directions. Our building block is a general gravitational action that depends on scalar quantities constructed from the 2 +1 +1 canonical variables and the lapse. Variation of the action up to first order in perturbations gives rise to three independent background equations of motion, as expected from spherical symmetry. The dynamical equations of linear perturbations follow from the second-order Lagrangian after a suitable gauge fixing. We derive conditions for the avoidance of ghosts and Laplacian instabilities for the odd-type perturbations. We show that our results not only incorporate those derived in the most general scalar-tensor theories with second-order equations of motion (the Horndeski theories) but they can be applied to more generic theories beyond Horndeski.

  20. Quantum mechanical analysis on faujasite-type molecular sieves by using fermi dirac statistics and quantum theory of dielectricity

    International Nuclear Information System (INIS)

    We studied Faujasite type molecular sieves by using Fermi Dirac statistics and the quantum theory of dielectricity. We developed an empirical relationship for quantum capacitance which follows an inverse Gaussian profile in the frequency range of 66 Hz - 3 MHz. We calculated quantum capacitance, sample crystal momentum, charge quantization and quantized energy of Faujasite type molecular sieves in the frequency range of 0.1 Hz - 10/sup 4/ MHz. Our calculations for diameter of sodalite and super-cages of Faujasite type molecular sieves are in agreement with experimental results reported in this manuscript. We also calculated quantum polarizability, quantized molecular field, orientational polarizability and deformation polarizability by using experimental results of Ligia Frunza etal. The phonons are over damped in the frequency range 0.1 Hz - 10 kHz and become a source for producing cages in the Faujasite type molecular sieves. Ion exchange recovery processes occur due to over damped phonon excitations in Faujasite type molecular sieves and with increasing temperatures. (author)

  1. Electrostatic field in superconductors IV: theory of Ginzburg-Landau type.

    Czech Academy of Sciences Publication Activity Database

    Lipavský, P.; Kolá?ek, Jan

    Singapore : World Scientific Publ. Co, 2010 - (Kusmartsev, F.), s. 581-587 ISBN 978-981-4289-14-6. - (24). [International Workshop on Condensed Matter Theories /32./. Loughborough (GB), 12.08.2008-19.08.2008] Institutional research plan: CEZ:AV0Z10100521 Keywords : superconductor * electric field Subject RIV: BM - Solid Matter Physics ; Magnetism http://eproceedings.worldscinet.com/9789814289153/9789814289153.shtml

  2. The double Mellin-Barnes type integrals and their applications to convolution theory

    CERN Document Server

    Hai, Nguyen Thanh

    1992-01-01

    This book presents new results in the theory of the double Mellin-Barnes integrals popularly known as the general H-function of two variables.A general integral convolution is constructed by the authors and it contains Laplace convolution as a particular case and possesses a factorization property for one-dimensional H-transform. Many examples of convolutions for classical integral transforms are obtained and they can be applied for the evaluation of series and integrals.

  3. Adaptation of learning resources based on the MBTI theory of psychological types

    OpenAIRE

    Amel Behaz; Mahieddine Djoudi

    2012-01-01

    Today, the resources available on the web increases significantly. The motivation for the dissemination of knowledge and their acquisition by learners is central to learning. However, learners show differences between the ways of learning that suits them best. The objective of the work presented in this paper is to study how it is possible to integrate models from cognitive theories and ontologies for the adaptation of educational resources. The goal is to provide the system capabilities to c...

  4. Classical Morse theory revisited I -- Backward $\\lambda$-Lemma and homotopy type

    OpenAIRE

    Weber, Joa

    2014-01-01

    We introduce a tool, dynamical thickening, which overcomes the infamous discontinuity of the gradient flow endpoint map near non-degenerate critical points. More precisely, we interpret the stable foliations of certain Conley pairs $(N,L)$, established in [4], as a \\emph{dynamical thickening of the stable manifold}. As a first application and to illustrate efficiency of the concept we reprove a fundamental theorem of classical Morse theory, Milnor's homotopical cell attachme...

  5. Maier-Saupe-type theory of ferroelectric nanoparticles in nematic liquid crystals

    OpenAIRE

    Lopatina, Lena M.; Selinger, Jonathan V.

    2011-01-01

    Several experiments have reported that ferroelectric nanoparticles have drastic effects on nematic liquid crystals--increasing the isotropic-nematic transition temperature by about 5 K, and greatly increasing the sensitivity to applied electric fields. In a recent paper [L. M. Lopatina and J. V. Selinger, Phys. Rev. Lett. 102, 197802 (2009)], we modeled these effects through a Landau theory, based on coupled orientational order parameters for the liquid crystal and the nanop...

  6. Elliptic Sn-type solutions fluctuations in theory ??4 in a finite domain

    International Nuclear Information System (INIS)

    Some years ago Dashen, Hasslacher, and Neveu (DHN) studied the quantum fluctuations of a kind and the quantum corrections to its energy in a infinity domain in (1+1) dimensional space-time. In this work we calculate the quantum fluctuations for classical solutions more general than Kink, which are named Elliptic sn-type solutions. We obtain a differential equation of second order with five regular singular points usually known by Fuchsian equation. The solutions of this type equation known as Generalised Lame Functions, are usually given by Riemann's P-Functions. These solutions presented some difficulties in order to determine the radiative corrections for the Elliptic sn-type solutions. (author)

  7. Session Types = Intersection Types + Union Types

    CERN Document Server

    Padovani, Luca

    2011-01-01

    We propose a semantically grounded theory of session types which relies on intersection and union types. We argue that intersection and union types are natural candidates for modeling branching points in session types and we show that the resulting theory overcomes some important defects of related behavioral theories. In particular, intersections and unions provide a native solution to the problem of computing joins and meets of session types. Also, the subtyping relation turns out to be a pre-congruence, while this is not always the case in related behavioral theories.

  8. Is the term "type-1.5 superconductivity" warranted by Ginzburg-Landau theory?

    Energy Technology Data Exchange (ETDEWEB)

    Kogan, V.G.; Schmalian, J.

    2011-01-03

    It is shown that within the Ginzburg-Landau (GL) approximation the order parameters {Delta}{sub 1}(r,T) and {Delta}{sub 2}(r,T) in two-band superconductors vary on the same length scale, the difference in zero-T coherence lengths {zeta}{sub 0{nu}} {approx} {h_bar}{nu}{sub F}/{Delta}{sub {nu}}(0), {nu} = 1,2 notwithstanding. This amounts to a single physical GL parameter {kappa} and the classic GL dichotomy: {kappa} < 1/{radical}2 for type I and {kappa} > 1/{radical}2 for type II.

  9. Minimally interacting holographic dark energy model in Bianchi type-III universe in Brans-Dicke theory

    Science.gov (United States)

    Umadevi, S.; Ramesh, G.

    2015-10-01

    A spatially homogeneous and anisotropic Bianchi type-III universe filled with two minimally interacting fields is investigated: matter and holographic dark energy components in the framework of the Brans-Dicke (Phys. Rev. 124:925, 1961) theory of gravitation. To obtain determinate solutions of the field equations we have used (i) scalar expansion proportional to the shear scalar and (ii) special law of variation for Hubble's parameter proposed by Berman (Nuovo Comento B 74:182, 1983). Some physical and kinematical properties of the model are also discussed.

  10. Unified theory of mixed state Hall effect in type-II superconductors: Scaling behavior and sign reversal

    International Nuclear Information System (INIS)

    Based upon the normal core model of Bardeen and Stephen and by taking into account both the backflow effect and thermal fluctuations, we have developed a unified theory for the flux motion, particularly for the mixed state Hall effect in type-II superconductors. Both the puzzling scaling behavior and the anomalous sign reversal of the Hall effect have been demonstrated rigorously and naturally. We show that our results successfully explain all essential features of experiments on the mixed state Hall resistivity observed in high-Tc superconductors

  11. New Weyl-type vacuum space-time of Einstein's gravitational theory

    International Nuclear Information System (INIS)

    New Weyl-type space-time, solutions of the Einstein equations are found which satisfy the casuality conditions as the Lichtenstein theorem and the asymptotic boundary conditions. The new solutions contain as a special case the Schwarzschild and the Voorhees solutions and can be considered, like the Voorhees solutions, as the exterior space time of a static axially symmetric body

  12. Fixed point theory for compact absorbing contractions in extension type spaces

    OpenAIRE

    Donal ORegan

    2010-01-01

    Several new fixed point results for self maps in extension type spaces are presented in this paper. In particular we discuss compact absorbing contractions.Son presentados en este artículo varios resultados nuevos de punto fijo para autoaplicaciones en espacios de tipo extensión. En particular discutimos contracciones compactas absorbentes.

  13. Investigating Strength and Frequency Effects in Recognition Memory Using Type-2 Signal Detection Theory

    Science.gov (United States)

    Higham, Philip A.; Perfect, Timothy J.; Bruno, Davide

    2009-01-01

    Criterion- versus distribution-shift accounts of frequency and strength effects in recognition memory were investigated with Type-2 signal detection receiver operating characteristic (ROC) analysis, which provides a measure of metacognitive monitoring. Experiment 1 demonstrated a frequency-based mirror effect, with a higher hit rate and lower…

  14. Modeling the size dependent pull-in instability of beam-type NEMS using strain gradient theory

    Scientific Electronic Library Online (English)

    Ali, Koochi; Hamid M., Sedighi; Mohamadreza, Abadyan.

    Full Text Available It is well recognized that size dependency of materials characteristics, i.e. size-effect, often plays a significant role in the performance of nano-structures. Herein, strain gradient continuum theory is employed to investigate the size dependent pull-in instability of beam-type nano-electromechani [...] cal systems (NEMS). Two most common types of NEMS i.e. nano-bridge and nano-cantilever are considered. Effects of electrostatic field and dispersion forces i.e. Casimir and van der Waals (vdW) attractions have been considered in the nonlinear governing equations of the systems. Two different solution methods including numerical and Rayleigh-Ritz have been employed to solve the constitutive differential equations of the system. Effect of dispersion forces, the size dependency and the importance of coupling between them on the instability performance are discussed.

  15. Uncertainty Quantification in Climate Modeling

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis requires a large number of training runs, as well as an output parameterization with respect to a fast-growing spectral basis set. To alleviate this issue, we adopt the Bayesian view of compressive sensing, well-known in the image recognition community. The technique efficiently finds a sparse representation of the model output with respect to a large number of input variables, effectively obtaining a reduced order surrogate model for the input-output relationship. The methodology is preceded by a sampling strategy that takes into account input parameter constraints by an initial mapping of the constrained domain to a hypercube via the Rosenblatt transformation, which preserves probabilities. Furthermore, a sparse quadrature sampling, specifically tailored for the reduced basis, is employed in the unconstrained domain to obtain accurate representations. The work is supported by the U.S. Department of Energy's CSSEF (Climate Science for a Sustainable Energy Future) program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  16. Nonlocal theory of drift type waves in a collisionless dusty plasma

    International Nuclear Information System (INIS)

    A nonlocal theory is formulated to study drift waves in a collisionless multicomponent (dusty) plasma in a sheared slab geometry. The dynamics of dust particles and ions are treated by fluid models, whereas the electrons are assumed to follow the Boltzmann distribution. It is found that the usual stability of drift waves in a sheared slab geometry is destroyed by the presence of dust particles. A drift wave is excited which propagates with a new characteristic frequency modified by dust particles. This result is similar to our earlier work for the collisional dusty plasma [Chakraborty et al., Phys. Plasmas 8, 1514 (2001)

  17. Weyl Group Multiple Dirichlet Series Type A Combinatorial Theory (AM-175)

    CERN Document Server

    Brubaker, Ben; Friedberg, Solomon

    2011-01-01

    Weyl group multiple Dirichlet series are generalizations of the Riemann zeta function. Like the Riemann zeta function, they are Dirichlet series with analytic continuation and functional equations, having applications to analytic number theory. By contrast, these Weyl group multiple Dirichlet series may be functions of several complex variables and their groups of functional equations may be arbitrary finite Weyl groups. Furthermore, their coefficients are multiplicative up to roots of unity, generalizing the notion of Euler products. This book proves foundational results about these series an

  18. Introduction to string theory

    International Nuclear Information System (INIS)

    Open and closed boson theories are discussed in a classical framework, highlighting the physical interpretation of conformal symmetry and the Virasoro (1970) algebra. The quantification of bosonic strings is done within the old covariant operational formalism. This method is much less elegant and powerful than the BRST quantification, but it quickly reveals the physical content of quantum theory. Generalization to theories with fermionic degrees of freedom is introduced: the Neveu-Schartz (1971) and Ramond (1971) models, their reduced supersymmetry (two dimensions) and the Gliozzi, Scherk and Olive (1977) projection which leads to a supersymmetry theory in the usual meaning of the term

  19. Nonlinear Spinor Fields in LRS Bianchi type-I spacetime: Theory and observation

    CERN Document Server

    Saha, Bijan

    2015-01-01

    Within the scope of a LRS Bianchi type-I cosmological model we study the role of the nonlinear spinor field in the evolution of the Universe. In doing so we consider a polynomial type of nonlinearity that describes different stages of the evolution. Finally we also use the observational data to fix the problem parameters that match best with the real picture of the evolution. The assessment of the age of the Universe in case of the soft beginning of expansion (initial speed of expansion in a point of singularity is equal to zero) the age was found 15 billion years, whereas in case of the hard beginning (nontrivial initial speed) it was found that the Universe is 13.7 billion years old.

  20. A constrained theory of non-BCS type superconductivity in gapped Graphene

    CERN Document Server

    Vyas, Vivek M

    2011-01-01

    We show that gapped Graphene, with a local constraint that current arising from the two valley fermions are exactly equal, shows a non-BCS type superconductivity. Unlike the conventional mechanisms, this superconductivity phenomenon does not require any pairing. We estimate the critical temperature for superconducting-to-normal transition via Berezinskii-Kosterlitz-Thouless mechanism, and find that it is proportional to the gap.

  1. Theory of the normal modes of vibrations in the lanthanide type crystals

    International Nuclear Information System (INIS)

    For the lanthanide type crystals, a vast and rich, though incomplete amount of experimental data has been accumulated, from linear and non linear optics, during the last decades. The main goal of the current research work is to report a new methodology and strategy to put forward a more representative approach to account for the normal modes of vibrations for a complex N-body system. For illustrative purposes, the chloride lanthanide type crystals Cs2NaLnCl6 have been chosen and we develop new convergence tests as well as a criterion to deal with the details of the F-matrix (potential energy matrix). A novel and useful concept of natural potential energy distributions (NPED) is introduced and examined throughout the course of this work. The diagonal and non diagonal contributions to these NPED-values, are evaluated for a series of these crystals explicitly. Our model is based upon a total of seventy two internal coordinates and ninety eight internal Hooke type force constants. An optimization mathematical procedure is applied with reference to the series of chloride lanthanide crystals and it is shown that the strategy and model adopted is sound from both a chemical and a physical viewpoints. We can argue that the current model is able to accommodate a number of interactions and to provide us with a very useful physical insight. The limitations and advantages of the current model and the most likely sources for improvements are discussed in detail.

  2. Recent progress on Kubas-type hydrogen-storage nanomaterials: from theories to experiments

    Science.gov (United States)

    Chung, ChiHye; Ihm, Jisoon; Lee, Hoonkyung

    2015-06-01

    Transition-metal (TM) atoms are known to form TM-H2 complexes, which are collectively called Kubas dihydrogen complexes. The TM-H2 complexes are formed through the hybridization of the TM d orbitals with the H2 ? and ?* orbitals. The adsorption energy of H2 molecules in the TM-H2 complexes is usually within the range of energy required for reversible H2 storage at room temperature and ambient pressure (-0.4 ~ -0.2 eV/H2). Thus, TM-H2 complexes have been investigated as potential Kubas-type hydrogen-storage materials. Recently, TM-decorated nanomaterials have attracted much attention because of their promising high capacity and reversibility as Kubas-type hydrogen-storage materials. The hydrogen storage capacity of TM-decorated nanomaterials is expected to be as large as ~9 wt%, which is suitable for certain vehicular applications. However, in the TM-decorated nanostructures, the TM atoms prefer to form clusters because of the large cohesive energy (approximately 4 eV), which leads to a significant reduction in the hydrogen-storage capacity. On the other hand, Ca atoms can form complexes with H2 molecules via Kubas-like interactions. Ca atoms attached to nanomaterials have been reported to be able to adsorb as many H2 molecules as TM atoms. Ca atoms tend to cluster less because of the small cohesive energy of bulk Ca (1.83 eV), which is much smaller than those of bulk TMs. These observations suggest thatKubas interactions can occur in d orbital-free elements, thereby making Ca a more suitable element for attracting H2 in hydrogen-storage materials. Recently, Kubas-type TM-based, hydrogen- stor ge materials were experimentally synthesized, and the Kubas-type interactions were measured to be stronger than the van der Waals interactions. In this review, the recent progress of Kubas-type hydrogen- storage materials will be discussed from both theoretical and experimental viewpoints.

  3. Protocol for Quantification of Defects in Natural Fibres for Composites

    DEFF Research Database (Denmark)

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based on the experimental method of optical microscopy and the image analysis algorithms of the seeded region growing method and Otsu’s method. The use of the protocol is demonstrated by examining two types of differ...

  4. The quality of Mueller type functionals in reduced density matrix functional theory

    International Nuclear Information System (INIS)

    Reduced density matrix functional theory, which uses the one-body density matrix as its fundamental variable, provides a powerful tool for the description of many-electron systems. While the kinetic energy is known exactly as a functional of the one-body density matrix the correlation energy needs to be approximated. Most approximations that are currently employed are modifications of the Mueller functional. The adiabatic extension of these functionals into the time-dependent domain proofs problematic because it leads to time-independent occupation numbers. We assess the general quality of these approximations for an exactly solvable two-electron system as well as for calculations of the fundamental gap. In addition, we address the impact of those functionals for excited state properties in optics

  5. A New Survey of types of Uncertainties in Nonlinear System with Fuzzy Theory

    Directory of Open Access Journals (Sweden)

    Fereshteh Mohammadi

    2013-03-01

    Full Text Available This paper is an attempt to introduce a new framework to handle both uncertainty and time in spatial domain. The application of the fuzzy temporal constraint network (FTCN method is proposed for representation and reasoning of uncertain temporal data. A brief introduction of the fuzzy sets theory is followed by description of the FTCN method with its main algorithms. The paper then discusses the issues of incorporating fuzzy approach into current spatio-temporal processing framework. The general temporal data model is extended to accommodate uncertainties with temporal data and relationships among events. A theoretical FTCN process of fuzzy transition for the imprecise information is introduced with an example. A summary of the paper is given together with outlining some contributions of the paper and future research directions.

  6. Extension Theory and Krein-type Resolvent Formulas for Nonsmooth Boundary Value Problems

    DEFF Research Database (Denmark)

    Abels, Helmut; Grubb, Gerd

    2014-01-01

    The theory of selfadjoint extensions of symmetric operators, and more generally the theory of extensions of dual pairs, was implemented some years ago for boundary value problems for elliptic operators on smooth bounded domains. Recently, the questions have been taken up again for nonsmooth domains. In the present work we show that pseudodifferential methods can be used to obtain a full characterization, including Kre?n resolvent formulas, of the realizations of nonselfadjoint second-order operators on View the MathML sourceC32+? domains; more precisely, we treat domains with View the MathML sourceBp,232-smoothness and operators with View the MathML sourceHq1-coefficients, for suitable p>2(n?1)p>2(n?1) and q>nq>n. The advantage of the pseudodifferential boundary operator calculus is that the operators are represented by a principal part and a lower-order remainder, leading to regularity results; in particular we analyze resolvents, Poisson solution operators and Dirichlet-to-Neumann operators in this way, also in Sobolev spaces of negative order.

  7. Theory of the beta-type Organic Superconductivity under Uniaxial Compression

    OpenAIRE

    Suzuki, Takeo; Onari, Seiichiro; Ito,Hiroshi; Tanaka, Yukio

    2010-01-01

    We study theoretically the shift of the superconducting transition temperature (Tc) under uniaxial compression in beta-type organic superconductors, beta-(BEDT-TTF)2I3 and beta-(BDA-TTP)2X[X=SbF6,AsF6], in order to clarify the electron correlation, the spin frustration and the effect of dimerization. The transfer integrals are calculated by the extended Huckel method assuming the uniaxial strain and the superconducting state mediated by the spin fluctuation is solved using E...

  8. RCF4: Inconsistent Quantification

    CERN Document Server

    Pfender, Michael

    2009-01-01

    We exhibit canonical Choice maps within categorical theories of Primitive Recursion, of partially defined PR maps, as well as for classical, quantifier defined PR theories, and show incompatibility of these choice sections in the latter theories, with (iterative) finite-descent property of omega^omega, namely within a ``minimal'' such quantor defined Arithmetic, Q. This is to give inconsistency of ZF, and even of first order set theory 1ZF strengthened by well-order property of omega^omega. The argument is iterative evaluation of PR map codes, which gets epimorphic defined-arguments enumeration by above finite-descent property. This enumeration is turned into a retraction by AC, with PR section in Q^+ = Q+wo(omega^omega), and so makes the evaluation a PR map. But the latter is excluded by Ackermann's result that such (diagonalised) evaluation grows faster than any PR map within any consistent frame. Keywords: quantifiers, first order arithmetics, recursion, AC, sections and retractions, primitive recursive ev...

  9. Six-Dimensional Superconformal Theories and their Compactifications from Type IIA Supergravity

    Science.gov (United States)

    Apruzzi, Fabio; Fazzi, Marco; Passias, Achilleas; Rota, Andrea; Tomasiello, Alessandro

    2015-08-01

    We describe three analytic classes of infinitely many AdSd supersymmetric solutions of massive IIA supergravity, for d =7 ,5 ,4 . The three classes are related by simple universal maps. For example, the AdS7×M3 solutions (where M3 is topologically S3 ) are mapped to AdS5×?2×M3' , where ?2 is a Riemann surface of genus g ?2 and the metric on M3' is obtained by distorting M3 in a certain way. The solutions can have localized D6 or O6 sources, as well as an arbitrary number of D8-branes. The AdS7 case (previously known only numerically) is conjecturally dual to an NS5-D6-D8 system. The field theories in three and four dimensions are not known, but their number of degrees of freedom can be computed in the supergravity approximation. The AdS4 solutions have numerical "attractor" generalizations that might be useful for flux compactification purposes.

  10. Secret symmetries of type IIB superstring theory on AdS3 × S3 × M4

    International Nuclear Information System (INIS)

    We establish features of so-called Yangian secret symmetries for AdS3 type IIB superstring backgrounds, thus verifying the persistence of such symmetries to this new instance of the AdS/CFT correspondence. Specifically, we find two a priori different classes of secret symmetry generators. One class of generators, anticipated from the previous literature, is more naturally embedded in the algebra governing the integrable scattering problem. The other class of generators is more elusive and somewhat closer in its form to its higher-dimensional AdS5 counterpart. All of these symmetries respect left-right crossing. In addition, by considering the interplay between left and right representations, we gain a new perspective on the AdS5 case. We also study the RTT-realisation of the Yangian in AdS3 backgrounds, thus establishing a new incarnation of the Beisert–de Leeuw construction. (paper)

  11. Flexural wave band gaps in metamaterial beams with membrane-type resonators: theory and experiment

    Science.gov (United States)

    Zhang, Hao; Xiao, Yong; Wen, Jihong; Yu, Dianlong; Wen, Xisen

    2015-11-01

    This paper deals with flexural wave band gaps in metamaterial beams with membrane-type resonators. The proposed membrane-type resonator consists of a tensioned elastic membrane and a mass block attached to the center of the membrane. Numerical models based on finite element method are presented to predict the dispersion relation, band gaps and eigen-modes. It has shown that the metamaterial beams exhibit unique wave physics. A broad Bragg band gap (BBG) and two low-frequency locally resonant band gaps (LRBGs) can be observed due to the structural periodicity and locally resonant behavior respectively. The first LRBG can be ascribed to the combined resonance of the membranes and the masses, while the second LRBG is caused by the resonance of the membranes. The study of the effective property shows that negative mass density occurs in the LRBGs. The effects of membrane tension and mass magnitude (the weight of mass block) on the LRBGs are further analyzed. It is shown that both the two LRBGs move to high-frequency with the increase of the membrane tension. However, as the mass magnitude increases, the first LRBG moves to low-frequency and the second LRBG almost remains unchanged. It is further demonstrated that, when a larger unit cell with multiple kinds of masses (a larger unit cell incorporating multiple basic unit cells but with different weights of mass blocks within each basic unit cell) are used, the first LRBG can be broadened, which can be employed to achieve broadband vibration attenuation. Moreover, experimental measurements of vibration transmittance are conducted to validate the theoretical predictions. Good agreements between the experimental results and the theoretical predictions are observed.

  12. Rotating strings and D2-branes in type IIA reduction of M-theory on G2 manifold and their semiclassical limits

    International Nuclear Information System (INIS)

    We consider rotating strings and D2-branes on type IIA background, which arises as dimensional reduction of M-theory on manifold of G2 holonomy, dual to N = 1 gauge theory in four dimensions. We obtain exact solutions and explicit expressions for the conserved charges. By taking the semiclassical limit, we show that the rotating strings can reproduce only one type of semiclassical behavior, exhibited by rotating M2-branes on G2 manifolds. Our further investigation leads to the conclusion that the rotating D2-branes reproduce two types of the semiclassical energy-charge relations known for membranes in eleven dimensions

  13. Development of flow network analysis code for block type VHTR core by linear theory method

    International Nuclear Information System (INIS)

    VHTR (Very High Temperature Reactor) is high-efficiency nuclear reactor which is capable of generating hydrogen with high temperature of coolant. PMR (Prismatic Modular Reactor) type reactor consists of hexagonal prismatic fuel blocks and reflector blocks. The flow paths in the prismatic VHTR core consist of coolant holes, bypass gaps and cross gaps. Complicated flow paths are formed in the core since the coolant holes and bypass gap are connected by the cross gap. Distributed coolant was mixed in the core through the cross gap so that the flow characteristics could not be modeled as a simple parallel pipe system. It requires lot of effort and takes very long time to analyze the core flow with CFD analysis. Hence, it is important to develop the code for VHTR core flow which can predict the core flow distribution fast and accurate. In this study, steady state flow network analysis code is developed using flow network algorithm. Developed flow network analysis code was named as FLASH code and it was validated with the experimental data and CFD simulation results. (authors)

  14. Type Ia Supernovae and their Environment: Theory and Applications to SN 2014J

    CERN Document Server

    Dragulin, Paul

    2015-01-01

    We present theoretical semi-analytic models for the interaction of stellar winds with the interstellar medium (ISM) or prior mass loss implemented in our code SPICE (Supernovae Progenitor Interaction Calculator for parameterized Environments, available on request), assuming spherical symmetry and power-law ambient density profiles and using the Pi-theorem. This allows us to test a wide variety of configurations, their functional dependencies, and to find classes of solutions for given observations. Here, we study Type Ia (SN~Ia) surroundings of single and double degenerate systems, and their observational signatures. Winds may originate from the progenitor prior to the white dwarf (WD) stage, the WD, a donor star, or an accretion disk (AD). For M_Ch explosions,the AD wind dominates and produces a low-density void several light years across surrounded by a dense shell. The bubble explains the lack of observed interaction in late time SN light curves for, at least, several years. The shell produces narrow ISM l...

  15. Representation of the Kaellen-Wilhelmsson type for vacuum expectations of field operator products of the scalar field theory

    International Nuclear Information System (INIS)

    For vacuum expectations of field operator products of free scalar field theory, called Wightman functionals - Wsub(n) an integral representation of the Kallen-Wilhelmsson type is obtained, where Wsub(2n) (xi) is expressed by singular functions ?sup(+)sub(n+1)(xi; asub(sl)). By the simplicity of the investigated model the weight bunction of representation G(asub(sl)) is identical to unity. Especially simple representation is obtained for W4 and W6 because in this case ?3+ and ?4+ are expressed directly by the Hankel 3d kind dunctions. An integral representation of Wsub(2n) as compared with the expression obtained by the Wick theorem has a more symmetric Lorentz-invariant form; it is also convenient for examining the local commutativity, an axiom which is not taken into account when deriving the Kallen-Willhelmsson representation

  16. Band-gap corrected density functional theory calculations for InAs/GaSb type II superlattices

    Science.gov (United States)

    Wang, Jianwei; Zhang, Yong

    2014-12-01

    We performed pseudopotential based density functional theory (DFT) calculations for GaSb/InAs type II superlattices (T2SLs), with bandgap errors from the local density approximation mitigated by applying an empirical method to correct the bulk bandgaps. Specifically, this work (1) compared the calculated bandgaps with experimental data and non-self-consistent atomistic methods; (2) calculated the T2SL band structures with varying structural parameters; (3) investigated the interfacial effects associated with the no-common-atom heterostructure; and (4) studied the strain effect due to lattice mismatch between the two components. This work demonstrates the feasibility of applying the DFT method to more exotic heterostructures and defect problems related to this material system.

  17. The classical Yang–Baxter equation and the associated Yangian symmetry of gauged WZW-type theories

    International Nuclear Information System (INIS)

    We construct the Lax-pair, the classical monodromy matrix and the corresponding solution of the Yang–Baxter equation, for a two-parameter deformation of the Principal chiral model for a simple group. This deformation includes as a one-parameter subset, a class of integrable gauged WZW-type theories interpolating between the WZW model and the non-Abelian T-dual of the principal chiral model. We derive in full detail the Yangian algebra using two independent methods: by computing the algebra of the non-local charges and alternatively through an expansion of the Maillet brackets for the monodromy matrix. As a byproduct, we also provide a detailed general proof of the Serre relations for the Yangian symmetry

  18. Magnetism in olivine-type LiCo1-xFexPO4 cathode materials: bridging theory and experiment.

    Science.gov (United States)

    Singh, Vijay; Gershinsky, Yelena; Kosa, Monica; Dixit, Mudit; Zitoun, David; Major, Dan Thomas

    2015-11-18

    In the current paper, we present a non-aqueous sol-gel synthesis of olivine type LiCo1-xFexPO4 compounds (x = 0.00, 0.25, 0.50, 0.75, 1.00). The magnetic properties of the olivines are measured experimentally and calculated using first-principles theory. Specifically, the electronic and magnetic properties are studied in detail with standard density functional theory (DFT), as well as by including spin-orbit coupling (SOC), which couples the spin to the crystal structure. We find that the Co(2+) ions exhibit strong orbital moment in the pure LiCoPO4 system, which is partially quenched upon substitution of Co(2+) by Fe(2+). Interestingly, we also observe a non-negligible orbital moment on the Fe(2+) ion. We underscore that the inclusion of SOC in the calculations is essential to obtain qualitative agreement with the observed effective magnetic moments. Additionally, Wannier functions were used to understand the experimentally observed rising trend in the Néel temperature, which is directly related to the magnetic exchange interaction paths in the materials. We suggest that out of layer M-O-P-O-M magnetic interactions (J?) are present in the studied materials. The current findings shed light on important differences observed in the electrochemistry of the cathode material LiCoPO4 compared to the already mature olivine material LiFePO4. PMID:26548581

  19. Effective field theory of modified gravity on the spherically symmetric background: leading order dynamics and the odd-type perturbations

    CERN Document Server

    Kase, Ryotaro; Tsujikawa, Shinji

    2014-01-01

    We consider perturbations of a static and spherically symmetric background endowed with a metric tensor and a scalar field in the framework of the effective field theory of modified gravity. We employ the previously developed 2+1+1 canonical formalism of a double Arnowitt-Deser-Misner (ADM) decomposition of space-time, which singles out both time and radial directions. Our building block is a general gravitational action that depends on scalar quantities constructed from the 2+1+1 canonical variables and the lapse. Variation of the action up to first-order in perturbations gives rise to three independent background equations of motion, as expected from spherical symmetry. The dynamical equations of linear perturbations follow from the second-order Lagrangian after a suitable gauge fixing. We derive conditions for the avoidance of ghosts and Laplacian instabilities for the odd-type perturbations. We show that our results not only incorporates those derived in the most general scalar-tensor theories with second...

  20. Self-organized criticality as Witten-type topological field theory with spontaneously broken Becchi-Rouet-Stora-Tyutin symmetry

    International Nuclear Information System (INIS)

    Here, a scenario is proposed, according to which a generic self-organized critical (SOC) system can be looked upon as a Witten-type topological field theory (W-TFT) with spontaneously broken Becchi-Rouet-Stora-Tyutin (BRST) symmetry. One of the conditions for the SOC is the slow driving noise, which unambiguously suggests Stratonovich interpretation of the corresponding stochastic differential equation (SDE). This, in turn, necessitates the use of Parisi-Sourlas-Wu stochastic quantization procedure, which straightforwardly leads to a model with BRST-exact action, i.e., to a W-TFT. In the parameter space of the SDE, there must exist full-dimensional regions where the BRST symmetry is spontaneously broken by instantons, which in the context of SOC are essentially avalanches. In these regions, the avalanche-type SOC dynamics is liberated from overwise a rightful dynamics-less W-TFT, and a Goldstone mode of Fadeev-Popov ghosts exists. Goldstinos represent moduli of instantons (avalanches) and being gapless are responsible for the critical avalanche distribution in the low-energy, long-wavelength limit. The above arguments are robust against moderate variations of the SDE's parameters and the criticality is 'self-tuned'. The proposition of this paper suggests that the machinery of W-TFTs may find its applications in many different areas of modern science studying various physical realizations of SOC. It also suggests that there may in principle exist a connection between some SOC's and the concept of topological quantum computing.

  1. The synchrotron-maser theory of type II solar radio emission processes - The physical model and generation mechanism

    Science.gov (United States)

    Wu, C. S.; Steinolfson, R. S.; Zhou, G. C.

    1986-01-01

    A theory is proposed to explain the generation mechanism of type II solar radio bursts. It is suggested that the shock wave formed at the leading edge of a coronal transient can accelerate electrons. Because of the nature of the acceleration process, the energized electrons can possess a 'hollow-beam' type distribution function. When the electron beam propagates along the ambient magnetic field to lower altitudes and attains larger pitch angles, a synchrotron-maser instability can set in. This instability leads to the amplification of unpolarized or weakly polarized radiation. The present discussion incorporates a model which describes the ambient magnetic field and background plasma by means of MHD simulation. The potential emission regions may be located approximately, according to the time-dependent MHD simulation. Since the average local plasma frequency in the source region can be evaluated from the MHD model, the frequent drift associated with the radiation may be estimated. The result seems to be in good agreement with that derived from observations.

  2. Theory, design, and simulation of LINA: A path forward for QCA-type nanoelectronics

    Science.gov (United States)

    Hook, Loyd Reed, IV

    The past 50 years have seen exponential advances in digital integrated circuit technologies which has facilitated an explosion of uses and functionality. Although this rate (generally referred to as "Moore's Law") cannot be sustained indefinitely, significant advances will remain possible even after current technologies reach fundamental limits. However if these further advances are to be realized, nanoelectronics designs must be developed that provide significant improvements over, the currently-utilized, complementary metal-oxide semiconductor (CMOS) transistor based integrated circuits. One promising nanoelectronics paradigm to fulfill this function is Quantum-dot Cellular Automata (QCA). QCA provides the possibility of THz switching, molecular scaling, and provides particular applicability for advanced logical constructs such as reversible logic and systolic arrays within the paradigm. These attributes make QCA an exciting prospect; however, current fabrication technology does not exist which allows for the fabrication of reliable electronic QCA circuits which operate at room-temperature. Furthermore, a plausible path to fabrication of circuitry on the very large scale integration (VLSI) level with QCA does not currently exist. This has caused doubts to the viability of the paradigm and questions to its future as a suitable nanoelectronic replacement to CMOS. In order to resolve these issues, research was conducted into a new design which could utilize key attributes of QCA while also providing a means for near-term fabrication of reliable room-temperature circuits and a path forward for VLSI circuits. The result of this research, presented in this dissertation, is the Lattice-based Integrated-signal Nanocellular Automata (LINA) nanoelectronics paradigm. LINA designs are based on QCA and provide the same basic functionality as traditional QCA. LINA also retains the key attributes of THz switching, scalability to the molecular level, and ability to utilize advanced logical constructs which are crucial to the QCA proposals. However, LINA designs also provide significant improvements over traditional QCA. For example, the continuous correction of faults, due to LINA's integrated-signal approach, provides reliability improvements to enable room-temperature operation with cells which are potentially up to 20nm and fault tolerance to layout, patterning, stray-charge, and stuck-at-faults. In terms of fabrication, LINA's lattice-based structure allows precise relative placement through the use of self-assembly techniques seen in current nanoparticle research. LINA also allows for large enough wire and logic structures to enable use of widely available photo-lithographical patterning technologies. These aspects of the LINA designs, along with power, timing, and clocking results, have been verified through the use of new and/or modified simulation tools specifically developed for this purpose. To summarize, the LINA designs and results, presented in this dissertation, provide a path to realization of QCA-type VLSI nanoelectronic circuitry. Furthermore, they offer a renewed viability of the paradigm to replace CMOS and advance computing technologies beyond the next decade.

  3. Quantification of natural phenomena

    International Nuclear Information System (INIS)

    The science is like a great spider's web in which unexpected connections appear and therefore it is frequently difficult to already know the consequences of new theories on those existent. The physics is a clear example of this. The Newton mechanics laws describe the physical phenomena observable accurately by means of our organs of the senses or by means of observation teams not very sophisticated. After their formulation at the beginning of the XVIII Century, these laws were recognized in the scientific world as a mathematical model of the nature. Together with the electrodynamics law, developed in the XIX century, and the thermodynamic one constitutes what we call the classic physics. The state of maturity of the classic physics at the end of last century it was such that some scientists believed that the physics was arriving to its end obtaining a complete description of the physical phenomena. The spider's web of the knowledge was supposed finished, or at least very near its termination. It ended up saying, in arrogant form, that if the initial conditions of the universe were known, we could determine the state of the same one in any future moment. Two phenomena related with the light would prove in firm form that mistaken that they were, creating unexpected connections in the great spider's web of the knowledge and knocking down part of her. The thermal radiation of the bodies and the fact that the light spreads to constant speed in the hole, without having an absolute system of reference with regard to which this speed is measured, they constituted the decisive factors in the construction of a new physics. The development of sophisticated of measure teams gave access to more precise information and it opened the microscopic world to the observation and confirmation of existent theories

  4. Automated quantification and analysis of mandibular asymmetry

    DEFF Research Database (Denmark)

    Darvann, T. A.; Hermann, N. V.

    2010-01-01

    We present an automated method of spatially detailed 3D asymmetry quantification in mandibles extracted from CT and apply it to a population of infants with unilateral coronal synostosis (UCS). An atlas-based method employing non-rigid registration of surfaces is used for determining deformation fields, thereby establishing detailed anatomical point correspondence between subjects as well as between points on the left and right side of the mid-sagittal plane (MSP). Asymmetry is defined in terms of the vector between a point and the corresponding anatomical point on the opposite side of the MSP after mirroring the mandible across the MSP. A principal components analysis of asymmetry characterizes the major types of asymmetry in the population, and successfully separates the asymmetric UCS mandibles from a number of less asymmetric mandibles from a control population.

  5. A recipe for EFT uncertainty quantification in nuclear physics

    International Nuclear Information System (INIS)

    The application of effective field theory (EFT) methods to nuclear systems provides the opportunity to rigorously estimate the uncertainties originating in the nuclear Hamiltonian. Yet this is just one source of uncertainty in the observables predicted by calculations based on nuclear EFTs. We discuss the goals of uncertainty quantification in such calculations and outline a recipe to obtain statistically meaningful error bars for their predictions. We argue that the different sources of theory error can be accounted for within a Bayesian framework, as we illustrate using a toy model. (paper)

  6. A recipe for EFT uncertainty quantification in nuclear physics

    Science.gov (United States)

    Furnstahl, R. J.; Phillips, D. R.; Wesolowski, S.

    2015-03-01

    The application of effective field theory (EFT) methods to nuclear systems provides the opportunity to rigorously estimate the uncertainties originating in the nuclear Hamiltonian. Yet this is just one source of uncertainty in the observables predicted by calculations based on nuclear EFTs. We discuss the goals of uncertainty quantification in such calculations and outline a recipe to obtain statistically meaningful error bars for their predictions. We argue that the different sources of theory error can be accounted for within a Bayesian framework, as we illustrate using a toy model.

  7. From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches

    OpenAIRE

    Potter, Kristin; Rosen, Paul; Johnson, Chris R.

    2012-01-01

    Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly u...

  8. Object Oriented Design Security Quantification

    OpenAIRE

    Suhel Ahmad Khan

    2011-01-01

    Quantification of security at early phase produces a significant improvement to understand the management of security artifacts for best possible results. The proposed study discusses a systematic approach to quantify security based on complexity factors which having impact on security attributes. This paper provides a road-map to researchers and software practitioner to assess, and preferably, quantify software security in design phase. A security assessment through complexity framework (SVD...

  9. Aerobic physical activity and resistance training: an application of the theory of planned behavior among adults with type 2 diabetes in a random, national sample of Canadians

    OpenAIRE

    Karunamuni Nandini; Trinh Linda; Courneya Kerry S; Plotnikoff Ronald C; Sigal Ronald J

    2008-01-01

    Abstract Background Aerobic physical activity (PA) and resistance training are paramount in the treatment and management of type 2 diabetes (T2D), but few studies have examined the determinants of both types of exercise in the same sample. Objective The primary purpose was to investigate the utility of the Theory of Planned Behavior (TPB) in explaining aerobic PA and resistance training in a population sample of T2D adults. Methods A total of 244 individuals were recruited through a random na...

  10. Towards an integrative account of social cognition: marrying theory of mind and interactionism to study the interplay of Type 1 and Type 2 processes

    OpenAIRE

    VivianBohl; Woutervan den Bos

    2012-01-01

    Traditional theory of mind accounts of social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional theory of mind accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of conside...

  11. Low energy expansion of the four-particle genus-one amplitude in type II superstring theory

    CERN Document Server

    Green, Michael B; Vanhove, Pierre

    2008-01-01

    A diagrammatic expansion of coefficients in the low-momentum expansion of the genus-one four-particle amplitude in type II superstring theory is developed. This is applied to determine coefficients up to order s^6R^4 (where s is a Mandelstam invariant and R^4 the linearized super-curvature), and partial results are obtained beyond that order. This involves integrating powers of the scalar propagator on a toroidal world-sheet, as well as integrating over the modulus of the torus. At any given order in s the coefficients of these terms are given by rational numbers multiplying multiple zeta values (or Euler--Zagier sums) that, up to the order studied here, reduce to products of Riemann zeta values. We are careful to disentangle the analytic pieces from logarithmic threshold terms, which involves a discussion of the conditions imposed by unitarity. We further consider the compactification of the amplitude on a circle of radius r, which results in a plethora of terms that are power-behaved in r. These coefficient...

  12. A critical examination of the predictive capabilities of a new type of general laminated plate theory in the inelastic response regime

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Todd O [Los Alamos National Laboratory

    2008-01-01

    Recently, a new type of general, multiscale plate theory was developed for application to the analysis of the history-dependent response of laminated plates (Williams). In particular, the history-dependent behavior in a plate was considered to arise from both delamination effects as well as history-dependent material point responses (such as from viscoelasticity, viscoplasticity, damage, etc.). The multiscale nature of the theoretical framework is due to the use of a superposition of both general global and local displacement effects. Using this global-local displacement field the governing equations of the theory are obtained by satisfying the governing equations of nonlinear continuum mechanics referenced to the initial configuration. In order to accomplish the goal of conducting accurate analyses in the history-dependent response regimes the formulation of the theory has been carried out in a sufficiently general fashion that any cohesive zone model (CZM) and any history-dependent constitutive model for a material point can be incorporated into the analysis without reformulation. Recently, the older multiscale theory of Williams has been implemented into the finite element (FE) framework by Mourad et al. and the resulting capabilities where used to shown that in a qualitative sense it is important that the local fields be accurately obtained in order to correctly predict even the overall response characteristics of a laminated plate in the inelastic regime. The goal of this work is to critically examine the predictive capabilities of this theory, as well as the older multiscale theory of Williams and other types of laminated plate theories, with recently developed exact solutions for the response of inelastic plates in cylindrical bending (Williams). These exact solutions are valid for both nonlinear CZMs as well as inelastic material responses obtained from different constitutive theories. In particular, the accuracy with which the different plate theories predict the local and global responses are considered.

  13. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  14. Quantification of Information in a One-Way Plant-to-Animal Communication System

    Directory of Open Access Journals (Sweden)

    Laurance R. Doyle

    2009-08-01

    Full Text Available In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum to the wasp (Cardiochiles nigriceps studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type, to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia. We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types, to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message. We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception, for possible insights into the history and actual working of this one-way communication system.

  15. Quantification of radiation transformation frequencies

    International Nuclear Information System (INIS)

    The occurrence of late lethal mutations in many of the progeny of cells which survive ionizing radiation seriously affects the quantification of transformation frequencies following irradiation. Lethal mutations are particularly relevant where focal assays are used or where transformations are scored following serial passaging of survivors. This paper examines the influence of lethal mutations on the radiation transformation dose response curve for two typical assays, viz. a C3H 10T1/2 focal assay and a primary thyroid serial subculture assay. (author)

  16. Surface complexes of acetate on edge surfaces of 2:1 type phyllosilicate: Insights from density functional theory calculation

    Science.gov (United States)

    Liu, Xiandong; Lu, Xiancai; Wang, Rucheng; Zhou, Huiqun; Xu, Shijin

    2008-12-01

    To explore the complexation mechanisms of carboxylate on phyllosilicate edge surfaces, we simulate acetate complexes on the (0 1 0) type edge of pyrophyllite by using density functional theory method. We take into account the intrinsic long-range order and all the possible complex sets under common environments. This study discloses that H-bonding interactions occur widely and play important roles in both inner-sphere and outer-sphere fashions. In inner-sphere complexes, one acetate C-O bond elongates to form a covalent bond with surface Al atom; the other C-O either forms a covalent bond with Al or interacts with surface hydroxyls via H-bonds. In outer-sphere complexes, the acetate can capture a proton from the surface groups to form an acid molecule. For the groups of both substrate and ligand, the variations in geometrical parameters caused by H-bonding interactions depend on the role it plays (i.e., proton donor or acceptor). By comparing the edge structures before and after interaction, we found that the carboxylate binding can modify the surface structures. In the inner-sphere complexes, the exposed Al atom can be stabilized by a single acetate ion through either monodentate or bidentate schemes, whereas the Al atoms complexing both an acetate and a hydroxyl may significantly deviate outwards from the bulk equilibrium positions. In the outer-sphere complexes, some H-bondings are strong enough to polarize the metal-oxygen bonds and therefore distort the local coordination structure of metal in the substrate, which may make the metal susceptible to release.

  17. Evaluation of Different RNA Extraction Methods and Storage Conditions of Dried Plasma or Blood Spots for Human Immunodeficiency Virus Type 1 RNA Quantification and PCR Amplification for Drug Resistance Testing?

    OpenAIRE

    Monleau, Marjorie; Montavon, Céline; Laurent, Christian; Segondy, Michel; Montes, Brigitte; DELAPORTE, ERIC; Boillot, François; Peeters, Martine

    2009-01-01

    The development and validation of dried sample spots as a method of specimen collection are urgently needed in developing countries for monitoring of human immunodeficiency virus (HIV) infection. Our aim was to test some crucial steps in the use of dried spots, i.e., viral recovery and storage over time. Moreover, we investigated whether dried plasma and blood spots (DPS and DBS, respectively) give comparable viral load (VL) results. Four manual RNA extraction methods from commercial HIV type...

  18. Medición volumétrica de grasa visceral abdominal con resonancia magnética y su relación con antropometría, en una población diabética / Quantification of visceral adipose tissue using magnetic resonance imaging compared with anthropometry, in type 2 diabetic patients

    Scientific Electronic Library Online (English)

    Cristóbal, Serrano García; Francisco, Barrera; Pilar, Labbé; Jessica, Liberona; Marco, Arrese; Pablo, Irarrázabal; Cristián, Tejos; Sergio, Uribe.

    2012-12-01

    Full Text Available [...] Abstract in english Background: Visceral fat accumulation is associated with the development of metabolic diseases. Anthropometry is one of the methods used to quantify it. aim: to evaluate the relationship between visceral adipose tissue volume (VAT), measured with magnetic resonance imaging (MRI), and anthropometric [...] indexes, such as body mass index (BMI) and waist circumference (WC), in type 2 diabetic patients (DM2). Patients and Methods: Twenty four type 2 diabetic patients aged 55 to 78 years (15 females) and weighting 61.5 to 97 kg, were included. The patients underwent MRI examination on a Philips Intera® 1.5T MR scanner. The MRI protocol included a spectral excitation sequence centered at the fat peak. The field of view included from L4-L5 to the diaphragmatic border. VAT was measured using the software Image J®. Weight, height, BMI, WC and body fat percentage (BF%), derived from the measurement offour skinfolds with the equation of Durnin and Womersley, were also measured. The association between MRIVAT measurement and anthropometry was evaluated using the Pearson's correlation coefficient. Results: Mean VAT was 2478 ± 758 ml, mean BMI29.5 ± 4.7 kg/m², and mean WC was 100 ± 9.7 cm. There was a poor correlation between VAT, BMI (r = 0.18) and WC (r = 0.56). Conclusions: BMI and WC are inaccurate predictors of VAT volume in type 2 diabetic patients.

  19. Detection and Quantification of Neurotransmitters in Dialysates

    OpenAIRE

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2009-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection).

  20. Quantification of wastewater sludge dewatering.

    Science.gov (United States)

    Skinner, Samuel J; Studer, Lindsay J; Dixon, David R; Hillis, Peter; Rees, Catherine A; Wall, Rachael C; Cavalida, Raul G; Usher, Shane P; Stickland, Anthony D; Scales, Peter J

    2015-10-01

    Quantification and comparison of the dewatering characteristics of fifteen sewage sludges from a range of digestion scenarios are described. The method proposed uses laboratory dewatering measurements and integrity analysis of the extracted material properties. These properties were used as inputs into a model of filtration, the output of which provides the dewatering comparison. This method is shown to be necessary for quantification and comparison of dewaterability as the permeability and compressibility of the sludges varies by up to ten orders of magnitude in the range of solids concentration of interest to industry. This causes a high sensitivity of the dewaterability comparison to the starting concentration of laboratory tests, thus simple dewaterability comparison based on parameters such as the specific resistance to filtration is difficult. The new approach is demonstrated to be robust relative to traditional methods such as specific resistance to filtration analysis and has an in-built integrity check. Comparison of the quantified dewaterability of the fifteen sludges to the relative volatile solids content showed a very strong correlation in the volatile solids range from 40 to 80%. The data indicate that the volatile solids parameter is a strong indicator of the dewatering behaviour of sewage sludges. PMID:26003332

  1. Quantification of uncertainties of water vapour column retrievals using future instruments

    OpenAIRE

    Diedrich, H.; Preusker, R.; Lindstrot, R.; J Fischer

    2012-01-01

    This study presents a quantification of uncertainties of water vapour retrievals based on near infrared measurements of upcoming instruments. The concepts of three scheduled spectrometer were taken into account: OLCI (Ocean and Land Color Instrument) on Sentinel-3, METimage on MetOp (Meteorological Operational Satellite) and FCI (Flexible Combined Imager) on MTG (Meteosat Third Generation). Optimal estimation theory was used to estimate th...

  2. Quantification of Flow Structures in Syntectonic Magmatic Rocks

    Science.gov (United States)

    Kruhl, J. H.; Gerik, A.

    2007-12-01

    Fabrics of syntectonic magmatic rocks provide important information on melt emplacement and crystallization conditions and, consequently, information on state and development of certain parts of the continental crust. Therefore, detailed studies on magmatic fabrics and, specifically, their quantification is a necessary prerequisite for any more detailed study. Fabric anisotropy and heterogeneity are fundamental properties of magmatic rocks. Their quantification can be performed by recently developed modified methods of fractal geometry. (i) A modified Cantor-dust method leads to a direction-related fractal dimension and, consequently, to quantification of fabric anisotropy. (ii) A modified perimeter method allows determination of fractal dimensions of complex curves in relation to their average orientations. (iii) A combination of box-counting method with kriging results in a contour map of the box-counting dimension, revealing the local fabric heterogeneity. (iv) A combination of method iii and a modified Cantor-dust method leads to mapping of fabric anisotropy (Kruhl et al. 2004, Peternell et al. subm.). Automation of these methods allows fast recording, generation of large data sets and the application of quantification methods on large areas (Gerik & Kruhl subm.). It leads to a precision of fabric analysis, not obtainable by manual execution of methods. Specifically, the direction-related Cantor-dust method has proven useful for analyzing magmatic flow structures and quantifying the intensity of flow. Application of this method to different types of syntectonic magmatic rocks will be presented and discussed. References: Gerik, A. & Kruhl, J.H.: Towards automated pattern quantification: time-efficient assessment of anisotropy of 2D pattern with AMOCADO. Computers & Geosciences (subm.). Kruhl, J.H., Andries, F., Peternell, M. & Volland, S. 2004: Fractal geometry analyses of rock fabric anisotropies and inhomogeneities. In: Kolymbas, D. (ed.), Fractals in Geotechnical Engineering, Advances in Geotechnical Engineering and Tunnelling, 9, Logos, Berlin, 115-135. Peternell, M., Bitencourt, M.F. & Kruhl, J.H.: New methods for large-scale rock fabric quantification - the Piquiri Syenite Massif, Southern Brazil. Journal of Structural Geology (subm.)

  3. A pH and solvent optimized reverse-phase ion-paring-LC–MS/MS method that leverages multiple scan-types for targeted absolute quantification of intracellular metabolites

    DEFF Research Database (Denmark)

    McCloskey, Douglas; Gangoiti, Jon A.

    2015-01-01

    Comprehensive knowledge of intracellular biochemistry is needed to accurately understand, model, and manipulate metabolism for industrial and therapeutic applications. Quantitative metabolomics has been driven by advances in analytical instrumentation and can add valuable knowledge to the understanding of intracellular metabolism. Liquid chromatography coupled to mass spectrometry (LC–MS and LC–MS/MS) has become a reliable means with which to quantify a multitude of intracellular metabolites in parallel with great specificity and accuracy. This work details a method that builds and extends upon existing reverse phase ion-paring liquid chromatography methods for separation and detection of polar and anionic compounds that comprise key nodes of intracellular metabolism by optimizing pH and solvent composition. In addition, the presented method utilizes multiple scan types provided by hybrid instrumentation to improve confidence in compound identification. The developed method was validated for a broad coverage of polar and anionic metabolites of intracellular metabolism

  4. A Leonard-Sanders-Budiansky-Koiter-Type Nonlinear Shell Theory with a Hierarchy of Transverse-Shearing Deformations

    Science.gov (United States)

    Nemeth, Michael P.

    2013-01-01

    A detailed exposition on a refined nonlinear shell theory suitable for nonlinear buckling analyses of laminated-composite shell structures is presented. This shell theory includes the classical nonlinear shell theory attributed to Leonard, Sanders, Koiter, and Budiansky as an explicit proper subset. This approach is used in order to leverage the exisiting experience base and to make the theory attractive to industry. In addition, the formalism of general tensors is avoided in order to expose the details needed to fully understand and use the theory. The shell theory is based on "small" strains and "moderate" rotations, and no shell-thinness approximations are used. As a result, the strain-displacement relations are exact within the presumptions of "small" strains and "moderate" rotations. The effects of transverse-shearing deformations are included in the theory by using analyst-defined functions to describe the through-the-thickness distributions of transverse-shearing strains. Constitutive equations for laminated-composite shells are derived without using any shell-thinness approximations, and simplified forms and special cases are presented.

  5. Application of the perturbation theory-differential formalism-for sensitivity analysis in steam generators of PWR type nuclear power plants

    International Nuclear Information System (INIS)

    An homogeneous model which simulates the stationary behavior of steam generators of PWR type reactors and uses the differential formalism of perturbation theory for analysing sensibility of linear and non-linear responses, is presented. The PERGEVAP computer code to calculate the temperature distribution in the steam generator and associated importance function, is developed. The code also evaluates effects of the thermohydraulic parameter variation on selected functionals. The obtained results are compared with results obtained by GEVAP computer code . (M.C.K.)

  6. Using psychological theory to understand the clinical management of type 2 diabetes in Primary Care: a comparison across two European countries

    OpenAIRE

    Johnston Marie; Dijkstra Rob; Bosch Marije; Francis Jill J; Eccles Martin P; Hrisos Susan; Grol Richard; Kaner Eileen FS; Steen Ian N

    2009-01-01

    Abstract Background Long term management of patients with Type 2 diabetes is well established within Primary Care. However, despite extensive efforts to implement high quality care both service provision and patient health outcomes remain sub-optimal. Several recent studies suggest that psychological theories about individuals' behaviour can provide a valuable framework for understanding generalisable factors underlying health professionals' clinical behaviour. In the context of the team mana...

  7. Development of Quantification Method for Bioluminescence Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il [Chonnam National University Hospital, Hwasun (Korea, Republic of); Choi, Eun Seo [Chosun University, Gwangju (Korea, Republic of); Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young [Inje University, Kimhae (Korea, Republic of)

    2009-10-15

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  8. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  9. On a singular Fredholm-type integral equation arising in N=2 super-Yang–Mills theories

    International Nuclear Information System (INIS)

    In this work we study the Nekrasov–Shatashvili limit of the Nekrasov instanton partition function of Yang–Mills field theories with N=2 supersymmetry and gauge group SU(Nc). The theories are coupled with Nf flavors of fundamental matter. The equation that determines the density of eigenvalues at the leading order in the saddle-point approximation is exactly solved when Nf=2Nc. The dominating contribution to the instanton free energy is computed. The requirement that this energy is finite imposes quantization conditions on the parameters of the theory that are in agreement with analogous conditions that have been derived in previous works. The instanton energy and thus the instanton contribution to the prepotential of the gauge theory is computed in closed form.

  10. Accessible quantification of multiparticle entanglement

    CERN Document Server

    Cianciaruso, Marco; Adesso, Gerardo

    2015-01-01

    Entanglement is a key ingredient for quantum technologies and a fundamental signature of quantumness in a broad range of phenomena encompassing many-body physics, thermodynamics, cosmology, and life sciences. For arbitrary multiparticle systems, the quantification of entanglement typically involves hard optimisation problems, and requires demanding tomographical techniques. In this paper we show that such difficulties can be overcome by developing an experimentally friendly method to evaluate measures of multiparticle entanglement via a geometric approach. The method provides exact analytical results for a relevant class of mixed states of $N$ qubits, and computable lower bounds to entanglement for any general state. For practical purposes, the entanglement determination requires local measurements in just three settings for any $N$. We demonstrate the power of our approach to quantify multiparticle entanglement in $N$-qubit bound entangled states and other states recently engineered in laboratory using quant...

  11. The relationship of theory of mind and executive functions to symptom type and severity in children with autism

    OpenAIRE

    Joseph, Robert M; TAGER–FLUSBERG, HELEN

    2004-01-01

    Although neurocognitive impairments in theory of mind and in executive functions have both been hypothesized to play a causal role in autism, there has been little research investigating the explanatory power of these impairments with regard to autistic symptomatology. The present study examined the degree to which individual differences in theory of mind and executive functions could explain variations in the severity of autism symptoms. Participants included 31 verbal, school-aged children ...

  12. Prospects of using the second-order perturbation theory of the MP2 type in the theory of electron scattering by polyatomic molecules.

    Czech Academy of Sciences Publication Activity Database

    ?ársky, Petr

    2015-01-01

    Ro?. 191, ?. 2015 (2015), s. 191-192. ISSN 1551-7616 R&D Projects: GA MŠk OC09079; GA MŠk(CZ) OC10046; GA ?R GA202/08/0631 Grant ostatní: COST(XE) CM0805; COST(XE) CM0601 Institutional support: RVO:61388955 Keywords : electron-scattering * calculation of cross sections * second-order perturbation theory Subject RIV: CF - Physical ; Theoretical Chemistry

  13. One-step RT-droplet digital PCR: a breakthrough in the quantification of waterborne RNA viruses

    OpenAIRE

    Ra?ki, Nejc; Morisset, Dany; Gutierrez-Aguirre, Ion; Ravnikar, Maja

    2013-01-01

    Water contamination by viruses has an increasing worldwide impact on human health, and has led to requirements for accurate and quantitative molecular tools. Here, we report the first one-step reverse-transcription droplet digital PCR-based absolute quantification of a RNA virus (rotavirus) in different types of surface water samples. This quantification method proved to be more precise and more tolerant to inhibitory substances than the benchmarking reverse-transcription real-time PCR (RT-qP...

  14. Quantification of nanowire uptake by live cells

    KAUST Repository

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  15. Uncertainty Quantification in Aerodynamics Simulations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  16. Identification and Quantification of Carbonate Species Using Rock-Eval Pyrolysis.

    OpenAIRE

    Pillot D.; Deville E.; Prinzhofer A.

    2014-01-01

    This paper presents a new reliable and rapid method to characterise and quantify carbonates in solid samples based on monitoring the CO2 flux emitted by progressive thermal decomposition of carbonates during a programmed heating. The different peaks of destabilisation allow determining the different types of carbonates present in the analysed sample. The quantification of each peak gives the respective proportions of these diffe...

  17. Ex vivo activity quantification in micrometastases at the cellular scale using the ?-camera technique

    DEFF Research Database (Denmark)

    Chouin, Nicolas; Lindegren, Sture

    2013-01-01

    Targeted ?-therapy (TAT) appears to be an ideal therapeutic technique for eliminating malignant circulating, minimal residual, or micrometastatic cells. These types of malignancies are typically infraclinical, complicating the evaluation of potential treatments. This study presents a method of ex vivo activity quantification with an ?-camera device, allowing measurement of the activity taken up by tumor cells in biologic structures a few tens of microns.

  18. Bianchi type-V cosmological models with perfect fluid and heat flow in Saez–Ballester theory

    Indian Academy of Sciences (India)

    Shri Ram; M Zeyauddin; C P Singh

    2009-02-01

    In this paper we discuss the variation law for Hubble's parameter with average scale factor in a spatially homogeneous and anisotropic Bianchi type-V space-time model, which yields constant value of the deceleration parameter. We derive two laws of variation of the average scale factor with cosmic time, one is of power-law type and the other is of exponential form. Exact solutions of Einstein field equations with perfect fluid and heat conduction are obtained for Bianchi type-V space-time in these two types of cosmologies. In the cosmology with the power-law, the solutions correspond to a cosmological model which starts expanding from the singular state with positive deceleration parameter. In the case of exponential cosmology, we present an accelerating non-singular model of the Universe. We find that the constant value of deceleration parameter is reasonable for the present day Universe and gives an appropriate description of evolution of Universe. We have also discussed different types of physical and kinematical behaviour of the models in these two types of cosmologies.

  19. Uncertainty Quantification in Solidification Modelling

    Science.gov (United States)

    Fezi, K.; Krane, M. J. M.

    2015-06-01

    Numerical models have been used to simulate solidification processes, to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to a few or single experiments, in which agreement is dependent on both model and experimental uncertainty. As a first step to quantifying the uncertainty in the models, sensitivity and uncertainty analysis were performed on a simple steady state 1D solidification model of continuous casting of weld filler rod. This model includes conduction, advection, and release of latent heat was developed for use in uncertainty quantification in the calculation of the position of the liquidus and solidus and the solidification time. Using this model, a Smolyak sparse grid algorithm constructed a response surface that fit model outputs based on the range of uncertainty in the inputs to the model. The response surface was then used to determine the probability density functions (PDF's) of the model outputs and sensitivities of the inputs. This process was done for a linear fraction solid and temperature relationship, for which there is an analytical solution, and a Scheil relationship. Similar analysis was also performed on a transient 2D model of solidification in a rectangular domain.

  20. A PCR-based tool for cultivation-independent detection and quantification of Metarhizium clade 1.

    Science.gov (United States)

    Schneider, S; Rehner, S A; Widmer, F; Enkerli, J

    2011-10-01

    The entomopathogenic fungus Metarhizium anisopliae and sister species are some of the most widely used biological control agents for insects. Availability of specific monitoring and quantification tools are essential for the investigation of environmental factors influencing their environmental distribution. Naturally occurring as well as released Metarhizium strains in the environment traditionally are monitored with cultivation-dependent techniques. However, specific detection and quantification may be limited due to the lack of a defined and reliable detection range of such methods. Cultivation-independent PCR-based detection and quantification tools offer high throughput analyses of target taxa in various environments. In this study a cultivation-independent PCR-based method was developed, which allows for specific detection and quantification of the defined Metarhizium clade 1, which is formed by the species Metarhizium majus, Metarhizium guizhouense, Metarhizium pingshaense, Metarhizium anisopliae, Metarhizium robertsii and Metarhiziumbrunneum, formerly included in the M. anisopliae cryptic species complex. This method is based on the use of clade-specific primers, i.e. Ma 1763 and Ma 2097, that are positioned within the internal transcribed spacer regions 1 and 2 of the nuclear ribosomal RNA gene cluster, respectively. BLAST similarity searches and empirical specificity tests performed on target and non-target species, as well as on bulk soil DNA samples, demonstrated specificity of this diagnostic tool for the targeted Metarhizium clade 1. Testing of the primer pair in qPCR assays validated the diagnostic method for specific quantification of Metarhizium clade 1 in complex bulk soil DNA samples that significantly correlated with cultivation-dependent quantification. The new tool will allow for highly specific and rapid detection and quantification of the targeted Metarhizium clade 1 in the environment. Habitat with high Metarhizium clade 1 densities can then be analyzed for habitat preferences in greater detail using cultivation-dependent techniques and genetic typing of isolates. PMID:21821039

  1. Damage quantification of shear buildings using deflections obtained by modal flexibility

    International Nuclear Information System (INIS)

    This paper presents a damage quantification method for shear buildings using the damage-induced inter-storey deflections (DI-IDs) estimated by the modal flexibilities from ambient vibration measurements. This study intends to provide a basis for the damage quantification problem of more complex building structures by investigating a rather idealized type of structures, shear buildings. Damage in a structure represented by loss of stiffness generally induces additional deflection, which may contain essential information about the damage. From an analytical investigation, the general equation of damage quantification by the damage-induced deflection is proposed and its special case for shear buildings is also proposed based on the damage-induced inter-storey deflection. The proposed damage quantification method is advantageous compared to conventional FE updating approaches since the number of variables in the optimization problem is only dependent on the complexity of damage parametrization, not on the complexity of the structure. For this reason, the damage quantification for shear buildings is simplified to a form that does not require any FE updating. Numerical and experimental studies on a five-storey shear building were carried out for two damage scenarios with 10% column EI reductions. From the numerical study, it was found that the lower four natural frequencies and mode shapes were enough to make errors in the deflection estimation and the damage quantification below 1%. From the experimental study, deflections estimated by the modal flexibilities were found to agree well with the deflections obtained from static push-over tests. Damage quantifications by the proposed method were also found to agree well with true amounts of damage obtained from static push-over tests

  2. Synthesis of nanodiamond derivatives carrying amino functions and quantification by a modified Kaiser test

    Directory of Open Access Journals (Sweden)

    Gerald Jarre

    2014-11-01

    Full Text Available Nanodiamonds functionalized with different organic moieties carrying terminal amino groups have been synthesized. These include conjugates generated by Diels–Alder reactions of ortho-quinodimethanes formed in situ from pyrazine and 5,6-dihydrocyclobuta[d]pyrimidine derivatives. For the quantification of primary amino groups a modified photometric assay based on the Kaiser test has been developed and validated for different types of aminated nanodiamond. The results correspond well to values obtained by thermogravimetry. The method represents an alternative wet-chemical quantification method in cases where other techniques like elemental analysis fail due to unfavourable combustion behaviour of the analyte or other impediments.

  3. Synthesis of nanodiamond derivatives carrying amino functions and quantification by a modified Kaiser test.

    Science.gov (United States)

    Jarre, Gerald; Heyer, Steffen; Memmel, Elisabeth; Meinhardt, Thomas; Krueger, Anke

    2014-01-01

    Nanodiamonds functionalized with different organic moieties carrying terminal amino groups have been synthesized. These include conjugates generated by Diels-Alder reactions of ortho-quinodimethanes formed in situ from pyrazine and 5,6-dihydrocyclobuta[d]pyrimidine derivatives. For the quantification of primary amino groups a modified photometric assay based on the Kaiser test has been developed and validated for different types of aminated nanodiamond. The results correspond well to values obtained by thermogravimetry. The method represents an alternative wet-chemical quantification method in cases where other techniques like elemental analysis fail due to unfavourable combustion behaviour of the analyte or other impediments. PMID:25550737

  4. Application of perturbation theory to sensitivity calculations of PWR type reactor cores using the two-channel model

    International Nuclear Information System (INIS)

    Sensitivity calculations are very important in design and safety of nuclear reactor cores. Large codes with a great number of physical considerations have been used to perform sensitivity studies. However, these codes need long computation time involving high costs. The perturbation theory has constituted an efficient and economical method to perform sensitivity analysis. The present work is an application of the perturbation theory (matricial formalism) to a simplified model of DNB (Departure from Nucleate Boiling) analysis to perform sensitivity calculations in PWR cores. Expressions to calculate the sensitivity coefficients of enthalpy and coolant velocity with respect to coolant density and hot channel area were developed from the proposed model. The CASNUR.FOR code to evaluate these sensitivity coefficients was written in Fortran. The comparison between results obtained from the matricial formalism of perturbation theory with those obtained directly from the proposed model makes evident the efficiency and potentiality of this perturbation method for nuclear reactor cores sensitivity calculations (author). 23 refs, 4 figs, 7 tabs

  5. Calculation of Fayet–Iliopoulos D-term in type I string theory revisited: T{sup 6}/Z{sub 3} orbifold case

    Energy Technology Data Exchange (ETDEWEB)

    Itoyama, H., E-mail: itoyama@sci.osaka-cu.ac.jp [Department of Mathematics and Physics, Graduate School of Science, Osaka City University (Japan); Osaka City University Advanced Mathematical Institute (OCAMI), 3-3-138, Sugimoto, Sumiyoshi-ku, Osaka 558-8585 (Japan); Yano, Kohei, E-mail: kyano@sci.osaka-cu.ac.jp [Department of Mathematics and Physics, Graduate School of Science, Osaka City University (Japan)

    2013-12-18

    The string one-loop computation of the Fayet–Iliopoulos D-term in type I string theory in the case of T{sup 6}/Z{sub 3} orbifold compactification associated with annulus (planar) and the Möbius strip string worldsheet diagrams is reexamined. The mass extracted from the sum of these amplitudes through a limiting procedure is found to be non-vanishing, which is contrary to the earlier computation. The sum can be made finite by a rescaling of the modular parameter in the closed string channel.

  6. Study on exploration theory and SAR technology for interlayer oxidation zone sandstone type uranium deposit and its application in Eastern Jungar Basin

    International Nuclear Information System (INIS)

    Started with analyzing the features of metallogenetic epoch and space distribution of typical interlayer oxidation zone sandstone type uranium deposit both in China and abroad and their relations of basin evolution, the authors have proposed the idea that the last unconformity mainly controls the metallogenetic epoch and the strength of structure activity after the last unconformity determines the deposit space. An exploration theory with the kernel from new events to the old one is put forward. The means and method to use SAR technology to identify ore-controlling key factors are discussed. An application study in Eastern Jungar Basin is performed

  7. Quantification of nerolidol in mouse plasma using gas chromatography-mass spectrometry.

    Science.gov (United States)

    Saito, Alexandre Yukio; Sussmann, Rodrigo Antonio Ceschini; Kimura, Emilia Akemi; Cassera, Maria Belen; Katzin, Alejandro Miguel

    2015-07-10

    Nerolidol is a naturally occurring sesquiterpene found in the essential oils of many types of flowers and plants. It is frequently used in cosmetics, as a food flavoring agent, and in cleaning products. In addition, nerolidol is used as a skin penetration enhancer for transdermal delivery of therapeutic drugs. However, nerolidol is hemolytic at low concentrations. A simple and fast GC-MS method was developed for preliminary quantification and assessment of biological interferences of nerolidol in mouse plasma after oral dosing. Calibration curves were linear in the concentration range of 0.010-5 ?g/mL nerolidol in mouse plasma with correlation coefficients (r) greater than 0.99. Limits of detection and quantification were 0.0017 and 0.0035 ?g/mL, respectively. The optimized method was successfully applied to the quantification of nerolidol in mouse plasma. PMID:25880240

  8. Quantification of brain lipids by FTIR spectroscopy and partial least squares regression

    Science.gov (United States)

    Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph

    2009-01-01

    Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.

  9. Techniques for quantification of liver fat in risk stratification of diabetics

    International Nuclear Information System (INIS)

    Fatty liver disease plays an important role in the development of type 2 diabetes. Accurate techniques for detection and quantification of liver fat are essential for clinical diagnostics. Chemical shift-encoded magnetic resonance imaging (MRI) is a simple approach to quantify liver fat content. Liver fat quantification using chemical shift-encoded MRI is influenced by several bias factors, such as T2* decay, T1 recovery and the multispectral complexity of fat. The confounder corrected proton density fat fraction is a simple approach to quantify liver fat with comparable results independent of the software and hardware used. The proton density fat fraction is an accurate biomarker for assessment of liver fat. An accurate and reproducible quantification of liver fat using chemical shift-encoded MRI requires a calculation of the proton density fat fraction. (orig.)

  10. Quantification of Mycobacterium avium subsp. paratuberculosis Strains Representing Distinct Genotypes and Isolated from Domestic and Wildlife Animal Species by Use of an Automatic Liquid Culture System

    OpenAIRE

    Abendaño, Naiara; Sevilla, Iker; Prieto, José M.; Garrido, Joseba M.; Juste, Ramon A.; Alonso-Hearn, Marta

    2012-01-01

    Quantification of 11 clinical strains of Mycobacterium avium subsp. paratuberculosis isolated from domestic (cattle, sheep, and goat) and wildlife (fallow deer, deer, wild boar, and bison) animal species in an automatic liquid culture system (Bactec MGIT 960) was accomplished. The strains were previously isolated and typed using IS1311 PCR followed by restriction endonuclease analysis (PCR-REA) into type C, S, or B. A strain-specific quantification curve was generated for each M. avium subsp....

  11. The Types of Axisymmetric Exact Solutions Closely Related to n-SOLITONS for Yang-Mills Theory

    Science.gov (United States)

    Zhong, Zai Zhe

    In this letter, we point out that if a symmetric 2×2 real matrix M(?,z) obeys the Belinsky-Zakharov equation and |det(M)|=1, then an axisymmetric Bogomol'nyi field exact solution for the Yang-Mills-Higgs theory can be given. By using the inverse scattering technique, some special Bogomol'nyi field exact solutions, which are closely related to the true solitons, are generated. In particular, the Schwarzschild-like solution is a two-soliton-like solution.

  12. Effectiveness of a brief theory-based health promotion intervention among adults at high risk of type 2 diabetes

    DEFF Research Database (Denmark)

    Juul, Lise; Andersen, Vibeke Just; Arnoldsen, Jette; Maindal, Helle Terkildsen

    2015-01-01

    AIM: To examine the effect of a brief theory-based health promotion intervention delivered in the community on health behaviour and diabetes-related risk factors among Danish adults at high risk of diabetes. METHODS: A randomised trial was conducted among 127 individuals aged 28 to 70 with fasting plasma glucose: 6.1-6.9mmol/l and/or HbA1c: 6.0-<6.5% (42-<48mmol/mol) recruited from general practice in Holstebro, Denmark. Participants were randomised to a control group or to receive the intervent...

  13. Type-1.5 superconductivity in multiband systems: Magnetic response, broken symmetries and microscopic theory - A brief overview

    Energy Technology Data Exchange (ETDEWEB)

    Babaev, E., E-mail: babaev1@physics.umass.edu [Department of Theoretical Physics, The Royal Institute of Technology, Stockholm, SE 10691 (Sweden); Department of Physics, University of Massachusetts Amherst, MA 01003 (United States); Carlstroem, J. [Department of Theoretical Physics, The Royal Institute of Technology, Stockholm, SE 10691 (Sweden); Department of Physics, University of Massachusetts Amherst, MA 01003 (United States); Garaud, J. [Department of Physics, University of Massachusetts Amherst, MA 01003 (United States); Silaev, M. [Department of Theoretical Physics, The Royal Institute of Technology, Stockholm, SE 10691 (Sweden); Institute for Physics of Microstructures RAS, 603950 Nizhny Novgorod (Russian Federation); Speight, J.M. [School of Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2012-09-15

    A conventional superconductor is described by a single complex order parameter field which has two fundamental length scales, the magnetic field penetration depth {lambda} and the coherence length {xi}. Their ratio {kappa} determines the response of a superconductor to an external field, sorting them into two categories as follows; type-I when {kappa}<1/{radical}(2) and type-II when {kappa}>1/{radical}(2). We overview here multicomponent systems which can possess three or more fundamental length scales and allow a separate 'type-1.5' superconducting state when, e.g. in two-component case {xi}{sub 1}<{radical}(2){lambda}<{xi}{sub 2}. In that state, as a consequence of the extra fundamental length scale, vortices attract one another at long range but repel at shorter ranges. As a consequence the system should form an additional Semi-Meissner state which properties we discuss below. In that state vortices form clusters in low magnetic fields. Inside the cluster one of the component is depleted and the superconductor-to-normal interface has negative energy. In contrast the current in second component is mostly concentrated on the cluster's boundary, making the energy of this interface positive. Here we briefly overview recent developments in Ginzburg-Landau and microscopic descriptions of this state.

  14. Dynamic behaviors of spin-1/2 bilayer system within Glauber-type stochastic dynamics based on the effective-field theory

    Energy Technology Data Exchange (ETDEWEB)

    Erta?, Mehmet; Kantar, Ersin, E-mail: ersinkantar@erciyes.edu.tr; Keskin, Mustafa

    2014-05-01

    The dynamic phase transitions (DPTs) and dynamic phase diagrams of the kinetic spin-1/2 bilayer system in the presence of a time-dependent oscillating external magnetic field are studied by using Glauber-type stochastic dynamics based on the effective-field theory with correlations for the ferromagnetic/ferromagnetic (FM/FM), antiferromagnetic/ferromagnetic (AFM/FM) and antiferromagnetic/antiferromagnetic (AFM/AFM) interactions. The time variations of average magnetizations and the temperature dependence of the dynamic magnetizations are investigated. The dynamic phase diagrams for the amplitude of the oscillating field versus temperature were presented. The results are compared with the results of the same system within Glauber-type stochastic dynamics based on the mean-field theory. - Highlights: • The Ising bilayer system is investigated within the Glauber dynamics based on EFT. • The time variations of average order parameters to find phases are studied. • The dynamic phase diagrams are found for the different interaction parameters. • The system displays the critical points as well as a re-entrant behavior.

  15. Dynamic behaviors of spin-1/2 bilayer system within Glauber-type stochastic dynamics based on the effective-field theory

    International Nuclear Information System (INIS)

    The dynamic phase transitions (DPTs) and dynamic phase diagrams of the kinetic spin-1/2 bilayer system in the presence of a time-dependent oscillating external magnetic field are studied by using Glauber-type stochastic dynamics based on the effective-field theory with correlations for the ferromagnetic/ferromagnetic (FM/FM), antiferromagnetic/ferromagnetic (AFM/FM) and antiferromagnetic/antiferromagnetic (AFM/AFM) interactions. The time variations of average magnetizations and the temperature dependence of the dynamic magnetizations are investigated. The dynamic phase diagrams for the amplitude of the oscillating field versus temperature were presented. The results are compared with the results of the same system within Glauber-type stochastic dynamics based on the mean-field theory. - Highlights: • The Ising bilayer system is investigated within the Glauber dynamics based on EFT. • The time variations of average order parameters to find phases are studied. • The dynamic phase diagrams are found for the different interaction parameters. • The system displays the critical points as well as a re-entrant behavior

  16. WaveletQuant, an improved quantification software based on wavelet signal threshold de-noising for labeled quantitative proteomic analysis

    Directory of Open Access Journals (Sweden)

    Li Song

    2010-04-01

    Full Text Available Abstract Background Quantitative proteomics technologies have been developed to comprehensively identify and quantify proteins in two or more complex samples. Quantitative proteomics based on differential stable isotope labeling is one of the proteomics quantification technologies. Mass spectrometric data generated for peptide quantification are often noisy, and peak detection and definition require various smoothing filters to remove noise in order to achieve accurate peptide quantification. Many traditional smoothing filters, such as the moving average filter, Savitzky-Golay filter and Gaussian filter, have been used to reduce noise in MS peaks. However, limitations of these filtering approaches often result in inaccurate peptide quantification. Here we present the WaveletQuant program, based on wavelet theory, for better or alternative MS-based proteomic quantification. Results We developed a novel discrete wavelet transform (DWT and a 'Spatial Adaptive Algorithm' to remove noise and to identify true peaks. We programmed and compiled WaveletQuant using Visual C++ 2005 Express Edition. We then incorporated the WaveletQuant program in the Trans-Proteomic Pipeline (TPP, a commonly used open source proteomics analysis pipeline. Conclusions We showed that WaveletQuant was able to quantify more proteins and to quantify them more accurately than the ASAPRatio, a program that performs quantification in the TPP pipeline, first using known mixed ratios of yeast extracts and then using a data set from ovarian cancer cell lysates. The program and its documentation can be downloaded from our website at http://systemsbiozju.org/data/WaveletQuant.

  17. Black Holes in type IIA String on Calabi-Yau Threefolds with Affine ADE Geometries and q-Deformed 2d Quiver Gauge Theories

    CERN Document Server

    Laamara, R A; Drissi, L B; Saidi, E H

    2006-01-01

    Motivated by studies on 4d black holes and q-deformed 2d Yang Mills theory, and borrowing ideas from compact geometry of the blowing up of affine ADE singularities, we build a class of local Calabi-Yau threefolds (CY^{3}) extending the local 2-torus model \\mathcal{O}(m)\\oplus \\mathcal{O}(-m)\\to T^{2\\text{}} considered in hep-th/0406058 to test OSV conjecture. We first study toric realizations of T^{2} and then build a toric representation of X_{3} using intersections of local Calabi-Yau threefolds \\mathcal{O}(m)\\oplus \\mathcal{O}(-m-2)\\to \\mathbb{P}^{1}. We develop the 2d \\mathcal{N}=2 linear \\sigma-model for this class of toric CY^{3}s. Then we use these local backgrounds to study partition function of 4d black holes in type IIA string theory and the underlying q-deformed 2d quiver gauge theories. We also make comments on 4d black holes obtained from D-branes wrapping cycles in \\mathcal{O}(\\mathbf{m}) \\oplus \\mathcal{O}(\\mathbf{-m-2}%) \\to \\mathcal{B}_{k} with \\mathbf{m=}(m_{1},...,m_{k}) a k-dim integer vec...

  18. Quantifications and Modeling of Human Failure Events in a Fire PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures.

  19. Accurate material quantification in dual energy CT

    Science.gov (United States)

    Shechter, Gilad; Thran, Axel; Katchalski, Tsvi

    2012-03-01

    Clinical CT applications such as oncology follow-up using iodine maps require accurate contrast agent (CA) quantification within the patient. Unfortunately, due to beam hardening, the quantification of CA materials like iodine in dual energy systems can vary for different patient sizes and surrounding composition. In this paper we present a novel method that handles this problem which takes into account properly the CA energy dependent attenuation profile. Our method is applicable for different dual energy scanners, e.g. fast kVp switching or dual layer detector array and is fully compatible with image domain material analysis. In this paper we explain the concept of so called landmarks used by our method, and give the mathematical formulation of how to calculate them. We demonstrate by scans of various phantom shapes and by simulations, the robustness and the accuracy of the iodine concentration quantification obtained by our method.

  20. Analysis of New Type Air-conditioning for Loom Based on CFD Simulation and Theory of Statistics

    Directory of Open Access Journals (Sweden)

    Ruiliang Yang

    2011-01-01

    Full Text Available Based on theory of statistics, main factors affecting effects of loom workshop’s large and small zone ventilation using the CFD simulation in this paper.  Firstly, four factors and three levels of orthogonal experimental table is applied to CFD simulation, the order from major to minor of four factors is obtained, which can provide theoretical basis for design and operation. Then single-factor experiment method is applied to CFD simulation, certain factor changing can be obtained with best levels of other factors. Base on above recommend parameters, CFD software is applied to simulate relative humid and PMV on the loom. Lastly, comparison of simulation results and experiment is used to verify feasibility of simulation results.

  1. Preference for a vanishingly small cosmological constant in supersymmetric vacua in a Type IIB string theory model

    International Nuclear Information System (INIS)

    We study the probability distribution P(?) of the cosmological constant ? in a specific set of KKLT type models of supersymmetric IIB vacua. We show that, as we sweep through the quantized flux values in this flux compactification, P(?) behaves divergent at ?=0? and the median magnitude of ? drops exponentially as the number of complex structure moduli h2,1 increases. Also, owing to the hierarchical and approximate no-scale structure, the probability of having a positive Hessian (mass-squared matrix) approaches unity as h2,1 increases

  2. Generation of non-classical correlated photon pairs via a ladder-type atomic configuration: theory and experiment

    CERN Document Server

    Ding, Dong-Sheng; Shi, Bao-Sen; Zou, Xu-Bo; Guo, Guang-Can

    2012-01-01

    We experimentally generate a non-classical correlated two-color photon pair at 780 and 1529.4 nm in a ladder-type configuration using a hot 85Rb atomic vapor with the production rate of ~107/s. The non-classical correlation between these two photons is demonstrated by strong violation of Cauchy-Schwarz inequality by the factor R=48+-12. Besides, we experimentally investigate the relations between the correlation and some important experimental parameters such as the single-photon detuning, the powers of pumps. We also make a theoretical analysis in detail and the theoretical predictions are in reasonable agreement with our experimental results.

  3. Trace elements quantification in Portuguese red wines

    OpenAIRE

    Santos, Susana Isabel Barros dos

    2011-01-01

    The aim of this thesis is to characterize Portuguese red wines in terms of trace elements composition. The wines were chosen so that all the country was represented and studied. For trace elements quantification (As, Hg, Cd, Ni and Pb) were tested various sample treatments including for all trace elements: acid digestion or presence and absence of spike. The need for H2O2 addition in order to oxidize organic compounds was analyzed for Hg, Cd, Ni and Pb. Quantification of all trace el...

  4. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Science.gov (United States)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.

  5. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  6. Effective field theory and Ab-initio calculation of p-type (Ga, Fe)N within LDA and SIC approximation

    Energy Technology Data Exchange (ETDEWEB)

    Salmani, E. [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); Mounkachi, O. [Institute of Nanomaterials and Nanotechnology, MAScIR, Rabat (Morocco); Ez-Zahraouy, H., E-mail: ezahamid@fsr.ac.ma [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); El Kenz, A. [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); Hamedoun, M. [Institute of Nanomaterials and Nanotechnology, MAScIR, Rabat (Morocco); Benyoussef, A. [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); Institute of Nanomaterials and Nanotechnology, MAScIR, Rabat (Morocco); Hassan II Academy of Science and Technology, Rabat (Morocco)

    2013-03-15

    Based on first-principles spin-density functional calculations, using the Korringa-Kohn-Rostoker method combined with the coherent potential approximation, we investigated the half-metallic ferromagnetic behavior of (Ga, Fe)N co-doped with carbon within the self-interaction-corrected local density approximation. Mechanism of hybridization and interaction between magnetic ions in p-type (Ga, Fe)N is investigated. Stability energy of ferromagnetic and disorder local moment states was calculated for different carbon concentration. The local density and the self-interaction-corrected approximations have been used to explain the strong ferromagnetic interaction observed and the mechanism that stabilizes this state. The transition temperature to the ferromagnetic state has been calculated within the effective field theory, with a Honmura-Kaneyoshi differential operator technique. - Highlights: Black-Right-Pointing-Pointer The paper focus on the study the magnetic properties and electronic structure of p-type (Ga, Fe)N within LDA and SIC approximation. Black-Right-Pointing-Pointer These methods allow us to explain the strong ferromagnetic interaction observed and the mechanism for its stability and the mechanism of hybridization and interaction between magnetic ions in p-type (Ga, Fe). Black-Right-Pointing-Pointer The results obtained are interesting and can be serve as a reference in the field of dilute magnetic semi conductor.

  7. Quantification of Information in a One-Way Plant-to-Animal Communication System

    OpenAIRE

    Doyle, Laurance R.

    2009-01-01

    In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps) studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—on...

  8. Initial growth mechanism of atomic layer deposited titanium dioxide using cyclopentadienyl-type precursor: A density functional theory study

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Guangfen [College of Science, Beijing Institute of Technology, Beijing 100081 (China); College of Science, Hebei University of Science and Technology, Shijiazhuang 050018 (China); Ren, Jie, E-mail: renjie@fudan.edu.cn [College of Science, Hebei University of Science and Technology, Shijiazhuang 050018 (China); Zhang, Shaowen [College of Science, Beijing Institute of Technology, Beijing 100081 (China)

    2012-12-01

    The initial reaction mechanism of atomic layer deposited TiO{sub 2} thin film on the silicon surface using Cp*Ti(OCH{sub 3}){sub 3} as the metal precursor has been investigated by using the density functional theory. We find that Cp*Ti(OCH{sub 3}){sub 3} adsorbed state can be formed via the hydrogen bonding interaction between CH{sub 3}O ligands and the Si-OH sites, which is in good agreement with the quadrupole mass spectrometry (QMS) experimental observations. Moreover, the desorption of adsorbed Cp*Ti(OCH{sub 3}){sub 3} is favored in the thermodynamic equilibrium state. The elimination reaction of CH{sub 3}OH can occur more readily than that of Cp*H during the Cp*Ti(OCH{sub 3}){sub 3} pulse. This conclusion is also confirmed by the QMS experimental results. - Highlights: Black-Right-Pointing-Pointer Initial reaction mechanism of atomic layer deposition of TiO{sub 2} has been studied. Black-Right-Pointing-Pointer The Cp*Ti(OCH{sub 3}){sub 3} absorbed state on silicon surface is formed by hydrogen bonds. Black-Right-Pointing-Pointer The elimination of CH{sub 3}OH occurs more readily than that of Cp*H in Cp*Ti(OCH{sub 3}){sub 3}. Black-Right-Pointing-Pointer The Cp*Ti(OCH{sub 3}){sub 3} adsorbs on silicon surface via the CH{sub 3}O ligand.

  9. Entanglement quantification by local unitaries

    OpenAIRE

    Monras, A.; Adesso, G.; Giampaolo, S. M.; Gualdi, G; Davies, G. B.; Illuminati, F.

    2011-01-01

    Invariance under local unitary operations is a fundamental property that must be obeyed by every proper measure of quantum entanglement. However, this is not the only aspect of entanglement theory where local unitaries play a relevant role. In the present work we show that the application of suitable local unitary operations defines a family of bipartite entanglement monotones, collectively referred to as "mirror entanglement". They are constructed by first considering the (...

  10. Stellar convection theory. III - Dynamical coupling of the two convection zones in A-type stars by penetrative motions

    Science.gov (United States)

    Latour, J.; Toomre, J.; Zahn, J.-P.

    1981-01-01

    The thermal convection occurring over many density scale heights in an A-type star outer envelope, encompassing both the hydrogen and helium convectively unstable zones, is examined by means of anelastic modal equations. The single-mode anelastic equations for such compressible convection display strong overshooting of the motions into adjacent radiative zones, which would preclude diffusive separation of elements in the supposedly quiescent region between the two unstable zones. In addition, the anelastic solutions reveal that the two zones of convective instability are dynamically coupled by the overshooting motions. The two solutions that the nonlinear single-mode equations admit for the same horizontal wavelength are distinguished by the sense of the vertical velocity at the center of the three-dimensional cell. It is suggested that strong horizontal shear flows should be present just below the surface of the star, and that the large-scale motions extending into the stable atmosphere would appear mainly as horizontal flows.

  11. Critical review of current and emerging quantification methods for the development of influenza vaccine candidates.

    Science.gov (United States)

    Manceur, Aziza P; Kamen, Amine A

    2015-11-01

    Significant improvements in production and purification have been achieved since the first approved influenza vaccines were administered 75 years ago. Global surveillance and fast response have limited the impact of the last pandemic in 2009. In case of another pandemic, vaccines can be generated within three weeks with certain platforms. However, our Achilles heel is at the quantification level. Production of reagents for the quantification of new vaccines using the SRID, the main method formally approved by regulatory bodies, requires two to three months. The impact of such delays can be tragic for vulnerable populations. Therefore, efforts have been directed toward developing alternative quantification methods, which are sensitive, accurate, easy to implement and independent of the availability of specific reagents. The use of newly-developed antibodies against a conserved region of hemagglutinin (HA), a surface protein of influenza, holds great promises as they are able to recognize multiple subtypes of influenza; these new antibodies could be used in immunoassays such as ELISA and slot-blot analysis. HA concentration can also be determined using reversed-phase high performance liquid chromatography (RP-HPLC), which obviates the need for antibodies but still requires a reference standard. The number of viral particles can be evaluated using ion-exchange HPLC and techniques based on flow cytometry principles, but non-viral vesicles have to be taken into account with cellular production platforms. As new production systems are optimized, new quantification methods that are adapted to the type of vaccine produced are required. The nature of these new-generation vaccines might dictate which quantification method to use. In all cases, an alternative method will have to be validated against the current SRID assay. A consensus among the scientific community would have to be reached so that the adoption of new quantification methods would be harmonized between international laboratories. PMID:26271833

  12. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to increased emissions unless we improve production efficiencies and management. Developing countries currently account for about three-quarters of direct emissions and are expected to be the most rapidly growing emission sources in the future (FAO 2011). Reducing agricultural emissions and increasing carbon sequestration in the soil and biomass has the potential to reduce agriculture's contribution to climate change by 5.5-6.0 gigatons (Gt) of carbon dioxide equivalent (CO2eq)/year. Economic potentials, which take into account costs of implementation, range from 1.5 to 4.3 GT CO2eq/year, depending on marginal abatement costs assumed and financial resources committed, with most of this potential in developing countries (Smith et al 2007). The opportunity for mitigation in agriculture is thus significant, and, if realized, would contribute to making this sector carbon neutral. Yet it is only through a robust and shared understanding of how much carbon can be stored or how much CO2 is reduced from mitigation practices that informed decisions can be made about how to identify, implement, and balance a suite of mitigation practices as diverse as enhancing soil organic matter, increasing the digestibility of feed for cattle, and increasing the efficiency of nitrogen fertilizer applications. Only by selecting a portfolio of options adapted to regional characteristics and goals can mitigation needs be best matched to also serve rural development goals, including food security and increased resilience to climate change. Expansion of agricultural land also remains a major contributor of greenhouse gases, with deforestation, largely linked to clearing of land for cultivation or pasture, generating 80% of emissions from developing countries (Hosonuma et al 2012). There are clear opportunities for these countries to address mitigation strategies from the forest and agriculture sector, recognizing that agriculture plays a large role in economic and development potential. In this context, multiple development goals can be reinforced by specific climate funding granted on the basis of

  13. Anwendung der "Uncertainty Quantification" bei eisenbahndynamischen problemen

    DEFF Research Database (Denmark)

    Bigoni, Daniele; Engsig-Karup, Allan Peter

    2013-01-01

    The paper describes the results of the application of "Uncertainty Quantification" methods in railway vehicle dynamics. The system parameters are given by probability distributions. The results of the application of the Monte-Carlo and generalized Polynomial Chaos methods to a simple bogie model will be discussed.

  14. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  15. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  16. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension. PMID:25987192

  17. Stellar convection theory. III. Dynamical coupling of the two convection zones in A -type stars by penetrative motions

    International Nuclear Information System (INIS)

    Anelastic modal equations are used to examine thermal convection occurring over many density scale heights in the entire outer envelope of an A-type star, encompassing both the hydrogen and helium convectively unstable zones. Single-mode anelastic solutions for such compressible convection display strong overshooting of the motions into adjacent radiative zones. Such mixing would preclude diffusive separation of elements in the supposedly quiescent region between the two unstable zones. Indeed, the anelastic solutions reveal that the two zones of convective instability are dynamically coupled by the overshooting motions. The nonlinear single-mode equations admit two solutions for the same horizontal wavelength, and these are distinguished by the sense of the vertical velocity at the center of the three-dimensional cell. The upward directed flows experience large pressure effects when they penetrate into regions where the vertical scale height has become small compared to their horizontal scale. The fluctuating pressure can modify the density fluctuations so that the sense of the buoyancy force is changed, with buoyancy braking actually achieved near the top of the convection zone, even though the mean stratification is still superadiabatic. The pressure and buoyancy work there serves to decelerate the vertical motions and deflect them laterally, leading to strong horizontal shearing motions. Thus the shallow but highly unstable hydrogen ionization zone may serve to prevent convection with a horizontal scale comparable to supergranulation from getting through into the atmosphere with any significant portion of its original momentum. This suggests that strong horizontal shear flows should be present just below the surface of the star, and similarly that strong horizontal shear flows should be present just below the surface of the star, and similarly that the large-scale motions extending into the stable atmosphere would appear mainly as horizontal flows

  18. Toward a theory of distinct types of "impulsive" behaviors: A meta-analysis of self-report and behavioral measures.

    Science.gov (United States)

    Sharma, Leigh; Markon, Kristian E; Clark, Lee Anna

    2014-03-01

    Impulsivity is considered a personality trait affecting behavior in many life domains, from recreational activities to important decision making. When extreme, it is associated with mental health problems, such as substance use disorders, as well as with interpersonal and social difficulties, including juvenile delinquency and criminality. Yet, trait impulsivity may not be a unitary construct. We review commonly used self-report measures of personality trait impulsivity and related constructs (e.g., sensation seeking), plus the opposite pole, control or constraint. A meta-analytic principal-components factor analysis demonstrated that these scales comprise 3 distinct factors, each of which aligns with a broad, higher order personality factor-Neuroticism/Negative Emotionality, Disinhibition versus Constraint/Conscientiousness, and Extraversion/Positive Emotionality/Sensation Seeking. Moreover, Disinhibition versus Constraint/Conscientiousness comprise 2 correlated but distinct subfactors: Disinhibition versus Constraint and Conscientiousness/Will versus Resourcelessness. We also review laboratory tasks that purport to measure a construct similar to trait impulsivity. A meta-analytic principal-components factor analysis demonstrated that these tasks constitute 4 factors (Inattention, Inhibition, Impulsive Decision-Making, and Shifting). Although relations between these 2 measurement models are consistently low to very low, relations between both trait scales and laboratory behavioral tasks and daily-life impulsive behaviors are moderate. That is, both independently predict problematic daily-life impulsive behaviors, such as substance use, gambling, and delinquency; their joint use has incremental predictive power over the use of either type of measure alone and furthers our understanding of these important, problematic behaviors. Future use of confirmatory methods should help to ascertain with greater precision the number of and relations between impulsivity-related components. PMID:24099400

  19. String Theory

    Science.gov (United States)

    Polchinski, Joseph

    1998-10-01

    Volume 2: Superstring Theory and Beyond, begins with an introduction to supersymmetric string theories and goes on to a broad presentation of the important advances of recent years. The book first introduces the type I, type II, and heterotic superstring theories and their interactions. It then goes on to present important recent discoveries about strongly coupled strings, beginning with a detailed treatment of D-branes and their dynamics, and covering string duality, M-theory, and black hole entropy, and discusses many classic results in conformal field theory. The final four chapters are concerned with four-dimensional string theories, and have two goals: to show how some of the simplest string models connect with previous ideas for unifying the Standard Model; and to collect many important and beautiful general results on world-sheet and spacetime symmetries.

  20. Quantification of AS and AR

    OpenAIRE

    Mehta Yatin; Singh Rajni

    2009-01-01

    Trans-esophageal echocardiography (TEE) is routinely used in valvular surgery in most institutions. The popularity of TEE stems from the fact that it can supplement or confirm information gained from other methods of evaluation or make completely independant diagnoses. Quantitative and qualitative assessment permits informed decisions regarding surgical intervention, type of intervention, correction of inadequate surgical repair and re-operation for complications. This review summarizes the v...

  1. Perturbation theory

    International Nuclear Information System (INIS)

    After noting some advantages of using perturbation theory some of the various types are related on a chart and described, including many-body nonlinear summations, quartic force-field fit for geometry, fourth-order correlation approximations, and a survey of some recent work. Alternative initial approximations in perturbation theory are also discussed. 25 references

  2. Aerobic physical activity and resistance training: an application of the theory of planned behavior among adults with type 2 diabetes in a random, national sample of Canadians

    Directory of Open Access Journals (Sweden)

    Karunamuni Nandini

    2008-12-01

    Full Text Available Abstract Background Aerobic physical activity (PA and resistance training are paramount in the treatment and management of type 2 diabetes (T2D, but few studies have examined the determinants of both types of exercise in the same sample. Objective The primary purpose was to investigate the utility of the Theory of Planned Behavior (TPB in explaining aerobic PA and resistance training in a population sample of T2D adults. Methods A total of 244 individuals were recruited through a random national sample which was created by generating a random list of household phone numbers. The list was proportionate to the actual number of household telephone numbers for each Canadian province (with the exception of Quebec. These individuals completed self-report TPB constructs of attitude, subjective norm, perceived behavioral control and intention, and a 3-month follow-up that assessed aerobic PA and resistance training. Results TPB explained 10% and 8% of the variance respectively for aerobic PA and resistance training; and accounted for 39% and 45% of the variance respectively for aerobic PA and resistance training intentions. Conclusion These results may guide the development of appropriate PA interventions for aerobic PA and resistance training based on the TPB.

  3. Entanglement quantification by local unitaries

    CERN Document Server

    Monras, A; Giampaolo, S M; Gualdi, G; Davies, G B; Illuminati, F

    2011-01-01

    Invariance under local unitary operations is a fundamental property that must be obeyed by every proper measure of quantum entanglement. However, this is not the only aspect of entanglement theory where local unitaries play a relevant role. In the present work we show that the application of suitable local unitary operations defines a family of bipartite entanglement monotones, collectively referred to as "shield entanglement". They are constructed by first considering the (squared) Hilbert- Schmidt distance of the state from the set of states obtained by applying to it a given local unitary. To the action of each different local unitary there corresponds a different distance. We then minimize these distances over the sets of local unitaries with different spectra, obtaining an entire family of different entanglement monotones. We show that these shield entanglement monotones are organized in a hierarchical structure, and we establish the conditions that need to be imposed on the spectrum of a local unitary f...

  4. Molecular identification and quantification of human rhinoviruses in respiratory samples.

    Science.gov (United States)

    Lee, Wai-Ming; Grindle, Kris; Vrtis, Rose; Pappas, Tressa; Vang, Fue; Lee, Iris; Gern, James E

    2015-01-01

    PCR-based molecular assays have become standard diagnostic procedures for the identification and quantification of human rhinoviruses (HRVs) and other respiratory pathogens in most, if not all, clinical microbiology laboratories. Molecular assays are significantly more sensitive than traditional culture-based and serological methods. This advantage has led to the recognition that HRV infections are common causes for not only upper airway symptoms but also more severe lower respiratory illnesses. In addition, molecular assays improve turnaround time, can be performed by technicians with ordinary skills, and can easily be automated. This chapter describes two highly sensitive and specific PCR-based methods for identifying and quantifying HRVs. The first is a two-step PCR method for the detection and typing of HRV. The second is a pan-HRV real-time quantitative (q) PCR method for measuring viral loads in respiratory samples. PMID:25261304

  5. Defect and damage evolution quantification in dynamically-deformed metals using orientation-imaging microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Gray, George T., III [Los Alamos National Laboratory; Livescu, Veronica [Los Alamos National Laboratory; Cerreta, Ellen K [Los Alamos National Laboratory

    2010-03-18

    Orientation-imaging microscopy offers unique capabilities to quantify the defects and damage evolution occurring in metals following dynamic and shock loading. Examples of the quantification of the types of deformation twins activated, volume fraction of twinning, and damage evolution as a function of shock loading in Ta are presented. Electron back-scatter diffraction (EBSD) examination of the damage evolution in sweeping-detonation-wave shock loading to study spallation in Cu is also presented.

  6. Cross recurrence quantification analysis of indefinite anaphora in Swedish dialog : an eye-tracking pilot experiment

    OpenAIRE

    Diderichsen, Philip

    2006-01-01

    A new method is used in an eye-tracking pilot experiment which shows that it is possible to detect differences in common ground associated with the use of minimally different types of indefinite anaphora. Following Richardson and Dale (2005), cross recurrence quantification analysis (CRQA) was used to show that the tandem eye movements of two Swedish-speaking interlocutors are slightly more coupled when they are using fully anaphoric indefinite expressions ...

  7. Synthesis of nanodiamond derivatives carrying amino functions and quantification by a modified Kaiser test

    OpenAIRE

    Jarre, Gerald; Heyer, Steffen; Memmel, Elisabeth; Meinhardt, Thomas; Krueger, Anke

    2014-01-01

    Nanodiamonds functionalized with different organic moieties carrying terminal amino groups have been synthesized. These include conjugates generated by Diels–Alder reactions of ortho-quinodimethanes formed in situ from pyrazine and 5,6-dihydrocyclobuta[d]pyrimidine derivatives. For the quantification of primary amino groups a modified photometric assay based on the Kaiser test has been developed and validated for different types of aminated nanodiamond. The results correspond well to values o...

  8. Quantification of atherosclerosis with MRI

    International Nuclear Information System (INIS)

    Cardiovascular disease due to atherosclerosis is a major cause of death in the United States. A major limitation in the current treatment of atherosclerosis is the lack of a quantitative means to non-invasively evaluate the extent of the disease. Recent studies suggest that Magnetic Resonance Imaging (MRI) has the potential for the detection of atherosclerotic plaque. It has been demonstrated that multi-dimensional pattern recognition can be applied to multi-pulse sequence MR images to identify different tissue types. The authors reported the identification of tissues involved in the atherosclerotic disease process, such as normal endothelium, smooth muscle, thrombus, fat or lipid, connective tissue and calcified plaque. The work reported in this abstract presents preliminary results of applying quantitative 3-D reconstruction to the problem of identifying and quantifying atherosclerotic plaque in vitro

  9. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  10. Sulphite quantification on damaged stones and mortars

    Science.gov (United States)

    Gobbi, G.; Zappia, G.; Sabbioni, C.

    An analytical procedure was developed for the simultaneous identification and quantification of the sulphite and main anions found in degradation patinas on historic buildings and monuments, as well as on stones and mortars exposed in simulation chamber and field tests. The quantification of anions was performed by means of ion chromatography (IC), after the stabilisation of sulphite with a D(-) fructose solution. The utilisation of two different chromatographic columns, connected in series, allowed the simultaneous determination of fluoride, acetate, formate, chloride, nitrite, bromide, iodide, oxyhalides, nitrate, phosphate, sulphite, sulphate and oxalate, in a time of approximately 25 min, without interference and with high reproducibility. Finally, the results show how in the majority of cases the formation of sulphite is an intermediate stage in the sulphation process affecting building materials exposed to the environment and needs to be measured together with sulphate, in order to obtain a correct interpretation of degradation mechanisms on such materials.

  11. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    2015-01-01

    The systematic quantification of the uncertainties affecting dynamical systems and the characterization of the uncertainty of their outcomes is critical for engineering design and analysis, where risks must be reduced as much as possible. Uncertainties stem naturally from our limitations in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mat...

  12. Quantification of water uptake by soot particles

    International Nuclear Information System (INIS)

    Quantification of atmospheric processes including the water uptake by soot particles of various origin, emitted from different sources, requires identification of hydrophobic and hydrophilic soot. Water uptake measurements are performed on well-characterized laboratory soots available for atmospheric studies. Comparative analysis of water adsorption isotherms on soots of various compositions allows us to suggest a concept of quantification. Systematic analysis demonstrates two mechanisms of water/soot interaction, namely, bulk dissolution into soot-water-soluble coverage (absorption mechanism) and water molecule adsorption on surface active sites (adsorption mechanism). The formation of water film extended over the surface is suggested as a quantification measure which separates hygroscopic from non-hygroscopic soot. Water uptake on hygroscopic soot takes place by the absorption mechanism: it significantly exceeds the formation of many surface layers. If soot particles are made mostly from elemental carbon and/or are covered by a water-insoluble organic layer, they are classified as non-hygroscopic. Low water adsorption on some active sites following cluster formation is a typical mechanism of water interaction with hydrophobic soot. If a water film extended over the surface is formed due to the cluster confluence it is suggested that soot is hydrophilic. A few classical models are applied for parameterization of water interactions on hydrophilic and hydrophobic soots

  13. Extending Existential Quantification in Conjunctions of BDDs

    Directory of Open Access Journals (Sweden)

    Sean A. Weaver

    2006-06-01

    Full Text Available We introduce new approaches intended to speed up determining the satisfiability of a given Boolean formula ? expressed as a conjunction of Boolean functions. A common practice in such cases, when using constraint-oriented methods, is to represent the functions as BDDs, then repeatedly cluster BDDs containing one or more variables, and finally existentially quantify those variables away from the cluster. Clustering is essential because, in general, existential quantification cannot be applied unless the variables occur in only a single BDD. But, clustering incurs significant overhead and may result in BDDs that are too big to allow the process to complete in a reasonable amount of time. There are two significant contributions in this paper. First, we identify elementary conditions under which the existential quantification of a subset of variables V may be distributed over all BDDs without clustering. We show that when these conditions are satisfied, safe assignments to the variables of V are automatically generated. This is significant because these assignments can be applied, as though they were inferences, to simplify ?. Second, some efficient operations based on these conditions are introduced and can be integrated into existing frameworks of both search-oriented and constraint-oriented methods of satisfiability. All of these operations are relaxations in the use of existential quantification and therefore may fail to find one or more existing safe assignments. Finally, we compare and contrast the relationship of these operations to autarkies and present some preliminary results.

  14. Assessment of Factors Affecting Self-Care Behavior Among Women With Type 2 Diabetes in Khoy City Diabetes Clinic Using the Extended Theory of Reasoned Action

    Directory of Open Access Journals (Sweden)

    Ebrahim Hajizadeh

    2011-11-01

    Full Text Available Background and Aim: Many studies show that the only way to control diabetes and prevent its debilitating complications is continuous self-care. This study aimed to determine factors affecting self-care behavior of diabetic women in Khoy City, Iran based the extended theory of reasoned action (ETRA. Materials and Methods: A sample of 352 women with type 2 diabetes referring to a Diabetes Clinic in Khoy City in West Azarbaijan Province, Iran participated in the study. Appropriate instruments were designed to measure the relevant variables (diabetes knowledge, personal beliefs, subjective norm, self-efficacy and behavioral intention, and self-care behavior based on ETRA. Reliability and validity of the instruments were determined prior to the study. Statistical analysis of the data was done using the SPSS-version 16 software.Results: Based on the data obtained, the proposed model could predict and explain 41% and 26.2% of the variance of behavioral intention and self-care, respectively, in women with type-2 diabetes. The data also indicated that among the constructs of the model perceived self-efficacy was the strongest predictor for intention for self-care behavior. This construct affected both directly and indirectly self-care behavior. The next strongest predictors were attitudes, social pressures, social norms, and intervals between visiting patients by the treating team.Conclusion: The proposed model can predict self-care behavior very well. Thus, it may form the basis for educational interventions aiming at promoting self-care and, ultimately, controlling diabetes.

  15. Characterization and LC-MS/MS based quantification of hydroxylated fullerenes

    Science.gov (United States)

    Chao, Tzu-Chiao; Song, Guixue; Hansmeier, Nicole; Westerhoff, Paul; Herckes, Pierre; Halden, Rolf U.

    2011-01-01

    Highly water-soluble hydroxylated fullerene derivatives are being investigated for a wide range of commercial products as well as for potential cytotoxicity. However, no analytical methods are currently available for their quantification at sub-ppm concentrations in environmental matrices. Here, we report on the development and comparison of liquid chromatography-ultra violet/visible spectroscopy (LC-UV/vis) and mass spectrometry (LC-MS) based detection and quantification methods for a commercial fullerols. We achieved good separation efficiency using an amide-type hydrophilic interaction liquid chromatography (HILIC) column (plate number >2000) under isocratic conditions with 90% acetonitrile as the mobile phase. The method detection limits (MDLs) ranged from 42.8 ng/mL (UV detection) to 0.19 pg/mL (using MS with multiple reaction monitoring, MRM). Other MS measurement modes achieved MDLs of 125 pg/mL (single quad scan, Q1) and 1.5 pg/mL (multiple ion monitoring, MI). Each detection method exhibited a good linear response over several orders of magnitude. Moreover, we tested the robustness of these methods in the presence of Suvanee River fulvic acids (SRFA) as an example of organic matter commonly found in environmental water samples. While SRFA significantly interfered with UV- and Q1-based quantifications, the interference was relatively low using MI or MRM (relative error in presence of SRFA: 8.6% and 2.5%, respectively). This first report of a robust MS-based quantification method for modified fullerenes dissolved in water suggests the feasibility of implementing MS techniques more broadly for identification and quantification of fullerols and other water-soluble fullerene derivatives in environmental samples. PMID:21294534

  16. Protocol for Quantification of Defects in Natural Fibres for Composites

    DEFF Research Database (Denmark)

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based on the experimental method of optical microscopy and the image analysis algorithms of the seeded region growing method and Otsu’s method. The use of the protocol is demonstrated by examining two types of differently processed flax fibres to give mean defect contents of 6.9 and 3.9%, a difference which is tested to be statistically significant. The protocol is evaluated with respect to the selection of image analysis algorithms, and Otsu’s method is found to be a more appropriate method than the alternative coefficient of variation method. The traditional way of defining defect size by area is compared to the definition of defect size by width, and it is shown that both definitions can be used to give unbiased findings for the comparison between fibre types. Finally, considerations are given with respect to true measures of defect content, number of determinations, and number of significant figures used for the descriptive statistics.

  17. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon; Hermans, Pim

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell sup...

  18. ["A Little Bit of Switzerland, a Little Bit of Kosovo". Swiss Immigrants from Former Yugoslavia with Type 2 Diabetes. A Qualitative Study' in Analogy to Grounded Theory].

    Science.gov (United States)

    Wenger, A; Mischke, C

    2015-10-01

    Type 2 diabetes is on the increase among the Swiss immigrants. The cultural background of patients presents new linguistic and sociocultural barriers and gains in importance for health care. In order to develop patient-centred care, it is necessary to focus on different sociocultural aspects in everyday life and experiences of immigrants from the former republics of Yugoslavia with diabetes who have rarely been studied in Switzerland. Based on these insights the needs for counselling can be identified and nursing interventions can be designed accordingly. Using the Grounded Theory approach, 5 interviews were analysed according to the Corbin and Strauss coding paradigm. The central phenomenon found is the experience to live in 2 different cultures. The complexity arises from the tension living in 2 cultural backgrounds at the same time. It turns out that in the country of origin the immigrants adjust their disease management. The changing daily rhythm and the more traditional role model affect aspects of their disease management such as diet and/or drug therapy. The different strategies impact the person's roles, emotions, their everyday lives and their families. It provides an insight into the perspective of Swiss immigrants from the former republics of Yugoslavia suffering from diabetes. Many questions are still unanswered and further research will be required. PMID:26270044

  19. Secular Dynamics of S-type Planetary Orbits in Binary Star Systems: Applicability Domains of First- and Second-Order Theories

    CERN Document Server

    Andrade-Ines, Eduardo; Michtchenko, Tatiana; Robutel, Philippe

    2015-01-01

    We analyse the secular dynamics of planets on S-type coplanar orbits in tight binary systems, based on first- and second-order analytical models, and compare their predictions with full N-body simulations. The perturbation parameter adopted for the development of these models depends on the masses of the stars and on the semimajor axis ratio between the planet and the binary. We show that each model has both advantages and limitations. While the first-order analytical model is algebraically simple and easy to implement, it is only applicable in regions of the parameter space where the perturbations are sufficiently small. The second-order model, although more complex, has a larger range of validity and must be taken into account for dynamical studies of some real exoplanetary systems such as $\\gamma$-Cephei and HD 41004A. However, in some extreme cases, neither of these analytical models yields quantitatively correct results, requiring either higher-order theories or direct numerical simulations. Finally, we ...

  20. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective for the Phase II effort will be to develop a comprehensive, efficient, and flexible uncertainty quantification (UQ) framework implemented within a...

  1. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  2. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  3. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  4. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Propagation of parameter uncertainties through large computer models can be very resource intensive. Frameworks and tools for uncertainty quantification are...

  5. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification of EPO in a high-throughput setting.

  6. Multiparty Symmetric Sum Types

    DEFF Research Database (Denmark)

    Nielsen, Lasse; Yoshida, Nobuko; Honda, Kohei

    2010-01-01

    This paper introduces a new theory of multiparty session types based on symmetric sum types, by which we can type non-deterministic orchestration choice behaviours. While the original branching type in session types can represent a choice made by a single participant and accepted by others determining how the session proceeds, the symmetric sum type represents a choice made by agreement among all the participants of a session. Such behaviour can be found in many practical systems, including coll...

  7. Using psychological theory to understand the clinical management of type 2 diabetes in Primary Care: a comparison across two European countries

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2009-08-01

    Full Text Available Abstract Background Long term management of patients with Type 2 diabetes is well established within Primary Care. However, despite extensive efforts to implement high quality care both service provision and patient health outcomes remain sub-optimal. Several recent studies suggest that psychological theories about individuals' behaviour can provide a valuable framework for understanding generalisable factors underlying health professionals' clinical behaviour. In the context of the team management of chronic disease such as diabetes, however, the application of such models is less well established. The aim of this study was to identify motivational factors underlying health professional teams' clinical management of diabetes using a psychological model of human behaviour. Methods A predictive questionnaire based on the Theory of Planned Behaviour (TPB investigated health professionals' (HPs' cognitions (e.g., beliefs, attitudes and intentions about the provision of two aspects of care for patients with diabetes: prescribing statins and inspecting feet. General practitioners and practice nurses in England and the Netherlands completed parallel questionnaires, cross-validated for equivalence in English and Dutch. Behavioural data were practice-level patient-reported rates of foot examination and use of statin medication. Relationships between the cognitive antecedents of behaviour proposed by the TPB and healthcare teams' clinical behaviour were explored using multiple regression. Results In both countries, attitude and subjective norm were important predictors of health professionals' intention to inspect feet (Attitude: beta = .40; Subjective Norm: beta = .28; Adjusted R2 = .34, p 2 = .40, p Conclusion Using the TPB, we identified modifiable factors underlying health professionals' intentions to perform two clinical behaviours, providing a rationale for the development of targeted interventions. However, we did not observe a relationship between health professionals' intentions and our proxy measure of team behaviour. Significant methodological issues were highlighted concerning the use of models of individual behaviour to explain behaviours performed by teams. In order to investigate clinical behaviours performed by teams it may be necessary to develop measures that reflect the collective cognitions of the members of the team to facilitate the application of these theoretical models to team behaviours.

  8. DNA quantification by real time PCR and short tandem repeats (STRs amplification results

    Directory of Open Access Journals (Sweden)

    Zoppis S

    2012-11-01

    Full Text Available Determining the DNA amount in a forensic sample is fundamental for PCR-based analyses because if on one hand an excessive amount of template may cause the appearance of additional or out-of-scale peaks, by the other a low quantity can determine the appearance of stochastic phenomena affecting the PCR reaction and the subsequent interpretation of typing results. In the common practice of forensic genetics laboratories, the quantification results provided by Real Time PCR (qPCR assume the role of “boundary line” between the possibility for a given DNA sample to be subjected or not to the subsequent analytical steps, on the basis of an optimal amount of DNA in the range indicated by the manufacturer of the specific commercial kit.However, some studies have shown the possibility to obtain STR typing results even with an extremely low DNA concentration or, paradoxically, equal to zero (1. Regardless of the amount of DNA used for the quantification of the testing sample, specific software are able to use the standard curve to calculate concentration values far below the manufacturer’s reported optimal detection limit (0.023 ng/?L. Consequently, laboratories have to face the critical decision to interrupt the analyses giving up the possibility to obtain a genetic profile -although partial- or to try the amplification of the extract with the awareness of the interpretation issues that this implies.The authors will present the quantification results obtained by qPCR performed on numerous samples collected from items of forensic interest, subjected to DNA extraction using magnetic beads. Following the quantification step, the extracts were subjected to DNA amplification and STR typing using last generation commercial kits. Samples that showed quantification values below the limit of detection for the method were included in the analysis in order to check the existence of a correlation between the DNA quantification results by qPCR and the possibility of obtaining a genetic profile useful for identification purposes.Our study, performed on 558 samples from forensic casework items, has shown a correlation between the DNA amount resulted from qPCR analysis and the possibility of obtaining a genetic profile useful for identification purposes.In spite of the increasing sensitivity of last generation commercial kits for STR analysis, as demonstrated by the ability to detect allelic peaks from extremely low DNA quantities (with concentrations far below the limit of detection for the specific quantification kit, even corresponding to 0 or “Undetermined”, the results obtained show a correlation between qPCR quantification values and STR typing results. Thus the qPCR method confirms being today a useful and valid instrument for both qualitative and quantitative evaluation of genetic samples for human identification purposes.

  9. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. PMID:25864956

  10. Tutorial examples for uncertainty quantification methods.

    Energy Technology Data Exchange (ETDEWEB)

    De Bord, Sarah [Univ. of California, Davis, CA (United States)

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  11. Quantification of human performance circadian rhythms.

    Science.gov (United States)

    Freivalds, A; Chaffin, D B; Langolf, G D

    1983-09-01

    The quantification of worker performance changes during a shift is critical to establishing worker productivity. This investigation examined the existence of circadian rhythms in response variables that relate most meaningfully to the physiological and neurological state of the body for three subjects maintaining a resting posture for 25 hours on five separate occasions. Significant circadian variation ranging from 3% to 11% of the mean value was detected for elbow flexion strength, physiological tremor, simple reaction time, information processing rate and critical eye-hand tracking capacity. PMID:6637808

  12. A Tableaux Calculus for Ambiguous Quantification

    CERN Document Server

    Monz, C; Monz, Christof; Rijke, Maarten de

    2000-01-01

    Coping with ambiguity has recently received a lot of attention in natural language processing. Most work focuses on the semantic representation of ambiguous expressions. In this paper we complement this work in two ways. First, we provide an entailment relation for a language with ambiguous expressions. Second, we give a sound and complete tableaux calculus for reasoning with statements involving ambiguous quantification. The calculus interleaves partial disambiguation steps with steps in a traditional deductive process, so as to minimize and postpone branching in the proof process, and thereby increases its efficiency.

  13. Thermal behavior of dynamic magnetizations, hysteresis loop areas and correlations of a cylindrical Ising nanotube in an oscillating magnetic field within the effective-field theory and the Glauber-type stochastic dynamics approach

    International Nuclear Information System (INIS)

    The dynamical aspects of a cylindrical Ising nanotube in the presence of a time-varying magnetic field are investigated within the effective-field theory with correlations and Glauber-type stochastic approach. Temperature dependence of the dynamic magnetizations, dynamic total magnetization, hysteresis loop areas and correlations are investigated in order to characterize the nature of dynamic transitions as well as to obtain the dynamic phase transition temperatures and compensation behaviors. Some characteristic phenomena are found depending on the ratio of the physical parameters in the surface shell and core, i.e., five different types of compensation behaviors in the Néel classification nomenclature exist in the system. -- Highlights: ? Kinetic cylindrical Ising nanotube is investigated using the effective-field theory. ? The dynamic magnetizations, hysteresis loop areas and correlations are calculated. ? The effects of the exchange interactions have been studied in detail. ? Five different types of compensation behaviors have been found. ? Some characteristic phenomena are found depending on ratio of physical parameters.

  14. Thermal behavior of dynamic magnetizations, hysteresis loop areas and correlations of a cylindrical Ising nanotube in an oscillating magnetic field within the effective-field theory and the Glauber-type stochastic dynamics approach

    Energy Technology Data Exchange (ETDEWEB)

    Deviren, Bayram, E-mail: bayram.deviren@nevsehir.edu.tr [Department of Physics, Nevsehir University, 50300 Nevsehir (Turkey); Keskin, Mustafa [Department of Physics, Erciyes University, 38039 Kayseri (Turkey)

    2012-02-20

    The dynamical aspects of a cylindrical Ising nanotube in the presence of a time-varying magnetic field are investigated within the effective-field theory with correlations and Glauber-type stochastic approach. Temperature dependence of the dynamic magnetizations, dynamic total magnetization, hysteresis loop areas and correlations are investigated in order to characterize the nature of dynamic transitions as well as to obtain the dynamic phase transition temperatures and compensation behaviors. Some characteristic phenomena are found depending on the ratio of the physical parameters in the surface shell and core, i.e., five different types of compensation behaviors in the Néel classification nomenclature exist in the system. -- Highlights: ? Kinetic cylindrical Ising nanotube is investigated using the effective-field theory. ? The dynamic magnetizations, hysteresis loop areas and correlations are calculated. ? The effects of the exchange interactions have been studied in detail. ? Five different types of compensation behaviors have been found. ? Some characteristic phenomena are found depending on ratio of physical parameters.

  15. Alberta Diabetes and Physical Activity Trial (ADAPT: A randomized theory-based efficacy trial for adults with type 2 diabetes - rationale, design, recruitment, evaluation, and dissemination

    Directory of Open Access Journals (Sweden)

    Birkett Nicholas

    2010-01-01

    Full Text Available Abstract Background The primary aim of this study was to compare the efficacy of three physical activity (PA behavioural intervention strategies in a sample of adults with type 2 diabetes. Method/Design Participants (N = 287 were randomly assigned to one of three groups consisting of the following intervention strategies: (1 standard printed PA educational materials provided by the Canadian Diabetes Association [i.e., Group 1/control group]; (2 standard printed PA educational materials as in Group 1, pedometers, a log book and printed PA information matched to individuals' PA stage of readiness provided every 3 months (i.e., Group 2; and (3 PA telephone counseling protocol matched to PA stage of readiness and tailored to personal characteristics, in addition to the materials provided in Groups 1 and 2 (i.e., Group 3. PA behaviour measured by the Godin Leisure Time Exercise Questionnaire and related social-cognitive measures were assessed at baseline, 3, 6, 9, 12 and 18-months (i.e., 6-month follow-up. Clinical (biomarkers and health-related quality of life assessments were conducted at baseline, 12-months, and 18-months. Linear Mixed Model (LMM analyses will be used to examine time-dependent changes from baseline across study time points for Groups 2 and 3 relative to Group 1. Discussion ADAPT will determine whether tailored but low-cost interventions can lead to sustainable increases in PA behaviours. The results may have implications for practitioners in designing and implementing theory-based physical activity promotion programs for this population. Clinical Trials Registration ClinicalTrials.gov identifier: NCT00221234

  16. e/a classification of Hume–Rothery Rhombic Triacontahedron-type approximants based on all-electron density functional theory calculations

    Energy Technology Data Exchange (ETDEWEB)

    Mizutani, U; Inukai, M; Sato, H; Zijlstra, E S; Lin, Q

    2014-05-16

    There are three key electronic parameters in elucidating the physics behind the Hume–Rothery electron concentration rule: the square of the Fermi diameter (2kF)2, the square of the critical reciprocal lattice vector and the electron concentration parameter or the number of itinerant electrons per atom e/a. We have reliably determined these three parameters for 10 Rhombic Triacontahedron-type 2/1–2/1–2/1 (N?=?680) and 1/1–1/1–1/1 (N?=?160–162) approximants by making full use of the full-potential linearized augmented plane wave-Fourier band calculations based on all-electron density-functional theory. We revealed that the 2/1–2/1–2/1 approximants Al13Mg27Zn45 and Na27Au27Ga31 belong to two different sub-groups classified in terms of equal to 126 and 109 and could explain why they take different e/a values of 2.13 and 1.76, respectively. Among eight 1/1–1/1–1/1 approximants Al3Mg4Zn3, Al9Mg8Ag3, Al21Li13Cu6, Ga21Li13Cu6, Na26Au24Ga30, Na26Au37Ge18, Na26Au37Sn18 and Na26Cd40Pb6, the first two, the second two and the last four compounds were classified into three sub-groups with ?=?50, 46 and 42; and were claimed to obey the e/a?=?2.30, 2.10–2.15 and 1.70–1.80 rules, respectively.

  17. An investigation of psychological distress among patients with Type 2 diabetes considered in the light of the scope of Conservation of Resources theory

    Directory of Open Access Journals (Sweden)

    Ezgi GÖÇEK YORULMAZ

    2014-12-01

    Full Text Available Objectives: Diabetes is an important disease with an every day increasing prelevance. Diabetes patients experience psychological problems as well as physical problems which may negatively influence treatment process. If the psychological distress level of patients and related factors were determined, more healthy physical treatments could be carried out. For this reason, the aim of this study was to assess anxiety, depression and psychological distress (total score of anxiety+depression and the related factors (i.e., ways of coping, perceived social support, self efficiency in relation to diabetes, expressed emotions, and loss of resources among patients with Type 2 diabetes in the light of the scope of Conservation of Resources theory. Patients and Methods: Sociodemographic and Illness Information form and also six different scales about psychological distress and related factors were administered to 116 diabetes patients. To investigate the relationship between the variables the independent samples t test, regression and correlation analysis were performed. Results: It was found that nearly half of the diabetes patients exprienced high levels of anxiety and depression. More importantly, it was observed that helplessness coping and resource loss were positively related to all psychological problems. However, optimistic coping was negatively associated with such psychological problems. In addition, it was indicated that increased emotional over-involvement domain of expressed emotion was only associated with decrease in general psychological distress. Conclusion: It was found that resource loss, coping strategies and expressed emotions were found to be related to psychological distress. It was suggested that psychological intervention programs should focus on issues identified in this study.

  18. An improved competitive inhibition enzymatic immunoassay method for tetrodotoxin quantification

    Directory of Open Access Journals (Sweden)

    Stokes Amber N

    2012-03-01

    Full Text Available Abstract Quantifying tetrodotoxin (TTX has been a challenge in both ecological and medical research due to the cost, time and training required of most quantification techniques. Here we present a modified Competitive Inhibition Enzymatic Immunoassay for the quantification of TTX, and to aid researchers in the optimization of this technique for widespread use with a high degree of accuracy and repeatability.

  19. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. PMID:26471520

  20. Quantification noise in single cell experiments

    Science.gov (United States)

    Reiter, M.; Kirchner, B.; Müller, H.; Holzhauer, C.; Mann, W.; Pfaffl, M. W.

    2011-01-01

    In quantitative single-cell studies, the critical part is the low amount of nucleic acids present and the resulting experimental variations. In addition biological data obtained from heterogeneous tissue are not reflecting the expression behaviour of every single-cell. These variations can be derived from natural biological variance or can be introduced externally. Both have negative effects on the quantification result. The aim of this study is to make quantitative single-cell studies more transparent and reliable in order to fulfil the MIQE guidelines at the single-cell level. The technical variability introduced by RT, pre-amplification, evaporation, biological material and qPCR itself was evaluated by using RNA or DNA standards. Secondly, the biological expression variances of GAPDH, TNF?, IL-1?, TLR4 were measured by mRNA profiling experiment in single lymphocytes. The used quantification setup was sensitive enough to detect single standard copies and transcripts out of one solitary cell. Most variability was introduced by RT, followed by evaporation, and pre-amplification. The qPCR analysis and the biological matrix introduced only minor variability. Both conducted studies impressively demonstrate the heterogeneity of expression patterns in individual cells and showed clearly today's limitation in quantitative single-cell expression analysis. PMID:21745823

  1. Benchmark problems for subsurface flow uncertainty quantification

    Science.gov (United States)

    Chang, Haibin; Liao, Qinzhuo; Zhang, Dongxiao

    2015-12-01

    In this work, we design a series of benchmark problems for subsurface flow uncertainty quantification. Three basic subsurface flow problems with increasing complexity are selected, which are steady state groundwater flow, groundwater contamination, and multi-phase flow. For the steady state groundwater flow, hydraulic conductivity is assumed to be uncertain, and the uncertain model parameter is assumed to be Gaussian random constant, Gaussian random field, and facies field, respectively. For the other two flow problems, the uncertain model parameter is assumed to be Gaussian random field and facies field, respectively. The statistical property of the uncertain model parameter is specified for each problem. The Monte Carlo (MC) method is used to obtain the benchmark results. The results include the first two statistical moments and the probability density function of the quantities of interest. To verify the MC results, we test the convergence of the results and the reliability of the sampling algorithm. For any existing and newly developed uncertainty quantification methods, which are not (fully) verified, the designed benchmark problems in this work can facilitate the verification process of those methods. For illustration, in this work, we provide a verification of the probabilistic collocation method using the benchmark results.

  2. A shear deformable theory of laminated composite shallow shell-type panels and their response analysis. I - Free vibration and buckling

    Science.gov (United States)

    Librescu, L.; Khdeir, A. A.; Frederick, D.

    1989-01-01

    This paper deals with the substantiation of a shear deformable theory of cross-ply laminated composite shallow shells. While the developed theory preserves all the advantages of the first order transverse shear deformation theory it succeeds in eliminating some of its basic shortcomings. The theory is further employed in the analysis of the eigenvibration and static buckling problems of doubly curved shallow panels. In this context, the state space concept is used in conjunction with the Levy method, allowing one to analyze these problems in a unified manner, for a variety of boundary conditions. Numerical results are presented and some pertinent conclusions are formulated.

  3. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  4. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  5. Mesh refinement for uncertainty quantification through model reduction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jing, E-mail: lixxx873@umn.edu; Stinis, Panos, E-mail: stinis@umn.edu

    2015-01-01

    We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan–Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory.

  6. Quantification of motility of carabid beetles in farmland.

    Science.gov (United States)

    Allema, A B; van der Werf, W; Groot, J C J; Hemerik, L; Gort, G; Rossing, W A H; van Lenteren, J C

    2015-04-01

    Quantification of the movement of insects at field and landscape levels helps us to understand their ecology and ecological functions. We conducted a meta-analysis on movement of carabid beetles (Coleoptera: Carabidae), to identify key factors affecting movement and population redistribution. We characterize the rate of redistribution using motility ? (L2 T-1), which is a measure for diffusion of a population in space and time that is consistent with ecological diffusion theory and which can be used for upscaling short-term data to longer time frames. Formulas are provided to calculate motility from literature data on movement distances. A field experiment was conducted to measure the redistribution of mass-released carabid, Pterostichus melanarius in a crop field, and derive motility by fitting a Fokker-Planck diffusion model using inverse modelling. Bias in estimates of motility from literature data is elucidated using the data from the field experiment as a case study. The meta-analysis showed that motility is 5.6 times as high in farmland as in woody habitat. Species associated with forested habitats had greater motility than species associated with open field habitats, both in arable land and woody habitat. The meta-analysis did not identify consistent differences in motility at the species level, or between clusters of larger and smaller beetles. The results presented here provide a basis for calculating time-varying distribution patterns of carabids in farmland and woody habitat. The formulas for calculating motility can be used for other taxa. PMID:25673121

  7. Mathematical Models in Schema Theory

    OpenAIRE

    Burgin, Mark

    2005-01-01

    In this paper, a mathematical schema theory is developed. This theory has three roots: brain theory schemas, grid automata, and block-shemas. In Section 2 of this paper, elements of the theory of grid automata necessary for the mathematical schema theory are presented. In Section 3, elements of brain theory necessary for the mathematical schema theory are presented. In Section 4, other types of schemas are considered. In Section 5, the mathematical schema theory is developed...

  8. MOTIVATION INTERNALIZATION AND SIMPLEX STRUCTURE IN SELF-DETERMINATION THEORY.

    Science.gov (United States)

    Ünlü, Ali; Dettweiler, Ulrich

    2015-12-01

    -Self-determination theory, as proposed by Deci and Ryan, postulated different types of motivation regulation. As to the introjected and identified regulation of extrinsic motivation, their internalizations were described as "somewhat external" and "somewhat internal" and remained undetermined in the theory. This paper introduces a constrained regression analysis that allows these vaguely expressed motivations to be estimated in an "optimal" manner, in any given empirical context. The approach was even generalized and applied for simplex structure analysis in self-determination theory. The technique was exemplified with an empirical study comparing science teaching in a classical school class versus an expeditionary outdoor program. Based on a sample of 84 German pupils (43 girls, 41 boys, 10 to 12 years old), data were collected using the German version of the Academic Self-Regulation Questionnaire. The science-teaching format was seen to not influence the pupils' internalization of identified regulation. The internalization of introjected regulation differed and shifted more toward the external pole in the outdoor teaching format. The quantification approach supported the simplex structure of self-determination theory, whereas correlations may disconfirm the simplex structure. PMID:26595290

  9. Detection and quantification of proteins in clinical samples using high resolution mass spectrometry.

    Science.gov (United States)

    Gallien, Sebastien; Domon, Bruno

    2015-06-15

    Quantitative proteomics has benefited from the recent development of mass spectrometers capable of high-resolution and accurate-mass (HR/AM) measurements. While targeted experiments are routinely performed on triple quadrupole instruments in selected reaction monitoring (SRM; often referred as multiple reaction monitoring, MRM) mode, the quadrupole-orbitrap mass spectrometers allow quantification in MS/MS mode, also known as parallel reaction monitoring (PRM). This technique is characterized by higher selectivity and better confidence in the assignment of the precursor and fragment ions, and thus translates into an improved analytical performance. More fundamentally, PRM introduces a change of the overall paradigm of targeted experiments, by the decoupling of the acquisition and data processing. They rely on two distinct steps, with a simplified acquisition method in conjunction with a flexible, iterative, post-acquisition data processing. This account describes in detail the different steps of a PRM experiment, which include the design of the acquisition method, the confirmation of the identity of the analytes founded upon a full MS/MS fragmentation pattern, and the quantification based on the extraction of specific fragment ions (selected post-acquisition) using tight mass tolerance. The different types of PRM experiments, defined as large-scale screening or precise targeted quantification using calibrated internal standards, together with the considerations on the selection of experimental parameters are discussed. PMID:25843604

  10. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  11. Ecosystem Service Potentials, Flows and Demands – Concepts for Spatial Localisation, Indication and Quantification

    Directory of Open Access Journals (Sweden)

    Benjamin Burkhard

    2014-06-01

    Full Text Available The high variety of ecosystem service categorisation systems, assessment frameworks, indicators, quantification methods and spatial localisation approaches allows scientists and decision makers to harness experience, data, methods and tools. On the other hand, this variety of concepts and disagreements among scientists hamper an integration of ecosystem services into contemporary environmental management and decision making. In this article, the current state of the art of ecosystem service science regarding spatial localisation, indication and quantification of multiple ecosystem service supply and demand is reviewed and discussed. Concepts and tables for regulating, provisioning and cultural ecosystem service definitions, distinguishing between ecosystem service potential supply (stocks, flows (real supply and demands as well as related indicators for quantification are provided. Furthermore, spatial concepts of service providing units, benefitting areas, spatial relations, rivalry, spatial and temporal scales are elaborated. Finally, matrices linking CORINE land cover types to ecosystem service potentials, flows, demands and budget estimates are provided. The matrices show that ecosystem service potentials of landscapes differ from flows, especially for provisioning ecosystem services.

  12. QUANTIFICATION AND BIOREMEDIATION OF ENVIRONMENTAL SAMPLES BY DEVELOPING A NOVEL AND EFFICIENT METHOD

    Directory of Open Access Journals (Sweden)

    Mohammad Osama

    2014-06-01

    Full Text Available Pleurotus ostreatus, a white rot fungus, is capable of bioremediating a wide range of organic contaminants including Polycyclic Aromatic Hydrocarbons (PAHs. Ergosterol is produced by living fungal biomass and used as a measure of fungal biomass. The first part of this work deals with the extraction and quantification of PAHs from contaminated sediments by Lipid Extraction Method (LEM. The second part consists of the development of a novel extraction method (Ergosterol Extraction Method (EEM, quantification and bioremediation. The novelty of this method is the simultaneously extraction and quantification of two different types of compounds, sterol (ergosterol and PAHs and is more efficient than LEM. EEM has been successful in extracting ergosterol from the fungus grown on barley in the concentrations of 17.5-39.94 µg g-1 ergosterol and the PAHs are much more quantified in numbers and amounts as compared to LEM. In addition, cholesterol usually found in animals, has also been detected in the fungus, P. ostreatus at easily detectable levels.

  13. The open-endedness of the set concept and the semantics of set theory

    OpenAIRE

    Paseau, A

    2003-01-01

    Some philosophers have argued that the open-endedness of the set concept has revisionary consequences for the semantics and logic of set theory. I consider (several variants of) an argument for this claim, premissed on the view that quantification in mathematics cannot outrun our conceptual abilities. The argument urges a non-standard semantics for set theory that allegedly sanctions a non-classical logic. I show that the views about quantification the argument relies on turn out to sanction ...

  14. On Uncertainty Quantification in Particle Accelerators Modelling

    CERN Document Server

    Adelmann, Andreas

    2015-01-01

    Using a cyclotron based model problem, we demonstrate for the first time the applicability and usefulness of a uncertainty quantification (UQ) approach in order to construct surrogate models for quantities such as emittance, energy spread but also the halo parameter, and construct a global sensitivity analysis together with error propagation and $L_{2}$ error analysis. The model problem is selected in a way that it represents a template for general high intensity particle accelerator modelling tasks. The presented physics problem has to be seen as hypothetical, with the aim to demonstrate the usefulness and applicability of the presented UQ approach and not solving a particulate problem. The proposed UQ approach is based on sparse polynomial chaos expansions and relies on a small number of high fidelity particle accelerator simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobols' ...

  15. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  16. Band Calculations for Ce Compounds with AuCu$_{3}$-type Crystal Structure on the basis of Dynamical Mean Field Theory I. CePd$_{3}$ and CeRh$_{3}$

    OpenAIRE

    Sakai, Osamu

    2010-01-01

    Band calculations for Ce compounds with the AuCu$_{3}$-type crystal structure were carried out on the basis of dynamical mean field theory (DMFT). The auxiliary impurity problem was solved by a method named NCA$f^{2}$vc (noncrossing approximation including the $f^{2}$ state as a vertex correction). The calculations take into account the crystal-field splitting, the spin-orbit interaction, and the correct exchange process of the $f^{1} \\rightarrow f^{0},f^{2}$ virtual excitat...

  17. Quantification of bronchial dimensions at MDCT using dedicated software

    International Nuclear Information System (INIS)

    This study aimed to assess the feasibility of quantification of bronchial dimensions at MDCT using dedicated software (BronCare). We evaluated the reliability of the software to segment the airways and defined criteria ensuring accurate measurements. BronCare was applied on two successive examinations in 10 mild asthmatic patients. Acquisitions were performed at pneumotachographically controlled lung volume (65% TLC), with reconstructions focused on the right lung base. Five validation criteria were imposed: (1) bronchus type: segmental and subsegmental; (2) lumen area (LA)>4 mm2; (3) bronchus length (Lg) > 7 mm; (4) confidence index - giving the percentage of the bronchus not abutted by a vessel - (CI) >55% for validation of wall area (WA) and (5) a minimum of 10 contiguous cross-sectional images fulfilling the criteria. A complete segmentation procedure on both acquisitions made possible an evaluation of LA and WA in 174/223 (78%) and 171/174 (98%) of bronchi, respectively. The validation criteria were met for 56/69 (81%) and for 16/69 (23%) of segmental bronchi and for 73/102 (72%) and 58/102 (57%) of subsegmental bronchi, for LA and WA, respectively. In conclusion, BronCare is reliable to segment the airways in clinical practice. The proposed criteria seem appropriate to select bronchi candidates for measurement. (orig.)

  18. Quantification of bronchial dimensions at MDCT using dedicated software

    Energy Technology Data Exchange (ETDEWEB)

    Brillet, P.Y. [Universite Leonard de Vinci-Paris XIII, Service de Radiologie, Assistance Publique-Hopitaux de Paris, Hopital Avicenne, Bobigny (France); Fetita, C.I.; Saragaglia, A.; Perchet, D.; Preteux, F. [Institut National des Telecommunications, ARTEMIS Department, Evry (France); Beigelman-Aubry, C.; Grenier, P.A. [Universite Pierre et Marie Curie-Paris VI, Service de Radiologie, Assistance Publique-Hopitaux de Paris, Hopital Pitie-Salpetriere, Paris (France)

    2007-06-15

    This study aimed to assess the feasibility of quantification of bronchial dimensions at MDCT using dedicated software (BronCare). We evaluated the reliability of the software to segment the airways and defined criteria ensuring accurate measurements. BronCare was applied on two successive examinations in 10 mild asthmatic patients. Acquisitions were performed at pneumotachographically controlled lung volume (65% TLC), with reconstructions focused on the right lung base. Five validation criteria were imposed: (1) bronchus type: segmental and subsegmental; (2) lumen area (LA)>4 mm{sup 2}; (3) bronchus length (Lg) > 7 mm; (4) confidence index - giving the percentage of the bronchus not abutted by a vessel - (CI) >55% for validation of wall area (WA) and (5) a minimum of 10 contiguous cross-sectional images fulfilling the criteria. A complete segmentation procedure on both acquisitions made possible an evaluation of LA and WA in 174/223 (78%) and 171/174 (98%) of bronchi, respectively. The validation criteria were met for 56/69 (81%) and for 16/69 (23%) of segmental bronchi and for 73/102 (72%) and 58/102 (57%) of subsegmental bronchi, for LA and WA, respectively. In conclusion, BronCare is reliable to segment the airways in clinical practice. The proposed criteria seem appropriate to select bronchi candidates for measurement. (orig.)

  19. Quantification of benzodiazepines in whole blood and serum.

    Science.gov (United States)

    Dussy, Franz E; Hamberg, Cornelia; Briellmann, Thomas A

    2006-11-01

    A high-performance liquid chromatography method for the determination of benzodiazepines and their metabolites in whole blood and serum using mass spectrometry (MS) and photodiode array (PDA) detection is presented. The combination of both detection types can complement each other and provides extensive case relevant data. The limits of quantification (LOQ) with the MS detection lie between 2 and 3 microg/l for the following benzodiazepines or metabolites: 7-amino-flunitrazepam, alprazolam, desalkyl-flurazepam, desmethyl-flunitrazepam, diazepam, flunitrazepam, flurazepam, alpha-hydroxy-midazolam, lorazepam, midazolam, nitrazepam, nordazepam and oxazepam, respectively 5 microg/l for lormetazepam and 6 microg/l for bromazepam. The LOQ of clobazam determined with the PDA detector is 10 microg/l. A convenient approach for determining the measurement uncertainty of the presented method--applicable also for other methods in an accreditation process--is presented. At low concentrations (180 microg/l, it was estimated to be about 15%. One hundred and twenty-eight case data acquired over 1 year are summarised. PMID:16220317

  20. Quantum probability theory

    OpenAIRE

    Rédei, Miklós; Summers, Stephen Jeffrey

    2007-01-01

    The mathematics of classical probability theory was subsumed into classical measure theory by Kolmogorov in 1933. Quantum theory as nonclassical probability theory was incorporated into the beginnings of noncommutative measure theory by von Neumann in the early thirties, as well. To precisely this end, von Neumann initiated the study of what are now called von Neumann algebras and, with Murray, made a first classification of such algebras into three types. The nonrelativisti...

  1. Galois theory in bicategories

    CERN Document Server

    Gomez-Torrecillas, Jose

    2007-01-01

    We develop a Galois (descent) theory for comonads within the framework of bicategories. We give generalizations of Beck's theorem and the Joyal-Tierney theorem. Many examples are provided, including classical descent theory, Hopf-Galois theory over Hopf algebras and Hopf algebroids, Galois theory for corings and group-corings, and Morita-Takeuchi theory for corings. As an application we construct a new type of comatrix corings based on (dual) quasi bialgebras.

  2. Quantification of containment response in a probabilistic risk assessment

    International Nuclear Information System (INIS)

    This lecture consists of four parts. They embody the principal aspects of the steps and analyses required to determine: The Plant Damage States; the C-Matrix; and the Release Categories. The four parts addressed are: I. A methodology for the probabilistic quantification of containment response. II. The analysis of containment failure pressure and quantification of uncertainty. III. A probabilistic analysis of containment failure due to hydrogen burning. IV. Determination of source terms and uncertainities. This lecture emphasis the quantification of uncertainties in containment response as a key ingredient of a PRA (Probabilistic risk assessment)

  3. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  4. A logical model for quantification of occupational risk

    International Nuclear Information System (INIS)

    Functional block diagrams (FBDs) and their equivalent event trees are introduced as logical models in the quantification of occupational risks. Although a FBD is similar to an influence diagram or a belief network it provides a framework for introduction in a compact form of the logic of the model through the partition of the paths of the equivalent event tree. This is achieved by consideration of an overall event which has as outcomes the outmost consequences defining the risk under analysis. This event is decomposed into simpler events the outcome space of which is partitioned into subsets corresponding to the outcomes of the initial joint event. The simpler events can be further decomposed into simpler events creating a hierarchy where the events in a given level (parents) are decomposed to a number of simpler events (children) in the next level of the hierarchy. The partitioning of the outcome space is transferred from level to level through logical relationships corresponding to the logic of the model. Occupational risk is modeled trough a general FBD where the undesirable health consequence is decomposed to 'dose' and 'dose/response'; 'dose' is decomposed to 'center event' and 'mitigation'; 'center event' is decomposed to 'initiating event' and 'prevention'. This generic FBD can be transformed to activity-specific FBDs which together with their equivalent event trees are used to delineate the various accident sequences that might lead to injury or death consequences. The methodology and the associated algorithms have been computerized in a program with a graphical user interface (GUI) which allows the user to input the functional relationships between parent and children events, corresponding probabilities for events of the lowest level and obtain at the end the quantified corresponding simplified event tree. The methodology is demonstrated with an application to the risk of falling from a mobile ladder. This type of accidents has been analyzed as part of the Workgroup Occupational Risk Model (WORM) project in the Netherlands aiming at the development and quantification of models for a full range of potential risks from accidents in the workspace

  5. Matrix Theory on Non-Orientable Surfaces

    OpenAIRE

    Zwart, Gysbert

    1997-01-01

    We construct the Matrix theory descriptions of M-theory on the Mobius strip and the Klein bottle. In a limit, these provide the matrix string theories for the CHL string and an orbifold of type IIA string theory.

  6. Assessment of molecular recognition element for the quantification of human epidermal growth factor using surface plasmon resonance

    Scientific Electronic Library Online (English)

    Ira Amira, Rosti; Ramakrishnan Nagasundara, Ramanan; Tau Chuan, Ling; Arbakariya B, Ariff.

    2013-11-15

    Full Text Available Background: A method for the selection of suitable molecular recognition element (MRE) for the quantification of human epidermal growth factor (hEGF) using surface plasmon resonance (SPR) is presented. Two types of hEGF antibody, monoclonal and polyclonal, were immobilized on the surface of chip and [...] validated for its characteristics and performance in the quantification of hEGF. Validation of this analytical procedure was to demonstrate the stability and suitability of antibody for the quantification of target protein. Results: Specificity, accuracy and precision for all samples were within acceptable limit for both antibodies. The affinity and kinetic constant of antibodies-hEGF binding were evaluated using a 1:1 Langmuir interaction model. The model fitted well to all binding responses simultaneously. Polyclonal antibody (pAb) has better affinity (K D = 7.39e-10 M) than monoclonal antibody (mAb) (K D = 9.54e-9 M). Further evaluation of kinetic constant demonstrated that pAb has faster reaction rate during sample injection, slower dissociation rate during buffer injection and higher level of saturation state than mAb. Besides, pAb has longer shelf life and greater number of cycle run. Conclusions: Thus, pAb was more suitable to be used as a stable MRE for further quantification works from the consideration of kinetic, binding rate and shelf life assessment.

  7. Identification and quantification of selected chemicals in laser pyrolysis products of mammalian tissues

    Science.gov (United States)

    Spleiss, Martin; Weber, Lothar W.; Meier, Thomas H.; Treffler, Bernd

    1995-01-01

    Liver and muscle tissue have been irradiated with a surgical CO2-laser. The prefiltered fumes were adsorbed on different sorbents (activated charcoal type NIOSH and Carbotrap) and desorbed with different solvents (carbondisulphide and acetone). Analysis was done by gas chromatography/mass spectrometry. An updated list of identified substances is shown. Typical Maillard reaction products as found in warmed over flavour as aldehydes, aromatics, heterocyclic and sulphur compounds were detected. Quantification of some toxicological relevant substances is presented. The amounts of these substances are given in relation to the laser parameters and different tissues for further toxicological assessment.

  8. Detection and quantification of hogwash oil in soybean oils using low-cost spectroscopy and chemometrics

    Science.gov (United States)

    Mignani, A. G.; Ciaccheri, L.; Mencaglia, A. A.; Cichelli, A.; Xing, J.; Yang, X.; Sun, W.; Yuan, L.

    2013-05-01

    This paper presents the detection and quantification of hogwash oil in soybean oils by means of absorption spectroscopy. Three types of soybean oils were adulterated with different concentrations of hogwash oil. The spectra were measured in the visible band using a white LED and a low-cost spectrometer. The measured spectra were processed by means of multivariate analysis to distinguish the adulteration and, for each soybean oil, to quantify the adulterant concentration. Then the visible spectra were sliced into two bands for modeling a simple setup made of two LEDs only. The successful results indicate potentials for implementing a smartphone-compatible device for self-assessment of soybean oil quality.

  9. The impact of respiratory motion on tumor quantification and delineation in static PET/CT imaging

    OpenAIRE

    Liu, Chi; Pierce, Larry A; Alessio, Adam M; Kinahan, Paul E

    2009-01-01

    Our aim is to investigate the impact of respiratory motion on tumor quantification and delineation in static PET/CT imaging using a population of patient respiratory traces. A total of 1295 respiratory traces acquired during whole body PET/CT imaging were classified into three types according to the qualitative shape of their signal histograms. Each trace was scaled to three diaphragm motion amplitudes (6 mm, 11 mm and 16 mm) to drive a whole body PET/CT computer simulation that was validated...

  10. Exaggerated psychophysiological reactivity: issues in quantification and reliability.

    Science.gov (United States)

    Seraganian, P; Hanley, J A; Hollander, B J; Roskies, E; Smilga, C; Martin, N D; Collu, R; Oseasohn, R

    1985-01-01

    Marked physiological reactivity to challenging mental tasks has been associated with elevated risk for, as well as the presence of, coronary heart disease. However, little systematic enquiry into the reliability and quantification of such exaggerated reactivity has emerged. Subjects were 32 male, managerial employees, ranging in age from 22 to 56 yr, who satisfied the following criteria: no history or current signs of heart disease, presence of Type A behavior pattern as revealed by the Structured Interview, and an increase during an initial psychosocial stress testing of at least 25% over baseline in at least three out of five psychophysiological indices. Heart rate, systolic blood pressure, diastolic blood pressure, plasma epinephrine and plasma norepinephrine levels were monitored while challenging mental tasks were performed in three sessions (screening, pretraining and posttraining) spaced several weeks apart. Psychophysiological reactivity during the tasks emerged as a consistent trait. For all five measures, change scores from baseline during the screening session were significantly correlated with change scores during the pretraining session. Moreover, the magnitude of the change scores were similar in the screening and pretraining sessions. Analysis of cross correlations within and between indices provided little support for the use of data transformations such as residual scores or analysis of covariance. Finally, on four out of five measures, the challenging tasks were found to be comparable in the degree of reactivity elicited. These findings suggest that, for selected Type A men, exaggerated psychophysiological reactivity occurs reliably when monitored with multiple indices, appears insensitive to mere passage of time, and can be uniformly elicited by a variety of tasks. PMID:4057127

  11. Uncertainty Quantification for Production Navier-Stokes Solvers Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The uncertainty quantification methods developed under this program are designed for use with current state-of-the-art flow solvers developed by and in use at NASA....

  12. Higher Order Quasi Monte-Carlo Integration in Uncertainty Quantification

    OpenAIRE

    Dick, Josef; Gia, Quoc Thong Le; Schwab, Christoph

    2014-01-01

    We review recent results on dimension-robust higher order convergence rates of Quasi-Monte Carlo Petrov-Galerkin approximations for response functionals of infinite-dimensional, parametric operator equations which arise in computational uncertainty quantification.

  13. Experimental quantification of the tactile spatial responsivity of human cornea.

    Science.gov (United States)

    Beiderman, Yevgeny; Belkin, Michael; Rotenstreich, Ygal; Zalevsky, Zeev

    2015-01-01

    We present the first experimental quantification of the tactile spatial responsivity of the cornea and we teach a subject to recognize spatial tactile shapes that are stimulated on their cornea. PMID:26158088

  14. The parallel reaction monitoring method contributes to a highly sensitive polyubiquitin chain quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsuchiya, Hikaru; Tanaka, Keiji, E-mail: tanaka-kj@igakuken.or.jp; Saeki, Yasushi, E-mail: saeki-ys@igakuken.or.jp

    2013-06-28

    Highlights: •The parallel reaction monitoring method was applied to ubiquitin quantification. •The ubiquitin PRM method is highly sensitive even in biological samples. •Using the method, we revealed that Ufd4 assembles the K29-linked ubiquitin chain. -- Abstract: Ubiquitylation is an essential posttranslational protein modification that is implicated in a diverse array of cellular functions. Although cells contain eight structurally distinct types of polyubiquitin chains, detailed function of several chain types including K29-linked chains has remained largely unclear. Current mass spectrometry (MS)-based quantification methods are highly inefficient for low abundant atypical chains, such as K29- and M1-linked chains, in complex mixtures that typically contain highly abundant proteins. In this study, we applied parallel reaction monitoring (PRM), a quantitative, high-resolution MS method, to quantify ubiquitin chains. The ubiquitin PRM method allows us to quantify 100 attomole amounts of all possible ubiquitin chains in cell extracts. Furthermore, we quantified ubiquitylation levels of ubiquitin-proline-?-galactosidase (Ub-P-?gal), a historically known model substrate of the ubiquitin fusion degradation (UFD) pathway. In wild-type cells, Ub-P-?gal is modified with ubiquitin chains consisting of 21% K29- and 78% K48-linked chains. In contrast, K29-linked chains are not detected in UFD4 knockout cells, suggesting that Ufd4 assembles the K29-linked ubiquitin chain(s) on Ub-P-?gal in vivo. Thus, the ubiquitin PRM is a novel, useful, quantitative method for analyzing the highly complicated ubiquitin system.

  15. Quantification is Incapable of Directly Enhancing Life Quality through Healthcare

    OpenAIRE

    Peter A. Moskovitz

    2013-01-01

    Quantification, the measurement and representational modeling of objects, events and relationships, cannot enhance life quality, not directly. Illustrative is Sydenham’s model of disease (Sydenham, 1848-1850) and its spawn: the checklist quantification that is contained in the DSM (Diagnostic and Statistical Manual of Mental Disorders, now in its fifth edition) and ICD (International Classification of Diseases, now in its ninth edition). The use of these diagnostic catalogs is incapable of di...

  16. A Micropillar Compression Methodology for Ductile Damage Quantification :

    OpenAIRE

    Tasan, CC (Cem); Hoefnagels, JPM (Johan); Geers, MGD (Marc)

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies do not fulfill the requirements, there is an active search for an accurate damage quantification methodology. In this article, a new, micropillar, compression-based methodology is presented, whereb...

  17. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    ZHANG Min; Yang, Le; Zhao, Huizhong; Zhang, Leijie; Zhong, Zhiyou; Liu, Yanling; Chen, Jianhua

    2010-01-01

    A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature distribution in the cooling...

  18. Diversity, Distribution and Quantification of Antibiotic Resistance Genes in Goat and Lamb Slaughterhouse Surfaces and Meat Products

    OpenAIRE

    Lavilla Lerma, Leyre; Benomar, Nabil; Knapp, Charles W; Correa Galeote, David; Gálvez, Antonio; Abriouel, Hikmate

    2014-01-01

    The distribution and quantification of tetracycline, sulfonamide and beta-lactam resistance genes were assessed in slaughterhouse zones throughout meat chain production and the meat products; this study represents the first to report quantitatively monitor antibiotic resistance genes (ARG) in goat and lamb slaughterhouse using a culture independent approach, since most studies focused on individual bacterial species and their specific resistance types. Quantitative PCR (qPCR) revealed a high ...

  19. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    2015-01-01

    The systematic quantification of the uncertainties affecting dynamical systems and the characterization of the uncertainty of their outcomes is critical for engineering design and analysis, where risks must be reduced as much as possible. Uncertainties stem naturally from our limitations in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor-train (STT) decomposition, a novel high-order method for the effective propagation of uncertainties which aims at providing an exponential convergence rate while tackling the curse of dimensionality. The curse of dimensionality is a problem that afflicts many methods based on meta-models, for which the computational cost increases exponentially with the number of inputs of the approximated function – which we will call dimension in the following. The STT-decomposition is based on the Polynomial Chaos (PC) approximation and the low-rank decomposition of the function describing the Quantity of Interest of the considered problem. The low-rank decomposition is obtained through the discrete tensor-train decomposition, which is constructed using an optimization algorithm for the selection of the relevant points on which the function needs to be evaluated. The selection of these points is informed by the approximated function and thus it is able to adapt to its features. The number of function evaluations needed for the construction grows only linearly with the dimension and quadratically with the rank. In this work we will present and use the functional counterpart of this low-rank decomposition and, after proving some auxiliary properties, we will apply PC on it, obtaining the STT-decomposition. This will allow the decoupling of each dimension, leading to a much cheaper construction of the PC surrogate. In the associated paper, the capabilities of the STT-decomposition are checked on commonly used test functions and on an elliptic problem with random inputs. This work will also present three active research directions aimed at improving the efficiency of the STT-decomposition. In this context, we propose three new strategies for solving the ordering problem suffered by the tensor-train decomposition, for computing better estimates with respect to the norms usually employed in UQ and for the anisotropic adaptivity of the method. The second part of this work presents engineering applications of the UQ framework. Both the applications are characterized by functions whose evaluation is computationally expensive and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose characteristics are uncertain. These analysis are carried out using mostly PC methods, and resorting to random sampling methods for comparison and when strictly necessary. The second application of the UQ framework is on the propagation of the uncertainties entering a fully non-linear and dispersive model of water waves. This computationally challenging task is tackled with the adoption of state-of-the-art software for its numerical solution and of efficient PC methods. The aim of this study is the construction of stochastic benchmarks where to test UQ methodologies before being applied to full-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix.

  20. Volumetric motion quantification by 3D tissue phase mapped CMR

    Directory of Open Access Journals (Sweden)

    Lutz Anja

    2012-10-01

    Full Text Available Abstract Background The objective of this study was the quantification of myocardial motion from 3D tissue phase mapped (TPM CMR. Recent work on myocardial motion quantification by TPM has been focussed on multi-slice 2D acquisitions thus excluding motion information from large regions of the left ventricle. Volumetric motion assessment appears an important next step towards the understanding of the volumetric myocardial motion and hence may further improve diagnosis and treatments in patients with myocardial motion abnormalities. Methods Volumetric motion quantification of the complete left ventricle was performed in 12 healthy volunteers and two patients applying a black-blood 3D TPM sequence. The resulting motion field was analysed regarding motion pattern differences between apical and basal locations as well as for asynchronous motion pattern between different myocardial segments in one or more slices. Motion quantification included velocity, torsion, rotation angle and strain derived parameters. Results All investigated motion quantification parameters could be calculated from the 3D-TPM data. Parameters quantifying hypokinetic or asynchronous motion demonstrated differences between motion impaired and healthy myocardium. Conclusions 3D-TPM enables the gapless volumetric quantification of motion abnormalities of the left ventricle, which can be applied in future application as additional information to provide a more detailed analysis of the left ventricular function.

  1. Dempster-Shafer theory and connections to information theory

    Science.gov (United States)

    Peri, Joseph S. J.

    2013-05-01

    The Dempster-Shafer theory is founded on probability theory. The entire machinery of probability theory, and that of measure theory, is at one's disposal for the understanding and the extension of the Dempster-Shafer theory. It is well known that information theory is also founded on probability theory. Claude Shannon developed, in the 1940's, the basic concepts of the theory and demonstrated their utility in communications and coding. Shannonian information theory is not, however, the only type of information theory. In the 1960's and 1970's, further developments in this field were made by French and Italian mathematicians. They developed information theory axiomatically, and discovered not only the Wiener- Shannon composition law, but also the hyperbolic law and the Inf-law. The objective of this paper is to demonstrate the mathematical connections between the Dempster Shafer theory and the various types of information theory. A simple engineering example will be used to demonstrate the utility of the concepts.

  2. Electrophoresis Gel Quantification with a Flatbed Scanner and Versatile Lighting from a Screen Scavenged from a Liquid Crystal Display (LCD) Monitor

    Science.gov (United States)

    Yeung, Brendan; Ng, Tuck Wah; Tan, Han Yen; Liew, Oi Wah

    2012-01-01

    The use of different types of stains in the quantification of proteins separated on gels using electrophoresis offers the capability of deriving good outcomes in terms of linear dynamic range, sensitivity, and compatibility with specific proteins. An inexpensive, simple, and versatile lighting system based on liquid crystal display backlighting is…

  3. Identification and Quantification of Carbonate Species Using Rock-Eval Pyrolysis

    Directory of Open Access Journals (Sweden)

    Pillot D.

    2013-03-01

    Full Text Available This paper presents a new reliable and rapid method to characterise and quantify carbonates in solid samples based on monitoring the CO2 flux emitted by progressive thermal decomposition of carbonates during a programmed heating. The different peaks of destabilisation allow determining the different types of carbonates present in the analysed sample. The quantification of each peak gives the respective proportions of these different types of carbonates in the sample. In addition to the chosen procedure presented in this paper, using a standard Rock-Eval 6 pyrolyser, calibration characteristic profiles are also presented for the most common carbonates in nature. This method should allow different types of application for different disciplines, either academic or industrial.

  4. Superspace conformal field theory

    Energy Technology Data Exchange (ETDEWEB)

    Quella, Thomas [Koeln Univ. (Germany). Inst. fuer Theoretische Physik; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-07-15

    Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.

  5. Superspace conformal field theory

    International Nuclear Information System (INIS)

    Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.

  6. D. M. Armstrong on the Identity Theory of Mind

    OpenAIRE

    Shanjendu Nath

    2013-01-01

    The Identity theory of mind occupies an important place in the history of philosophy. This theory is one of the important representations of the materialistic philosophy. This theory is known as "Materialist Monist Theory of Mind". Sometimes it is called "Type Physicalism", "Type Identity" or "Type-Type Theory" or "Mind-Brain Identity Theory". This theory appears in the philosophical domain as a reaction to the failure of Behaviourism. A number of philosophers developed this theory and among...

  7. Type I background fields in terms of type IIB ones

    OpenAIRE

    B. Nikolic; Sazdovic, B.

    2008-01-01

    We choose such boundary conditions for open IIB superstring theory which preserve N=1 SUSY. The explicite solution of the boundary conditions yields effective theory which is symmetric under world-sheet parity transformation $\\Omega:\\sigma\\to-\\sigma$. We recognize effective theory as closed type I superstring theory. Its background fields,beside known $\\Omega$ even fields of the initial IIB theory, contain improvements quadratic in $\\Omega$ odd ones.

  8. Mathematical Models in Schema Theory

    CERN Document Server

    Burgin, M

    2005-01-01

    In this paper, a mathematical schema theory is developed. This theory has three roots: brain theory schemas, grid automata, and block-shemas. In Section 2 of this paper, elements of the theory of grid automata necessary for the mathematical schema theory are presented. In Section 3, elements of brain theory necessary for the mathematical schema theory are presented. In Section 4, other types of schemas are considered. In Section 5, the mathematical schema theory is developed. The achieved level of schema representation allows one to model by mathematical tools virtually any type of schemas considered before, including schemas in neurophisiology, psychology, computer science, Internet technology, databases, logic, and mathematics.

  9. Uncertainty Quantification for Cargo Hold Fires

    CERN Document Server

    DeGennaro, Anthony M; Martinelli, Luigi; Rowley, Clarence W

    2015-01-01

    The purpose of this study is twofold -- first, to introduce the application of high-order discontinuous Galerkin methods to buoyancy-driven cargo hold fire simulations, second, to explore statistical variation in the fluid dynamics of a cargo hold fire given parameterized uncertainty in the fire source location and temperature. Cargo hold fires represent a class of problems that require highly-accurate computational methods to simulate faithfully. Hence, we use an in-house discontinuous Galerkin code to treat these flows. Cargo hold fires also exhibit a large amount of uncertainty with respect to the boundary conditions. Thus, the second aim of this paper is to quantify the resulting uncertainty in the flow, using tools from the uncertainty quantification community to ensure that our efforts require a minimal number of simulations. We expect that the results of this study will provide statistical insight into the effects of fire location and temperature on cargo fires, and also assist in the optimization of f...

  10. Quantification of biological aging in young adults

    Science.gov (United States)

    Belsky, Daniel W.; Caspi, Avshalom; Houts, Renate; Cohen, Harvey J.; Corcoran, David L.; Danese, Andrea; Harrington, HonaLee; Israel, Salomon; Levine, Morgan E.; Schaefer, Jonathan D.; Sugden, Karen; Williams, Ben; Yashin, Anatoli I.; Poulton, Richie; Moffitt, Terrie E.

    2015-01-01

    Antiaging therapies show promise in model organism research. Translation to humans is needed to address the challenges of an aging global population. Interventions to slow human aging will need to be applied to still-young individuals. However, most human aging research examines older adults, many with chronic disease. As a result, little is known about aging in young humans. We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their “biological aging” (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older. Measured biological aging in young adults can be used to identify causes of aging and evaluate rejuvenation therapies. PMID:26150497

  11. Quantification of risk in medical procedures

    International Nuclear Information System (INIS)

    In most medical procedures, the benefit appears likely to exceed the risk by so much that detailed quantification of the risk is unnecessary. In some instances, however, it is important to attempt to estimate and compare the likely amount of the risk and of the benefit, to determine whether, or when, the use of the procedure is justified. This need arises in the use of radiological screening programmes for the early diagnosis of certain forms of cancer, to assess the age above which such a programme would save more lives by making early diagnoses than it would lose by inducing cancer in the tissues irradiated. It is also informative to estimate the levels of risk that might be involved in research studies involving radiation exposure. Conventional diagnostic and therapeutic procedures entail risks of fatality ranging over at least four orders of magnitude, and clearly relate to the urgency of the situations in which the procedures need to be used. Estimates of such risks are reviewed. (author)

  12. Quantification of moving target cyber defenses

    Science.gov (United States)

    Farris, Katheryn A.; Cybenko, George

    2015-05-01

    Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.

  13. Perfusion quantification using Gaussian process deconvolution.

    DEFF Research Database (Denmark)

    Andersen, I K; Szymkowiak, A

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated as a constraint in the method. The GPD method, which automatically estimates the noise level in each voxel, has the advantage that model parameters are optimized automatically. The GPD is compared to singular value decomposition (SVD) using a common threshold for the singular values, and to SVD using a threshold optimized according to the noise level in each voxel. The comparison is carried out using artificial data as well as data from healthy volunteers. It is shown that GPD is comparable to SVD with a variable optimized threshold when determining the maximum of the IRF, which is directly related to the perfusion. GPD provides a better estimate of the entire IRF. As the signal-to-noise ratio (SNR) increases or the time resolution of the measurements increases, GPD is shown to be superior to SVD. This is also found for large distribution volumes.

  14. Tissue quantification for development of pediatric phantom

    International Nuclear Information System (INIS)

    The optimization of the risk- benefit ratio is a major concern in the pediatric radiology, due to the greater vulnerability of children to the late somatic effects and genetic effects of exposure to radiation compared to adults. In Brazil, it is estimated that the causes of death from head trauma are 18 % for the age group between 1-5 years and the radiograph is the primary diagnostic test for the detection of skull fracture . Knowing that the image quality is essential to ensure the identification of structures anatomical and minimizing errors diagnostic interpretation, this paper proposed the development and construction of homogeneous phantoms skull, for the age group 1-5 years. The construction of the phantoms homogeneous was performed using the classification and quantification of tissue present in the skull of pediatric patients. In this procedure computational algorithms were used, using Matlab, to quantify distinct biological tissues present in the anatomical regions studied , using pictures retrospective CT scans. Preliminary data obtained from measurements show that between the ages of 1-5 years, assuming an average anteroposterior diameter of the pediatric skull region of the 145.73 ± 2.97 mm, can be represented by 92.34 mm ± 5.22 of lucite and 1.75 ± 0:21 mm of aluminum plates of a provision of PEP (Pacient equivalent phantom). After its construction, the phantoms will be used for image and dose optimization in pediatric protocols process to examinations of computerized radiography

  15. Cross recurrence quantification for cover song identification

    International Nuclear Information System (INIS)

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  16. The effects of metal ion PCR inhibitors on results obtained with the Quantifiler(®) Human DNA Quantification Kit.

    Science.gov (United States)

    Combs, Laura Gaydosh; Warren, Joseph E; Huynh, Vivian; Castaneda, Joanna; Golden, Teresa D; Roby, Rhonda K

    2015-11-01

    Forensic DNA samples may include the presence of PCR inhibitors, even after extraction and purification. Studies have demonstrated that metal ions, co-purified at specific concentrations, inhibit DNA amplifications. Metal ions are endogenous to sample types, such as bone, and can be introduced from environmental sources. In order to examine the effect of metal ions as PCR inhibitors during quantitative real-time PCR, 2800M DNA was treated with 0.0025-18.750mM concentrations of aluminum, calcium, copper, iron, nickel, and lead. DNA samples, both untreated and metal-treated, were quantified using the Quantifiler(®) Human DNA Quantification Kit. Quantification cycle (Cq) values for the Quantifiler(®) Human DNA and internal PCR control (IPC) assays were measured and the estimated concentrations of human DNA were obtained. Comparisons were conducted between metal-treated and control DNA samples to determine the accuracy of the quantification estimates and to test the efficacy of the IPC inhibition detection. This kit is most resistant to the presence of calcium as compared to all metals tested; the maximum concentration tested does not affect the amplification of the IPC or quantification of the sample. This kit is most sensitive to the presence of aluminum; concentrations greater than 0.0750mM negatively affected the quantification, although the IPC assay accurately assessed the presence of PCR inhibition. The Quantifiler(®) Human DNA Quantification Kit accurately quantifies human DNA in the presence of 0.5000mM copper, iron, nickel, and lead; however, the IPC does not indicate the presence of PCR inhibition at this concentration of these metals. Unexpectedly, estimates of DNA quantity in samples treated with 18.750mM copper yielded values in excess of the actual concentration of DNA in the samples; fluorescence spectroscopy experiments indicated this increase was not a direct interaction between the copper metal and 6-FAM dye used to label the probe that targets human DNA in the Quantifiler(®) kit. Evidence of inhibition was observed for the human-specific assay at a lower metal concentration than detected by the IPC, for all metals examined except calcium. These results strongly suggest that determination of a "true negative" sample should not be based solely on the failure of the IPC to indicate the presence of a PCR inhibitor and indicate that amplification of all samples should be attempted, regardless of the quantification results. PMID:26240969

  17. Uncertainty quantification of bacterial aerosol neutralization in shock heated gases

    Science.gov (United States)

    Schulz, J. C.; Gottiparthi, K. C.; Menon, S.

    2015-01-01

    A potential method for the neutralization of bacterial endospores is the use of explosive charges since the high thermal and mechanical stresses in the post-detonation flow are thought to be sufficient in reducing the endospore survivability to levels that pose no significant health threat. While several experiments have attempted to quantify endospore survivability by emulating such environments in shock tube configurations, numerical simulations are necessary to provide information in scenarios where experimental data are difficult to obtain. Since such numerical predictions require complex, multi-physics models, significant uncertainties could be present. This work investigates the uncertainty in determining the endospore survivability from using a reduced order model based on a critical endospore temperature. Understanding the uncertainty in such a model is necessary in quantifying the variability in predictions using large-scale, realistic simulations of bacterial endospore neutralization by explosive charges. This work extends the analysis of previous large-scale simulations of endospore neutralization [Gottiparthi et al. in (Shock Waves, 2014. doi:10.1007/s00193-014-0504-9)] by focusing on the uncertainty quantification of predicting endospore neutralization. For a given initial mass distribution of the bacterial endospore aerosol, predictions of the intact endospore percentage using nominal values of the input parameters match the experimental data well. The uncertainty in these predictions are then investigated using the Dempster-Shafer theory of evidence and polynomial chaos expansion. The studies show that the endospore survivability is governed largely by the endospore's mass distribution and their exposure or residence time at the elevated temperatures and pressures. Deviations from the nominal predictions can be as much as 20-30 % in the intermediate temperature ranges. At high temperatures, i.e., strong shocks, which are of the most interest, the residence time is observed to be a dominant parameter, and this coupled with the analysis resulting from the Dempster-Shafer theory of evidence seems to indicate that achieving confident predictions of less than 1 % endospore viability can only occur by extending the residence time of the fluid-particle interaction.

  18. Half-Metallic p -Type LaAlO3/EuTiO3 Heterointerface from Density-Functional Theory

    Science.gov (United States)

    Lu, Hai-Shuang; Cai, Tian-Yi; Ju, Sheng; Gong, Chang-De

    2015-03-01

    The two-dimensional electron gas (2DEG) observed at the LaAlO3/SrTiO3 heterointerface has attracted intense research interest in recent years. The high mobility, electric tunability, and giant persistent photoconductivity suggest its potential for electronic and photonic applications. The lack of a p -type counterpart as well as a highly spin-polarized carrier in the LaAlO3/SrTiO3 system, however, restricts its widespread application, since both multiple carriers and high spin polarization are very desirable for electronic devices. Here, we report a system of LaAlO3/EuTiO3 digital heterostructures that may overcome these limitations. Results from first-principles calculations reveal that the 2DEG in the n -type LaAlO3/EuTiO3 is a normal ferromagnet. The p -type two-dimensional hole gas, on the other hand, is a 100% spin-polarized half-metal. For digital heterostructures with alternating n -type and p -type interfaces, a magnetic-field-driven insulator-to-metal transition, together with spatially separated electrons and holes, can be realized by tuning the intrinsic polar field. At low temperatures, the spin-polarized electron-hole pairs may result in spin-triplet exciton condensation, which provides an experimentally accessible system for achieving the theoretically proposed dissipationless spin transport. Our findings open a path for exploring spintronics at the heterointerface of transition-metal oxides.

  19. Methodological considerations in quantification of oncological FDG PET studies

    International Nuclear Information System (INIS)

    This review aims to provide insight into the factors that influence quantification of glucose metabolism by FDG PET images in oncology as well as their influence on repeated measures studies (i.e. treatment response assessment), offering improved understanding both for clinical practice and research. Structural PubMed searches have been performed for the many factors affecting quantification of glucose metabolism by FDG PET. Review articles and references lists have been used to supplement the search findings. Biological factors such as fasting blood glucose level, FDG uptake period, FDG distribution and clearance, patient motion (breathing) and patient discomfort (stress) all influence quantification. Acquisition parameters should be adjusted to maximize the signal to noise ratio without exposing the patient to a higher than strictly necessary radiation dose. This is especially challenging in pharmacokinetic analysis, where the temporal resolution is of significant importance. The literature is reviewed on the influence of attenuation correction on parameters for glucose metabolism, the effect of motion, metal artefacts and contrast agents on quantification of CT attenuation-corrected images. Reconstruction settings (analytical versus iterative reconstruction, post-reconstruction filtering and image matrix size) all potentially influence quantification due to artefacts, noise levels and lesion size dependency. Many region of interest definitions are available, but increased complexity does not necessarily result in improved performance. Different methods for the quantification of the tissue of interest can introduce systematic and random inaccuracy. This review provides an up-to-date overview of the many factors that influence quantification of glucose metabolism by FDG PET. (orig.) 3

  20. Mössbauer spectroscopy, nuclear inelastic scattering and density functional theory studies on oxobridged iron complexes and their reaction under Gif-type conditions

    OpenAIRE

    Subramanian, Rajagopalan

    2010-01-01

    This dissertation comprises studies of the Gif reaction of trinuclear oxobridged iron complexes aimed at predicting the intermediates formed during the Gif reaction. The experimental techniques used in these studies were 57Fe transmission Mössbauer spectroscopy and synchrotron-based nuclear inelastic scattering (NIS). Quantum mechanical calculations based on density functional theory were also used to interpret the experimental results. Because NIS has rarely been applied to study catalytic r...

  1. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  2. Quantification of the adrenal cortex hormones with radioimmunoassay

    International Nuclear Information System (INIS)

    The pathologies of the adrenal cortex -adrenal insufficiency and Cushing syndrome- have their origin on the deficit or hypersecretion of some of the hormones that are secreted by the adrenal cortex, which is divided in three zones anatomically defined: the external zone, also called the zona glomerulosa, which is the main production site of aldosterone and mineralocorticoids; the internal zone, or zona reticularis, that produces androgens; and the external zone, or zone 1 orticotrop, which is responsible for producing glucocorticoids. In this work, a quantitative analysis of those hormones and their pathologic trigger was made; the quantification was made in the laboratory by means of highly sensitive and specific techniques, in this case, the radioimmunoassay, in which a radioisotope I-125 is used. This technique is based on the biochemical bond-type reaction, because it requires of a substance called the linker, which bonds to another called ligand. This reaction is also known as antigen-antibody (Ag-Ab), where the results of the reaction will depend on the quantity of antigen in the sample and on its affinity for the antibody. In this work, a 56 patients (of which 13 were men and 43 women) study was made. The cortisol, the ACTH, the androsterone and the DHEA values were very elevated in the majority of the cases corresponding to women, predominating cortisol; while in men, a notorious elevation of the 17 ?-OH-PRG and of the DHEA-SO4 was observed. Based on that, we can conclude that 51 of them did not have mayor complications, because they just went to the laboratory once, while the remaining 5 had a medical monitoring, and they visited the laboratory more than one occasion, tell about a difficulty on their improvement. According to the results, an approximate relation of 8:2 women:men, respectively, becomes clear to the hormonal pathologies of the adrenal cortex. (Author)

  3. Quantification of the adrenal cortex hormones with radioimmunoassay

    Energy Technology Data Exchange (ETDEWEB)

    Badillo A, V.; Carrera D, A. A.; Ibarra M, C. M., E-mail: vbadillocren@hotmail.co [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Calle Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas (Mexico)

    2010-10-15

    The pathologies of the adrenal cortex -adrenal insufficiency and Cushing syndrome- have their origin on the deficit or hypersecretion of some of the hormones that are secreted by the adrenal cortex, which is divided in three zones anatomically defined: the external zone, also called the zona glomerulosa, which is the main production site of aldosterone and mineralocorticoids; the internal zone, or zona reticularis, that produces androgens; and the external zone, or zone 1 orticotrop, which is responsible for producing glucocorticoids. In this work, a quantitative analysis of those hormones and their pathologic trigger was made; the quantification was made in the laboratory by means of highly sensitive and specific techniques, in this case, the radioimmunoassay, in which a radioisotope I-125 is used. This technique is based on the biochemical bond-type reaction, because it requires of a substance called the linker, which bonds to another called ligand. This reaction is also known as antigen-antibody (Ag-Ab), where the results of the reaction will depend on the quantity of antigen in the sample and on its affinity for the antibody. In this work, a 56 patients (of which 13 were men and 43 women) study was made. The cortisol, the ACTH, the androsterone and the DHEA values were very elevated in the majority of the cases corresponding to women, predominating cortisol; while in men, a notorious elevation of the 17 {alpha}-OH-PRG and of the DHEA-SO{sub 4} was observed. Based on that, we can conclude that 51 of them did not have mayor complications, because they just went to the laboratory once, while the remaining 5 had a medical monitoring, and they visited the laboratory more than one occasion, tell about a difficulty on their improvement. According to the results, an approximate relation of 8:2 women:men, respectively, becomes clear to the hormonal pathologies of the adrenal cortex. (Author)

  4. Band Calculations for Ce Compounds with AuCu$_{3}$-type Crystal Structure on the basis of Dynamical Mean Field Theory II. - CeIn$_{3}$ and CeSn$_{3}$

    OpenAIRE

    Sakai, Osamu; Harima, Hisatomo

    2012-01-01

    Band calculations for Ce compounds with the AuCu$_{3}$-type crystal structure were carried out on the basis of dynamical mean field theory (DMFT). The results of applying the calculation to CeIn$_{3}$ and CeSn$_{3}$ are presented as the second in a series of papers. The Kondo temperature and crystal-field splitting are obtained, respectively, as 190 and 390 K (CeSn$_{3}$), 8 and 160 K (CeIn$_{3}$ under ambient pressure), and 30 and 240 K (CeIn$_{3}$ at a pressure of 2.75 GPa...

  5. GPU-accelerated voxelwise hepatic perfusion quantification

    International Nuclear Information System (INIS)

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10?6. The method will be useful for generating liver perfusion images in clinical settings. (paper)

  6. GPU-accelerated voxelwise hepatic perfusion quantification

    Science.gov (United States)

    Wang, H.; Cao, Y.

    2012-09-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10-6. The method will be useful for generating liver perfusion images in clinical settings.

  7. Quantification of asphaltene precipitation by scaling equation

    Science.gov (United States)

    Janier, Josefina Barnachea; Jalil, Mohamad Afzal B. Abd.; Samin, Mohamad Izhar B. Mohd; Karim, Samsul Ariffin B. A.

    2015-02-01

    Asphaltene precipitation from crude oil is one of the issues for the oil industry. The deposition of asphaltene occurs during production, transportation and separating process. The injection of carbon dioxide (CO2) during enhance oil recovery (EOR) is believed to contribute much to the precipitation of asphaltene. Precipitation can be affected by the changes in temperature and pressure on the crude oil however, reduction in pressure contribute much to the instability of asphaltene as compared to temperature. This paper discussed the quantification of precipitated asphaltene in crude oil at different high pressures and at constant temperature. The derived scaling equation was based on the reservoir condition with variation in the amount of carbon dioxide (CO2) mixed with Dulang a light crude oil sample used in the experiment towards the stability of asphaltene. A FluidEval PVT cell with Solid Detection System (SDS) was the instrument used to gain experimental knowledge on the behavior of fluid at reservoir conditions. Two conditions were followed in the conduct of the experiment. Firstly, a 45cc light crude oil was mixed with 18cc (40%) of CO2 and secondly, the same amount of crude oil sample was mixed with 27cc (60%) of CO2. Results showed that for a 45cc crude oil sample combined with 18cc (40%) of CO2 gas indicated a saturation pressure of 1498.37psi and asphaltene onset point was 1620psi. Then for the same amount of crude oil combined with 27cc (60%) of CO2, the saturation pressure was 2046.502psi and asphaltene onset point was 2230psi. The derivation of the scaling equation considered reservoir temperature, pressure, bubble point pressure, mole percent of the precipitant the injected gas CO2, and the gas molecular weight. The scaled equation resulted to a third order polynomial that can be used to quantify the amount of asphaltene in crude oil.

  8. AdS{sub 3} x{sub w} (S{sup 3} x S{sup 3} x S{sup 1}) solutions of type IIB string theory

    Energy Technology Data Exchange (ETDEWEB)

    Donos, Aristomenis [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Gauntlett, Jerome P. [Imperial College, London (United Kingdom). Blackett Lab.]|[Imperial College, London (United Kingdom). The Institute for Mathematical Sicences; Sparks, James [Oxford Univ. (United Kingdom). Mathematical Institute

    2008-10-15

    We analyse a recently constructed class of local solutions of type IIB supergravity that consist of a warped product of AdS{sub 3} with a sevendimensional internal space. In one duality frame the only other nonvanishing fields are the NS three-form and the dilaton. We analyse in detail how these local solutions can be extended to globally well-defined solutions of type IIB string theory, with the internal space having topology S{sup 3} x S{sup 3} x S{sup 1} and with properly quantised three-form flux. We show that many of the dual (0,2) SCFTs are exactly marginal deformations of the (0,2) SCFTs whose holographic duals are warped products of AdS{sub 3} with seven-dimensional manifolds of topology S{sup 3} x S{sup 2} x T{sup 2}. (orig.)

  9. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques. PMID:25182968

  10. Automated lobar quantification of emphysema in patients with severe COPD

    International Nuclear Information System (INIS)

    Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p>0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC=0.94), left lower lobe (ICC=0.98), and right lower lobe (ICC=0.80). The agreement was good for right upper lobe (ICC=0.68) and moderate for middle lobe (IC=0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring. (orig.)

  11. A phenomenological theory of spatially structured local synaptic connectivity.

    Directory of Open Access Journals (Sweden)

    2005-06-01

    Full Text Available The structure of local synaptic circuits is the key to understanding cortical function and how neuronal functional modules such as cortical columns are formed. The central problem in deciphering cortical microcircuits is the quantification of synaptic connectivity between neuron pairs. I present a theoretical model that accounts for the axon and dendrite morphologies of pre- and postsynaptic cells and provides the average number of synaptic contacts formed between them as a function of their relative locations in three-dimensional space. An important aspect of the current approach is the representation of a complex structure of an axonal/dendritic arbor as a superposition of basic structures-synaptic clouds. Each cloud has three structural parameters that can be directly estimated from two-dimensional drawings of the underlying arbor. Using empirical data available in literature, I applied this theory to three morphologically different types of cell pairs. I found that, within a wide range of cell separations, the theory is in very good agreement with empirical data on (i axonal-dendritic contacts of pyramidal cells and (ii somatic synapses formed by the axons of inhibitory interneurons. Since for many types of neurons plane arborization drawings are available from literature, this theory can provide a practical means for quantitatively deriving local synaptic circuits based on the actual observed densities of specific types of neurons and their morphologies. It can also have significant implications for computational models of cortical networks by making it possible to wire up simulated neural networks in a realistic fashion.

  12. Tractability of Theory Patching

    CERN Document Server

    Argamon-Engelson, S

    1998-01-01

    In this paper we consider the problem of `theory patching', in which we are given a domain theory, some of whose components are indicated to be possibly flawed, and a set of labeled training examples for the domain concept. The theory patching problem is to revise only the indicated components of the theory, such that the resulting theory correctly classifies all the training examples. Theory patching is thus a type of theory revision in which revisions are made to individual components of the theory. Our concern in this paper is to determine for which classes of logical domain theories the theory patching problem is tractable. We consider both propositional and first-order domain theories, and show that the theory patching problem is equivalent to that of determining what information contained in a theory is `stable' regardless of what revisions might be performed to the theory. We show that determining stability is tractable if the input theory satisfies two conditions: that revisions to each theory compone...

  13. Gamma camera based Positron Emission Tomography: a study of the viability on quantification

    International Nuclear Information System (INIS)

    Positron Emission Tomography (PET) is a Nuclear Medicine imaging modality for diagnostic purposes. Pharmaceuticals labeled with positron emitters are used and images which represent the in vivo biochemical process within tissues can be obtained. The positron/electron annihilation photons are detected in coincidence and this information is used for object reconstruction. Presently, there are two types of systems available for this imaging modality: the dedicated systems and those based on gamma camera technology. In this work, we utilized PET/SPECT systems, which also allows for the traditional Nuclear Medicine studies based on single photon emitters. There are inherent difficulties which affect quantification of activity and other indices. They are related to the Poisson nature of radioactivity, to radiation interactions with patient body and detector, noise due to statistical nature of these interactions and to all the detection processes, as well as the patient acquisition protocols. Corrections are described in the literature and not all of them are implemented by the manufacturers: scatter, attenuation, random, decay, dead time, spatial resolution, and others related to the properties of each equipment. The goal of this work was to assess these methods adopted by two manufacturers, as well as the influence of some technical characteristics of PET/SPECT systems on the estimation of SUV. Data from a set of phantoms were collected in 3D mode by one camera and 2D, by the other. We concluded that quantification is viable in PET/SPECT systems, including the estimation of SUVs. This is only possible if, apart from the above mentioned corrections, the camera is well tuned and coefficients for sensitivity normalization and partial volume corrections are applied. We also verified that the shapes of the sources used for obtaining these factors play a role on the final results and should be delt with carefully in clinical quantification. Finally, the choice of the region of interest is critical and it should be the same used to calculate the correction factors. (author)

  14. Alberta Diabetes and Physical Activity Trial (ADAPT): A randomized theory-based efficacy trial for adults with type 2 diabetes - rationale, design, recruitment, evaluation, and dissemination

    OpenAIRE

    Birkett Nicholas; Johnson Jeffrey A; Sigal Ronald J; Courneya Kerry S; Plotnikoff Ronald C; Lau David; Raine Kim; Johnson Steven T; Karunamuni Nandini

    2010-01-01

    Abstract Background The primary aim of this study was to compare the efficacy of three physical activity (PA) behavioural intervention strategies in a sample of adults with type 2 diabetes. Method/Design Participants (N = 287) were randomly assigned to one of three groups consisting of the following intervention strategies: (1) standard printed PA educational materials provided by the Canadian Diabetes Association [i.e., Group 1/control group)]; (2) standard printed PA educational materials a...

  15. Identification of Important Chemical Features of 11?-Hydroxysteroid Dehydrogenase Type1 Inhibitors: Application of Ligand Based Virtual Screening and Density Functional Theory

    OpenAIRE

    Keun Woo Lee; Sundaraganesan Namadevan; Young-Sik Sohn; Chandrasekaran Meganathan; Sugunadevi Sakkiah

    2012-01-01

    11?-Hydroxysteroid dehydrogenase type1 (11?HSD1) regulates the conversion from inactive cortisone to active cortisol. Increased cortisol results in diabetes, hence quelling the activity of 11?HSD1 has been thought of as an effective approach for the treatment of diabetes. Quantitative hypotheses were developed and validated to identify the critical chemical features with reliable geometric constraints that contribute to the inhibition of 11?HSD1 function. The best hypothesis, Hypo1, which con...

  16. Two families with quadrupedalism, mental retardation, no speech, and infantile hypotonia (Uner Tan Syndrome Type-II); a novel theory for the evolutionary emergence of human bipedalism

    OpenAIRE

    Tan, Uner

    2014-01-01

    Two consanguineous families with Uner Tan Syndrome (UTS) were analyzed in relation to self-organizing processes in complex systems, and the evolutionary emergence of human bipedalism. The cases had the key symptoms of previously reported cases of UTS, such as quadrupedalism, mental retardation, and dysarthric or no speech, but the new cases also exhibited infantile hypotonia and are designated UTS Type-II. There were 10 siblings in Branch I and 12 siblings in Branch II. Of these, there were s...

  17. Two families with quadrupedalism, mental retardation, no speech, and infantile hypotonia (Uner Tan Syndrome Type-II); a novel theory for the evolutionary emergence of human bipedalism

    OpenAIRE

    UnerTan

    2014-01-01

    Two consanguineous families with Uner Tan Syndrome (UTS) were analyzed in relation to self-organizing processes in complex systems, and the evolutionary emergence of human bipedalism. The cases had the key symptoms of previously reported cases of UTS, such as quadrupedalism, mental retardation, and dysarthric or no speech, but the new cases also exhibited infantile hypotonia and are designated UTS Type-II. There were 10 siblings in Branch I and 12 siblings in Branch II. Of these, there were ...

  18. Superlattice band structure: New and simple energy quantification condition

    International Nuclear Information System (INIS)

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga0.5Al0.5As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results

  19. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  20. Basic concepts in quantum information theory

    International Nuclear Information System (INIS)

    Full text: Quantum information theory provides a framework for the description of quantum systems and their applications in the context of quantum computation and quantum communication. Although several of the basic concepts on which such theory is built are reminiscent of those of (classical) Information Theory, the new rules provided by quantum mechanics introduce properties which have no classical counterpart and that are responsible for most of the applications. In particular, entangled states appear as one of the basic resources in this context. In this lecture I will introduce the basic concepts and applications in Quantum Information, particularly stressing the definition of entanglement, its quantification, and its applications. (author)

  1. Absolute Quantification of Somatic DNA Alterations in Human Cancer - Scott Carter, TCGA Scientific Symposium 2011

    Science.gov (United States)

    Home News and Events Multimedia Library Videos Absolute Quantification of Somatic DNA Alterations in Human Cancer - Scott Carter Absolute Quantification of Somatic DNA Alterations in Human Cancer - Scott Carter, TCGA Scientific Symposium 2011 You

  2. Quantification of Structure from Medical Images

    DEFF Research Database (Denmark)

    Qazi, Arish Asif

    2008-01-01

    In this thesis, we present automated methods that quantify information from medical images; information that is intended to assist and enable clinicians gain a better understanding of the underlying pathology. The first part of the thesis presents methods that analyse the articular cartilage, segmented from MR images of the knee. The cartilage tissue is considered to be a key determinant in the onset of Osteoarthritis (OA), a degenerative joint disease, with no known cure. The primary obstacle has been the dependence on radiography as the ‘gold standard’ for detecting the manifestation of cartilage changes. This is an indirect assessment, since the cartilage is not visible on xrays. We propose Cartilage Homogeneity, quantified from MR images, as a marker for detection of the early biochemical alterations in the articular cartilage. We show that homogeneity provides accuracy, sensitivity, and information beyond that of traditional morphometric measures. The thesis also proposes a fully automatic and generic statistical framework for identifying biologically interpretable regions of difference (ROD) between two groups of biological objects, attributed by anatomical differences or changes relating to pathology, without a priori knowledge about the location, extent, or topology of the ROD. Based on quantifications from both morphometric and textural based imaging markers, our method has identified the most pathological regions in the articular cartilage. The remaining part of the thesis presents methods based on diffusion tensor imaging, a technique widely used for analysis of the white matter of the central nervous system in the living human brain. An inherent drawback of the traditional diffusion tensor model is its limited ability to provide detailed information about multi-directional fiber architecture within a voxel. This leads to erroneous fiber tractography results in locations where fiber bundles cross each other. We present a novel tractography technique, which successfully traces through regions of crossing fibers. Detection of crossing white matter pathways can improve neurosurgical visualization of  functionally relevant white matter areas. We also present preliminary results of analysing the meshwork of the collagen fibers in the articular cartilage by high-resolution diffusion tensor imaging.

  3. Light element quantification by lithium elastic scattering

    Energy Technology Data Exchange (ETDEWEB)

    Portillo, F.E. [Departamento de Física, Universidad Simón Bolívar, Caracas (Venezuela, Bolivarian Republic of); Liendo, J.A., E-mail: jliendo@usb.ve [Departamento de Física, Universidad Simón Bolívar, Caracas (Venezuela, Bolivarian Republic of); González, A.C. [Centro de Física, Instituto Venezolano de Investigaciones Científicas, Caracas (Venezuela, Bolivarian Republic of); Caussyn, D.D.; Fletcher, N.R.; Momotyuk, O.A.; Roeder, B.T.; Wiedenhoever, I.; Kemper, K.W.; Barber, P. [Physics Department, The Florida State University, Tallahassee, FL (United States); Sajo-Bohus, L. [Departamento de Física, Universidad Simón Bolívar, Caracas (Venezuela, Bolivarian Republic of)

    2013-06-15

    Accurate differential cross sections have been measured at specific beam energies and angles to be used in a method proposed previously for the simultaneous quantification of light elements (Z<11) present in evaporated liquid biological samples. Targets containing {sup 1}H, {sup 7}Li, {sup 12}C, {sup 16}O, {sup 19}F, {sup 28}Si and {sup 197}Au have been bombarded with 13 MeV {sup 6}Li{sup 3+} and 20 MeV {sup 16}O{sup 5+} beams. The {sup 16}O + {sup 1}H, {sup 16}O + {sup 12}C, {sup 16}O + {sup 16}O, {sup 16}O + {sup 19}F, {sup 16}O + {sup 28}Si and {sup 16}O + {sup 197}Au cross sections, shown to be consistent with the Rutherford formula predictions at 15° and 20°, have been used to determine cross sections for the {sup 6}Li + {sup 1}H, {sup 6}Li + {sup 12}C, {sup 6}Li + {sup 16}O, {sup 6}Li + {sup 19}F, {sup 6}Li + {sup 28}Si and {sup 6}Li + {sup 197}Au scatterings respectively at 17.5°, 24°, 25°, 26°, 28° and 30°. Although {sup 6}Li + {sup 7}Li cross sections have not been obtained from {sup 16}O + {sup 7} Li cross sections, they have been determined from measured {sup 6}Li + {sup 19}F cross sections and, in addition, used to obtain {sup 16}O + {sup 7}Li cross sections at 15° and 20°. The reliability of the new cross sections determined in this investigation for the {sup 6}Li + {sup 1}H, {sup 6}Li + {sup 7}Li and {sup 6}Li + {sup 19}F scatterings is based on the Rutherford behavior of the measured {sup 6}Li + {sup 197}Au scattering data as expected and the consistency observed between the {sup 6}Li + {sup 12}C, {sup 6}Li + {sup 16}O and {sup 6}Li + {sup 28}Si cross sections obtained in this work and previously reported values. This research has important implications in applied physics.

  4. Multiparty Symmetric Sum Types

    Directory of Open Access Journals (Sweden)

    Lasse Nielsen

    2010-11-01

    Full Text Available This paper introduces a new theory of multiparty session types based on symmetric sum types, by which we can type non-deterministic orchestration choice behaviours. While the original branching type in session types can represent a choice made by a single participant and accepted by others determining how the session proceeds, the symmetric sum type represents a choice made by agreement among all the participants of a session. Such behaviour can be found in many practical systems, including collaborative workflow in healthcare systems for clinical practice guidelines (CPGs. Processes using the symmetric sums can be embedded into the original branching types using conductor processes. We show that this type-driven embedding preserves typability, satisfies semantic soundness and completeness, and meets the encodability criteria adapted to the typed setting. The theory leads to an efficient implementation of a prototypical tool for CPGs which automatically translates the original CPG specifications from a representation called the Process Matrix to symmetric sum types, type checks programs and executes them.

  5. Multiparty Symmetric Sum Types

    DEFF Research Database (Denmark)

    Nielsen, Lasse; Yoshida, Nobuko

    2010-01-01

    This paper introduces a new theory of multiparty session types based on symmetric sum types, by which we can type non-deterministic orchestration choice behaviours. While the original branching type in session types can represent a choice made by a single participant and accepted by others determining how the session proceeds, the symmetric sum type represents a choice made by agreement among all the participants of a session. Such behaviour can be found in many practical systems, including collaborative workflow in healthcare systems for clinical practice guidelines (CPGs). Processes with the symmetric sums can be embedded into the original branching types using conductor processes. We show that this type-driven embedding preserves typability, satisfies semantic soundness and completeness, and meets the encodability criteria adapted to the typed setting. The theory leads to an efficient implementation of a prototypical tool for CPGs which automatically translates the original CPG specifications from a representation called the Process Matrix to symmetric sum types, type checks programs and executes them.

  6. Clinical PET Myocardial Perfusion Imaging and Flow Quantification.

    Science.gov (United States)

    Juneau, Daniel; Erthal, Fernanda; Ohira, Hiroshi; Mc Ardle, Brian; Hessian, Renée; deKemp, Robert A; Beanlands, Rob S B

    2016-02-01

    Cardiac PET imaging is a powerful tool for the assessment of coronary artery disease. Many tracers with different advantages and disadvantages are available. It has several advantages over single photon emission computed tomography, including superior accuracy and lower radiation exposure. It provides powerful prognostic information, which can help to stratify patients and guide clinicians. The addition of flow quantification enables better detection of multivessel disease while providing incremental prognostic information. Flow quantification provides important physiologic information, which may be useful to individualize patient therapy. This approach is being applied in some centers, but requires standardization before it is more widely applied. PMID:26590781

  7. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  8. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  9. From M-theory to F-theory, with Branes

    OpenAIRE

    Johnson, Clifford V

    1997-01-01

    A duality relationship between certain brane configurations in type IIA and type IIB string theory is explored by exploiting the geometrical origins of each theory in M-theory. The configurations are dual ways of realising the non-perturbative dynamics of a four dimensional N=2 supersymmetric SU(2) gauge theory with four or fewer favours of fermions in the fundamental, and the spectral curve which organizes these dynamics plays a prominent role in each case. This is an illus...

  10. Analyzing Social Interactions : The Promises and Challenges of Using Cross Recurrence Quantification Analysis

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Konvalinka, Ivana

    2014-01-01

    The scientific investigation of social interactions presents substantial challenges: interacting agents engage each other at many different levels and timescales (motor and physiological coordination, joint attention, linguistic exchanges, etc.), often making their behaviors interdependent in non-linear ways. In this paper we review the current use of Cross Recurrence Quantification Analysis (CRQA) in the analysis of social interactions, and assess its potential and challenges. We argue that the method can sensitively grasp the dynamics of human interactions, and that it has started producing valuable knowledge about them. However, much work is still necessary: more systematic analyses and interpretation of the recurrence indexes and more consistent reporting of the results,more emphasis on theory-driven studies, exploring interactions involving more than 2 agents and multiple aspects of coordination,and assessing and quantifying complementary coordinative mechanisms. These challenges are discussed and operationalized in recommendations to further develop the field.

  11. Analyzing Social Interactions: Promises and Challenges of Cross Recurrence Quantification Analysis

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Konvalinka, Ivana

    2014-01-01

    The scientific investigation of social interactions presents substantial challenges: interacting agents engage each other at many different levels and timescales (motor and physiological coordination, joint attention, linguistic exchanges, etc.), often making their behaviors interdependent in non-linear ways. In this paper we review the current use of Cross Recurrence Quantification Analysis (CRQA) in the analysis of social interactions, and assess its potential and challenges. We argue that the method can sensitively grasp the dynamics of human interactions, and that it has started producing valuable knowledge about them. However, much work is still necessary: more systematic analyses and interpretation of the recurrence indexes and more consistent reporting of the results,more emphasis on theory-driven studies, exploring interactions involving more than 2 agents and multiple aspects of coordination,and assessing and quantifying complementary coordinative mechanisms. These challenges are discussed and operationalized in recommendations to further develop the field.

  12. Recurrence plots and its quantification analysis applied to the monitoring and surveillance in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Naranjo, Alberto R.; Otero, Maria Elena M.; Poveda, Aylin G. [Higher Institute of Technologies and Applied Sciences, Havana City (Cuba)]. E-mails: rolo@instec.cu; mmontesi@instec.cu; Guerra, Alexeis C. [University of Informatic Sciences, Havana City (Cuba)]. E-mail: alexeis@uci.cu

    2007-07-01

    The application of non-linear dynamic methods in many scientific fields has demonstrated its great potentiality in the early detection of significant dynamic singularities. The introduction of these methods oriented to the surveillance of anomalies and failures of nuclear reactors and their fundamental equipment have been demonstrated in the last years. Specifically, Recurrence Plot and its Quantification Analysis are methods currently used in many scientific fields. The paper focuses its attention on the estimation of the Recurrence Plots and its Quantification Analysis applied to signal samples obtained from different types of reactors: research reactor TRIGA MARK-III, BWR/5 and PHWR. Different behaviors are compared in order to look for a pattern for the characterization of the power instability events in the nuclear reactor. These outputs have a great importance for its application in systems of surveillance and monitoring in Nuclear Power Plants. For its introduction in a real time monitoring system, the authors propose some useful approaches. The results indicate the potentiality of the method for its implementation in a system of surveillance and monitoring in Nuclear Power Plants. All the calculations were performed with two computational tools developed by Marwan: Cross Recurrence Plot Toolbox for Matlab (Version 5.7, Release 22) and Visual Recurrence Analysis (Version 4.8). (author)

  13. A novel approach for the automated segmentation and volume quantification of cardiac fats on computed tomography.

    Science.gov (United States)

    Rodrigues, É O; Morais, F F C; Morais, N A O S; Conci, L S; Neto, L V; Conci, A

    2016-01-01

    The deposits of fat on the surroundings of the heart are correlated to several health risk factors such as atherosclerosis, carotid stiffness, coronary artery calcification, atrial fibrillation and many others. These deposits vary unrelated to obesity, which reinforces its direct segmentation for further quantification. However, manual segmentation of these fats has not been widely deployed in clinical practice due to the required human workload and consequential high cost of physicians and technicians. In this work, we propose a unified method for an autonomous segmentation and quantification of two types of cardiac fats. The segmented fats are termed epicardial and mediastinal, and stand apart from each other by the pericardium. Much effort was devoted to achieve minimal user intervention. The proposed methodology mainly comprises registration and classification algorithms to perform the desired segmentation. We compare the performance of several classification algorithms on this task, including neural networks, probabilistic models and decision tree algorithms. Experimental results of the proposed methodology have shown that the mean accuracy regarding both epicardial and mediastinal fats is 98.5% (99.5% if the features are normalized), with a mean true positive rate of 98.0%. In average, the Dice similarity index was equal to 97.6%. PMID:26474835

  14. Exact quantification of time signals in Pade-based magnetic resonance spectroscopy

    International Nuclear Information System (INIS)

    This study reports on the fast Pade transform (FPT) for parametric signal processing of realistically synthesized free induction decay curves whose main spectral features are similar to those encoded clinically from a healthy human brain by means of magnetic resonance spectroscopy (MRS). Here, for the purpose of diagnostics, it is of paramount importance to be able to perform accurate and robust quantification of the investigated time signals. This amounts to solving the challenging harmonic inversion problem as a spectral decomposition of the given time signal by means of reconstruction of the unknown total number of resonances, their complex frequencies and amplitudes yielding the peak positions, widths, heights and phases. On theoretical grounds, the FPT solves exactly this mathematically ill-conditioned inverse problem for any noiseless synthesized time signal comprised of an arbitrarily large (finite or infinite) number of damped complex exponentials with stationary and non-stationary polynomial-type amplitudes leading to Lorentzian (non-degenerate) and non-Lorentzian (degenerate) spectra. Convergent validation for this fact is given via the proof-of-principle which is thoroughly demonstrated by the exact numerical solution of a typical quantification problem from MRS. The presently designed study is a paradigm shift for signal processing in MRS with particular relevance to clinical oncology, due to the unprecedented capability of the fast Pade transform to unequivocally resolve and quantify isolated, tightly overlapped and nearly coincident resonances

  15. Improved understanding of factors contributing to quantification of anhydrate/hydrate powder mixtures

    DEFF Research Database (Denmark)

    Rantanen, Jukka; Wikström, Håkan

    2005-01-01

    Different spectroscopic approaches have proved to be excellent analytical tools for monitoring process-induced transformations of active pharmaceutical ingredients during pharmaceutical unit operations. In order to use these tools effectively, it is necessary to build calibration models that describe the relationship between the amount of each solid-state form of interest and the spectroscopic signal. In this study, near-infrared (NIR) and Raman spectroscopic methods have been evaluated for the quantification of hydrate and anhydrate forms in pharmaceutical powders. Process type spectrometers were used to collect the data and the role of the sampling procedure was examined. Multivariate regression models were compared with traditional univariate calibrations and special emphasis was placed on data treatment prior to multivariate modeling by partial least squares (PLS). It was found that the measured sample volume greatly affected the performance of the model whereby the calibrations were significantly improved by utilizing a larger sampling area. In addition, multivariate regression did not always improve the predictability of the data compared to univariate analysis. The data treatment prior to multivariate modeling had a significant influence on the quality of predictions with standard normal variate transformation generally proving to be the best preprocessing method. When the appropriate sampling techniques and data analysis methods were utilized, both NIR and Raman spectroscopy were found to be suitable methods for the quantification of anhydrate/hydrate in powder systems, and thus the method of choice will depend on the conditions in the process under investigation.

  16. Reliability quantification and visualization for electric microgrids

    Science.gov (United States)

    Panwar, Mayank

    The electric grid in the United States is undergoing modernization from the state of an aging infrastructure of the past to a more robust and reliable power system of the future. The primary efforts in this direction have come from the federal government through the American Recovery and Reinvestment Act of 2009 (Recovery Act). This has provided the U.S. Department of Energy (DOE) with 4.5 billion to develop and implement programs through DOE's Office of Electricity Delivery and Energy Reliability (OE) over the a period of 5 years (2008-2012). This was initially a part of Title XIII of the Energy Independence and Security Act of 2007 (EISA) which was later modified by Recovery Act. As a part of DOE's Smart Grid Programs, Smart Grid Investment Grants (SGIG), and Smart Grid Demonstration Projects (SGDP) were developed as two of the largest programs with federal grants of 3.4 billion and $600 million respectively. The Renewable and Distributed Systems Integration (RDSI) demonstration projects were launched in 2008 with the aim of reducing peak electricity demand by 15 percent at distribution feeders. Nine such projects were competitively selected located around the nation. The City of Fort Collins in co-operative partnership with other federal and commercial entities was identified to research, develop and demonstrate a 3.5MW integrated mix of heterogeneous distributed energy resources (DER) to reduce peak load on two feeders by 20-30 percent. This project was called FortZED RDSI and provided an opportunity to demonstrate integrated operation of group of assets including demand response (DR), as a single controllable entity which is often called a microgrid. As per IEEE Standard 1547.4-2011 (IEEE Guide for Design, Operation, and Integration of Distributed Resource Island Systems with Electric Power Systems), a microgrid can be defined as an electric power system which has following characteristics: (1) DR and load are present, (2) has the ability to disconnect from and parallel with the area Electric Power Systems (EPS), (3) includes the local EPS and may include portions of the area EPS, and (4) is intentionally planned. A more reliable electric power grid requires microgrids to operate in tandem with the EPS. The reliability can be quantified through various metrics for performance measure. This is done through North American Electric Reliability Corporation (NERC) metrics in North America. The microgrid differs significantly from the traditional EPS, especially at asset level due to heterogeneity in assets. Thus, the performance cannot be quantified by the same metrics as used for EPS. Some of the NERC metrics are calculated and interpreted in this work to quantify performance for a single asset and group of assets in a microgrid. Two more metrics are introduced for system level performance quantification. The next step is a better representation of the large amount of data generated by the microgrid. Visualization is one such form of representation which is explored in detail and a graphical user interface (GUI) is developed as a deliverable tool to the operator for informative decision making and planning. Electronic appendices-I and II contain data and MATLAB© program codes for analysis and visualization for this work.

  17. Waltz's Theory of Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2009-01-01

    Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism and reflectivism. Yet, ironically, there has been little attention to Waltz's very explicit and original arguments about the nature of theory. This article explores and explicates Waltz's theory of theory. Central attention is paid to his definition of theory as ‘a picture, mentally formed' and to the radical anti-empiricism and anti-positivism of his position. Followers and critics alike have treated Waltzian neorealism as if it was at bottom a formal proposition about cause-effect relations. The extreme case of Waltz being so victorious in the discipline, and yet being consistently mis-interpreted on the question of theory, shows the power of a dominant philosophy of science in US IR, and thus the challenge facing any ambitious theorising. The article suggests a possible movement of fronts away from the ‘fourth debate' between rationalism and reflectivism towards one of theory against empiricism. To help this new agenda, the article introduces a key literature from philosophy of science about the structure of theory, and particularly about the way even natural science uses theory very differently from what IRs mainstream thinks - and much more like the way Waltz wants his theory used.

  18. Information Theoretic Resources in Quantum Theory

    OpenAIRE

    Meznaric, Sebastian

    2013-01-01

    Resource identification and quantification is an essential element of both classical and quantum information theory. Entanglement is one of these resources, arising when quantum communication and nonlocal operations are expensive to perform. In the first part of this thesis we quantify the effective entanglement when operations are additionally restricted. For an important class of errors we find a linear relationship between the usual and effective higher dimensional genera...

  19. Damage Localization and Quantification of Earthquake Excited RC-Frames

    DEFF Research Database (Denmark)

    Skjærbæk, P.S.; Nielsen, Søren R.K.; Kirkegaard, Poul Henning; Cakmak, A.S.

    1997-01-01

    In the paper a recently proposed method for damage localization and quantification of RC-structures from response measurements is tested on experimental data. The method investigated requires at least one response measurement along the structure and the ground surface acceleration. Further, the two lowest time- varying eigenfrequencies of the structure must be identified.

  20. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  1. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; SØrensen, Flemming Brandt

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles present in the synovium. A significantly positive correlation was demonstrated between the number of mast cells and the total number of cells. Thus, the present study reports stereological quantification of the mast cells and the total number of cells in synovium from patients with osteoarthritis. A possible link between the mast cell and osteoarthritis is discussed upon obtaining a precise estimate of cell profiles in human synovium.

  2. Literacy and Language Education: The Quantification of Learning

    Science.gov (United States)

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  3. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ? A method for standardless quantification in EPMA is presented. ? It gives better results than the commercial software GENESIS Spectrum. ? It gives better results than the software DTSA. ? It allows the determination of the conductive coating thickness. ? It gives an estimation for the concentration uncertainties.

  4. Leishmania parasite detection and quantification using PCR-ELISA.

    Czech Academy of Sciences Publication Activity Database

    Kobets, Tetyana; Badalová, Jana; Grekov, Igor; Havelková, Helena; Lipoldová, Marie

    2010-01-01

    Ro?. 5, ?. 6 (2010), s. 1074-1080. ISSN 1754-2189 R&D Projects: GA ?R GA310/08/1697; GA MŠk(CZ) LC06009 Institutional research plan: CEZ:AV0Z50520514 Keywords : polymerase chain reaction * Leishmania major infection * parasite quantification Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 8.362, year: 2010

  5. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  6. Quantification of Wheat Grain Arabinoxylans Using a Phloroglucinol Colorimetric Assay

    Science.gov (United States)

    Arabinoxylans (AX) play a critical role in end-use quality and nutrition of wheat (Triticum aestivum L.). An efficient, accurate method of AX quantification is desirable as AX plays an important role in processing, end use quality and human health. The objective of this work was to evaluate a stand...

  7. Detection and quantification of diabetic retinopathy

    Science.gov (United States)

    Ballerini, Lucia

    1999-10-01

    In this work a computational approach for detecting and quantifying diabetic retinopathy is proposed. Particular attention has been paid to the study of Foveal Avascular Zone (FAZ). In fact, retinal capillary occlusion produces a FAZ enlargement. Moreover, the FAZ is characterized by qualitative changes showing an irregular contour with notchings and indentations. Our study is mainly focused on the analysis of the FAZ and on the extraction of a proper set of features to quantify FAZ alterations in diabetic patients. We propose an automatic segmentation procedure to correctly identify the FAZ boundary. The method was derived from the theory of active contours, also known as snakes, along with genetic optimization. Then we tried to extract features which can capture not only the size of the object, but also its shape and spatial orientation. The theory of moments provides an interesting and useful way for representing the shape of objects. We used a set of region and boundary moments to obtain a FAZ description which is complete enough for diagnostic purposes and in order to assess the effectiveness of moment descriptors we performed several classification experiments to discriminate diabetic from non-diabetic subjects. We used a neural network-based classifier, optimized for the problem, which is able to perform feature selection at the same time as learning, in order to extract a subset of features. The theory of moments provided us with an interesting and useful tool for representing the shape characteristics. In this way we were able to transform the qualitative description of the FAZ used by ophthalmologists into quantitative measurements.

  8. Practical communication theory

    CERN Document Server

    Adamy, Dave

    2014-01-01

    Practical Communication Theory, 2nd edition enables practicing engineers and technicians to quickly and easily generate the answers to real-world problems encountered in specifying, testing, and fielding any type of systems that involve radio propagation.

  9. Theory of a new type of heavy-electron superconductivity in PrOs{sub 4}Sb{sub 12}: quadrupolar-fluctuation mediated odd-parity pairings

    Energy Technology Data Exchange (ETDEWEB)

    Miyake, K [Division of Materials Physics, Department of Physical Science, Graduate School of Engineering Science, Osaka University, Toyonaka, Osaka 560-8531 (Japan); Kohno, H [Division of Materials Physics, Department of Physical Science, Graduate School of Engineering Science, Osaka University, Toyonaka, Osaka 560-8531 (Japan); Harima, H [The Institute of Scientific and Industrial Research, Osaka University, Ibaraki, Osaka 567-0047 (Japan)

    2003-05-21

    It is shown that the unconventional nature of the superconducting state of PrOs{sub 4}Sb{sub 12}, a Pr based heavy electron compound with the filled-skutterudite structure, can be explained in a unified way by taking into account the structure of the crystalline-electric-field (CEF) level, the shape of the Fermi surface determined by the band structure calculation and a picture of the quasiparticles in the f {sup 2} configuration with a magnetically singlet CEF ground state. Possible types of pairing are narrowed down by consulting recent experimental results. In particular, the chiral 'p'-wave states such as p{sub x} + ip{sub y} are favoured under the magnetic field due to the orbital Zeeman effect, while the 'p'-wave states with twofold symmetry such as p{sub x} can be stabilized by a feedback effect without the magnetic field. It is also discussed that the double superconducting transition without the magnetic field is possible due to the spin-orbit coupling of the 'triplet' Cooper pairs in the chiral state. (letter to the editor)

  10. Quantum Information Theory

    OpenAIRE

    Nielsen, M. A.

    2000-01-01

    Quantum information theory is the study of the achievable limits of information processing within quantum mechanics. Many different types of information can be accommodated within quantum mechanics, including classical information, coherent quantum information, and entanglement. Exploring the rich variety of capabilities allowed by these types of information is the subject of quantum information theory, and of this Dissertation. In particular, I demonstrate several novel lim...

  11. Elementary game theory

    OpenAIRE

    Soni, Himanshu; Sharma, Damini

    2015-01-01

    The theory of games (or game theory) is a mathematical theory that deals with the general features of competitive situations. It involves strategic thinking, and studies the way people interact while making economic policies, contesting elections and other such decisions. There are various types of game models, which are based on factors, like the number of players participating, the sum of gains or losses and the number of strategies available. According to strategic reasoning, we can say th...

  12. Introduction to superstring theory

    International Nuclear Information System (INIS)

    This is a very basic introduction to the AdS/CFT correspondence. The first lecture motivates the duality between gauge theories and gravity/string theories. The next two lectures introduce the bosonic and supersymmetric string theories. The fourth lecture is devoted to study Dp-branes and finally, in the fifth lecture I discuss the two worlds: N=4 SYM in 3+1 flat dimensions and type IIB superstrings in AdS5 x S5. (author)

  13. Matrix string theory

    Energy Technology Data Exchange (ETDEWEB)

    Dijkgraaf, R. [Amsterdam Univ. (Netherlands). Dept. of Mathematics; Verlinde, E. [TH-Division, CERN, CH-1211 Geneva 23 (Switzerland)]|[Institute for Theoretical Physics, Universtity of Utrecht, 3508 TA Utrecht (Netherlands); Verlinde, H. [Institute for Theoretical Physics, University of Amsterdam, 1018 XE Amsterdam (Netherlands)

    1997-09-01

    Via compactification on a circle, the matrix model of M-theory proposed by Banks et al. suggests a concrete identification between the large N limit of two-dimensional N=8 supersymmetric Yang-Mills theory and type IIA string theory. In this paper we collect evidence that supports this identification. We explicitly identify the perturbative string states and their interactions, and describe the appearance of D-particle and D-membrane states. (orig.).

  14. Matrix String Theory

    OpenAIRE

    Dijkgraaf, R.; Verlinde, E.; Verlinde, H.

    1997-01-01

    Via compactification on a circle, the matrix model of M-theory proposed by Banks et al suggests a concrete identification between the large N limit of two-dimensional N=8 supersymmetric Yang-Mills theory and type IIA string theory. In this paper we collect evidence that supports this identification. We explicitly identify the perturbative string states and their interactions, and describe the appearance of D-particle and D-membrane states.

  15. Documentary and Cognitive Theory

    DEFF Research Database (Denmark)

    Bondebjerg, Ib

    2014-01-01

    This article deals with the benefits of using cognitive theory in documentary film studies. The article outlines general aspects of cognitive theory in humanities and social science, however the main focus is on the role of narrative, visual style and emotional dimensions of different types of documentaries. Dealing with cognitive theories of film and media and with memory studies, the article analyses how a cognitive approach to documentaries can increase our under-standing of how documentaries...

  16. Quantification of Waste in Conventional Construction

    OpenAIRE

    Siti Akhtar Mahayuddin; Wan Akmal Zahri Wan Zaharuddin

    2013-01-01

    Construction waste is generated throughout the construction process such as during site clearance, material use, material damage, material non-use, excess procurement and human error. The exact quantity and composition of construction waste generated throughout the projects are difficult to be identified as they are keep on changing due to the dynamic nature of the construction activities. Different stages of construction generates different types and composition of waste. Therefore the trend...

  17. Gauge theory of high spins

    International Nuclear Information System (INIS)

    The report is aimed at familiarizing with the ideas and results of the high spin gauge theory. It is referred to plotting the field theory model, characterized by the maximum high gauge symmetry. It is expected, that the theory of this type makes it possible to look anew at the superstrings theory, which is considered presently to be the first candidate to the role of the theory of the fundamental interactions

  18. Uncertainty Quantification in Fatigue Crack Growth Prognosis

    Directory of Open Access Journals (Sweden)

    Shankar Sankararaman

    2011-01-01

    Full Text Available This paper presents a methodology to quantify the uncertainty in fatigue crack growth prognosis, applied to structures with complicated geometry and subjected to variable amplitude multi-axial loading. Finite element analysis is used to address the complicated geometry and calculate the stress intensity factors. Multi-modal stress intensity factors due to multi-axial loading are combined to calculate an equivalent stress intensity factor using a characteristic plane approach. Crack growth under variable amplitude loading is modeled using a modified Paris law that includes retardation effects. During cycle-by-cycle integration of the crack growth law, a Gaussian process surrogate model is used to replace the expensive finite element analysis. The effect of different types of uncertainty – physical variability, data uncertainty and modeling errors – on crack growth prediction is investigated. The various sources of uncertainty include, but not limited to, variability in loading conditions, material parameters, experimental data, model uncertainty, etc. Three different types of modeling errors – crack growth model error, discretization error and surrogate model error – are included in analysis. The different types of uncertainty are incorporated into the crack growth prediction methodology to predict the probability distribution of crack size as a function of number of load cycles. The proposed method is illustrated using an application problem, surface cracking in a cylindrical structure.

  19. Two families with quadrupedalism, mental retardation, no speech, and infantile hypotonia (Uner Tan Syndrome Type-II; a novel theory for the evolutionary emergence of human bipedalism

    Directory of Open Access Journals (Sweden)

    UnerTan

    2014-04-01

    Full Text Available Two consanguineous families with Uner Tan Syndrome (UTS were analyzed in relation to self-organizing processes in complex systems, and the evolutionary emergence of human bipedalism. The cases had the key symptoms of previously reported cases of UTS, such as quadrupedalism, mental retardation, and dysarthric or no speech, but the new cases also exhibited infantile hypotonia and are designated UTS Type-II. There were 10 siblings in Branch I and 12 siblings in Branch II. Of these, there were seven cases exhibiting habitual quadrupedal locomotion (QL: four deceased and three living. The infantile hypotonia in the surviving cases gradually disappeared over a period of years, so that they could sit by about 10 years, crawl on hands and knees by about 12 years. They began walking on all fours around 14 years, habitually using QL. Neurological examinations showed normal tonus in their arms and legs, no Babinski sign, brisk tendon reflexes especially in the legs, and mild tremor. The patients could not walk in a straight line, but (except in one case could stand up and maintain upright posture with truncal ataxia. Cerebello-vermial hypoplasia and mild gyral simplification were noted in their MRIs. The results of the genetic analysis were inconclusive: no genetic code could be identified as the triggering factor for the syndrome in these families. Instead, the extremely low socio-economic status of the patients was thought to play a role in the emergence of UTS, possibly by epigenetically changing the brain structure and function, with a consequent selection of ancestral neural networks for QL during locomotor development. It was suggested that UTS may be regarded as one of the unpredictable outcomes of self-organization within a complex system. It was also noted that the prominent feature of this syndrome, the diagonal-sequence habitual QL, generated an interference between ipsilateral hands and feet, as in non-human primates. It was suggested that this may have been

  20. Band Calculations for Ce Compounds with AuCu3-type Crystal Structure on the basis of Dynamical Mean Field Theory: I. CePd3 and CeRh3

    Science.gov (United States)

    Sakai, Osamu

    2010-11-01

    Band calculations for Ce compounds with the AuCu3-type crystal structure were carried out on the basis of dynamical mean field theory (DMFT). The auxiliary impurity problem was solved by a method named NCA f2vc (noncrossing approximation including the f2 state as a vertex correction). The calculations take into account the crystal-field splitting, the spin-orbit interaction, and the correct exchange process of the f1 ? f0, f2 virtual excitation. These are necessary features in the quantitative band theory for Ce compounds and in the calculation of their excitation spectra. The results of applying the calculation to CePd3 and CeRh3 are presented as the first in a series of papers. The experimental results of the photoemission spectrum (PES), the inverse PES, the angle-resolved PES, and the magnetic excitation spectra were reasonably reproduced by the first-principles DMFT band calculation. At low temperatures, the Fermi surface (FS) structure of CePd3 is similar to that of the band obtained by the local density approximation. It gradually changes into a form that is similar to the FS of LaPd3 as the temperature increases, since the 4f band shifts to the high-energy side and the lifetime broadening becomes large.