WorldWideScience

Sample records for quantification theory type

  1. A Study of Tongue and Pulse Diagnosis in Traditional Korean Medicine for Stroke Patients Based on Quantification Theory Type II

    Mi Mi Ko

    2013-01-01

    Full Text Available In traditional Korean medicine (TKM, pattern identification (PI diagnosis is important for treating diseases. The aim of this study was to comprehensively investigate the relationship between the PI type and tongue diagnosis or pulse diagnosis variables. The study included 1,879 stroke patients who were admitted to 12 oriental medical university hospitals from June 2006 through March 2009. The status of the pulse and tongue was examined in each patient. Additionally, to investigate relatively important indicators related to specialist PI, the quantification theory type II analysis was performed regarding the PI type. In the first axis quantification of the external criteria, the Qi-deficiency and the Yin-deficiency patterns were located in the negative direction, while the dampness-phlegm (DP and fire-heat patterns were located in the positive direction. The explanatory variable with the greatest impact on the assessment was a fine pulse. In the second axis quantification, the external criteria were divided into either the DP or non-DP patterns. The slippery pulse exhibited the greatest effect on the division. This study attempted to build a model using a statistical method to objectively quantify PI and various indicators that constitute the unique diagnosis system of TKM. These results should assist the development of future diagnostic standards in stroke PI.

  2. Uncertainty quantification theory, implementation, and applications

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  3. Guarded Cubical Type Theory

    Birkedal, Lars; Bizjak, Aleš; Clouston, Ranald;

    2016-01-01

    This paper improves the treatment of equality in guarded dependent type theory (GDTT), by combining it with cubical type theory (CTT). GDTT is an extensional type theory with guarded recursive types, which are useful for building models of program logics, and for programming and reasoning with co...

  4. Guarded Cubical Type Theory

    Birkedal, Lars; Bizjak, Aleš; Clouston, Ranald;

    2016-01-01

    This paper improves the treatment of equality in guarded dependent type theory (GDTT), by combining it with cubical type theory (CTT). GDTT is an extensional type theory with guarded recursive types, which are useful for building models of program logics, and for programming and reasoning...... with coinductive types. We wish to implement GDTT with decidable type-checking, while still supporting non-trivial equality proofs that reason about the extensions of guarded recursive constructions. CTT is a variation of Martin-L\\"of type theory in which the identity type is replaced by abstract paths between...... terms. CTT provides a computational interpretation of functional extensionality, is conjectured to have decidable type checking, and has an implemented type-checker. Our new type theory, called guarded cubical type theory, provides a computational interpretation of extensionality for guarded recursive...

  5. Guarded Cubical Type Theory

    Birkedal, Lars; Bizjak, Aleš; Clouston, Ranald;

    2016-01-01

    This paper improves the treatment of equality in guarded dependent type theory (GDTT), by combining it with cubical type theory (CTT). GDTT is an extensional type theory with guarded recursive types, which are useful for building models of program logics, and for programming and reasoning...... with coinductive types. We wish to implement GDTT with decidable type checking, while still supporting non-trivial equality proofs that reason about the extensions of guarded recursive constructions. CTT is a variation of Martin-L\\"of type theory in which the identity type is replaced by abstract paths between...... terms. CTT provides a computational interpretation of functional extensionality, enjoys canonicity for the natural numbers type, and is conjectured to support decidable type-checking. Our new type theory, guarded cubical type theory (GCTT), provides a computational interpretation of extensionality...

  6. Recurrence quantification analysis theory and best practices

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  7. An approximation approach for uncertainty quantification using evidence theory

    Bae, Ha-Rok; Grandhi, Ramana V.; Canfield, Robert A

    2004-12-01

    Over the last two decades, uncertainty quantification (UQ) in engineering systems has been performed by the popular framework of probability theory. However, many scientific and engineering communities realize that there are limitations in using only one framework for quantifying the uncertainty experienced in engineering applications. Recently evidence theory, also called Dempster-Shafer theory, was proposed to handle limited and imprecise data situations as an alternative to the classical probability theory. Adaptation of this theory for large-scale engineering structures is a challenge due to implicit nature of simulations and excessive computational costs. In this work, an approximation approach is developed to improve the practical utility of evidence theory in UQ analysis. The techniques are demonstrated on composite material structures and airframe wing aeroelastic design problem.

  8. Quantification of image quality using information theory.

    Niimi, Takanaga; Maeda, Hisatoshi; Ikeda, Mitsuru; Imai, Kuniharu

    2011-12-01

    Aims of present study were to examine usefulness of information theory in visual assessment of image quality. We applied first order approximation of the Shannon's information theory to compute information losses (IL). Images of a contrast-detail mammography (CDMAM) phantom were acquired with computed radiographies for various radiation doses. Information content was defined as the entropy Σp( i )log(1/p ( i )), in which detection probabilities p ( i ) were calculated from distribution of detection rate of the CDMAM. IL was defined as the difference between information content and information obtained. IL decreased with increases in the disk diameters (P information losses (TIL), were closely correlated with the image quality figures (r = 0.985). TIL was dependent on the distribution of image reading ability of each examinee, even when average reading ratio was the same in the group. TIL was shown to be sensitive to the observers' distribution of image readings and was expected to improve the evaluation of image quality.

  9. Guarded dependent type theory with coinductive types

    Bizjak, Aleš; Grathwohl, Hans Bugge; Clouston, Ranald;

    2015-01-01

    We present guarded dependent type theory, gDTT, an extensional dependent type theory with a later' modality and clock quantifiers for programming and proving with guarded recursive and coinductive types. The later modality is used to ensure the productivity of recursive definitions in a modular...

  10. Linear contextual modal type theory

    Schack-Nielsen, Anders; Schürmann, Carsten

    Abstract. When one implements a logical framework based on linear type theory, for example the Celf system [?], one is immediately con- fronted with questions about their equational theory and how to deal with logic variables. In this paper, we propose linear contextual modal type theory that giv...... a mathematical account of the nature of logic variables. Our type theory is conservative over intuitionistic contextual modal type theory proposed by Nanevski, Pfenning, and Pientka. Our main contributions include a mechanically checked proof of soundness and a working implementation....

  11. Guarded dependent type theory with coinductive types

    Bizjak, Aleš; Grathwohl, Hans Bugge; Clouston, Ranald

    2016-01-01

    , type based, way. Clock quantifiers are used for controlled elimination of the later modality and for encoding coinductive types using guarded recursive types. Key to the development of gDTT are novel type and term formers involving what we call delayed substitutions’. These generalise the applicative...... functor rules for the later modality considered in earlier work, and are crucial for programming and proving with dependent types. We show soundness of the type theory with respect to a denotational model....

  12. A Minimal Propositional Type Theory

    Kaminski, Mark

    2010-01-01

    Propositional type theory, first studied by Henkin, is the restriction of simple type theory to a single base type that is interpreted as the set of the two truth values. We show that two constants (falsity and implication) suffice for denotational and deductive completeness. Denotational completeness means that every value of the full set-theoretic type hierarchy can be described by a closed term. Deductive completeness is shown for a sequent-based proof system that extends a propositional natural deduction system with lambda conversion and Boolean replacement.

  13. Epistemic uncertainty quantification in flutter analysis using evidence theory

    Tang Jian; Wu Zhigang; Yang Chao

    2015-01-01

    Aimed at evaluating the structural stability and flutter risk of the system, this paper man-ages to quantify epistemic uncertainty in flutter analysis using evidence theory, including both para-metric uncertainty and method selection uncertainty, on the basis of information from limited experimental data of uncertain parameters. Two uncertain variables of the actuator coupling system with unknown probability distributions, that is bending and torsional stiffness, which are both described with multiple intervals and the basic belief assignment (BBA) extricated from the modal test of actuator coupling systems, are taken into account. Considering the difference in dealing with experimental data by different persons and the reliability of various information sources, a new combination rule of evidence––the generalized lower triangular matrices method is formed to acquire the combined BBA. Finally the parametric uncertainty and the epistemic uncertainty of flut-ter analysis method selection are considered in the same system to realize quantification. A typical rudder of missile is selected to examine the present method, and the dangerous range of velocity as well as relevant belief and plausibility functions is obtained. The results suggest that the present method is effective in obtaining the lower and upper bounds of flutter probability and assessing flut-ter risk of structures with limited experimental data of uncertain parameters and the belief of dif-ferent methods.

  14. Quantification of intrapancreatic fat in type 2 diabetes by MRI

    Hollingsworth, Kieren G.; Steven, Sarah; Tiniakos, Dina; Taylor, Roy

    2017-01-01

    Objectives Accumulation of intrapancreatic fat may be important in type 2 diabetes, but widely varying data have been reported. The standard quantification by MRI in vivo is time consuming and dependent upon a high level of experience. We aimed to develop a new method which would minimise inter-observer variation and to compare this against previously published datasets. Methods A technique of ‘biopsying’ the image to minimise inclusion of non-parenchymal tissues was developed. Additionally, thresholding was applied to exclude both pancreatic ducts and intrusions of visceral fat, with pixels of fat values of 20% being excluded. The new MR image ‘biopsy’ (MR-opsy) was compared to the standard method by 6 independent observers with wide experience of image analysis but no experience of pancreas imaging. The effect of the new method was examined on datasets from two studies of weight loss in type 2 diabetes. Results At low levels of intrapancreatic fat neither the result nor the inter-observer CV was changed by MR-opsy, thresholding or a combination of the methods. However, at higher levels the conventional method exhibited poor inter-observer agreement (coefficient of variation 26.9%) and the new combined method improved the CV to 4.3% (p<0.03). Using either MR-opsy alone or with thresholding, the new methods indicated a closer relationship between decrease in intrapancreatic fat and fall in blood glucose. Conclusion The inter-observer variation for quantifying intrapancreatic fat was substantially improved by the new method when pancreas fat levels were moderately high. The method will improve comparability of pancreas fat measurement between research groups. PMID:28369092

  15. Homotopy Type Theory: Univalent Foundations of Mathematics

    Program, The Univalent Foundations

    2013-01-01

    Homotopy type theory is a new branch of mathematics, based on a recently discovered connection between homotopy theory and type theory, which brings new ideas into the very foundation of mathematics. On the one hand, Voevodsky's subtle and beautiful "univalence axiom" implies that isomorphic structures can be identified. On the other hand, "higher inductive types" provide direct, logical descriptions of some of the basic spaces and constructions of homotopy theory. Both are impossible to capt...

  16. Types, structures and theories in NKI

    Xiaoru ZHANG; Zaiyue ZHANG; Yuefei SUI

    2008-01-01

    The National Knowledge,Infrastructure (NKI)is a multi-domain knowledge base. The classical type the-ory is no longer appropriate to describe every kind of object in multi-domains, such as artifacts, natural or micro objects. Three different kinds of type theories are defined: the classical, atomic and pseudo type theories; in the classical type theory, two new type constructors are defined: setm and ∨, to describe the types of sets of all the elements of the types and unions of two sets of different types, respectively. The structures and categories in the type theory are defined, and the sub-structures and homo-morphic structures are used to describe the part-of rela-tions that give the algebraic specifications for the natural objects and the part-of relations between the natural objects, micro objects and artifacts.

  17. Type Theory, Computation and Interactive Theorem Proving

    2015-09-01

    in type theory . Harper and student Kuen-Bang Hou developed a machine-checked proof of the equivalence of group actions and cov- ering spaces in...Track 2: Interactive theorem proving and au- tomated reasoning 3.1 Homotopy type theory Avigad participated in the Univalent Foundations Program at IAS...AFRL-AFOSR-VA-TR-2016-0071 TYPE THEORY , COMPUTATION AND INTERACTIVE THEOREM PROVING Jeremy Avigad CARNEGIE MELLON UNIVERSITY Final Report 09/01/2015

  18. Uncertainty Quantification and Propagation in Nuclear Density Functional Theory

    Schunck, N; McDonnell, J D; Higdon, D; Sarich, J; Wild, S M

    2015-03-17

    Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this paper, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statistical analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.

  19. Completeness in Hybrid Type Theory

    Areces, Carlos; Blackburn, Patrick Rowan; Huertas, Antonia;

    2014-01-01

    We show that basic hybridization (adding nominals and @ operators) makes it possible to give straightforward Henkin-style completeness proofs even when the modal logic being hybridized is higher-order. The key ideas are to add nominals as expressions of type t, and to extend to arbitrary types...... the way we interpret @i in propositional and first-order hybrid logic. This means: interpret @iαa , where αa is an expression of any type a , as an expression of type a that rigidly returns the value that αa receives at the i-world. The axiomatization and completeness proofs are generalizations of those...

  20. Quantification of Uncertainties in Nuclear Density Functional theory

    Schunck, N; Higdon, D; Sarich, J; Wild, S

    2014-01-01

    Reliable predictions of nuclear properties are needed as much to answer fundamental science questions as in applications such as reactor physics or data evaluation. Nuclear density functional theory is currently the only microscopic, global approach to nuclear structure that is applicable throughout the nuclear chart. In the past few years, a lot of effort has been devoted to setting up a general methodology to assess theoretical uncertainties in nuclear DFT calculations. In this paper, we summarize some of the recent progress in this direction. Most of the new material discussed here will be be published in separate articles.

  1. Game Theory and Uncertainty Quantification for Cyber Defense Applications

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.

    2016-07-21

    Cyber-system defenders face the challenging task of protecting critical assets and information continually against multiple types of malicious attackers. Defenders typically operate within resource constraints while attackers operate at relatively low costs. As a result, design and development of resilient cyber-systems that can support mission goals under attack while accounting for the dynamics between attackers and defenders is an important research problem.

  2. Application of the second theory of quantification in identifying gushing water sources of coal mines

    Zhang, X.; Zhang, Z.; Peng, S. [CUMT, Beijing (China). School of Resources and Safety Engineering

    2003-05-01

    Mathematical model for identifying the source of water burst was established using the second theory of quantification and based on 35 water samples form the main aquifers in Jiaozuo mine area. The newly burst water samples were tested using the identification software developed by VC + 6.0 and the 3D discrimination method. The result indicates that the test effect is good which has provided a convenient tool for discriminating the new sources of water burst. 6 refs., 2 tabs.

  3. Application of quantification theory in risk assessment of mine flooding

    WANG Lian-guo; MIAO Xie-xing; DONG Xu; WU Yu

    2008-01-01

    Hundreds of mine flooding accidents have occurred in China since the 1950s. These flooding accidents result in sub-merged working faces, even entire coal mines, leading to tremendous economic losses. It is reported that among 601 state-owned mines in China, 285 mines are exposed to water-inrush risks. The water pressure is becoming larger and larger with the increase of mining depth, leading to an increase of water-inrush hazards. Only when the risk of mine flooding is predicted in a reasonable manner, can we take timely and effective measures to prevent mine flooding from taking place. In our investigation quantifica-tion(II) theory is used to study the risk prediction problem about mine flooding. By investigating the main factors which affect mine flooding, eight risk assessment items have been identified. The extent of risk is classified into 4 grades. Given the data from differ-ent periods in the Feicheng mining area, a prediction model for the risk of mine flooding is established. The test analysis indicates a model correlation coefficient of 0.97 and the incidence of discrimination is as high as 97.37%, which implies that the effect of the model is quite satisfactory. With the help of computers, this method can be widely applied.

  4. Hoare type theory, polymorphism and separation

    Nanevski, Alexandar; Morrisett, J. Gregory; Birkedal, Lars

    2008-01-01

    We consider the problem of reconciling a dependently typed functional language with imperative features such as mutable higher-order state, pointer aliasing, and nontermination. We propose Hoare type theory (HTT), which incorporates Hoare-style specifications into types, making it possible to sta...

  5. Transcriptional regulatory network refinement and quantification through kinetic modeling, gene expression microarray data and information theory

    Tuncay Kagan

    2007-01-01

    Full Text Available Abstract Background Gene expression microarray and other multiplex data hold promise for addressing the challenges of cellular complexity, refined diagnoses and the discovery of well-targeted treatments. A new approach to the construction and quantification of transcriptional regulatory networks (TRNs is presented that integrates gene expression microarray data and cell modeling through information theory. Given a partial TRN and time series data, a probability density is constructed that is a functional of the time course of transcription factor (TF thermodynamic activities at the site of gene control, and is a function of mRNA degradation and transcription rate coefficients, and equilibrium constants for TF/gene binding. Results Our approach yields more physicochemical information that compliments the results of network structure delineation methods, and thereby can serve as an element of a comprehensive TRN discovery/quantification system. The most probable TF time courses and values of the aforementioned parameters are obtained by maximizing the probability obtained through entropy maximization. Observed time delays between mRNA expression and activity are accounted for implicitly since the time course of the activity of a TF is coupled by probability functional maximization, and is not assumed to be proportional to expression level of the mRNA type that translates into the TF. This allows one to investigate post-translational and TF activation mechanisms of gene regulation. Accuracy and robustness of the method are evaluated. A kinetic formulation is used to facilitate the analysis of phenomena with a strongly dynamical character while a physically-motivated regularization of the TF time course is found to overcome difficulties due to omnipresent noise and data sparsity that plague other methods of gene expression data analysis. An application to Escherichia coli is presented. Conclusion Multiplex time series data can be used for the

  6. Fixed point theory in metric type spaces

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  7. Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements

    McDonnell, J D; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-01-01

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models; to estimate model errors and thereby improve predictive capability; to extrapolate beyond the regions reached by experiment; and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, w...

  8. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  9. Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales

    J. Ellen Blue

    2008-05-01

    Full Text Available We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex vocal behavior of these humpback whales which could have direct consequences on their feeding behavior and thus ultimately on their health and reproduction. Humpback whale feeding calls recorded during conditions of high vessel-generated noise and lower levels of background noise are compared for differences in acoustic structure, use, and organization using information theoretic measures. We apply information theory in a self-referential manner (i.e., orders of entropy to quantify the changes in signaling behavior. We then compare this with the reduction in channel capacity due to noise in Glacier Bay itself treating it as a (Gaussian noisy channel. We find that high vessel noise is associated with an increase in the rate and repetitiveness of sequential use of feeding call types in our averaged sample of humpback whale vocalizations, indicating that vessel noise may be modifying the patterns of use of feeding calls by the endangered humpback whales in Southeast Alaska. The information theoretic approach suggested herein can make a reliable quantitative measure of such relationships and may also be adapted for wider application to many species where environmental noise is thought to be a problem.

  10. Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales

    Doyle, Laurance R.; McCowan, Brenda; Hanser, Sean F.; Chyba, Christopher; Bucci, Taylor; Blue, J. E.

    2008-06-01

    We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae) vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex vocal behavior of these humpback whales which could have direct consequences on their feeding behavior and thus ultimately on their health and reproduction. Humpback whale feeding calls recorded during conditions of high vessel-generated noise and lower levels of background noise are compared for differences in acoustic structure, use, and organization using information theoretic measures. We apply information theory in a self-referential manner (i.e., orders of entropy) to quantify the changes in signaling behavior. We then compare this with the reduction in channel capacity due to noise in Glacier Bay itself treating it as a (Gaussian) noisy channel. We find that high vessel noise is associated with an increase in the rate and repetitiveness of sequential use of feeding call types in our averaged sample of humpback whale vocalizations, indicating that vessel noise may be modifying the patterns of use of feeding calls by the endangered humpback whales in Southeast Alaska. The information theoretic approach suggested herein can make a reliable quantitative measure of such relationships and may also be adapted for wider application to many species where environmental noise is thought to be a problem.

  11. Applications of Jungian Type Theory to Counselor Education.

    Dilley, Josiah S.

    1987-01-01

    Describes Carl Jung's theory of psychological type and the Myers-Briggs Type Indicator (MBTI), an instrument to assess Jungian type. Cites sources of information on the research and application of the theory and the MBTI. Explores how knowledge of type theory can be useful to counselor educators. (Author)

  12. A Realizability Model for Impredicative Hoare Type Theory

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  13. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  14. Development of an exchange-correlation functional with uncertainty quantification capabilities for density functional theory

    Aldegunde, Manuel; Kermode, James R.; Zabaras, Nicholas

    2016-04-01

    This paper presents the development of a new exchange-correlation functional from the point of view of machine learning. Using atomization energies of solids and small molecules, we train a linear model for the exchange enhancement factor using a Bayesian approach which allows for the quantification of uncertainties in the predictions. A relevance vector machine is used to automatically select the most relevant terms of the model. We then test this model on atomization energies and also on bulk properties. The average model provides a mean absolute error of only 0.116 eV for the test points of the G2/97 set but a larger 0.314 eV for the test solids. In terms of bulk properties, the prediction for transition metals and monovalent semiconductors has a very low test error. However, as expected, predictions for types of materials not represented in the training set such as ionic solids show much larger errors.

  15. Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales

    J. Ellen Blue; Taylor Bucci; Christopher Chyba; Sean F. Hanser; Brenda McCowan; Doyle, Laurance R.

    2008-01-01

    We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae) vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex vocal behavior of these humpback whales which could have direct consequences on their feeding behavior and thus ultimately on their health and reproduction. Humpback whale feeding calls recorded d...

  16. Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory

    J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts

    2006-05-01

    This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and

  17. Simple Type Theory as Framework for Combining Logics

    Benzmueller, Christoph

    2010-01-01

    Simple type theory is suited as framework for combining classical and non-classical logics. This claim is based on the observation that various prominent logics, including (quantified) multimodal logics and intuitionistic logics, can be elegantly embedded in simple type theory. Furthermore, simple type theory is sufficiently expressive to model combinations of embedded logics and it has a well understood semantics. Off-the-shelf reasoning systems for simple type theory exist that can be uniformly employed for reasoning within and about combinations of logics.

  18. Verifying Process Algebra Proofs in Type Theory

    Sellink, M.P.A.

    2008-01-01

    In this paper we study automatic verification of proofs in process algebra. Formulas of process algebra are represented by types in typed λ-calculus. Inhabitants (terms) of these types represent proofs. The specific typed λ-calculus we use is the Calculus of Inductive Constructions as implemented in

  19. Toward a Theory of Psychological Type Congruence for Advertisers.

    McBride, Michael H.; And Others

    Focusing on the impact of advertisers' persuasive selling messages on consumers, this paper discusses topics relating to the theory of psychological type congruence. Based on an examination of persuasion theory and relevant psychological concepts, including recent cognitive stability and personality and needs theory and the older concept of…

  20. Water type quantification in the Skagerrak, the Kattegat and off the Jutland west coast

    Trond Kristiansen

    2015-04-01

    Full Text Available An extensive data series of salinity, nutrients and coloured dissolved organic material (CDOM was collected in the Skagerrak, the northern part of the Kattegat and off the Jutland west coast in April each year during the period 1996–2000, by the Institute of Marine Research in Norway. In this month, after the spring bloom, German Bight Water differs from its surrounding waters by a higher nitrate content and higher nitrate/phosphate and nitrate/silicate ratios. The spreading of this water type into the Skagerrak is of special interest with regard to toxic algal blooms. The quantification of the spatial distributions of the different water types required the development of a new algorithm for the area containing the Norwegian Coastal Current, while an earlier Danish algorithm was applied for the rest of the area. From the upper 50 m a total of 2227 observations of salinity and CDOM content have been used to calculate the mean concentration of water from the German Bight, the North Sea (Atlantic water, the Baltic Sea and Norwegian rivers. The Atlantic Water was the dominant water type, with a mean concentration of 79%, German Bight Water constituted 11%, Baltic Water 8%, and Norwegian River Water 2%. At the surface the mean percentages of these water types were found to be 68%, 15%, 15%, and 3%, respectively. Within the northern part of the Skagerrak, closer to the Norwegian coast, the surface waters were estimated to consist of 74% Atlantic Water, 20% Baltic Water, and 7% Norwegian River Water. The analysis indicates that the content of German Bight Water in this part is less than 5%.

  1. Evidence theory and differential evolution based uncertainty quantification for buckling load of semi-rigid jointed frames

    Hesheng Tang; Yu Su; Jiao Wang

    2015-08-01

    The paper describes a procedure for the uncertainty quantification (UQ) using evidence theory in buckling analysis of semi-rigid jointed frame structures under mixed epistemic–aleatory uncertainty. The design uncertainties (geometrical, material, strength, and manufacturing) are often prevalent in engineering applications. Due to lack of knowledge or incomplete, inaccurate, unclear information in the modeling, simulation, measurement, and design, there are limitations in using only one framework (probability theory) to quantify uncertainty in a system because of the impreciseness of data or knowledge. Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. Unfortunately, propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than propagation of a probabilistic representation for uncertainty. In order to alleviate the computational difficulties in the evidence theory based UQ analysis, a differential evolution-based computational strategy for propagation of epistemic uncertainty in a system with evidence theory is presented here. A UQ analysis for the buckling load of steel-plane frames with semi-rigid connections is given herein to demonstrate accuracy and efficiency of the proposed method.

  2. A Uniform Approach to Type Theory

    1989-01-01

    Whitehead. " Principia Mathematica ." Volume 1,2,3 Cambridge University Press (1912). [1621 K. Schfitte. "Proof theory." Springer-Verlag (1977). (163] D...A. Obtulowics. "The Logic of Categories of Partial Functions and its Applications." Dissertationes Mathematicae 241 (1982). (142] M.S. Paterson, M.N

  3. Theory confronts experiment in the Casimir force measurements: quantification of errors and precision

    Chen, F; Mohideen, U; Mostepanenko, V M

    2004-01-01

    We compare theory and experiment in the Casimir force measurement between gold surfaces performed with the atomic force microscope. Both random and systematic experimental errors are found leading to a total absolute error equal to 8.5 pN at 95% confidence. In terms of the relative errors, experimental precision of 1.75% is obtained at the shortest separation of 62 nm at 95% confidence level (at 60% confidence the experimental precision of 1% is confirmed at the shortest separation). An independent determination of the accuracy of the theoretical calculations of the Casimir force and its application to the experimental configuration is carefully made. Special attention is paid to the sample-dependent variations of the optical tabulated data due to the presence of grains, contribution of surface plasmons, and errors introduced by the use of the proximity force theorem. Nonmultiplicative and diffraction-type contributions to the surface roughness corrections are examined. The electric forces due to patch potent...

  4. Decorated linear order types and the theory of concatenation

    Cacic, V.; Pudlák, P.; Restall, G.; Urquhart, A.; Visser, A.

    2008-01-01

    We study the interpretation of Grzegorczyk’s Theory of Concatenation TC in structures of decorated linear order types satisfying Grzegorczyk’s axioms. We show that TC is incomplete for this interpretation. What is more, the first order theory validated by this interpretation interprets arithmetical

  5. Decorated linear order types and the theory of concatenation

    Cacic, V.; Pudlák, P. (Pavel); Restall, G.; Urquhart, A.; De Visser, A.

    2008-01-01

    We study the interpretation of Grzegorczyk’s Theory of Concatenation TC in structures of decorated linear order types satisfying Grzegorczyk’s axioms. We show that TC is incomplete for this interpretation. What is more, the first order theory validated by this interpretation interprets arithmetical truth. We also show that every extension of TC has a model that is not isomorphic to a structure of decorated order types. We provide a positive result, to wit a construction that builds structures...

  6. Unified theory of type I and type II irregularities in the equatorial electrojet

    Sudan, R. N.

    1983-01-01

    A nonlinear unified theory of type I and II irregularities is presented that explains their principal observed characteristics. The power spectrum is predicted by using Kolmogoroff-type conservation law for the power flow in cascading eddies.

  7. Calabi-Yau compactification of type II string theories

    Banerjee, Sibasish

    2016-01-01

    Superstring theories are the most promising theories for unified description of all fundamental interactions including gravity. However, these theories are formulated consistently only in 10 spacetime dimensions. Therefore, to connect to the observable world, it is required to compactify 6 out of those 10 dimensions in a suitable fashion. In this thesis, we mainly consider compactifications of type II string theories on Calabi-Yau threefolds. As a consequence, the resulting four dimensional theories preserve $\\mathcal{N}=2$ supersymmetry. In these cases the metrics on the moduli spaces of the matter multiplets, vector and hypermultiplets, completely determine the low energy theories. Whereas the former are very well understood by now, the complete description of hypermultiplets is more complicated. In fact, hypermultiplets receive both perturbative and non-perturbative corrections. The thesis mainly pertains to the understanding of the non-perturbative corrections. Our findings for the hypermultiplets rely on...

  8. Intensional Type Theory with Guarded Recursive Types qua Fixed Points on Universes

    Birkedal, Lars; Mogelberg, R.E.

    2013-01-01

    points of guarded recursive functions. Guarded recursive types can be formed simply by taking fixed points of guarded recursive functions on the universe of types. Moreover, we present a general model construction for constructing models of the intensional type theory with guarded recursive functions...... and types. When applied to the groupoid model of intensional type theory with the universe of small discrete groupoids, the construction gives a model of guarded recursion for which there is a one-to-one correspondence between fixed points of functions on the universe of types and fixed points of (suitable......Guarded recursive functions and types are useful for giving semantics to advanced programming languages and for higher-order programming with infinite data types, such as streams, e.g., for modeling reactive systems. We propose an extension of intensional type theory with rules for forming fixed...

  9. Inductive Data Types Based on Fibrations Theory in Programming

    Decheng Miao

    2016-03-01

    Full Text Available Traditional methods including algebra and category theory have some deficiencies in analyzing semantics properties and describing inductive rules of inductive data types, we present a method based on Fibrations theory aiming at those questions above. We systematically analyze some basic logical structures of inductive data types about a fibration such as re-indexing functor, truth functor and comprehension functor, make semantics models of non-indexed fibration, single-sorted indexed fibration and many-sorted indexed fibration respectively. On this basis, we thoroughly discuss semantics properties of fibred, single-sorted indexed and many-sorted indexed inductive data types, and abstractly describe their inductive rules with universality. Furthermore, we briefly introduce applications of the three inductive dana types for analyzing semantics properties and describing inductive rules based on Fibrations theory via some examples. Compared with traditional methods, our works have the following three advantages. Firstly, brief descriptions and flexible expansibility of Fibrations theory can analyze semantics properties of inductive data types accurately, whose semantics are computed automatically. Secondly, superior abstractness of Fibrations theory does not rely on particular computing environments to depict inductive rules of inductive data types with universality. Thirdly, its rigorousness and consistence provide sound basis for testing and maintenance of software development.

  10. A Model of PCF in Guarded Type Theory

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....

  11. Denotational semantics of recursive types in synthetic guarded domain theory

    Møgelberg, Rasmus Ejlers; Paviotti, Marco

    2016-01-01

    as was the case in previous work, but we show how to recover extensionality using a logical relation. All constructions and reasoning in this paper, including proofs of theorems such as soundness and adequacy, are by (informal) reasoning in type theory, often using guarded recursion.......Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for reasoning operationally about programming languages with advanced features including general references, recursive types, countable non......-determinism and concurrency. Guarded recursion also offers a way of adding recursion to type theory while maintaining logical consistency. In previous work we initiated a programme of denotational semantics in type theory using guarded recursion, by constructing a computationally adequate model of the language PCF (simply...

  12. Formation of social types in the theory of Orrin Klapp

    Trifunović Vesna

    2007-01-01

    Theory of Orrin Klapp about social types draws attention to important functions that these types have within certain societies as well as that it is preferable to take them into consideration if our goal is more complete knowledge of that society. For Klapp, social types are important social symbols, which in an interesting way reflect society they are part of and for that reason this author dedicates his work to considering their meanings and social functions. He thinks that we can not under...

  13. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    Tong, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, Xiao [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Iaccarino, Gianluca [Stanford Univ., CA (United States); Mittal, Akshay [Stanford Univ., CA (United States)

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  14. Exotic dual of type II double field theory

    Eric A. Bergshoeff

    2017-04-01

    Full Text Available We perform an exotic dualization of the Ramond–Ramond fields in type II double field theory, in which they are encoded in a Majorana–Weyl spinor of O(D,D. Starting from a first-order master action, the dual theory in terms of a tensor–spinor of O(D,D is determined. This tensor–spinor is subject to an exotic version of the (self-duality constraint needed for a democratic formulation. We show that in components, reducing O(D,D to GL(D, one obtains the expected exotically dual theory in terms of mixed Young tableaux fields. To this end, we generalize exotic dualizations to self-dual fields, such as the 4-form in type IIB string theory.

  15. Exotic dual of type II double field theory

    Bergshoeff, Eric A.; Hohm, Olaf; Riccioni, Fabio

    2017-04-01

    We perform an exotic dualization of the Ramond-Ramond fields in type II double field theory, in which they are encoded in a Majorana-Weyl spinor of O (D , D). Starting from a first-order master action, the dual theory in terms of a tensor-spinor of O (D , D) is determined. This tensor-spinor is subject to an exotic version of the (self-)duality constraint needed for a democratic formulation. We show that in components, reducing O (D , D) to GL (D), one obtains the expected exotically dual theory in terms of mixed Young tableaux fields. To this end, we generalize exotic dualizations to self-dual fields, such as the 4-form in type IIB string theory.

  16. On the theory of the type III burst exciter

    Smith, R. A.; Goldstein, M. L.; Papadopoulos, K.

    1976-01-01

    In situ satellite observations of type III burst exciters at 1 AU show that the beam does not evolve into a plateau in velocity space, contrary to the prediction of quasilinear theory. The observations can be explained by a theory that includes mode coupling effects due to excitation of the parametric oscillating two-stream instability and its saturation by anomalous resistivity. The time evolution of the beam velocity distribution is included in the analysis.

  17. Closed tachyon solitons in type II string theory

    Garcia-Etxebarria, Inaki [Max Planck Institute for Physics, Munich (Germany); Montero, Miguel [Instituto de Fisica Teorica IFT-UAM/CSIC, C/Nicolas Cabrera 13-15, Universidad Autonoma de Madrid (Spain); Departamento de Fisica Teorica, Universidad Autonoma de Madrid (Spain); Uranga, Angel M. [Instituto de Fisica Teorica IFT-UAM/CSIC, C/Nicolas Cabrera 13-15, Universidad Autonoma de Madrid (Spain)

    2015-09-15

    Type II theories can be described as the endpoint of closed string tachyon condensation in certain orbifolds of supercritical type 0 theories. In this paper, we study solitons of this closed string tachyon and analyze the nature of the resulting defects in critical type II theories. The solitons are classified by the real K-theory groups KO of bundles associated to pairs of supercritical dimensions. For real codimension 4 and 8, corresponding to KO(S{sup 4}) = Z and KO(S{sup 8}) = Z, the defects correspond to a gravitational instanton and a fundamental string, respectively. We apply these ideas to reinterpret the worldsheet GLSM, regarded as a supercritical theory on the ambient toric space with closed tachyon condensation onto the CY hypersurface, and use it to describe charged solitons under discrete isometries. We also suggest the possible applications of supercritical strings to the physical interpretation of the matrix factorization description of F-theory on singular spaces. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  18. Rational sphere valued supercocycles in M-theory and type IIA string theory

    Fiorenza, Domenico; Sati, Hisham; Schreiber, Urs

    2017-04-01

    We show that supercocycles on super L∞-algebras capture, at the rational level, the twisted cohomological charge structure of the fields of M-theory and of type IIA string theory. We show that rational 4-sphere-valued supercocycles for M-branes in M-theory descend to supercocycles in type IIA string theory with coefficients in the free loop space of the 4-sphere, to yield the Ramond-Ramond fields in the rational image of twisted K-theory, with the twist given by the B-field. In particular, we derive the M2/M5 ↔ F1/Dp/NS5 correspondence via dimensional reduction of sphere-valued super-L∞-cocycles.

  19. Models of Particle Physics from Type IIB String Theory and F-theory: A Review

    Maharana, Anshuman

    2012-01-01

    We review particle physics model building in type IIB string theory and F-theory. This is a region in the landscape where in principle many of the key ingredients required for a realistic model of particle physics can be combined successfully. We begin by reviewing moduli stabilisation within this framework and its implications for supersymmetry breaking. We then review model building tools and developments in the weakly coupled type IIB limit, for both local D3-branes at singularities and global models of intersecting D7-branes. Much of recent model building work has been in the strongly coupled regime of F-theory due to the presence of exceptional symmetries which allow for the construction of phenomenologically appealing Grand Unified Theories. We review both local and global F-theory model building starting from the fundamental concepts and tools regarding how the gauge group, matter sector and operators arise, and ranging to detailed phenomenological properties explored in the literature.

  20. Type I/heterotic duality and M-theory amplitudes

    Green, Michael B.; Rudra, Arnab

    2016-12-01

    This paper investigates relationships between low-energy four-particle scattering amplitudes with external gauge particles and gravitons in the E 8 × E 8 and SO(32) heterotic string theories and the type I and type IA superstring theories by considering a variety of tree level and one-loop Feynman diagrams describing such amplitudes in eleven-dimensional supergravity in a Horava-Witten background compactified on a circle. This accounts for a number of perturbative and non-perturbative aspects of low order higher derivative terms in the low-energy expansion of string theory amplitudes, which are expected to be protected by half maximal supersymmetry from receiving corrections beyond one or two loops. It also suggests the manner in which type I/heterotic duality may be realised for certain higher derivative interactions that are not so obviously protected. For example, our considerations suggest that R 4 interactions (where R is the Riemann curvature) might receive no perturbative corrections beyond one loop by virtue of a conspiracy involving contributions from (non-BPS) {Z}_2 D-instantons in the type I and heterotic SO(32) theories.

  1. Inverse regression-based uncertainty quantification algorithms for high-dimensional models: Theory and practice

    Li, Weixuan; Lin, Guang; Li, Bing

    2016-09-01

    A well-known challenge in uncertainty quantification (UQ) is the "curse of dimensionality". However, many high-dimensional UQ problems are essentially low-dimensional, because the randomness of the quantity of interest (QoI) is caused only by uncertain parameters varying within a low-dimensional subspace, known as the sufficient dimension reduction (SDR) subspace. Motivated by this observation, we propose and demonstrate in this paper an inverse regression-based UQ approach (IRUQ) for high-dimensional problems. Specifically, we use an inverse regression procedure to estimate the SDR subspace and then convert the original problem to a low-dimensional one, which can be efficiently solved by building a response surface model such as a polynomial chaos expansion. The novelty and advantages of the proposed approach is seen in its computational efficiency and practicality. Comparing with Monte Carlo, the traditionally preferred approach for high-dimensional UQ, IRUQ with a comparable cost generally gives much more accurate solutions even for high-dimensional problems, and even when the dimension reduction is not exactly sufficient. Theoretically, IRUQ is proved to converge twice as fast as the approach it uses seeking the SDR subspace. For example, while a sliced inverse regression method converges to the SDR subspace at the rate of $O(n^{-1/2})$, the corresponding IRUQ converges at $O(n^{-1})$. IRUQ also provides several desired conveniences in practice. It is non-intrusive, requiring only a simulator to generate realizations of the QoI, and there is no need to compute the high-dimensional gradient of the QoI. Finally, error bars can be derived for the estimation results reported by IRUQ.

  2. A model of PCF in guarded type theory

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about elements...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...

  3. Constructive Type Theory and the Dialogical Approach to Meaning

    Shahid Rahman

    2013-12-01

    Full Text Available In its origins Dialogical logic constituted one part of a new movement called the Erlangen School or Erlangen Constructivism. Its goal was to provide a new start to a general theory of language and of science. According to the Erlangen-School, language is not just a fact that we discover, but a human cultural accomplishment whose construction reason can and should control. The resulting project of intentionally constructing a scientific language was called the Orthosprache-project. Unfortunately, the Orthosprache-project was not further developed and seemed to fade away. It is possible that one of the reasons for this fading away is that the link between dialogical logic and Orthosprache was not sufficiently developed - in particular, the new theory of meaning to be found in dialogical logic seemed to be cut off from both the project of establishing the basis for scientific language and also from a general theory of meaning. We would like to contribute to clarifying one possible way in which a general dialogical theory of meaning could be linked to dialogical logic. The idea behind the proposal is to make use of constructive type theory in which logical inferences are preceded by the description of a fully interpreted language. The latter, we think, provides the means for a new start not only for the project of Orthosprache, but also for a general dialogical theory of meaning.

  4. D-branes in Type IIA and Type IIB theories from tachyon condensation

    Kluson, J

    2000-01-01

    In this paper we will construct all D-branes in Type IIA and Type IIB theories via tachyon condensation. We also propose form of Wess-Zumino term for non-BPS D-brane and we will show that tachyon condensation in this term leads to standard Wess-Zumino term for BPS D-brane.

  5. Formation of social types in the theory of Orrin Klapp

    Trifunović Vesna

    2007-01-01

    Full Text Available Theory of Orrin Klapp about social types draws attention to important functions that these types have within certain societies as well as that it is preferable to take them into consideration if our goal is more complete knowledge of that society. For Klapp, social types are important social symbols, which in an interesting way reflect society they are part of and for that reason this author dedicates his work to considering their meanings and social functions. He thinks that we can not understand a society without the knowledge about the types with which its members are identified and which serve them as models in their social activity. Hence, these types have cognitive value since, according to Klapp, they assist in perception and "contain the truth", and therefore the knowledge of them allows easier orientation within the social system. Social types also offer insight into the scheme of the social structure, which is otherwise invisible and hidden, but certainly deserves attention if we wish clearer picture about social relations within specific community. The aim of this work is to present this very interesting and inspirative theory of Orrin Klapp, pointing out its importance but also its weaknesses which should be kept in mind during its application in further research.

  6. Fokker's type action at a distance theory of gravitation

    Turygin, A. Iu.

    1986-04-01

    An attempt is made to develop a theory of direct gravitational interaction of an arbitrary order, i.e., a many-particle interaction. The situation considered is that of a moving particle in a fixed Riemannian space-time which interacts with a system characterized by a definite energy momentum tensor. A linear formulation of the Fokker type action is used to define particle equations of motion in a geodesic form in a metric of arbitrary order. A proof is developed to show that the resulting metric satisfies the Einstein equations and is commensurate with the Lorentz gauge in Wheeler-Feynman electrodynamics. When applied to many-particle interactions, the absorber theory of gravitational radiation that has been defined is effective only for a linear approximation solution to the Einstein equation, and will require further work to serve as a general absorber theory.

  7. Uncertainty Propagation and Quantification using Constrained Coupled Adaptive Forward-Inverse Schemes: Theory and Applications

    Ryerson, F. J.; Ezzedine, S. M.; Antoun, T.

    2013-12-01

    equation for the distribution of k is solved, provided that Cauchy data are appropriately assigned. In the next stage, only a limited number of passive measurements are provided. In this case, the forward and inverse PDEs are solved simultaneously. This is accomplished by adding regularization terms and filtering the pressure gradients in the inverse problem. Both the forward and the inverse problem are either simultaneously or sequentially coupled and solved using implicit schemes, adaptive mesh refinement, Galerkin finite elements. The final case arises when P, k, and Q data only exist at producing wells. This exceedingly ill posed problem calls for additional constraints on the forward-inverse coupling to insure that the production rates are satisfied at the desired locations. Results from all three cases are presented demonstrating stability and accuracy of the proposed approach and, more importantly, providing some insights into the consequences of data under sampling, uncertainty propagation and quantification. We illustrate the advantages of this novel approach over the common UQ forward drivers on several subsurface energy problems in either porous or fractured or/and faulted reservoirs. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  8. Nucleation of vacuum bubbles in Brans-Dicke type theory

    Kim, Hongsu; Lee, Wonwoo; Lee, Young Jae; Yeom, Dong-han

    2010-01-01

    In this paper, we study nucleation of vacuum bubbles in the Brans-Dicke type theory of gravity. In the Euclidean signatures, we calculate field combinations of vacuum bubbles as solutions of Einstein and field equations as well as their probabilities by integrating the Euclidean action. We illustrate three possible ways to obtain vacuum bubbles: true vacuum bubbles for $\\omega$ > -3/2, false vacuum bubbles for $\\omega$ -3/2 when the vacuum energy of the false vacuum in the potential of the Einstein frame is less than that of the true vacuum. After the bubble is nucleated at the t = 0 surface, we can smoothly connect and match the field combinations to some solutions of the Lorentzian signatures and consistently continue their subsequent evolutions. Therefore, we conclude that, in general scalar-tensor theories or Brans-Dicke type theories, which include some models of string theory, vacuum bubbles are allowed not only in the form of true vacuum bubbles but also false vacuum bubbles, as long as a special cond...

  9. Multivariate Bonferroni-type inequalities theory and applications

    Chen, John

    2014-01-01

    Multivariate Bonferroni-Type Inequalities: Theory and Applications presents a systematic account of research discoveries on multivariate Bonferroni-type inequalities published in the past decade. The emergence of new bounding approaches pushes the conventional definitions of optimal inequalities and demands new insights into linear and Fréchet optimality. The book explores these advances in bounding techniques with corresponding innovative applications. It presents the method of linear programming for multivariate bounds, multivariate hybrid bounds, sub-Markovian bounds, and bounds using Hamil

  10. Type 1 2HDM as Effective Theory of Supersymmetry

    邵华

    2012-01-01

    It is generally believed that the low energy effective theory of the minimal supersymmetric standard model is the type 2 two Higgs doublet model. We will show that the type 1 two Higge doublet model can also be as the effective of supersymmetry in a specific ease with high scale supersymmetry breaking and gauge mediation. If the other electroweak doublet obtain the vacuum expectation value after the electroweak symmetry breaking, the Higgs spectrum is quite different. A remarkable feature is that the physical Higgs boson mass can be 125 GeV unlike in the ordinary models with high scale supersymmetry in which the Higgs mass is generally around 140 GeV.

  11. Development and assessment of a multiplex real-time PCR assay for quantification of human immunodeficiency virus type 1 DNA.

    Beloukas, A; Paraskevis, D; Haida, C; Sypsa, V; Hatzakis, A

    2009-07-01

    Previous studies showed that high levels of human immunodeficiency virus type 1 (HIV-1) DNA are associated with a faster progression to AIDS, an increased risk of death, and a higher risk of HIV RNA rebound in patients on highly active antiretroviral therapy. Our objective was to develop and assess a highly sensitive real-time multiplex PCR assay for the quantification of HIV-1 DNA (RTMP-HIV) based on molecular beacons. HIV-1 DNA quantification was carried out by RTMP in a LightCycler 2.0 apparatus. HIV-1 DNA was quantified in parallel with CCR5 as a reference gene, and reported values are numbers of HIV-1 DNA copies/10(6) peripheral blood mononuclear cells (PBMCs). The clinical sensitivity of the assay was assessed for 115 newly diagnosed HIV-1-infected individuals. The analytical sensitivity was estimated to be 12.5 copies of HIV-1 DNA per 10(6) PBMCs, while the clinical sensitivity was 100%, with levels ranging from 1.23 to 4.25 log(10) HIV-1 DNA copies/10(6) PBMCs. In conclusion, we developed and assessed a new RTMP-HIV assay based on molecular beacons, using a LightCycler 2.0 instrument. This multiplex assay has comparable sensitivity, reproducibility, and accuracy to single real-time PCR assays.

  12. Type I/heterotic duality and M-theory amplitudes

    Green, Michael B

    2016-01-01

    This paper investigates relationships between low-energy four-particle scattering amplitudes with external gauge particles and gravitons in the E_8 X E_8 and SO(32) heterotic string theories and the type I and type IA superstring theories by considering a variety of tree level and one-loop Feynman diagrams describing such amplitudes in eleven-dimensional supergravity in a Horava--Witten background compactified on a circle. This accounts for a number of perturbative and non-perturbative aspects of low order higher derivative terms in the low-energy expansion of string theory amplitudes, which are expected to be protected by half maximal supersymmetry from receiving corrections beyond one or two loops. It also suggests the manner in which type I/heterotic duality may be realised for certain higher derivative interactions that are not so obviously protected. For example, our considerations suggest that R**4 interactions (where R is the Riemann curvature) might receive no perturbative corrections beyond one loop ...

  13. Irregular singularities in Liouville theory and Argyres-Douglas type gauge theories, I

    Gaiotto, D. [Institute for Advanced Study (IAS), Princeton, NJ (United States); Teschner, J. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-03-15

    Motivated by problems arising in the study of N=2 supersymmetric gauge theories we introduce and study irregular singularities in two-dimensional conformal field theory, here Liouville theory. Irregular singularities are associated to representations of the Virasoro algebra in which a subset of the annihilation part of the algebra act diagonally. In this paper we define natural bases for the space of conformal blocks in the presence of irregular singularities, describe how to calculate their series expansions, and how such conformal blocks can be constructed by some delicate limiting procedure from ordinary conformal blocks. This leads us to a proposal for the structure functions appearing in the decomposition of physical correlation functions with irregular singularities into conformal blocks. Taken together, we get a precise prediction for the partition functions of some Argyres-Douglas type theories on S{sup 4}. (orig.)

  14. The dopant type and amount governs the electrochemical performance of graphene platforms for the antioxidant activity quantification

    Hui, Kai Hwee; Ambrosi, Adriano; Sofer, Zdeněk; Pumera, Martin; Bonanni, Alessandra

    2015-05-01

    Graphene doped with heteroatoms can show new or improved properties as compared to the original undoped material. It has been reported that the type of heteroatoms and the doping conditions can have a strong influence on the electronic and electrochemical properties of the resulting material. Here, we wish to compare the electrochemical behavior of two n-type and two p-type doped graphenes, namely boron-doped graphenes and nitrogen-doped graphenes containing different amounts of heteroatoms. We show that the boron-doped graphene containing a higher amount of dopants provides the best electroanalytical performance in terms of calibration sensitivity, selectivity and linearity of response for the detection of gallic acid normally used as the standard probe for the quantification of antioxidant activity of food and beverages. Our findings demonstrate that the type and amount of heteroatoms used for the doping have a profound influence on the electrochemical detection of gallic acid rather than the structural properties of the materials such as amounts of defects, oxygen functionalities and surface area. This finding has a profound influence on the application of doped graphenes in the field of analytical chemistry.Graphene doped with heteroatoms can show new or improved properties as compared to the original undoped material. It has been reported that the type of heteroatoms and the doping conditions can have a strong influence on the electronic and electrochemical properties of the resulting material. Here, we wish to compare the electrochemical behavior of two n-type and two p-type doped graphenes, namely boron-doped graphenes and nitrogen-doped graphenes containing different amounts of heteroatoms. We show that the boron-doped graphene containing a higher amount of dopants provides the best electroanalytical performance in terms of calibration sensitivity, selectivity and linearity of response for the detection of gallic acid normally used as the standard probe for

  15. Uncertainty quantification for proton-proton fusion in chiral effective field theory

    Acharya, B.; Carlsson, B. D.; Ekström, A.; Forssén, C.; Platter, L.

    2016-09-01

    We compute the S-factor of the proton-proton (pp) fusion reaction using chiral effective field theory (χEFT) up to next-to-next-to-leading order (NNLO) and perform a rigorous uncertainty analysis of the results. We quantify the uncertainties due to (i) the computational method used to compute the pp cross section in momentum space, (ii) the statistical uncertainties in the low-energy coupling constants of χEFT, (iii) the systematic uncertainty due to the χEFT cutoff, and (iv) systematic variations in the database used to calibrate the nucleon-nucleon interaction. We also examine the robustness of the polynomial extrapolation procedure, which is commonly used to extract the threshold S-factor and its energy-derivatives. By performing a statistical analysis of the polynomial fit of the energy-dependent S-factor at several different energy intervals, we eliminate a systematic uncertainty that can arise from the choice of the fit interval in our calculations. In addition, we explore the statistical correlations between the S-factor and few-nucleon observables such as the binding energies and point-proton radii of 2,3H and 3He as well as the D-state probability and quadrupole moment of 2H, and the β-decay of 3H. We find that, with the state-of-the-art optimization of the nuclear Hamiltonian, the statistical uncertainty in the threshold S-factor cannot be reduced beyond 0.7%.

  16. Uncertainty quantification for proton-proton fusion in chiral effective field theory

    Acharya, B; Ekström, A; Forssén, C; Platter, L

    2016-01-01

    We compute the $S$-factor of the proton-proton ($pp$) fusion reaction using chiral effective field theory ($\\chi$EFT) up to next-to-next-to-leading order (NNLO) and perform a rigorous uncertainty analysis of the results. We quantify the uncertainties due to (i) the computational method used to compute the $pp$ cross section in momentum space, (ii) the statistical uncertainties in the low-energy coupling constants of $\\chi$EFT, (iii) the systematic uncertainty due to the $\\chi$EFT cutoff, and (iv) systematic variations in the database used to calibrate the nucleon-nucleon interaction. We also examine the robustness of the polynomial extrapolation procedure, which is commonly used to extract the threshold $S$-factor and its energy-derivatives. By performing a statistical analysis of the polynomial fit of the energy-dependent $S$-factor at several different energy intervals, we eliminate a systematic uncertainty that can arise from the choice of the fit interval in our calculations. In addition, we explore the s...

  17. Uncertainty quantification for proton–proton fusion in chiral effective field theory

    B. Acharya

    2016-09-01

    Full Text Available We compute the S-factor of the proton–proton (pp fusion reaction using chiral effective field theory (χEFT up to next-to-next-to-leading order (NNLO and perform a rigorous uncertainty analysis of the results. We quantify the uncertainties due to (i the computational method used to compute the pp cross section in momentum space, (ii the statistical uncertainties in the low-energy coupling constants of χEFT, (iii the systematic uncertainty due to the χEFT cutoff, and (iv systematic variations in the database used to calibrate the nucleon–nucleon interaction. We also examine the robustness of the polynomial extrapolation procedure, which is commonly used to extract the threshold S-factor and its energy-derivatives. By performing a statistical analysis of the polynomial fit of the energy-dependent S-factor at several different energy intervals, we eliminate a systematic uncertainty that can arise from the choice of the fit interval in our calculations. In addition, we explore the statistical correlations between the S-factor and few-nucleon observables such as the binding energies and point-proton radii of 2,3H and 3He as well as the D-state probability and quadrupole moment of 2H, and the β-decay of 3H. We find that, with the state-of-the-art optimization of the nuclear Hamiltonian, the statistical uncertainty in the threshold S-factor cannot be reduced beyond 0.7%.

  18. Predictions for orientifold field theories from type 0{sup '} string theory

    Armoni, Adi [Department of Physics, University of Wales Swansea, Singleton Park, Swansea SA2 8PP (United Kingdom); Imeroni, Emiliano [Institute for Theoretical Physics and Spinoza Institute, Utrecht University, Postbus 80.195, 3508 TD Utrecht (Netherlands)]. E-mail: e.imeroni@phys.uu.nl

    2005-12-29

    Two predictions about finite-N non-supersymmetric 'orientifold field theories' are made by using the dual type 0{sup '} string theory on C{sup 3}/Z{sub 2}xZ{sub 2} orbifold singularity. First, the mass ratio between the lowest pseudoscalar and scalar color-singlets is estimated to be equal to the ratio between the axial anomaly and the scale anomaly at strong coupling, M{sub -}/M{sub +}{approx}C{sub -}/C{sub +}. Second, the ratio between the domain wall tension and the value of the quark condensate is computed.

  19. D-brane Instantons in Type II String Theory

    Blumenhagen, Ralph; /Munich, Max Planck Inst.; Cvetic, Mirjam; /Pennsylvania U.; Kachru, Shamit; /Stanford U., Phys. Dept. /SLAC; Weigand, Timo; /SLAC

    2009-06-19

    We review recent progress in determining the effects of D-brane instantons in N=1 supersymmetric compactifications of Type II string theory to four dimensions. We describe the abstract D-brane instanton calculus for holomorphic couplings such as the superpotential, the gauge kinetic function and higher fermionic F-terms. This includes a discussion of multi-instanton effects and the implications of background fluxes for the instanton sector. Our presentation also highlights, but is not restricted to the computation of D-brane instanton effects in quiver gauge theories on D-branes at singularities. We then summarize the concrete consequences of stringy D-brane instantons for the construction of semi-realistic models of particle physics or SUSY-breaking in compact and non-compact geometries.

  20. D-brane Instantons in Type II String Theory

    Blumenhagen, Ralph; Kachru, Shamit; Weigand, Timo

    2009-01-01

    We review recent progress in determining the effects of D-brane instantons in N=1 supersymmetric compactifications of Type II string theory to four dimensions. We describe the abstract D-brane instanton calculus for holomorphic couplings such as the superpotential, the gauge kinetic function and higher fermionic F-terms. This includes a discussion of multi-instanton effects and the implications of background fluxes for the instanton sector. Our presentation also highlights, but is not restricted to the computation of D-brane instanton effects in quiver gauge theories on D-branes at singularities. We then summarize the concrete consequences of stringy D-brane instantons for the construction of semi-realistic models of particle physics or SUSY-breaking in compact and non-compact geometries.

  1. Exploration of Action Figure Appeals Using Evaluation Grid Method and Quantification Theory Type I

    Chang, Hua-Cheng; Chen, Hung-Yuan

    2017-01-01

    Contemporary toy is characterized by accelerating social, cultural and technological change. An attractive action figure can grab consumers' attention, influence the latent consuming preference and evoke their pleasure. However, traditional design of action figure is always dependent on designer's opinion, subjective experience and preference. It…

  2. Threshold anomalies in Horava-Lifshitz-type theories

    Amelino-Camelia, Giovanni, E-mail: amelino@roma1.infn.i [Dipartimento di Fisica, Universita di Roma ' La Sapienza' and Sez. Roma 1 INFN, P.le A. Moro 2, 00185 Roma (Italy); Gualtieri, Leonardo; Mercati, Flavio [Dipartimento di Fisica, Universita di Roma ' La Sapienza' and Sez. Roma 1 INFN, P.le A. Moro 2, 00185 Roma (Italy)

    2010-03-29

    Recently the study of threshold kinematic requirements for particle-production processes has played a very significant role in the phenomenology of theories with departures from Poincare symmetry. We here specialize these threshold studies to the case of a class of violations of Poincare symmetry which has been much discussed in the literature on Horava-Lifshitz scenarios. These involve modifications of the energy-momentum ('dispersion') relation that may be different for different types of particles, but always involve even powers of energy-momentum in the correction terms. We establish the requirements for compatibility with the observed cosmic-ray spectrum, which is sensitive to the photopion-production threshold. We find that the implications for the electron-positron pair-production threshold are rather intriguing, in light of some recent studies of TeV emissions by Blazars. Our findings should also provide additional motivation for examining the fate of the law of energy-momentum conservation in Horava-Lifshitz-type theories.

  3. In situ fluid typing and quantification with 1D and 2D NMR logging.

    Sun, Boqin

    2007-05-01

    In situ nuclear magnetic resonance (NMR) fluid typing has recently gained momentum due to data acquisition and inversion algorithm enhancement of NMR logging tools. T(2) distributions derived from NMR logging contain information on bulk fluids and pore size distributions. However, the accuracy of fluid typing is greatly overshadowed by the overlap between T(2) peaks arising from different fluids with similar apparent T(2) relaxation times. Nevertheless, the shapes of T(2) distributions from different fluid components are often different and can be predetermined. Inversion with predetermined T(2) distributions allows us to perform fluid component decomposition to yield individual fluid volume ratios. Another effective method for in situ fluid typing is two-dimensional (2D) NMR logging, which results in proton population distribution as a function of T(2) relaxation time and fluid diffusion coefficient (or T(1) relaxation time). Since diffusion coefficients (or T(1) relaxation time) for different fluid components can be very different, it is relatively easy to separate oil (especially heavy oil) from water signal in a 2D NMR map and to perform accurate fluid typing. Combining NMR logging with resistivity and/or neutron/density logs provides a third method for in situ fluid typing. We shall describe these techniques with field examples.

  4. Microstructure, quantification and control of dislocations in bast-type plant fibres

    Madsen, Bo; Lester, Catherine L.; Mortensen, Ulrich Andreas;

    2016-01-01

    Bast-type plant fibres are increasingly being used for structural composite applications where high quality fibres with good mechanical properties are required. A central aspect for this application is the existence of dislocations in the cell wall of plant fibres, i.e. regions of misaligned...

  5. Localized-statistical quantification of human serum proteome associated with type 2 diabetes.

    Rong-Xia Li

    Full Text Available BACKGROUND: Recent advances in proteomics have shed light to discover serum proteins or peptides as biomarkers for tracking the progression of diabetes as well as understanding molecular mechanisms of the disease. RESULTS: In this work, human serum of non-diabetic and diabetic cohorts was analyzed by proteomic approach. To analyze total 1377 high-confident serum-proteins, we developed a computing strategy called localized statistics of protein abundance distribution (LSPAD to calculate a significant bias of a particular protein-abundance between these two cohorts. As a result, 68 proteins were found significantly over-represented in the diabetic serum (p<0.01. In addition, a pathway-associated analysis was developed to obtain the overall pathway bias associated with type 2 diabetes, from which the significant over-representation of complement system associated with type 2 diabetes was uncovered. Moreover, an up-stream activator of complement pathway, ficolin-3, was observed over-represented in the serum of type 2 diabetic patients, which was further validated with statistic significance (p = 0.012 with more clinical samples. CONCLUSIONS: The developed LSPAD approach is well fit for analyzing proteomic data derived from biological complex systems such as plasma proteome. With LSPAD, we disclosed the comprehensive distribution of the proteins associated with diabetes in different abundance levels and the involvement of ficolin-related complement activation in diabetes.

  6. Aspects of Moduli Stabilization in Type IIB String Theory

    Shaaban Khalil

    2016-01-01

    Full Text Available We review moduli stabilization in type IIB string theory compactification with fluxes. We focus on KKLT and Large Volume Scenario (LVS. We show that the predicted soft SUSY breaking terms in KKLT model are not phenomenological viable. In LVS, the following result for scalar mass, gaugino mass, and trilinear term is obtained: m0=m1/2=-A0=m3/2, which may account for Higgs mass limit if m3/2~O(1.5 TeV. However, in this case, the relic abundance of the lightest neutralino cannot be consistent with the measured limits. We also study the cosmological consequences of moduli stabilization in both models. In particular, the associated inflation models such as racetrack inflation and Kähler inflation are analyzed. Finally, the problem of moduli destabilization and the effect of string moduli backreaction on the inflation models are discussed.

  7. Classical Bianchi type I cosmology in K-essence theory

    Socorro, J; Espinoza-García, Abraham

    2014-01-01

    We use one of the simplest forms of the K-essence theory and we apply it to the classical anisotropic Bianchi type I cosmological model, with a barotropic perfect fluid modeling the usual matter content and with cosmological constant. The classical solutions for any but the stiff fluid and without cosmological constant are found in closed form, using a time transformation. We also present the solution whith cosmological constant and some particular values of the barotropic parameter. We present the possible isotropization of the cosmological model, using the ratio between the anisotropic parameters and the volume of the universe and show that this tend to a constant or to zero for different cases. We include also a qualitative analysis of the analog of the Friedmann equation.

  8. Aspects of moduli stabilization in type IIB string theory

    Khalil, Shaaban; Nassar, Ali

    2015-01-01

    We review moduli stabilization in type IIB string theory compactification with fluxes. We focus on the KKLT and Large Volume Scenario (LVS). We show that the predicted soft SUSY breaking terms in KKLT model are not phenomenological viable. In LVS, the following result for scalar mass, gaugino mass, and trilinear term is obtained: $m_0 =m_{1/2}= - A_0=m_{3/2}$, which may account for Higgs mass limit if $m_{3/2} \\sim {\\cal O}(1.5)$ TeV. However, in this case the relic abundance of the lightest neutralino can not be consistent with the measured limits. We also study the cosmological consequences of moduli stabilization in both models. In particular, the associated inflation models such as racetrack inflation and K\\"ahler inflation are analyzed. Finally the problem of moduli destabilization and the effect of string moduli backreaction on the inflation models are discussed.

  9. What is the Nature of a Post-Materialist Paradigm? Three Types of Theories.

    Schwartz, Gary E

    2016-01-01

    What does it mean to have a post-materialist theory? I propose that there are three classes or categories of theories. (1) Type I post-materialist theories: neo-physical theories that are derived from materialist theories, where the materialist theories are still seen as primary and are viewed as being fundamentally necessary to create "non-material" (yet physical) phenomena such as consciousness. (2) Type II post-materialist theories: post-materialist theories of consciousness existing alongside materialist theories, where each class of theories are seen as primary and are viewed as not being derivable from (i.e. are not reducible to) the other And (3) Type I post-materialist theories: where materialist theories are derived from, and are a subset of, more inclusive post-materialist theories of consciousness; here post-materialist theories are seen as primary and are viewed as the ultimate origin of material systems. Type I theories are the least controversial, Type III are the most controversial. The three types of theories are considered in the context of the history of the emergence of post-materialist science.

  10. [Quantification of gait using insole type foot pressure monitor : clinical application for chronic hemiplegia].

    Naito, Yutaro; Kimura, Yoshiko; Hashimoto, Takashi; Mori, Masao; Takemoto, Yoshimi

    2014-03-01

    Home-based stroke hemiplegia patients tend to fall easily. Poor toe clearance is reported to be one of the causes of falling, although there are many other related factors. We developed a low-priced insole type portable foot pressure measurement device, and measured the foot pressure distribution and the foot pressure-time curve of 20 chronic hemiplegia patients and compared them with 36 healthy controls. We also analyzed the outdoor gait of a chronic hemiplegia patient on flat ground, on rough terrain, walking up stairs and on a downward slope. The result was that the load rate of the unaffected heel was significantly increased in hemiplegic gait, and there was a significant negative correlation between the affected side stance phase rate and gait time for 10 m distance (r = -0.73, P hemiplegia patients tend to be highly dependent on their unaffected side during indoor and outdoor gait.

  11. Identification and Quantification of Loline-Type Alkaloids in Endophyte-Infected Grasses by LC-MS/MS.

    Adhikari, Khem B; Boelt, Birte; Fomsgaard, Inge S

    2016-08-10

    Lolines, fungal metabolites of the grass-endophyte association, were identified and quantified using newly developed LC-MS/MS methods in endophyte-infected grasses belonging to the Lolium and Festuca genera after extraction with three different solvents using two extraction methods. The shaking extraction method with isopropanol/water was superior to the other methods due to its high sensitivity, high accuracy (recovery within or close to the range of 80-120%), and high precision (coefficient of variation of <10%). Seven loline alkaloids were identified and quantified using our newly established LC-MS/MS methods, and N-formylloline was the most abundant (5 mg/g dry matter), followed by N-acetylloline. These LC-MS/MS methods used the shortest sample handling time and the fewest sample preparation steps and proved to be good alternatives to existing GC and GC-MS analytical methods without compromising analytical efficiency. In conclusion, we developed for the first time a highly sensitive quantitative LC-MS/MS analytical method for the accurate and reproducible quantification and a LightSight-assisted LC-QTRAP/MS qualitative method for the tentative identification of loline-type alkaloids in endophyte-infected grasses.

  12. Type IIB flux vacua from G-theory II

    Candelas, Philip; Damian, Cesar; Larfors, Magdalena; Morales, Jose Francisco

    2014-01-01

    We find analytic solutions of type IIB supergravity on geometries that locally take the form $\\text{Mink}\\times M_4\\times \\mathbb{C}$ with $M_4$ a generalised complex manifold. The solutions involve the metric, the dilaton, NSNS and RR flux potentials (oriented along the $M_4$) parametrised by functions varying only over $\\mathbb{C}$. Under this assumption, the supersymmetry equations are solved using the formalism of pure spinors in terms of a finite number of holomorphic functions. Alternatively, the solutions can be viewed as vacua of maximally supersymmetric supergravity in six dimensions with a set of scalar fields varying holomorphically over $\\mathbb{C}$. For a class of solutions characterised by up to five holomorphic functions, we outline how the local solutions can be completed to four-dimensional flux vacua of type IIB theory. A detailed study of this global completion for solutions with two holomorphic functions has been carried out in the companion paper [1]. The fluxes of the global solutions ar...

  13. Topologically Massive Gauge Theory: Wu-Yang Type Solutions

    Saygili, K

    2006-01-01

    We discuss euclidean topologically massive Wu-Yang type solutions of the Maxwell-Chern-Simons and the Yang-Mills-Chern-Simons theories. The most distinctive feature of these solutions is the existence of a natural scale of length which is determined by the topological mass. The topological mass is proportional to the square of the gauge coupling constant. We find the non-abelian solution by a SU(2) gauge transformation of the abelian magnetic monopole type solution. In the topologically massive electrodynamics the field strength locally determines the gauge potential modulo a closed term via the self-duality equation. We present the Hopf map including the topological mass. The Wu-Yang construction is based on patching up the local potentials by means of a gauge transformation which can be expressed in terms of the magnetic or the electric charges. We also discuss solutions with different first Chern numbers. There exists a fundamental scale of length over which the gauge function is single-valued and periodic...

  14. Quantification of Atlantic salmon type-I interferon using an Mx1 promoter reporter gene assay.

    Johansen, Audny; Collet, Bertrand; Sandaker, Elin; Secombes, Christopher J; Jørgensen, Jorunn B

    2004-02-01

    We here describe an assay for the detection of interferon-like activity in Atlantic salmon based on the transient transfection of chinook salmon embryo cells (CHSE-214 cells) with a rainbow trout Mx1 promoter linked to a luciferase reporter. A beta-galactosidase gene under the control of a constitutively expressed beta-actin promoter was used as a transfection standard, and luciferase and beta gal expression were measured by a commercially available kit. Interferon containing supernatants from poly I:C- or CpG-stimulated leucocytes added to transfected CHSE-cells induced high luciferase expression (>60-fold induction compared to supernatants from non-stimulated cells). There was no response to supernatants from LPS- and ConA/PMA-stimulated leucocytes, demonstrating the specificity for type I interferon-like activity. Duplicate samples analysed using a cell protection assay for detection of antiviral activity correlated well with levels obtained by the Mx1 promoter reporter gene assay (R2=0.97), confirming the reporter assay as a reliable substitute for the standard antiviral assay. The Mx reporter gene assay also has advantages in terms of sensitivity, high dynamic range and reliability over the conventional cell protection assay.

  15. Negative affectivity and social inhibition in cardiovascular disease: evaluating type-D personality and its assessment using item response theory

    Emons, Wilco H.M.; Meijer, Rob R.; Denollet, Johan

    2007-01-01

    Objective: Individuals with increased levels of both negative affectivity (NA) and social inhibition (SI)—referred to as type-D personality—are at increased risk of adverse cardiac events. We used item response theory (IRT) to evaluate NA, SI, and type-D personality as measured by the DS14. The obje

  16. Threshold anomalies in Horava-Lifshitz-type theories

    Amelino-Camelia, Giovanni; Mercati, Flavio

    2009-01-01

    Recently the study of threshold kinematic requirements for particle-production processes has played a very significant role in the phenomenology of theories with departures from Poincare' symmetry. We here specialize these threshold studies to the case of a class of violations of Poincare' symmetry which has been much discussed in the literature on Horava-Lifshitz scenarios. These involve modifications of the energy-momentum ("dispersion") relation that may be different for different types of particles, but always involve even powers of energy-momentum in the correction terms. We establish the requirements for compatibility with the observed cosmic-ray spectrum, which is sensitive to the photopion-production threshold. We find that the implications for the electron-positron pair-production threshold are rather intriguing, in light of some recent studies of TeV emissions by Blazars. Our findings should also provide motivation for examining the fate of the law of energy-momentum conservation in Horava-Lifshitz-...

  17. Type IIA flux compactifications. Vacua, effective theories and cosmological challenges

    Koers, Simon

    2009-07-30

    In this thesis, we studied a number of type IIA SU(3)-structure compactifications with 06-planes on nilmanifolds and cosets, which are tractable enough to allow for an explicit derivation of the low energy effective theory. In particular we calculated the mass spectrum of the light scalar modes, using N = 1 supergravity techniques. For the torus and the Iwasawa solution, we have also performed an explicit Kaluza-Klein reduction, which led to the same result. For the nilmanifold examples we have found that there are always three unstabilized moduli corresponding to axions in the RR sector. On the other hand, in the coset models, except for SU(2) x SU(2), all moduli are stabilized. We discussed the Kaluza-Klein decoupling for the supersymmetric AdS vacua and found that it requires going to the Nearly-Calabi Yau limited. We searched for non-trivial de Sitter minima in the original flux potential away from the AdS vacuum. Finally, in chapter 7, we focused on a family of three coset spaces and constructed non-supersymmetric vacua on them. (orig.)

  18. A new approach to comprehensive quantification of linear landscape elements using biotope types on a regional scale

    Hirt, Ulrike; Mewes, Melanie; Meyer, Burghard C.

    The structure of a landscape is highly relevant for research and planning (such as fulfilling the requirements of the Water Framework Directive - WFD - and for implementation of comprehensive catchment planning). There is a high potential for restoration of linear landscape elements in most European landscapes. By implementing the WFD in Germany, the restoration of linear landscape elements could be a valuable measure, for example to reduce nutrient input into rivers. Despite this importance of landscape structures for water and nutrients fluxes, biodiversity and the appearance of a landscape, specific studies of the linear elements are rare for larger catchment areas. Existing studies are limited because they either use remote sensing data, which does not adequately differentiate all types of linear landscape elements, or they focus only on a specific type of linear element. To address these limitations, we developed a framework allowing comprehensive quantification of linear landscape elements for catchment areas, using publicly available biotope type data. We analysed the dependence of landscape structures on natural regions and regional soil characteristics. Three data sets (differing in biotopes, soil parameters and natural regions) were generated for the catchment area of the middle Mulde River (2700 km 2) in Germany, using overlay processes in geographic information systems (GIS), followed by statistical evaluation. The linear landscape components of the total catchment area are divided into roads (55%), flowing water (21%), tree rows (14%), avenues (5%), and hedges (2%). The occurrence of these landscape components varies regionally among natural units and different soil regions. For example, the mixed deciduous stands (3.5 m/ha) are far more frequent in foothills (6 m/ha) than in hill country (0.9 m/ha). In contrast, fruit trees are more frequent in hill country (5.2 m/ha) than in the cooler foothills (0.5 m/ha). Some 70% of avenues, and 40% of tree rows

  19. Stable D8-branes and tachyon condensation in type 0 open string theory

    Eyras, E

    1999-01-01

    We consider non-BPS D8 (and D7) branes in type 0 open string theory and describe under which circumstances these branes are stable. We find stable non-BPS D7 and D8 in type 0 with and without D9-branes in the background. By extending the descent relations between D-branes to type 0 theories, the non

  20. Theory of timber connections with slender dowel type fasteners

    Svensson, Staffan; Munch-Andersen, Jørgen

    2016-01-01

    A theory on the lateral load-carrying capacity of timber connections with slender fasteners is presented. The base of the theory is the coupled mechanical phenomena acting in the connection, while the wood and the slender fastener deform and yield prior to failure. The objective is to derive...... a sufficient description of actions and responses which have determining influence on the load-carrying capacity of timber connections with slender fasteners. Model assumptions are discussed and made, but simplifications are left out. Even so, simple mathematical equations describing the lateral capacity...... are derived from mechanical equilibrium of the deformed fastener. The herein proposed theory is verified against tests. The tests were designed to vary the influence of isolated mechanical phenomenon as much as possible. The theory shows a very high accuracy and precision when predicting the load-carrying...

  1. A remark on collective quantification

    Kontinen, J.; Szymanik, J.

    2008-01-01

    We consider collective quantification in natural language. For many years the common strategy in formalizing collective quantification has been to define the meanings of collective determiners, quantifying over collections, using certain type-shifting operations. These type-shifting operations, i.e.

  2. Krichever-Novikov type algebras theory and applications

    Schlichenmaier, Martin

    2014-01-01

    Krichever and Novikov introduced certain classes of infinite dimensionalLie algebrasto extend the Virasoro algebra and its related algebras to Riemann surfaces of higher genus. The author of this book generalized and extended them toa more general setting needed by the applications. Examples of applications are Conformal Field Theory, Wess-Zumino-Novikov-Witten models, moduli space problems, integrable systems, Lax operator algebras, and deformation theory of Lie algebra. Furthermore they constitute an important class of infinite dimensional Lie algebras which due to their geometric origin are

  3. MASLOV-TYPE INDEX THEORY FOR SYMPLECTIC PATHS AND SPECTRAL FLOW(Ⅱ)

    2000-01-01

    Based on the spectral flow and the stratification structures of the symplectic group Sp(2n, C),the Maslov-type index theory and its generalization, the w-index theory parameterized by all w on the unit circle, for arbitrary paths in Sp(2n, C) are established. Then the Bott-type iteration formula of the Maslov-type indices for iterated paths in Sp(2n, C) is proved, and the mean index for any path in Sp(2n, C) is defined. Also, the relation among various Maslov-type index theories is studied.

  4. Development of Primer-Probe Energy Transfer real-time PCR for the detection and quantification of porcine circovirus type 2

    Balint, Adam; Tenk, Miklós; Deim, Zoltán

    2009-01-01

    A real-time PCR assay, based on Primer-Probe Energy Transfer (PriProET), was developed to improve the detection and quantification of porcine circovirus type 2 (PVC2). PCV2 is recognised as the essential infectious agent in post-weaning multisystemic wasting syndrome (PMWS) and has been associated...... in different organs. The data obtained in this study correlate with those described earlier; namely, the viral load in 1 ml plasma and in 500 ng tissue DNA exceeds 10(7) copies in the case of PMWS. The results indicate that the new assay provides a specific, sensitive and robust tool for the improved detection...

  5. Quantification of the N-terminal propeptide of human procollagen type I (PINP): comparison of ELISA and RIA with respect to different molecular forms

    Jensen, Charlotte Harken; Hansen, M; Brandt, J

    1998-01-01

    This paper compares the results of procollagen type I N-terminal propeptide (PINP) quantification by radioimmunoassay (RIA) and enzyme linked immunosorbent assay (ELISA). PINP in serum from a patient with uremic hyperparathyroidism was measured in RIA and ELISA to 20 micrograms l-1 and 116...... micrograms l-1 and the corresponding concentrations in dialysis fluid were 94.5 micrograms l-1 and 140 micrograms l-1, respectively. PINP antigen appears in two distinct peaks following size chromatography and the two peak fractions display immunological identity and identical M(r)'s (27 kDa: SDS...

  6. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    Bang-Cheng Tang

    2016-01-01

    Full Text Available Multivariate calibration (MVC and near-infrared (NIR spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA net analyte signal (NAS method (SANAS for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders.

  7. Conformal field theory and functions of hypergeometric type

    Isachenkov, Mikhail

    2016-03-15

    Conformal field theory provides a universal description of various phenomena in natural sciences. Its development, swift and successful, belongs to the major highlights of theoretical physics of the late XX century. In contrast, advances of the theory of hypergeometric functions always assumed a slower pace throughout the centuries of its existence. Functional identities studied by this mathematical discipline are fascinating both in their complexity and beauty. This thesis investigates the interrelation of two subjects through a direct analysis of three CFT problems: two-point functions of the 2d strange metal CFT, three-point functions of primaries of the non-rational Toda CFT and kinematical parts of Mellin amplitudes for scalar four-point functions in general dimensions. We flash out various generalizations of hypergeometric functions as a natural mathematical language for two of these problems. Several new methods inspired by extensions of classical results on hypergeometric functions, are presented.

  8. Four types of coping with COPD-induced breathlessness in daily living: a grounded theory study

    Bastrup, Lene; Dahl, Ronald; Pedersen, Preben Ulrich;

    2013-01-01

    COPD predominantly cope with breathlessness during daily living. We chose a multimodal grounded theory design that holds the opportunity to combine qualitative and quantitative data to capture and explain the multidimensional coping behaviour among poeple with COPD. The participants' main concern...... comprised distrinctive physiological, cognitive, affective and psychosocial features constituting coping-type-specific indicators. In theory, four predominant coping types with distinct physiological, cognitive, affective and psychosocial properties are observed among people with COPD. The four coping types...

  9. Applications of Reflection Amplitudes in Toda-type Theories

    Ahn, C; Rim, C; Ahn, Changrim; Kim, Chanju; Rim, Chaiho

    2001-01-01

    Reflection amplitudes are defined as two-point functions of certain class of conformal field theories where primary fields are given by vertex operators with real couplings. Among these, we consider (Super-)Liouville theory and simply and non-simply laced Toda theories. In this paper we show how to compute the scaling functions of effective central charge for the models perturbed by some primary fields which maintains integrability. This new derivation of the scaling functions are compared with the results from conventional TBA approach and confirms our approach along with other non-perturbative results such as exact expressions of the on-shell masses in terms of the parameters in the action, exact free energies. Another important application of the reflection amplitudes is a computation of one-point functions for the integrable models. Introducing functional relations between the one-point functions in terms of the reflection amplitudes, we obtain explicit expressions for simply-laced and non-simply-laced af...

  10. Reconciling Experiment and Theory in the Use of Aryl-Extended Calix[4]pyrrole Receptors for the Experimental Quantification of Chloride–π Interactions in Solution

    Antonio Bauzá

    2015-04-01

    Full Text Available In this manuscript we consider from a theoretical point of view the recently reported experimental quantification of anion–π interactions (the attractive force between electron deficient aromatic rings and anions in solution using aryl extended calix[4]pyrrole receptors as model systems. Experimentally, two series of calix[4]pyrrole receptors functionalized, respectively, with two and four aryl rings at the meso positions, were used to assess the strength of chloride–π interactions in acetonitrile solution. As a result of these studies the contribution of each individual chloride–π interaction was quantified to be very small (<1 kcal/mol. This result is in contrast with the values derived from most theoretical calculations. Herein we report a theoretical study using high-level density functional theory (DFT calculations that provides a plausible explanation for the observed disagreement between theory and experiment. The study reveals the existence of molecular interactions between solvent molecules and the aromatic walls of the receptors that strongly modulate the chloride–π interaction. In addition, the obtained theoretical results also suggest that the chloride-calix[4]pyrrole complex used as reference to dissect experimentally the contribution of the chloride–π interactions to the total binding energy for both the two and four-wall aryl-extended calix[4]pyrrole model systems is probably not ideal.

  11. Gluing together Proof Environments: Canonical extensions of LF Type Theories featuring Locks

    Furio Honsell

    2015-07-01

    Full Text Available We present two extensions of the LF Constructive Type Theory featuring monadic locks. A lock is a monadic type construct that captures the effect of an external call to an oracle. Such calls are the basic tool for gluing together diverse Type Theories and proof development environments. The oracle can be invoked either to check that a constraint holds or to provide a suitable witness. The systems are presented in the canonical style developed by the CMU School. The first system, CLLFP, is the canonical version of the system LLFP, presented earlier by the authors. The second system, CLLFP?, features the possibility of invoking the oracle to obtain a witness satisfying a given constraint. We discuss encodings of Fitch-Prawitz Set theory, call-by-value lambda-calculi, and systems of Light Linear Logic. Finally, we show how to use Fitch-Prawitz Set Theory to define a type system that types precisely the strongly normalizing terms.

  12. A New Look at Generalized Rewriting in Type Theory

    Matthieu Sozeau

    2009-01-01

    Full Text Available Rewriting is an essential tool for computer-based reasoning, both automated and assisted. This is because rewriting is a general notion that permits modeling a wide range of problems and provides a means to effectively solve them. In a proof assistant, rewriting can be used to replace terms in arbitrary contexts, generalizing the usual equational reasoning to reasoning modulo arbitrary relations. This can be done provided the necessary proofs that functions appearing in goals are congruent with respect to specific relations. We present a new implementation of generalized rewriting in the Coq proof assistant, making essential use of the expressive power of dependent types and the recently implemented type class mechanism. The new rewrite tactic improves on and generalizes previous versions by natively supporting higher-order functions, polymorphism and subrelations. The type class system inspired by Haskell provides a perfect interface between the user and the tactic, making it easily extensible.

  13. Eady Solitary Waves: A Theory of Type B Cyclogenesis.

    Mitsudera, Humio

    1994-11-01

    Localized baroclinic instability in a weakly nonlinear, long-wave limit using an Eady model is studied. The resulting evolution equations have a form of the KdV type, including extra terms representing linear coupling. Baroclinic instability is triggered locally by the collision between two neutral solitary waves (one trapped at the upper boundary and the other at the lower boundary) if their incident amplitudes are sufficiently large. This characteristic is explained from the viewpoint of resonance when the relative phase speed, which depends on the amplitudes, is less than a critical value. The upper and lower disturbances grow in a coupled manner (resembling a normal-mode structure) initially, but they reverse direction slowly as the amplitudes increase, and eventually separate from each other.The motivation of this study is to investigate a type of extratropical cyclogenesis that involves a preexisting upper trough (termed as Type B development) from the viewpoint of resonant solitary waves. Two cases are of particular interest. First, the author examines a case where an upper disturbance preexists over an undisturbed low-level waveguide. The solitary waves exhibit behavior similar to that conceived by Hoskins et al. for Type B development; the lower disturbance is forced one sidedly by a preexisting upper disturbance initially, but in turn forces the latter once the former attains a sufficient amplitude, thus resulting in mutual reinforcement. Second, if a weak perturbation exists at the surface ahead of the preexisting strong upper disturbance, baroclinic instability is triggered when the two waves interact. Even though the amplitude of the lower disturbance is initially much weaker, it is intensified quickly and catches up with the amplitude of the upper disturbance, so that the coupled vertical structure resembles that of an unstable normal mode eventually. These results describe the observed behavior in Type B atmospheric cyclogenesis quite well.

  14. Godel Type Metrics in Einstein-Aether Theory II: Nonflat Background in Arbitrary Dimensions

    Gurses, Metin

    2015-01-01

    It was previously proved that the G\\"{o}del-type metrics with flat three-dimensional background metric solve exactly the field equations of the Einstein-Aether theory in four dimensions. We generalize this result by showing that the stationary G\\"{o}del-type metrics with nonflat background in $D$ dimensions solve exactly the field equations of the Einstein-Aether theory. The reduced field equations are the $(D-1)$-dimensional Euclidean Ricci-flat and the $(D-1)$-dimensional source-free Maxwell equations, and the parameters of the theory are left free except $c_{1}-c_{3}=1$. We give a method to produce exact solutions of the Einstein-Aether theory from the G\\"{o}del-type metrics in $D$ dimensions. By using this method, we present explicit exact solutions to the theory by considering the particular cases: ($D-1$)-dimensional Euclidean flat, conformally flat, and Tangherlini backgrounds.

  15. Brans-Dicke-type theories and avoidance of the cosmological singularity

    Quirós, I; Cardenas, R; Quiros, Israel; Bonal, Rolando; Cardenas, Rolando

    2000-01-01

    A point of view, based on a postulate about the physical equivalence of conformal representations of a given physical situation in Brans-Dicke-type theories of gravitation is presented, that automatically solves the discussion about the physical equivalence of Jordan frame and Einstein frame formulations of scalar-tensor theory. The cosmological consequences of this viewpoint for general relativity are studied, and its implications for the low-energy limit of string theory outlined.

  16. Cosmic web-type classification using decision theory

    Leclercq, Florent; Wandelt, Benjamin

    2015-01-01

    We propose a decision criterion for segmenting the cosmic web into different structure types (voids, sheets, filaments and clusters) on the basis of their respective probabilities and the strength of data constraints. Our approach is inspired by an analysis of games of chance where the gambler only plays if a positive expected net gain can be achieved based on some degree of privileged information. The result is a general solution for classification problems in the face of uncertainty, including the option of not committing to a class for a candidate object. As an illustration, we produce high-resolution maps of web-type constituents in the nearby Universe as probed by the Sloan Digital Sky Survey main galaxy sample. Other possible applications include the selection and labeling of objects in catalogs derived from astronomical survey data.

  17. Development of solution phase hybridisation PCR-ELISA for the detection and quantification of Enterococcus faecalis and Pediococcus pentosaceus in Nurmi-type cultures.

    Waters, Sinéad M; Doyle, Sean; Murphy, Richard A; Power, Ronan F G

    2005-12-01

    Nurmi-type cultures (NTCs), derived from the fermentation of caecal contents of specifically pathogen-free (SPF) birds, have been used successfully to control salmonella colonisation in chicks. These cultures are undefined in nature and, consequently, it is difficult to obtain approval from regulatory agencies for their use as direct fed microbials (DFMs) for poultry. Progress towards the generation of effective defined probiotics requires further knowledge of the composition of these cultures. As such, species-specific, culture-independent quantification methodologies need to be developed to elucidate the concentration of specific bacterial constituents of NTCs. Quantification of specific bacterial species in such ill-defined complex cultures using conventional culturing methods is inaccurate due to low levels of sensitivity and reproducibility, in addition to slow turnaround times. Furthermore, these methods lack selectivity due to the nature of the accompanying microflora. This study describes the development of a rapid, sensitive, reliable, reproducible, and species-specific culture-independent, solution phase hybridisation PCR-ELISA procedure for the detection and quantification of Enterococcus faecalis and Pediococcus pentosaceus in NTCs. In this technique, biotin-labelled primers were designed to amplify a species-specific fragment of a marker gene of known copy number, in both species. Resulting amplicons were hybridised with a dinitrophenol (DNP)-labelled oligonucleotide probe in solution and were subsequently captured on a streptavidin-coated microtitre plate. The degree of binding was determined by the addition of IgG (anti-DNP)-horseradish peroxidase conjugate, which was subsequently visualised using a chromogenic substrate, tetramethylbenzidine. This novel quantitative method was capable of detecting E. faecalis and P. pentosaceus at levels as low as 5 CFU per PCR reaction.

  18. Gödel-type Spacetimes in Induced Matter Gravity Theory

    Carrion, H L; Teixeira, A F F

    1999-01-01

    A five-dimensional (5D) generalized Gödel-type manifolds are examined in the light of the equivalence problem techniques, as formulated by Cartan. The necessary and sufficient conditions for local homogeneity of these 5D manifolds are derived. The local equivalence of these homogeneous Riemannian manifolds is studied. It is found that they are characterized by three essential parameters $k$, $m^2$ and $\\omega$: identical triads $(k, m^2, \\omega)$ correspond to locally equivalent 5D manifolds. An irreducible set of isometrically nonequivalent 5D locally homogeneous Riemannian generalized Gödel-type metrics are exhibited. A classification of these manifolds based on the essential parameters is presented, and the Killing vector fields as well as the corresponding Lie algebra of each class are determined. It is shown that the generalized Gödel-type 5D manifolds admit maximal group of isometry $G_r$ with $r=7$, $r=9$ or $r=15$ depending on the essential parameters $k$, $m^2$ and $\\omega$. The breakdown of causa...

  19. A matrix model for heterotic Spin(32)/Z sub 2 and type I string theory

    Krogh, M

    1999-01-01

    We consider heterotic string theories in the DLCQ. We derive that the matrix model of the Spin(32)/Z sub 2 heterotic theory is the theory living on N D-strings in type I wound on a circle with no Spin(32)/Z sub 2 Wilson line on the circle. This is an O(N) gauge theory. We rederive the matrix model for the E sub 8 xE sub 8 heterotic string theory, explicitly taking care of the Wilson line around the lightlike circle. The result is the same theory as for Spin(32)/Z sub 2 except that now there is a Wilson line on the circle. We also see that the integer N labeling the sector of the O(N) matrix model is not just the momentum around the lightlike circle, but a shifted momentum depending on the Wilson line. We discuss the aspect of level matching, GSO projections and why, from the point of view of matrix theory the E sub 8 xE sub 8 theory, and not the Spin(32)/Z sub 2 , develops an 11th dimension for strong coupling. Furthermore a matrix theory for type I is derived. This is again the O(N) theory living on the D-st...

  20. Development of a sandwich ELISA-type system for the detection and quantification of hazelnut in model chocolates.

    Costa, Joana; Ansari, Parisa; Mafra, Isabel; Oliveira, M Beatriz P P; Baumgartner, Sabine

    2015-04-15

    Hazelnut is one of the most appreciated nuts being virtually found in a wide range of processed foods. The simple presence of trace amounts of hazelnut in foods can represent a potential risk for eliciting allergic reactions in sensitised individuals. The correct labelling of processed foods is mandatory to avoid adverse reactions. Therefore, adequate methodology evaluating the presence of offending foods is of great importance. Thus, the aim of this study was to develop a highly specific and sensitive sandwich enzyme-linked immunosorbent assay (ELISA) for the detection and quantification of hazelnut in complex food matrices. Using in-house produced antibodies, an ELISA system was developed capable to detect hazelnut down to 1 mg kg(-1) and quantify this nut down to 50 mg kg(-1) in chocolates spiked with known amounts of hazelnut. These results highlight and reinforce the value of ELISA as rapid and reliable tool for the detection of allergens in foods.

  1. Comparison between magnetic force microscopy and electron back-scatter diffraction for ferrite quantification in type 321 stainless steel.

    Warren, A D; Harniman, R L; Collins, A M; Davis, S A; Younes, C M; Flewitt, P E J; Scott, T B

    2015-01-01

    Several analytical techniques that are currently available can be used to determine the spatial distribution and amount of austenite, ferrite and precipitate phases in steels. The application of magnetic force microscopy, in particular, to study the local microstructure of stainless steels is beneficial due to the selectivity of this technique for detection of ferromagnetic phases. In the comparison of Magnetic Force Microscopy and Electron Back-Scatter Diffraction for the morphological mapping and quantification of ferrite, the degree of sub-surface measurement has been found to be critical. Through the use of surface shielding, it has been possible to show that Magnetic Force Microscopy has a measurement depth of 105-140 nm. A comparison of the two techniques together with the depth of measurement capabilities are discussed.

  2. $\\mathcal{N}=2$ supersymmetric field theories on 3-manifolds with A-type boundaries

    Aprile, Francesco

    2016-01-01

    General half-BPS A-type boundary conditions are formulated for N=2 supersymmetric field theories on compact 3-manifolds with boundary. We observe that under suitable conditions manifolds of the real A-type admitting two complex supersymmetries (related by charge conjugation) possess, besides a contact structure, a natural integrable toric foliation. A boundary, or a general co-dimension-1 defect, can be inserted along any leaf of this preferred foliation to produce manifolds with boundary that have the topology of a solid torus. We show that supersymmetric field theories on such manifolds can be endowed with half-BPS A-type boundary conditions. We specify the natural curved space generalization of the A-type projection of bulk supersymmetries and analyze the resulting A-type boundary conditions in generic 3d non-linear sigma models and YM/CS-matter theories.

  3. On the usage of classical nucleation theory in quantification of the impact of bacterial INP on weather and climate

    Sahyoun, Maher; Wex, Heike; Gosewinkel, Ulrich; Šantl-Temkiv, Tina; Nielsen, Niels W.; Finster, Kai; Sørensen, Jens H.; Stratmann, Frank; Korsholm, Ulrik S.

    2016-08-01

    Bacterial ice-nucleating particles (INP) are present in the atmosphere and efficient in heterogeneous ice-nucleation at temperatures up to -2 °C in mixed-phase clouds. However, due to their low emission rates, their climatic impact was considered insignificant in previous modeling studies. In view of uncertainties about the actual atmospheric emission rates and concentrations of bacterial INP, it is important to re-investigate the threshold fraction of cloud droplets containing bacterial INP for a pronounced effect on ice-nucleation, by using a suitable parameterization that describes the ice-nucleation process by bacterial INP properly. Therefore, we compared two heterogeneous ice-nucleation rate parameterizations, denoted CH08 and HOO10 herein, both of which are based on classical-nucleation-theory and measurements, and use similar equations, but different parameters, to an empirical parameterization, denoted HAR13 herein, which considers implicitly the number of bacterial INP. All parameterizations were used to calculate the ice-nucleation probability offline. HAR13 and HOO10 were implemented and tested in a one-dimensional version of a weather-forecast-model in two meteorological cases. Ice-nucleation-probabilities based on HAR13 and CH08 were similar, in spite of their different derivation, and were higher than those based on HOO10. This study shows the importance of the method of parameterization and of the input variable, number of bacterial INP, for accurately assessing their role in meteorological and climatic processes.

  4. Cherkis bow varieties and Coulomb branches of quiver gauge theories of affine type $A$

    Nakajima, Hiraku

    2016-01-01

    We show that Coulomb branches of quiver gauge theories of affine type $A$ are Cherkis bow varieties, which have been introduced as ADHM type description of moduli space of instantons on the Taub-NUT space equivariant under a cyclic group action.

  5. Theory of chromatography of partially cyclic polymers: Tadpole-type and manacle-type macromolecules.

    Vakhrushev, Andrey V; Gorbunov, Alexei A

    2016-02-12

    A theory of chromatography is developed for partially cyclic polymers of tadpole- and manacle-shaped topological structures. We present exact equations for the distribution coefficient K at different adsorption interactions; simpler approximate formulae are also derived, relevant to the conditions of size-exclusion, adsorption, and critical chromatography. Theoretical chromatograms of heterogeneous partially cyclic polymers are simulated, and conditions for good separation by topology are predicted. According to the theory, an effective SEC-radius of tadpoles and manacles is mostly determined by the molar mass M, and by the linear-cyclic composition. In the interactive chromatography, the effect of molecular topology on the retention becomes significant. At the critical interaction point, partial dependences K(Mlin) and K(Mring) are qualitatively different: while being almost independent of Mlin, K increases with Mring. This behavior could be realized in critical chromatography-for separation of partially cyclic polymers by the number and molar mass of cyclic elements.

  6. Digital games for type 1 and type 2 diabetes: underpinning theory with three illustrative examples.

    Kamel Boulos, Maged N; Gammon, Shauna; Dixon, Mavis C; MacRury, Sandra M; Fergusson, Michael J; Miranda Rodrigues, Francisco; Mourinho Baptista, Telmo; Yang, Stephen P

    2015-03-18

    Digital games are an important class of eHealth interventions in diabetes, made possible by the Internet and a good range of affordable mobile devices (eg, mobile phones and tablets) available to consumers these days. Gamifying disease management can help children, adolescents, and adults with diabetes to better cope with their lifelong condition. Gamification and social in-game components are used to motivate players/patients and positively change their behavior and lifestyle. In this paper, we start by presenting the main challenges facing people with diabetes-children/adolescents and adults-from a clinical perspective, followed by three short illustrative examples of mobile and desktop game apps and platforms designed by Ayogo Health, Inc. (Vancouver, BC, Canada) for type 1 diabetes (one example) and type 2 diabetes (two examples). The games target different age groups with different needs-children with type 1 diabetes versus adults with type 2 diabetes. The paper is not meant to be an exhaustive review of all digital game offerings available for people with type 1 and type 2 diabetes, but rather to serve as a taster of a few of the game genres on offer today for both types of diabetes, with a brief discussion of (1) some of the underpinning psychological mechanisms of gamified digital interventions and platforms as self-management adherence tools, and more, in diabetes, and (2) some of the hypothesized potential benefits that might be gained from their routine use by people with diabetes. More research evidence from full-scale evaluation studies is needed and expected in the near future that will quantify, qualify, and establish the evidence base concerning this gamification potential, such as what works in each age group/patient type, what does not, and under which settings and criteria.

  7. Frobenius type and CV-structures for Donaldson-Thomas theory and a convergence property

    Barbieri, Anna

    2015-01-01

    We rephrase some well-known results in Donaldson-Thomas theory in terms of (formal families of) Frobenius type and CV-structures on a vector bundle in the sense of Hertling. We study these structures in an abstract setting, and prove a convergence result which is relevant to the case of triangulated categories. An application to physical field theory is also briefly discussed.

  8. A Spin(7) Conifold Transition in Type IIB as an F-theory Flop

    Tan, M C; Teo, Edward

    2004-01-01

    We consider type IIB string theory compactified on an 8-dimensional Spin(7) manifold with N D5-branes, undergoing a conifold transition to a geometry with no branes and N units of dual 3-form RR flux through appropriate 3-cycles. We show that at {\\it constant} IIB coupling, in the limit of large N and finite 't Hooft coupling, this conifold transition can be lifted to a purely geometric S^5 flop in the 11-dimensional space of an equivalent F-theory without branes and fluxes. The S^5 flop in this equivalent F-theory is a higher-dimensional analogue of the S^3 flop in M-theory on a G_2 manifold discovered by Atiyah et al., albeit resulting in a theory with {\\cal N}=1 in d=2 instead of d=4.

  9. Nonperturbative type IIB model building in the F-theory framework

    Jurke, Benjamin Helmut Friedrich

    2011-02-28

    This dissertation is concerned with the topic of non-perturbative string theory, which is generally considered to be the most promising approach to a consistent description of quantum gravity. The five known 10-dimensional perturbative string theories are all interconnected by numerous dualities, such that an underlying non-perturbative 11-dimensional theory, called M-theory, is postulated. Due to several technical obstacles, little is known about the fundamental objects in this theory. There exists an alternative non-perturbative description to type IIB string theory, namely F-theory. Here the SL(2;Z) self-duality of IIB theory is geometrized in the form of an elliptic fibration over the space-time. Moreover, higher-dimensional objects like 7-branes are included via singularities into the geometric picture. This formally elegant description, however, requires significant technical effort for the construction of suitable compactification geometries, as many different aspects necessarily have to be dealt with at the same time. On the other hand, the generation of essential GUT building blocks like certain Yukawa couplings or spinor representations is easier compared to perturbative string theory. The goal of this study is therefore to formulate a unified theory within the framework of F-theory, that satisfies basic phenomenological constraints. Within this thesis, at first E3-brane instantons in type IIB string theory - 4-dimensional objects that are entirely wrapped around the invisible dimensions of space-time - are matched with M5-branes in F-theory. Such objects are of great importance in the generation of critical Yukawa couplings or the stabilization of the free parameters of a theory. Certain properties of M5-branes then allow to derive a new criterion for E3-branes to contribute to the superpotential. In the aftermath of this analysis, several compactification geometries are constructed and checked for basic properties that are relevant for semi

  10. Decision-making styles: managerial application of the MBTI and type theory.

    Freund, C M

    1988-12-01

    Applying type theory is a relatively inexpensive way for managers to increase effectiveness by emphasizing the qualitative issues in organizations. The author describes managerial and organizational uses of C.G. Jung's theory of psychological type, as operationalized in the Myers-Briggs Type Indicator (MBTI). The MBTI is useful not only in identifying individual preferences, but also in developing effective managerial and working terms. Knowledge of one's own type and the type of others can help managers motivate others, maximize human resources, persuade others, and gain cooperation. An article in the January 1989 issue of JONA discusses the author's use of the MBTI to assess decision-making styles and the compatibility of hospital chief nursing officers and executive officers.

  11. Bianchi Type VI1 Viscous Fluid Cosmological Model in Wesson´s Theory of Gravitation

    Khadekar, G. S.; Avachar, G. R.

    2007-03-01

    Field equations of a scale invariant theory of gravitation proposed by Wesson [1, 2] are obtained in the presence of viscous fluid with the aid of Bianchi type VIh space-time with the time dependent gauge function (Dirac gauge). It is found that Bianchi type VIh (h = 1) space-time with viscous fluid is feasible in this theory, whereas Bianchi type VIh (h = -1, 0) space-times are not feasible in this theory, even in the presence of viscosity. For the feasible case, by assuming a relation connecting viscosity and metric coefficient, we have obtained a nonsingular-radiating model. We have discussed some physical and kinematical properties of the models.

  12. Conference on Geometric Analysis &Conference on Type Theory, Homotopy Theory and Univalent Foundations : Extended Abstracts Fall 2013

    Yang, Paul; Gambino, Nicola; Kock, Joachim

    2015-01-01

    The two parts of the present volume contain extended conference abstracts corresponding to selected talks given by participants at the "Conference on Geometric Analysis" (thirteen abstracts) and at the "Conference on Type Theory, Homotopy Theory and Univalent Foundations" (seven abstracts), both held at the Centre de Recerca Matemàtica (CRM) in Barcelona from July 1st to 5th, 2013, and from September 23th to 27th, 2013, respectively. Most of them are brief articles, containing preliminary presentations of new results not yet published in regular research journals. The articles are the result of a direct collaboration between active researchers in the area after working in a dynamic and productive atmosphere. The first part is about Geometric Analysis and Conformal Geometry; this modern field lies at the intersection of many branches of mathematics (Riemannian, Conformal, Complex or Algebraic Geometry, Calculus of Variations, PDE's, etc) and relates directly to the physical world, since many natural phenomena...

  13. Bianchi Type-I, V and VIo models in modified generalized scalar–tensor theory

    T Singh; R Chaubey

    2007-08-01

    In modified generalized scalar–tensor (GST) theory, the cosmological term is a function of the scalar field and its derivatives $\\dot{}^{2}$. We obtain exact solutions of the field equations in Bianchi Type-I, V and VIo space–times. The evolution of the scale factor, the scalar field and the cosmological term has been discussed. The Bianchi Type-I model has been discussed in detail. Further, Bianchi Type-V and VIo models can be studied on the lines similar to Bianchi Type-I model.

  14. Quantification and implications of two types of soluble organic matter from brackish to saline lake source rocks

    SONG Yitao; LIAO Yongsheng; ZHANG Shouchun

    2005-01-01

    Two types of soluble organic matter, the free and adsorbed, were obtained and quantified from the brackish to saline lake source rocks. The adsorbed type was extracted with chloroform, solvent mixtures of methanol: acetone:chloroform (MAC) and CS2:N-methyl-2-pyrroli- dinone (CS2/NMP). The total amounts of the two types of soluble organic matter from some immature source rocks are >830 mg/g TOC, more than 63% of the total organic matter in these samples. This result indicates that the majority of the organic matter in the immature source rocks in the brackish to saline lake basin is soluble, and is significant for study of petroleum formation and helpful for petroleum exploration in the brackish to saline lake basin.

  15. Quantification theory Ⅲ and its application in the evaluation of coal and gas otburst%数量化理论Ⅲ及其在煤与瓦斯突出危险性评估中的应用

    石庆礼; 杨胜强

    2013-01-01

    The principle and method of applying quantification theory Ⅲ to estublish the hazard evaluation coal and gas outburst were studied.Based on the theory of Gas-geology a danger evaluation index system of coal and gas outburst was established,including eleven geological factors,such as the coal seam gas content,gas pressure of coal seam and coal seam bifurcation or merger phenomenon,the complexity of geological structure,the coal damage type,etc.Taking the quantification theory Ⅲ as tool,the sensitive geological factors and axis-F1 for the danger of coal and gas outburst for 3 coal seam in Mayixi NO.1 were obtained by practical example analysis.The one with score more than 0.1 was the sensitive geological factor for strong outstanding,and the one with score less than -0.1 was the sensitive geological factor for weak outstanding.Besides,the factor with score between-0.1 and 0.1 was for medium outstanding area.Moreover,the mining area was divided into three regions,including strong outstanding,medium outstanding and weak outstanding area.The method realizes the quantitative analysis of coal and gas outburst in geological exploration stage.furthermove,and the regional division of the coal and gas outburst would be refined.%研究了数量化理论Ⅲ建立煤与瓦斯突出危险性的原理和方法,以瓦斯地质学理论为基础,建立了包括煤层瓦斯含量、煤层瓦斯压力、煤层分岔合并现象、地质构造复杂程度、煤的破坏类型等11个地质因素的煤层突出危险性评估指标体系.以数量化理论Ⅲ为工具,实例分析得出马依西一井3煤层突出危险性的敏感地质因素及因素轴F1,以该轴上得分大干0.1的地质因素作为强突出的敏感地质因素;得分小于-0.1的地质因素作为弱突出的敏感地质因素,得分在0.1--0.1之间的地质因素作为中等突出的敏感地质因素,将矿井划分为强突出、中等突出和弱突出三个区域.该方法实现了地勘阶段煤与

  16. Identification of enzymes and quantification of metabolic fluxes in the wild type and in a recombinant Aspergillus oryzae strain

    Pedersen, Henrik; Carlsen, Morten; Nielsen, Jens Bredal

    1999-01-01

    Two alpha-amylase-producing strains of Aspergillus oryzae, a wild-type strain and a recombinant containing additional copies of the alpha-amylase gene, were characterized,vith respect to enzyme activities, localization of enzymes to the mitochondria or cytosol, macromolecular composition...... or nitrate as the nitrogen source. The flux through the pentose phosphate pathway increased with increasing specific growth rate. The fluxes through the pentose phosphate pathway were 15 to 26% higher for the recombinant strain than for the wild-type strain....

  17. Quantification of zinc atoms in a surface alloy on copper in an industrial-type methanol synthesis catalyst

    Kuld, Sebastian; Moses, Poul Georg; Sehested, Jens;

    2014-01-01

    Methanol has recently attracted renewed interest because of its potential importance as a solar fuel.1 Methanol is also an important bulk chemical that is most efficiently formed over the industrial Cu/ZnO/Al2O3 catalyst. The identity of the active site and, in particular, the role of ZnO...... as a promoter for this type of catalyst is still under intense debate.2 Structural changes that are strongly dependent on the pretreatment method have now been observed for an industrial-type methanol synthesis catalyst. A combination of chemisorption, reaction, and spectroscopic techniques provides...

  18. Quantification of zinc atoms in a surface alloy on copper in an industrial-type methanol synthesis catalyst

    Kuld, Sebastian; Moses, Poul Georg; Sehested, Jens;

    2014-01-01

    Methanol has recently attracted renewed interest because of its potential importance as a solar fuel. Methanol is also an important bulk chemical that is most efficiently formed over the industrial Cu/ZnO/Al2O3 catalyst. The identity of the active site and, in particular, the role of ZnO...... as a promoter for this type of catalyst is still under intense debate. Structural changes that are strongly dependent on the pretreatment method have now been observed for an industrial-type methanol synthesis catalyst. A combination of chemisorption, reaction, and spectroscopic techniques provides a consistent...

  19. Type-IV pili spectroscopic markers: applications in the quantification of piliation levels in Moraxella bovis cells by a FT-IR ANN-based model.

    Bosch, Alejandra; Prieto, Claudia; Serra, Diego Omar; Martina, Pablo; Stämmbler, Maren; Naumann, Dieter; Schmitt, Jürgen; Yantorno, Osvaldo

    2010-08-01

    Type-IV pili are cell surface organelles found in a wide variety of Gram-negative bacteria. They have traditionally been detected by electron microscopy and ELISA techniques. However, these methodologies are not appropriate for the rapid discrimination and quantification of piliated and nonpiliated cells in industrial or field conditions. Here, the analysis of FT-IR spectra of piliated, nonpiliated and sheared Moraxella bovis cells, together with purified pili suspensions spectra, allowed the identification of 3 IR regions associated to spectroscopic markers of Type-IV pili: 1750-1600, 1450-1350 and 1280-950 cm(-1). Such IR-specific markers were found for piliated cells grown in different culture systems (liquid or solid media), independently of the strain or pili serotype. They were also sensitive to pili expression levels. Therefore, on the bases of these specific spectral features, an FT-IR ANN-based model was developed to classify piliation levels in 5 distinct groups. An overall classification rate of almost 90% demonstrates the strong potential of the ANN system developed to monitor M. bovis cultures in vaccine production.

  20. Social types hero, fool and villain in the theory of Orin Klapp

    Vesna Trifunović

    2016-01-01

    The essay deals with the theory of the respected American sociologist Orin Klapp pertaining to social types, with an emphasis on the Hero, Villain and the Fool. The system, through which Orin Klapp classifies these three types, developing their social functions, as well as the technique he uses in analyzing the American national character, emphasizing their value as a methodological means in comparing different societies, is reflected upon.

  1. Social types hero, fool and villain in the theory of Orin Klapp

    Vesna Trifunović

    2016-02-01

    Full Text Available The essay deals with the theory of the respected American sociologist Orin Klapp pertaining to social types, with an emphasis on the Hero, Villain and the Fool. The system, through which Orin Klapp classifies these three types, developing their social functions, as well as the technique he uses in analyzing the American national character, emphasizing their value as a methodological means in comparing different societies, is reflected upon.

  2. T-Duality in Type II String Theory via Noncommutative Geometry and Beyond

    Mathai, V.

    This brief survey on how nocommutative and nonassociative geometry appears naturally in the study of T-duality in type II string theory, is essentially a transcript of my talks given at the 21st Nishinomiya-Yukawa Memorial Symposium on Theoretical Physics: Noncommutative Geometry and Quantum Spacetime in Physics, Japan, 11--15 November 2006.

  3. Preschoolers' Generation of Different Types of Counterfactual Statements and Theory of Mind Understanding

    Guajardo, Nicole R.; Turley-Ames, Kandi Jo

    2004-01-01

    Two studies examined associations between theory of mind performance and counterfactual thinking using both antecedent and consequent counterfactual tasks. Moreover, the studies examined children's abilities to generate different types of counterfactual statements in terms of direction and structure. Participants were 3-, 4-, and 5-year-old…

  4. Quantification of age-related changes in the structure model type and trabecular thickness of human tibial cancellous

    Ding, Ming; Hvid, I

    2000-01-01

    Structure model type and trabecular thickness are important characteristics in describing cancellous bone architecture. It has been qualitatively observed that a radical change of trabeculae from plate-like to rod-like occurs in aging, bone remodeling, and osteoporosis. Thickness of trabeculae has...... traditionally been measured using model-based histomorphometric methods on two-dimensional (2-D) sections. However, no quantitative study has been published based on three-dimensional (3-D) methods on the age-related changes in structure model type and trabecular thickness for human peripheral (tibial......) cancellous bone. In this study, 160 human proximal tibial cancellous bone specimens from 40 normal donors, aged 16 to 85 years, were collected. These specimens were micro-computed tomography (micro-CT) scanned, then the micro-CT images were segmented using optimal thresholds. From accurate 3-D data sets...

  5. Quantification of the effects of audible rattle and source type on the human response to environmental vibration.

    Woodcock, J; Sica, G; Peris, E; Sharp, C; Moorhouse, A T; Waddington, D C

    2016-03-01

    The present research quantifies the influence of source type and the presence of audible vibration-induced rattle on annoyance caused by vibration in residential environments. The sources of vibration considered are railway and the construction of a light rail system. Data were measured in the United Kingdom using a socio-vibration survey (N = 1281). These data are analyzed using ordinal logit models to produce exposure-response relationships describing community annoyance as a function of vibration exposure. The influence of source type and the presence of audible vibration-induced rattle on annoyance are investigated using dummy variable analysis, and quantified using odds-ratios and community tolerance levels. It is concluded that the sample population is more likely to express higher levels of annoyance if the vibration source is construction compared to railway, and if vibration-induced rattle is audible.

  6. Enantiomeric separation and quantification of ephedrine-type alkaloids in herbal materials by comprehensive two-dimensional gas chromatography.

    Wang, Min; Marriott, Philip J; Chan, Wing-Hong; Lee, Albert W M; Huie, Carmen W

    2006-04-21

    The separation of ephedrine-type alkaloids and their enantiomers in raw herbs and commercial herbal products was investigated by carrying out enantioselective separation in the first-dimension column (containing beta-cyclodextrin as the chiral selector) of a comprehensive two-dimensional gas chromatography (GC x GC) system, whereas a polar polyethylene glycol capillary column was used for separation in the second dimension. Naturally occurring ephedrine-type alkaloids and their synthetic analogues (enantiomeric counterparts) were adequately resolved from each other, as well as from potential interference species in the sample matrix using GC x GC, whereas single column GC analysis was unable to separate all the alkaloids of interest. Detection limits in the order of 0.1-1.3 microg/mL and linearity of calibration with R(2)>or=0.999 over approximately the range of 0.5-100 microg/mL for the quantitative determination of various ephedrine-type alkaloids were obtained. The commercial herbal products tested contained mostly (-)-ephedrine, (+)-pseudoephedrine, (-)-N-methylephedrine and (-)-norephedrine, with concentrations in the range of 40-2100, 0-1,300, 15-300 and 0-30 microg/g of the product, respectively, and repeatability of analysis was generally in the range of 1-5%. The present GCxGC method is effective and useful for the determination of the dosage levels of the principle ephedrine-type alkaloids in commercial health supplements and complex raw herb formulations, as well the differentiation of ephedrine-containing products that were derived from natural plant or synthetic sources, e.g., simply by visualizing the presence or absence of the enantiomeric pairs of (+/-) ephedrine and (+/-)-N-methylephedrine in the GC x GC chromatograms.

  7. Bipolarity in Jungian type theory and the Myers-Briggs Type Indicator.

    Girelli, S A; Stake, J E

    1993-04-01

    The standard form of the Myers-Briggs Type Indicator (MBTI; Myers & McCaulley, 1985) was constructed to measure introversion/extroversion, sensing/intuiting, and thinking/feeling as single, bipolar dimensions. We tested this assumption of bipolarity with a Likert form of the MBTI that allowed for the independent assessment of each attitude and function. A total of 106 female and 59 male undergraduate and graduate students completed the standard and Likert MBTI forms approximately 3 weeks apart. Evidence for the bipolarity of the introversion/extroversion dimension was weak, and findings did not support the bipolarity of the sensing/intuiting or thinking/feeling dimensions. Results provide evidence that high negative correlations within MBTI dimensions are an artifact of its forced-choice format. Implications of the findings for typology measurement are discussed.

  8. Precision automation of cell type classification and sub-cellular fluorescence quantification from laser scanning confocal images

    Hardy Craig Hall

    2016-02-01

    Full Text Available While novel whole-plant phenotyping technologies have been successfully implemented into functional genomics and breeding programs, the potential of automated phenotyping with cellular resolution is largely unexploited. Laser scanning confocal microscopy has the potential to close this gap by providing spatially highly resolved images containing anatomic as well as chemical information on a subcellular basis. However, in the absence of automated methods, the assessment of the spatial patterns and abundance of fluorescent markers with subcellular resolution is still largely qualitative and time-consuming. Recent advances in image acquisition and analysis, coupled with improvements in microprocessor performance, have brought such automated methods within reach, so that information from thousands of cells per image for hundreds of images may be derived in an experimentally convenient time-frame. Here, we present a MATLAB-based analytical pipeline to 1 segment radial plant organs into individual cells, 2 classify cells into cell type categories based upon random forest classification, 3 divide each cell into sub-regions, and 4 quantify fluorescence intensity to a subcellular degree of precision for a separate fluorescence channel. In this research advance, we demonstrate the precision of this analytical process for the relatively complex tissues of Arabidopsis hypocotyls at various stages of development. High speed and robustness make our approach suitable for phenotyping of large collections of stem-like material and other tissue types.

  9. Generalized N=1 and N=2 structures in M-theory and type II orientifolds

    Graña, Mariana

    2012-01-01

    We consider M-theory and type IIA reductions to four dimensions with N=2 and N=1 supersymmetry and discuss their interconnection. Our work is based on the framework of Exceptional Generalized Geometry (EGG), which extends the tangent bundle to include all symmetries in M-theory and type II string theory, covariantizing the local U-duality group E7. We describe general N=1 and N=2 reductions in terms of SU(7) and SU(6) structures on this bundle and thereby derive the effective four-dimensional N=1 and N=2 couplings, in particular we compute the Kahler and hyper-Kahler potentials as well as the triplet of Killing prepotentials (or the superpotential in the N=1 case). These structures and couplings can be described in terms of forms on an eight-dimensional tangent space where SL(8) contained in E7 acts, which might indicate a description in terms of an eight-dimensional internal space, similar to F-theory. We finally discuss an orbifold action in M-theory and its reduction to O6 orientifolds, and show how the pr...

  10. In-vivo segmentation and quantification of coronary lesions by optical coherence tomography images for a lesion type definition and stenosis grading.

    Celi, Simona; Berti, Sergio

    2014-10-01

    Optical coherence tomography (OCT) is a catheter-based medical imaging technique that produces cross-sectional images of blood vessels. This technique is particularly useful for studying coronary atherosclerosis. In this paper, we present a new framework that allows a segmentation and quantification of OCT images of coronary arteries to define the plaque type and stenosis grading. These analyses are usually carried out on-line on the OCT-workstation where measuring is mainly operator-dependent and mouse-based. The aim of this program is to simplify and improve the processing of OCT images for morphometric investigations and to present a fast procedure to obtain 3D geometrical models that can also be used for external purposes such as for finite element simulations. The main phases of our toolbox are the lumen segmentation and the identification of the main tissues in the artery wall. We validated the proposed method with identification and segmentation manually performed by expert OCT readers. The method was evaluated on ten datasets from clinical routine and the validation was performed on 210 images randomly extracted from the pullbacks. Our results show that automated segmentation of the vessel and of the tissue components are possible off-line with a precision that is comparable to manual segmentation for the tissue component and to the proprietary-OCT-console for the lumen segmentation. Several OCT sections have been processed to provide clinical outcome.

  11. Existence theory for sequential fractional differential equations with anti-periodic type boundary conditions

    Aqlan Mohammed H.

    2016-01-01

    Full Text Available We develop the existence theory for sequential fractional differential equations involving Liouville-Caputo fractional derivative equipped with anti-periodic type (non-separated and nonlocal integral boundary conditions. Several existence criteria depending on the nonlinearity involved in the problems are presented by means of a variety of tools of the fixed point theory. The applicability of the results is shown with the aid of examples. Our results are not only new in the given configuration but also yield some new special cases for specific choices of parameters involved in the problems.

  12. Generating Erler-Schnabl-type Solution for Tachyon Vacuum in Cubic Superstring Field Theory

    Arroyo, E Aldo

    2010-01-01

    We analyze a new class of identity-based solutions in open bosonic string field theory and cubic superstring field theory. Even though these solutions seem to be trivial, it turns out that after performing a suitable gauge transformation, we are left with the known Erler-Schnabl-type solutions which correctly reproduce the value for the D-brane tension. This important result shows explicitly that how a seemingly trivial solution can generate a non-trivial configuration which precisely represents the tachyon vacuum.

  13. Generating Erler-Schnabl-type solution for the tachyon vacuum in cubic superstring field theory

    Aldo Arroyo, E.

    2010-11-01

    We study a new set of identity-based solutions to analyze the problem of tachyon condensation in open bosonic string field theory and cubic superstring field theory. Even though these identity-based solutions seem to be trivial, it turns out that after performing a suitable gauge transformation, we are left with the known Erler-Schnabl-type solutions which correctly reproduce the value of the D-brane tension. This result shows explicitly that a seemingly trivial solution can generate a non-trivial configuration which precisely represents the tachyon vacuum.

  14. Preclinical evaluation and quantification of [{sup 18}F]MK-9470 as a radioligand for PET imaging of the type 1 cannabinoid receptor in rat brain

    Casteels, Cindy [K.U. Leuven, University Hospital Leuven, Division of Nuclear Medicine, Leuven (Belgium); K.U. Leuven, MoSAIC, Molecular Small Animal Imaging Center, Leuven (Belgium); University Hospital Gasthuisberg, Division of Nuclear Medicine, Leuven (Belgium); Koole, Michel; Laere, Koen van [K.U. Leuven, University Hospital Leuven, Division of Nuclear Medicine, Leuven (Belgium); K.U. Leuven, MoSAIC, Molecular Small Animal Imaging Center, Leuven (Belgium); Celen, Sofie; Bormans, Guy [K.U. Leuven, MoSAIC, Molecular Small Animal Imaging Center, Leuven (Belgium); K.U. Leuven, Laboratory for Radiopharmacy, Leuven (Belgium)

    2012-09-15

    [{sup 18}F]MK-9470 is an inverse agonist for the type 1 cannabinoid (CB1) receptor allowing its use in PET imaging. We characterized the kinetics of [{sup 18}F]MK-9470 and evaluated its ability to quantify CB1 receptor availability in the rat brain. Dynamic small-animal PET scans with [{sup 18}F]MK-9470 were performed in Wistar rats on a FOCUS-220 system for up to 10 h. Both plasma and perfused brain homogenates were analysed using HPLC to quantify radiometabolites. Displacement and blocking experiments were done using cold MK-9470 and another inverse agonist, SR141716A. The distribution volume (V{sub T}) of [{sup 18}F]MK-9470 was used as a quantitative measure and compared to the use of brain uptake, expressed as SUV, a simplified method of quantification. The percentage of intact [{sup 18}F]MK-9470 in arterial plasma samples was 80 {+-} 23 % at 10 min, 38 {+-} 30 % at 40 min and 13 {+-} 14 % at 210 min. A polar radiometabolite fraction was detected in plasma and brain tissue. The brain radiometabolite concentration was uniform across the whole brain. Displacement and pretreatment studies showed that 56 % of the tracer binding was specific and reversible. V{sub T} values obtained with a one-tissue compartment model plus constrained radiometabolite input had good identifiability ({<=}10 %). Ignoring the radiometabolite contribution using a one-tissue compartment model alone, i.e. without constrained radiometabolite input, overestimated the [{sup 18}F]MK-9470 V{sub T}, but was correlated. A correlation between [{sup 18}F]MK-9470 V{sub T} and SUV in the brain was also found (R {sup 2} = 0.26-0.33; p {<=} 0.03). While the presence of a brain-penetrating radiometabolite fraction complicates the quantification of [{sup 18}F]MK-9470 in the rat brain, its tracer kinetics can be modelled using a one-tissue compartment model with and without constrained radiometabolite input. (orig.)

  15. NEW PRINCIPLES OF POWER AND ENERGY RATE OF INCREMENTAL RATE TYPE FOR GENERALIZED CONTINUUM FIELD THEORIES

    戴天民

    2001-01-01

    The aim of this paper is to establish new principles of power and energy rate of incremental type in generalized continuum mechanics. By combining new principles of virtual velocity and virtual angular velocity as well as of virtual stress and virtual couple stress with cross terms of incremental rate type a new principle of power and energy rate of incremental rate type with cross terms for micropolar continuum field theories is presented and from it all corresponding equations of motion and boundary conditions as well as power and energy rate equations of incremental rate type for micropolar and nonlocal micropolar continua with the help of generalized Piola's theorems in all and without any additional requirement are derived. Complete results for micromorphic continua could be similarly derived. The derived results in the present paper are believed to be new. They could be used to establish corresponding finite element methods of incremental rate type for generalized continuum mechanics.

  16. Heartbeat-related displacement of the thoracic aorta in patients with chronic aortic dissection type B: Quantification by dynamic CTA

    Weber, Tim F. [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: tim.weber@med.uni-heidelberg.de; Ganten, Maria-Katharina [German Cancer Research Center, Department of Radiology, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)], E-mail: m.ganten@dkfz.de; Boeckler, Dittmar [University of Heidelberg, Department of Vascular and Endovascular Surgery, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: dittmar.boeckler@med.uni-heidelberg.de; Geisbuesch, Philipp [University of Heidelberg, Department of Vascular and Endovascular Surgery, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: philipp.geisbuesch@med.uni-heidelberg.de; Kauczor, Hans-Ulrich [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany)], E-mail: hu.kauczor@med.uni-heidelberg.de; Tengg-Kobligk, Hendrik von [German Cancer Research Center, Department of Radiology, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)], E-mail: h.vontengg@dkfz.de

    2009-12-15

    Purpose: The purpose of this study was to characterize the heartbeat-related displacement of the thoracic aorta in patients with chronic aortic dissection type B (CADB). Materials and methods: Electrocardiogram-gated computed tomography angiography was performed during inspiratory breath-hold in 11 patients with CADB: Collimation 16 mm x 1 mm, pitch 0.2, slice thickness 1 mm, reconstruction increment 0.8 mm. Multiplanar reformations were taken for 20 equidistant time instances through both ascending (AAo) and descending aorta (true lumen, DAoT; false lumen, DAoF) and the vertex of the aortic arch (VA). In-plane vessel displacement was determined by region of interest analysis. Results: Mean displacement was 5.2 {+-} 1.7 mm (AAo), 1.6 {+-} 1.0 mm (VA), 0.9 {+-} 0.4 mm (DAoT), and 1.1 {+-} 0.4 mm (DAoF). This indicated a significant reduction of displacement from AAo to VA and DAoT (p < 0.05). The direction of displacement was anterior for AAo and cranial for VA. Conclusion: In CADB, the thoracic aorta undergoes a heartbeat-related displacement that exhibits an unbalanced distribution of magnitude and direction along the thoracic vessel course. Since consecutive traction forces on the aortic wall have to be assumed, these observations may have implications on pathogenesis of and treatment strategies for CADB.

  17. Real-time PCR method for the detection and quantification of Acanthamoeba species in various types of water samples.

    Kao, Po-Min; Tung, Min-Che; Hsu, Bing-Mu; Tsai, Hsien-Lung; She, Cheng-Yu; Shen, Shu-Min; Huang, Wen-Chien

    2013-03-01

    In this study, a quantitative real-time PCR was developed to detect and quantify Acanthamoeba spp. in various environmental water samples. The water samples were taken from watershed, water treatment plant, and three thermal spring recreation areas. The overall detection rate was 14.2 % (25/176) for Acanthamoeba spp. The percentages of samples containing Acanthamoeba spp. from river water, raw drinking water, and thermal spring water were 13 % (13/100), 25 % (7/28), and 10.4 % (5/48), respectively. Acanthamoeba spp. concentrations were determined according to SYBR Green quantitative real-time PCR. A plasmid-based standard curve was constructed to determine the Acanthamoeba concentration using dilution factors for achieving 1.36 × 10(9) gene copies per PCR for 18S rRNA gene in Acanthamoeba spp. The resulting concentrations varied by the type of water, in the range of 46-2.6 × 10(2) cells/l in positive raw drinking water, 2.7 × 10(2)-1.5 × 10(4) cells/l in river water, and 54-1.7 × 10(3) cells/l in thermal spring water. The presence of Acanthamoeba spp. in the raw drinking water samples was also found to have a significant difference with heterotrophic plate count. The presence of Acanthamoeba spp. in various aquatic environments may be a potential health hazard and must be further evaluated.

  18. A COSSERAT-TYPE PLATE THEORY AND ITS APPLICATION TO CARBON NANOTUBE MICROSTRUCTURE

    Abdellatif Selmi

    2014-01-01

    Full Text Available The predictive capabilities of plate and shell theories greatly depend on their underlying kinematic assumptions. In this study, we develop a Cosserat-type elastic plate theory which accounts for rotations around the normal to the mid-surface plane (so-called drilling rotations. Internal loads, equilibrium equations, boundary conditions and constitutive equations are derived. The case of a Single Walled carbon Nanotube (SWNT modelled as a Cosserat medium is taken here as a reference example. Material parameters are identified and the proposed theory is used to solve analytically the problem of a polymer-SWNT composite tube under torsion. Predictions such as an absolute size effect are compared to those of the classical Cauchy-de Saint-Venant results.

  19. Differential models for B-type open-closed topological Landau-Ginzburg theories

    Babalic, Mirela; Lazaroiu, Calin Iuliu; Tavakol, Mehdi

    2016-01-01

    We propose a family of differential models for B-type open-closed topological Landau-Ginzburg theories defined by a pair $(X,W)$, where $X$ is any non-compact Calabi-Yau manifold and $W$ is any holomorphic complex-valued function defined on $X$ whose critical set is compact. The models are constructed at cochain level using smooth data, including the twisted Dolbeault algebra of polyvector valued forms and a twisted Dolbeault category of holomorphic factorizations of $W$. We give explicit proposals for cochain level versions of the bulk and boundary traces and for the bulk-boundary and boundary-bulk maps of the Landau-Ginzburg theory. We prove that most of the axioms of an open-closed topological field theory are satisfied on cohomology and conjecture that the remaining axioms are also satisfied.

  20. LRS Bianchi type-II string cosmological models in a modified theory of gravitation

    Kanakavalli, T.; Ananda Rao, G.; Reddy, D. R. K.

    2017-03-01

    This paper is devoted to the investigation of spatially homogeneous anisotropic LRS Bianchi type-II cosmological models with string source in a modified theory of gravitation formulated by Harko et al. (Phys. Rev. D 84:024020, 2011) which is universally known as f( R, T) gravity. Here R is the Ricci scalar and T is the trace of the energy momentum tensor. By solving the field equation we have presented massive string and Takabyasi or p-string models in this theory. However it is interesting to note that geometric string in this space-time does not exist in this theory. Physical and geometrical properties of the strings obtained are also discussed.

  1. Quantification of the physiochemical constraints on the export of spider silk proteins by Salmonella type III secretion

    Voigt Christopher A

    2010-10-01

    Full Text Available Abstract Background The type III secretion system (T3SS is a molecular machine in gram negative bacteria that exports proteins through both membranes to the extracellular environment. It has been previously demonstrated that the T3SS encoded in Salmonella Pathogenicity Island 1 (SPI-1 can be harnessed to export recombinant proteins. Here, we demonstrate the secretion of a variety of unfolded spider silk proteins and use these data to quantify the constraints of this system with respect to the export of recombinant protein. Results To test how the timing and level of protein expression affects secretion, we designed a hybrid promoter that combines an IPTG-inducible system with a natural genetic circuit that controls effector expression in Salmonella (psicA. LacO operators are placed in various locations in the psicA promoter and the optimal induction occurs when a single operator is placed at the +5nt (234-fold and a lower basal level of expression is achieved when a second operator is placed at -63nt to take advantage of DNA looping. Using this tool, we find that the secretion efficiency (protein secreted divided by total expressed is constant as a function of total expressed. We also demonstrate that the secretion flux peaks at 8 hours. We then use whole gene DNA synthesis to construct codon optimized spider silk genes for full-length (3129 amino acids Latrodectus hesperus dragline silk, Bombyx mori cocoon silk, and Nephila clavipes flagelliform silk and PCR is used to create eight truncations of these genes. These proteins are all unfolded polypeptides and they encompass a variety of length, charge, and amino acid compositions. We find those proteins fewer than 550 amino acids reliably secrete and the probability declines significantly after ~700 amino acids. There also is a charge optimum at -2.4, and secretion efficiency declines for very positively or negatively charged proteins. There is no significant correlation with hydrophobicity

  2. Quasi-independence, homology and the unity of type: a topological theory of characters.

    Wagner, Günter P; Stadler, Peter F

    2003-02-21

    In this paper Lewontin's notion of "quasi-independence" of characters is formalized as the assumption that a region of the phenotype space can be represented by a product space of orthogonal factors. In this picture each character corresponds to a factor of a region of the phenotype space. We consider any region of the phenotype space that has a given factorization as a "type", i.e. as a set of phenotypes that share the same set of phenotypic characters. Using the notion of local factorizations we develop a theory of character identity based on the continuation of common factors among different regions of the phenotype space. We also consider the topological constraints on evolutionary transitions among regions with different regional factorizations, i.e. for the evolution of new types or body plans. It is shown that direct transition between different "types" is only possible if the transitional forms have all the characters that the ancestral and the derived types have and are thus compatible with the factorization of both types. Transitional forms thus have to go over a "complexity hump" where they have more quasi-independent characters than either the ancestral as well as the derived type. The only logical, but biologically unlikely, alternative is a "hopeful monster" that transforms in a single step from the ancestral type to the derived type. Topological considerations also suggest a new factor that may contribute to the evolutionary stability of "types". It is shown that if the type is decomposable into factors which are vertex irregular (i.e. have states that are more or less preferred in a random walk), the region of phenotypes representing the type contains islands of strongly preferred states. In other words types have a statistical tendency of retaining evolutionary trajectories within their interior and thus add to the evolutionary persistence of types.

  3. Deformed Type 0A Matrix Model and Super-Liouville Theory for Fermionic Black Holes

    Ahn, C; Park, J; Suyama, T; Yamamoto, M; Ahn, Changrim; Kim, Chanju; Park, Jaemo; Suyama, Takao; Yamamoto, Masayoshi

    2006-01-01

    We consider a ${\\hat c}=1$ model in the fermionic black hole background. For this purpose we consider a model which contains both the N=1 and the N=2 super-Liouville interactions. We propose that this model is dual to a recently proposed type 0A matrix quantum mechanics model with vortex deformations. We support our conjecture by showing that non-perturbative corrections to the free energy computed by both the matrix model and the super-Liouville theories agree exactly by treating the N=2 interaction as a small perturbation. We also show that a two-point function on sphere calculated from the deformed type 0A matrix model is consistent with that of the N=2 super-Liouville theory when the N=1 interaction becomes small. This duality between the matrix model and super-Liouville theories leads to a conjecture for arbitrary $n$-point correlation functions of the N=1 super-Liouville theory on the sphere.

  4. Generalization of Wertheim's theory for the assembly of various types of rings.

    Tavares, J M; Almarza, N G; Telo da Gama, M M

    2015-08-07

    We generalize Wertheim's first order perturbation theory to account for the effect in the thermodynamics of the self-assembly of rings characterized by two energy scales. The theory is applied to a lattice model of patchy particles and tested against Monte Carlo simulations on a fcc lattice. These particles have 2 patches of type A and 10 patches of type B, which may form bonds AA or AB that decrease the energy by εAA and by εAB ≡ rεAA, respectively. The angle θ between the 2 A-patches on each particle is fixed at 60°, 90° or 120°. For values of r below 1/2 and above a threshold rth(θ) the models exhibit a phase diagram with two critical points. Both theory and simulation predict that rth increases when θ decreases. We show that the mechanism that prevents phase separation for models with decreasing values of θ is related to the formation of loops containing AB bonds. Moreover, we show that by including the free energy of B-rings (loops containing one AB bond), the theory describes the trends observed in the simulation results, but that for the lowest values of θ, the theoretical description deteriorates due to the increasing number of loops containing more than one AB bond.

  5. Automating Access Control Logics in Simple Type Theory with LEO-II (Techreport)

    Benzmueller, Christoph

    2009-01-01

    Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory (which is also known as higher-order logic) and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound and complete embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in prominent access control logics.

  6. The epsilon regime of chiral perturbation theory with Wilson-type fermions

    Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Shindler, A. [Liverpool Univ. (United Kingdom). Theoretical Physics Division

    2009-11-15

    In this proceeding contribution we report on the ongoing effort to simulate Wilson-type fermions in the so called epsilon regime of chiral perturbation theory (cPT).We present results for the chiral condensate and the pseudoscalar decay constant obtained with Wilson twisted mass fermions employing two lattice spacings, two different physical volumes and several quark masses. With this set of simulations we make a first attempt to estimate the systematic uncertainties. (orig.)

  7. Flux vacua in Dirac-Born-Infeld type Einstein-Maxwell theory

    Maki, Takuya; Kobayashi, Koichiro; Shiraishi, Kiyoshi

    2011-01-01

    We study compactification of extra dimensions in a theory of Dirac-Born-Infeld (DBI) type gravity. We investigate the solution for Minkowski spacetime with an $S^{2}$ extra space. The solution is derived by the effective potential method in the presence of the magnetic flux on the extra sphere. We find that, in a certain model, the radius of the extra space has a minimum value independent of the higher-dimensional Newton constant in weak-field limit.

  8. Bianchi type VI1 cosmological model with wet dark fluid in scale invariant theory of gravitation

    Mishra, B

    2014-01-01

    In this paper, we have investigated Bianchi type VIh, II and III cosmological model with wet dark fluid in scale invariant theory of gravity, where the matter field is in the form of perfect fluid and with a time dependent gauge function (Dirac gauge). A non-singular model for the universe filled with disorder radiation is constructed and some physical behaviors of the model are studied for the feasible VIh (h = 1) space-time.

  9. G\\"odel and G\\"odel-type universes in Brans-Dicke theory

    Agudelo, J A; Petrov, A Yu; Porfírio, P J; Santos, A F

    2016-01-01

    In this paper, conditions for existence of G\\"{o}del and G\\"{o}del-type solutions in Brans-Dicke (BD) scalar-tensor theory and their main features are studied. The special attention is paid to consistency of equations of motion, causality, existence of CTCs (closed time-like curves) and to the role which cosmological constant and Mach principle play to achieve the consistency of this model.

  10. Spatially Homogeneous Bianchi Type V Cosmological Model in the Scale-Covariant Theory of Gravitation

    Shri Ram; M.K.Verma; Mohd.Zeyauddin

    2009-01-01

    We discuss spatially homogeneous and anisotropic Bianchi type-V spacetime filled with a perfect fluid in the framework of the scaie-covariant theory of gravitation proposed by Canuto et al.By applying the law of variation for Hubble's parameter,exact solutions of the field equations are obtained,which correspond to the model of the universe having a big-bang type singularity at the initial time t=0.The cosmological model,evolving from the initial singularity,expands with power-law expansion and gives essentially an empty space for a large time.The physical and dynamical properties of the model are also discussed.

  11. Specimens: "most of" generic NPs in a contextually flexible type theory

    Retoré, Christian

    2011-01-01

    This paper proposes to compute the meanings associated to sentences with generic NPs corresponding to the most of generalized quantifier. We call these generics specimens and they resemble stereotypes or prototypes in lexical semantics. The meanings are viewed as logical formulae that can be thereafter interpreted in your favorite models. We rather depart from the dominant Fregean single untyped universe and go for type theory with hints from Hilbert epsilon calculus and from medieval philosophy. Our type theoretic analysis bears some resemblance with on going work in lexical semantics. Our model also applies to classical examples involving a class (or a generic element of this class) which is provided by the context. An outcome of this study is that, in the minimalism-contextualism debate, if one adopts a type theoretical view, terms encode the purely semantic meaning component while their typing is pragmatically determined.

  12. Type Synthesis for 4-DOF Parallel Press Mechanism Using GF Set Theory

    HE Jun; GAO Feng; MENG Xiangdun; GUO Weizhong

    2015-01-01

    Parallel mechanisms is used in the large capacity servo press to avoid the over-constraint of the traditional redundant actuation. Currently, the researches mainly focus on the performance analysis for some specific parallel press mechanisms. However, the type synthesis and evaluation of parallel press mechanisms is seldom studied, especially for the four degrees of freedom(DOF) press mechanisms. The type synthesis of 4-DOF parallel press mechanisms is carried out based on the generalized function(GF) set theory. Five design criteria of 4-DOF parallel press mechanisms are firstly proposed. The general procedure of type synthesis of parallel press mechanisms is obtained, which includes number synthesis, symmetrical synthesis of constraint GF sets, decomposition of motion GF sets and design of limbs. Nine combinations of constraint GF sets of 4-DOF parallel press mechanisms, ten combinations of GF sets of active limbs, and eleven combinations of GF sets of passive limbs are synthesized. Thirty-eight kinds of press mechanisms are presented and then different structures of kinematic limbs are designed. Finally, the geometrical constraint complexity(GCC), kinematic pair complexity(KPC), and type complexity(TC) are proposed to evaluate the press types and the optimal press type is achieved. The general methodologies of type synthesis and evaluation for parallel press mechanism are suggested.

  13. Mild to severe social fears: ranking types of feared social situations using item response theory.

    Crome, Erica; Baillie, Andrew

    2014-06-01

    Social anxiety disorder is one of the most common mental disorders, and is associated with long term impairment, distress and vulnerability to secondary disorders. Certain types of social fears are more common than others, with public speaking fears typically the most prevalent in epidemiological surveys. The distinction between performance- and interaction-based fears has been the focus of long-standing debate in the literature, with evidence performance-based fears may reflect more mild presentations of social anxiety. This study aims to explicitly test whether different types of social fears differ in underlying social anxiety severity using item response theory techniques. Different types of social fears were assessed using items from three different structured diagnostic interviews in four different epidemiological surveys in the United States (n=2261, n=5411) and Australia (n=1845, n=1497); and ranked using 2-parameter logistic item response theory models. Overall, patterns of underlying severity indicated by different fears were consistent across the four samples with items functioning across a range of social anxiety. Public performance fears and speaking at meetings/classes indicated the lowest levels of social anxiety, with increasing severity indicated by situations such as being assertive or attending parties. Fears of using public bathrooms or eating, drinking or writing in public reflected the highest levels of social anxiety. Understanding differences in the underlying severity of different types of social fears has important implications for the underlying structure of social anxiety, and may also enhance the delivery of social anxiety treatment at a population level.

  14. A Density Functional Theory Study of Doped Tin Monoxide as a Transparent p-type Semiconductor

    Bianchi Granato, Danilo

    2012-05-01

    In the pursuit of enhancing the electronic properties of transparent p-type semiconductors, this work uses density functional theory to study the effects of doping tin monoxide with nitrogen, antimony, yttrium and lanthanum. An overview of the theoretical concepts and a detailed description of the methods employed are given, including a discussion about the correction scheme for charged defects proposed by Freysoldt and others [Freysoldt 2009]. Analysis of the formation energies of the defects points out that nitrogen substitutes an oxygen atom and does not provide charge carriers. On the other hand, antimony, yttrium, and lanthanum substitute a tin atom and donate n-type carriers. Study of the band structure and density of states indicates that yttrium and lanthanum improves the hole mobility. Present results are in good agreement with available experimental works and help to improve the understanding on how to engineer transparent p-type materials with higher hole mobilities.

  15. Axion-dilaton-modulus gravity theory of Brans-Dicke-type and conformal symmetry

    Quirós, I

    2000-01-01

    Conformal symmetry is investigated within the context of axion-dilaton-modulus theory of gravity of Brans-Dicke-type. A distinction is made between general conformal symmetry and invariance under transformations of the physical units. The conformal degree of symmetry of the theory is studied when quantum fermion (lepton) modes with electromagnetic interaction are considered. Based on the requirement of invariance of the physical laws under general transformations of the units of measure, arguments are given that point at a matter action with non-minimal coupling of the dilaton to the matter fields as the most viable description of the world within the context of the model studied. The geometrical implications of the results obtained are discussed.

  16. Social cognitive theory correlates of moderate-intensity exercise among adults with type 2 diabetes.

    Heiss, Valerie J; Petosa, R L

    2016-01-01

    The purpose of this study was to identify social cognitive theory (SCT) correlates of moderate- to vigorous-intensity exercise (MVPA) among adults with type 2 diabetes. Adults with type 2 diabetes (N = 181) participated in the study. Participants were recruited through ResearchMatch.org to complete an online survey. The survey used previously validated instruments to measure dimensions of self-efficacy, self-regulation, social support, outcome expectations, the physical environment, and minutes of MVPA per week. Spearman Rank Correlations were used to determine the relationship between SCT variables and MVPA. Classification and Regression Analysis using a decision tree model was used to determine the amount of variance in MVPA explained by SCT variables. Due to low levels of vigorous activity, only moderate-intensity exercise (MIE) was analyzed. SCT variables explained 42.4% of the variance in MIE. Self-monitoring, social support from family, social support from friends, and self-evaluative outcome expectations all contributed to the variability in MIE. Other contributing variables included self-reward, task self-efficacy, social outcome expectations, overcoming barriers, and self-efficacy for making time for exercise. SCT is a useful theory for identifying correlates of MIE among adults with type 2 diabetes. The SCT correlates can be used to refine diabetes education programs to target the adoption and maintenance of regular exercise.

  17. Axion decay constants at special points in type II string theory

    Honda, Masaki; Oikawa, Akane; Otsuka, Hajime

    2017-01-01

    We propose the mechanism to disentangle the decay constant of closed string axion from the string scale in the framework of type II string theory on Calabi-Yau manifold. We find that the quantum and geometrical corrections in the prepotential that arise at some special points in the moduli space widen the window of axion decay constant. In particular, around the small complex structure points, the axion decay constant becomes significantly lower than the string scale. We also discuss the moduli stabilization leading to the phenomenologically attractive low-scale axion decay constant.

  18. Reissner-Nordstr(o)m-de-Sitter-type Solution by a Gauge Theory of Gravity

    V. Enache; Camelia Popa; V. P(a)un; M. Agop

    2008-01-01

    We use the theory based on a gravitational gauge group (Wu's model) to obtain a spherical symmetric solution of the field equations for the gravitational potential on a Minkowski spacetime. The gauge group, the gauge covariant derivative, the strength tensor of the gauge field, the gauge invariant Lagrangean with the cosmological constant, the field equations of the gauge potentials with a gravitational energy-momentum tensor as well as with a tensor of the field of a point like source are determined. Finally, a Reissner-Nordstr(o)m-de Sitter-type metric on the gauge group space is obtained.

  19. General N=1 supersymmetric flux vacua of massive type IIA string theory.

    Behrndt, Klaus; Cvetic, Mirjam

    2005-07-08

    We derive conditions for the existence of four-dimensional N=1 supersymmetric flux vacua of massive type IIA string theory with general supergravity fluxes turned on. For an SU(3) singlet Killing spinor, we show that such flux vacua exist when the internal geometry is nearly Kähler. The geometry is not warped, all the allowed fluxes are proportional to the mass parameter, and the dilaton is fixed by a ratio of (quantized) fluxes. The four-dimensional cosmological constant, while negative, becomes small in the vacuum with the weak string coupling.

  20. Axion decay constants at special points in type II string theory

    Honda, Masaki; Otsuka, Hajime

    2016-01-01

    We propose the mechanism to disentangle the decay constant of closed string axion from the string scale in the framework of type II string theory on Calabi-Yau manifold. We find that the quantum and geometrical corrections in the prepotential that arise at some special points in the moduli space widen the window of axion decay constant. In particular, around the small complex structure points, the axion decay constant becomes significantly lower than the string scale. We also discuss the moduli stabilization leading to the phenomenologically attractive low-scale axion decay constant.

  1. Discovering cell types in flow cytometry data with random matrix theory

    Shen, Yang; Nussenblatt, Robert; Losert, Wolfgang

    Flow cytometry is a widely used experimental technique in immunology research. During the experiments, peripheral blood mononuclear cells (PBMC) from a single patient, labeled with multiple fluorescent stains that bind to different proteins, are illuminated by a laser. The intensity of each stain on a single cell is recorded and reflects the amount of protein expressed by that cell. The data analysis focuses on identifying specific cell types related to a disease. Different cell types can be identified by the type and amount of protein they express. To date, this has most often been done manually by labelling a protein as expressed or not while ignoring the amount of expression. Using a cross correlation matrix of stain intensities, which contains both information on the proteins expressed and their amount, has been largely ignored by researchers as it suffers from measurement noise. Here we present an algorithm to identify cell types in flow cytometry data which uses random matrix theory (RMT) to reduce noise in a cross correlation matrix. We demonstrate our method using a published flow cytometry data set. Compared with previous analysis techniques, we were able to rediscover relevant cell types in an automatic way. Department of Physics, University of Maryland, College Park, MD 20742.

  2. Energy of the Universe in Bianchi-type I Models in Moller's Tetrad Theory of Gravity

    Aydogdu, O; Aydogdu, Oktay; Salti, Mustafa

    2005-01-01

    In this paper, using the energy definition in Moller's tetrad theory of gravity we calculate the total energy of the universe in Bianchi-type I cosmological models which includes both the matter and gravitational fields. The total energy is found to be zero and this result agrees with a previous works of Banerjee-Sen who investigated this problem using the general relativity version of the Einstein energy-momentum complex and Xulu who investigated same problem using the general relativity versions of the Landau-lifshitz, Papapetrou and Weinberg's energy-momentum complexes. The result that total energy of the universe in Bianchi-type I universes is zero supports the viewpoint of Tryon.

  3. Clumpy Langmuir waves in type III radio sources - Comparison of stochastic-growth theory with observations

    Robinson, P. A.; Cairns, I. H.; Gurnett, D. A.

    1993-01-01

    Detailed comparisons are made between the Langmuir-wave properties predicted by the recently developed stochastic-growth theory of type III sources and those observed by the plasma wave experiment on ISEE 3, after correcting for the main instrumental and selection effects. Analysis of the observed field-strength distribution confirms the theoretically predicted form and implies that wave growth fluctuates both spatially and temporally in sign and magnitude, leading to an extremely clumpy distribution of fields. A cutoff in the field-strength distribution is seen at a few mV/m, corresponding to saturation via nonlinear effects. Analysis of the size distribution of Langmuir clumps yields results in accord with those obtained in earlier work and with the size distribution of ambient density fluctuations in the solar wind. This confirms that the inhomogeneities in the Langmuir growth rate are determined by the density fluctuations and that these fluctuations persist during type III events.

  4. T-dualization of type IIB superstring theory in double space

    Nikolić, Bojan

    2015-01-01

    In this article we offer the new interpretation of T-dualization procedure of type IIB superstring theory in double space framework. We use the ghost free action of type IIB superstring in pure spinor formulation in approximation of constant background fields up to the quadratic terms. T-dualization along any subset of the initial coordinates, $x^a$, is equivalent to the permutation of this subset with subset of the corresponding T-dual coordinates, $y_a$, in double space coordinate $Z^M=(x^\\mu,y_\\mu)$. Demanding that the T-dual transformation law after exchange $x^a\\leftrightarrow y_a$ has the same form as initial one, we obtain the T-dual NS-NS and NS-R background fields. The T-dual R-R field strength is determined up to one arbitrary constant under some assumptions.

  5. The early life origin theory in the development of cardiovascular disease and type 2 diabetes.

    Lindblom, Runa; Ververis, Katherine; Tortorella, Stephanie M; Karagiannis, Tom C

    2015-04-01

    Life expectancy has been examined from a variety of perspectives in recent history. Epidemiology is one perspective which examines causes of morbidity and mortality at the population level. Over the past few 100 years there have been dramatic shifts in the major causes of death and expected life length. This change has suffered from inconsistency across time and space with vast inequalities observed between population groups. In current focus is the challenge of rising non-communicable diseases (NCD), such as cardiovascular disease and type 2 diabetes mellitus. In the search to discover methods to combat the rising incidence of these diseases, a number of new theories on the development of morbidity have arisen. A pertinent example is the hypothesis published by David Barker in 1995 which postulates the prenatal and early developmental origin of adult onset disease, and highlights the importance of the maternal environment. This theory has been subject to criticism however it has gradually gained acceptance. In addition, the relatively new field of epigenetics is contributing evidence in support of the theory. This review aims to explore the implication and limitations of the developmental origin hypothesis, via an historical perspective, in order to enhance understanding of the increasing incidence of NCDs, and facilitate an improvement in planning public health policy.

  6. On the effective theory of type II string compactifications on nilmanifolds and coset spaces

    Caviezel, Claudio

    2009-07-30

    In this thesis we analyzed a large number of type IIA strict SU(3)-structure compactifications with fluxes and O6/D6-sources, as well as type IIB static SU(2)-structure compactifications with fluxes and O5/O7-sources. Restricting to structures and fluxes that are constant in the basis of left-invariant one-forms, these models are tractable enough to allow for an explicit derivation of the four-dimensional low-energy effective theory. The six-dimensional compact manifolds we studied in this thesis are nilmanifolds based on nilpotent Lie-algebras, and, on the other hand, coset spaces based on semisimple and U(1)-groups, which admit a left-invariant strict SU(3)- or static SU(2)-structure. In particular, from the set of 34 distinct nilmanifolds we identified two nilmanifolds, the torus and the Iwasawa manifold, that allow for an AdS{sub 4}, N = 1 type IIA strict SU(3)-structure solution and one nilmanifold allowing for an AdS{sub 4}, N = 1 type IIB static SU(2)-structure solution. From the set of all the possible six-dimensional coset spaces, we identified seven coset spaces suitable for strict SU(3)-structure compactifications, four of which also allow for a static SU(2)-structure compactification. For all these models, we calculated the four-dimensional low-energy effective theory using N = 1 supergravity techniques. In order to write down the most general four-dimensional effective action, we also studied how to classify the different disconnected ''bubbles'' in moduli space. (orig.)

  7. Scale relativity theory and integrative systems biology: 2. Macroscopic quantum-type mechanics.

    Nottale, Laurent; Auffray, Charles

    2008-05-01

    In these two companion papers, we provide an overview and a brief history of the multiple roots, current developments and recent advances of integrative systems biology and identify multiscale integration as its grand challenge. Then we introduce the fundamental principles and the successive steps that have been followed in the construction of the scale relativity theory, which aims at describing the effects of a non-differentiable and fractal (i.e., explicitly scale dependent) geometry of space-time. The first paper of this series was devoted, in this new framework, to the construction from first principles of scale laws of increasing complexity, and to the discussion of some tentative applications of these laws to biological systems. In this second review and perspective paper, we describe the effects induced by the internal fractal structures of trajectories on motion in standard space. Their main consequence is the transformation of classical dynamics into a generalized, quantum-like self-organized dynamics. A Schrödinger-type equation is derived as an integral of the geodesic equation in a fractal space. We then indicate how gauge fields can be constructed from a geometric re-interpretation of gauge transformations as scale transformations in fractal space-time. Finally, we introduce a new tentative development of the theory, in which quantum laws would hold also in scale space, introducing complexergy as a measure of organizational complexity. Initial possible applications of this extended framework to the processes of morphogenesis and the emergence of prokaryotic and eukaryotic cellular structures are discussed. Having founded elements of the evolutionary, developmental, biochemical and cellular theories on the first principles of scale relativity theory, we introduce proposals for the construction of an integrative theory of life and for the design and implementation of novel macroscopic quantum-type experiments and devices, and discuss their potential

  8. Type IIB String Backgrounds on Parallelizable PP-Waves and Conformal Liouville Theory

    Hssaini, M

    2003-01-01

    The scope of this work concerns the adaptation of the parallelizability pp-wave (Ppp-wave) process to D=10 type IIB string backgrounds in the presence of the non-trivial anti-self dual R-R 5-form $\\QTR{cal}{F}$. This is important in the sense that it gives rise to some unsuspected properties. In fact, exact solutions of type IIB string backgrounds on Ppp-waves are discussed. For the $u$-dependence of the dilaton field $\\Phi $, we establish explicitly a correspondence between type IIB supergravity equations of motion and 2d-conformal Liouville field theory. We show also that the corresponding conserved conformal current $T(\\Phi)$ coincides exactly with the trace of the symmetric matrix $\\mu_{ij}$ appearing in the quadratic front factor $F=\\mu _{ij}x^{i}x^{j}$ of the Ppp-wave. Furthermore, we consider the transverse space dependence of the dilaton $\\Phi $ and show that the supergravity equations are easily solved for the linear realization of the dilaton field. Other remarkable properties related to this case a...

  9. Statistical analysis of 4 types of neck whiplash injuries based on classical meridian theory.

    Chen, Yemeng; Zhao, Yan; Xue, Xiaolin; Li, Hui; Wu, Xiuyan; Zhang, Qunce; Zheng, Xin; Wang, Tianfang

    2015-01-01

    As one component of the Chinese medicine meridian system, the meridian sinew (Jingjin, (see text), tendino-musculo) is specially described as being for acupuncture treatment of the musculoskeletal system because of its dynamic attributes and tender point correlations. In recent decades, the therapeutic importance of the sinew meridian has become revalued in clinical application. Based on this theory, the authors have established therapeutic strategies of acupuncture treatment in Whiplash-Associated Disorders (WAD) by categorizing four types of neck symptom presentations. The advantage of this new system is to make it much easier for the clinician to find effective acupuncture points. This study attempts to prove the significance of the proposed therapeutic strategies by analyzing data collected from a clinical survey of various WAD using non-supervised statistical methods, such as correlation analysis, factor analysis, and cluster analysis. The clinical survey data have successfully verified discrete characteristics of four neck syndromes, based upon the range of motion (ROM) and tender point location findings. A summary of the relationships among the symptoms of the four neck syndromes has shown the correlation coefficient as having a statistical significance (P syndrome factors are more related to the Liver, as originally described in classical theory. The hypothesis of meridian sinew syndromes in WAD is clearly supported by the statistical analysis of the clinical trials. This new discovery should be beneficial in improving therapeutic outcomes.

  10. Bianchi type I anisotropic universe and stability interacting ghost dark energy in Brans-Dicke theories

    Hossienkhani, Hossien

    2016-01-01

    A spatially homogeneous and anisotropic Bianchi type I universe has been studied with the ghost dark energy (GDE) in the framework of Brans-Dicke theory. For this purpose, we use the squared sound speed $v_s^2$ whose sign determines the stability of the model. At first, we obtain the equation of state parameter, $\\omega_\\Lambda$, the deceleration parameter $q$ and the evolution equation of the ghost dark energy. Then, we extend our study to the case of ghost dark energy in a non-isotropic and Brans-Dicke framework and find out that the transition of $\\omega_\\Lambda$ to the phantom regime can be more easily accounted for than when it is restored into the Einstein field equations. Our numerical result show the effects of the interaction and anisotropic on the evolutionary behaviour the ghost dark energy models. In conclusion, we find evidence that the ghost dark energy in BD theory can lead to a stable universe favored by observations at the present time.

  11. From Type II string theory towards BSM/dark sector physics

    Honecker, Gabriele

    2016-01-01

    Four-dimensional compactifications of string theory provide a controlled set of possible gauge representations accounting for BSM particles and dark sector components. In this review, constraints from perturbative Type II string compactifications in the geometric regime are discussed in detail and then compared to results from heterotic string compactifications and non-perturbative/non-geometric corners. As a prominent example, an open string realization of the QCD axion is presented. The status of deriving the associated low-energy effective action in four dimensions is discussed and open avenues of major phenomenological importance are highlighted. As examples, a mechanism of closed string moduli stabilization by D-brane backreaction as well as one-loop threshold corrections to the gauge couplings and balancing a low string scale $M_{\\text{string}}$ with unisotropic compact dimensions are discussed together with implications on potential future new physics observations. For illustrative purposes, an explici...

  12. Type I and $new$ seesaw in left-right symmetric theories

    Chakrabortty, Joydeep

    2010-01-01

    We extend the Type I seesaw and suggest a $new$ seesaw mechanism to generate neutrino masses within the left-right symmetric theories where parity is spontaneously broken. We construct a next to minimal left-right symmetric model where neutrino masses are determined irrespective of the B-L breaking scale and call it the $new$ seesaw mechanism. In this scenario B-L scale can be very low. This makes B-L gauge boson and the quasi-Dirac $heavy$ leptons very light. These TeV scale particles could have large impact on lepton flavor and CP violating processes. We also shed light on the phenomenological aspects of the model within the reach of the LHC.

  13. Observations and theory of mass loss in late-type stars

    Hartmann, L.

    1981-01-01

    The presented review is mainly concerned with the ubiquitous mass loss which occurs during most of a star's existence as a cool giant or supergiant. Observations of mass loss are considered, taking into account wind components and kinematics, and the temperature structure of cool winds. Theories of mass loss are examined, giving attention to radiation pressure on dust, radiation pressure in Lyman alpha, and magnetic wave-driven winds. It is pointed out that the study of mass loss from late-type stars appears to be entering a promising new phase. In this phase, the behavior of cool giants and supergiants is considered from a solar perspective, a perspective which contains important implications concerning the nature of solar activity.

  14. Adaptation of learning resources based on the MBTI theory of psychological types

    Amel Behaz

    2012-01-01

    Full Text Available Today, the resources available on the web increases significantly. The motivation for the dissemination of knowledge and their acquisition by learners is central to learning. However, learners show differences between the ways of learning that suits them best. The objective of the work presented in this paper is to study how it is possible to integrate models from cognitive theories and ontologies for the adaptation of educational resources. The goal is to provide the system capabilities to conduct reasoning on descriptions obtained in order to automatically adapt the resources to a learner according to his preferences. We rely on the model MBTI (Myers-Briggs Type Indicator for the consideration of learning styles of learners as a criterion for adaptation.

  15. Natural inflation with and without modulations in type IIB string theory

    Abe, Hiroyuki; Otsuka, Hajime

    2014-01-01

    We propose a mechanism for the natural inflation with and without modulation in the framework of type IIB string theory on toroidal orientifold or orbifold. We explicitly construct the stabilization potential of complex structure, dilaton and K\\"ahler moduli, where one of the imaginary component of complex structure moduli becomes light which is identified as the inflaton. The inflaton potential is generated by the gaugino-condensation term which receives the one-loop threshold corrections determined by the field value of complex structure moduli and the axion decay constant of inflaton is enhanced by the inverse of one-loop factor. We also find the threshold corrections can also induce the modulations to the original scalar potential for the natural inflation. Depending on these modulations, we can predict several sizes of tensor-to-scalar ratio as well as the other cosmological observables reported by WMAP, Planck and/or BICEP2 collaborations.

  16. Cultivating New-type Farmers Based on the Theory of Human Resources Development

    2010-01-01

    Under the direction of theory of human resources development,this thesis analyzes the impact of rural human resources development oncultivating new-type farmers.Firstly,it increases the input of rural basic education;secondly,it reinforces the vocational education and technologytraining;thirdly,it promotes the rural medical and public health services;fourthly,it quickens the rural labor transfer.The status quo of China’s ruralhuman resources has been analyzed as follows:in terms of the quantity of rural human resources,the status quo is large and quick-developed baseof rural human resources,high labor participatory rate,and constitution of low age;in terms of the quality of rural human resources,the status quois the ubiquitous low quality of rural human resources,low technological level of rural human resources,and overall low physical quality of farmers;in terms of the structure of rural human resources,the status quo is the irrational industrial structure distribution and imbalanced regional structuredistribution.The thesis also discusses the edification of theory of human resources development in cultivating new-type farmers.First,in terms ofthe control over quantity of rural human resources,it is to keep the stability of family planning policy,and expedite the transfer of rural surplus la-bor;second,in terms of promoting the quality of rural human resources,it is to bolster the development of reserve rural labor force resources,toconstruct the adult educational training system with Chinese characteristics,and to build rural primary health care system;third,in terms of adjus-ting the structure of rural human resources,it is to perfect rural human resources market,and adjust rural economical structure and talents struc-ture.

  17. Topological and geometrical quantum computation in cohesive Khovanov homotopy type theory

    Ospina, Juan

    2015-05-01

    The recently proposed Cohesive Homotopy Type Theory is exploited as a formal foundation for central concepts in Topological and Geometrical Quantum Computation. Specifically the Cohesive Homotopy Type Theory provides a formal, logical approach to concepts like smoothness, cohomology and Khovanov homology; and such approach permits to clarify the quantum algorithms in the context of Topological and Geometrical Quantum Computation. In particular we consider the so-called "open-closed stringy topological quantum computer" which is a theoretical topological quantum computer that employs a system of open-closed strings whose worldsheets are open-closed cobordisms. The open-closed stringy topological computer is able to compute the Khovanov homology for tangles and for hence it is a universal quantum computer given than any quantum computation is reduced to an instance of computation of the Khovanov homology for tangles. The universal algebra in this case is the Frobenius Algebra and the possible open-closed stringy topological quantum computers are forming a symmetric monoidal category which is equivalent to the category of knowledgeable Frobenius algebras. Then the mathematical design of an open-closed stringy topological quantum computer is involved with computations and theorem proving for generalized Frobenius algebras. Such computations and theorem proving can be performed automatically using the Automated Theorem Provers with the TPTP language and the SMT-solver Z3 with the SMT-LIB language. Some examples of application of ATPs and SMT-solvers in the mathematical setup of an open-closed stringy topological quantum computer will be provided.

  18. Accurate segmentation of leukocyte in blood cell images using Atanassov's intuitionistic fuzzy and interval Type II fuzzy set theory.

    Chaira, Tamalika

    2014-06-01

    In this paper automatic leukocyte segmentation in pathological blood cell images is proposed using intuitionistic fuzzy and interval Type II fuzzy set theory. This is done to count different types of leukocytes for disease detection. Also, the segmentation should be accurate so that the shape of the leukocytes is preserved. So, intuitionistic fuzzy set and interval Type II fuzzy set that consider either more number of uncertainties or a different type of uncertainty as compared to fuzzy set theory are used in this work. As the images are considered fuzzy due to imprecise gray levels, advanced fuzzy set theories may be expected to give better result. A modified Cauchy distribution is used to find the membership function. In intuitionistic fuzzy method, non-membership values are obtained using Yager's intuitionistic fuzzy generator. Optimal threshold is obtained by minimizing intuitionistic fuzzy divergence. In interval type II fuzzy set, a new membership function is generated that takes into account the two levels in Type II fuzzy set using probabilistic T co norm. Optimal threshold is selected by minimizing a proposed Type II fuzzy divergence. Though fuzzy techniques were applied earlier but these methods failed to threshold multiple leukocytes in images. Experimental results show that both interval Type II fuzzy and intuitionistic fuzzy methods perform better than the existing non-fuzzy/fuzzy methods but interval Type II fuzzy thresholding method performs little bit better than intuitionistic fuzzy method. Segmented leukocytes in the proposed interval Type II fuzzy method are observed to be distinct and clear.

  19. A Review of Different Types of Subsidies and How They Work in Theory

    Kampungu; K.Gerson; Han; Feng

    2013-01-01

    This paper gives a brief review on types of subsidies and how they work in theory.The paper identified three types of subsidies:subsidies that increase revenue,subsidies that lower the cost of production,and subsidies that are not linked to production or input.With the use of graphic examples to describe the partial effects of subsidies on supply and demand,the following findings were obtained:one,for producers to sell more,they will need to produce more,and in order to produce more,a higher input level is required,which depends on the marginal productivity of the inputs;two,the larger the elasticity for supply and demand of input(the more responsive supply and demand are to changes in the price of the input),the larger quantity of input used for a given level of support,and thereby increasing the associated environmental damage from the use of that particular input;three,for a given demand curve,a shallow supply curve(reflecting a large price elasticity of supply)will yield larger volume effects in response to a certain change in price compared to a steep supply curve and vise a verse.Finally,the study found input subsidy as an example of subsidies that lower the cost of production,and direct income support or unconditional lump sum support to an industry as an example of subsidies that are not linked to production or input.

  20. Brane/antibrane Configurations in Type IIA and M-Theory

    Marsano, Joseph

    We investigate the relation between large N duality applied to systems of D5's and /lineD5's wrapping vanishing cycles of local CY in type IIB and M-theory lifts of the NS5/D4//lineD4 systems in type IIA to which they are related by T-duality. Through a simple example based on a local CY constructed using an A2 singularity, we review this well-known correspondence in the supersymmetric setting and describe the manner in which it generalizes when antibranes are added. Agreement between the IIB and IIA pictures, which supports the assertion that {N}=2 supersymmetry is spontaneously broken in these systems at string tree level, is demonstrated when gs ≪ 1. Novel nonholomorphic features can arise away from this regime and their physical origin is discussed. This note is based on talks given at KITP, Harvard University, TIFR, the University of Tokyo at Hongo, the 2007 Les Houches Summer School, and the 2007 Simons Workshop, is based on work done in collaboration with K. Papadodimas and M. Shigemori, and contains some previously unpublished results.

  1. LRS Bianchi type -V cosmology with heat flow in scalar-tensor theory

    Singh, C.P. [Delhi College of Engineering, Delhi (India). Dept. of Applied Mathematics], e-mail: cpsphd@rediffmail.com

    2009-12-15

    In this paper we present a spatially homogeneous locally rotationally symmetric (LRS) Bianchi type -V perfect fluid model with heat conduction in scalar tensor theory proposed by Saez and Ballester. The field equations are solved with and without heat conduction by using a law of variation for the mean Hubble parameter, which is related to the average scale factor of metric and yields a constant value for the deceleration parameter. The law of variation for the mean Hubble parameter generates two types of cosmologies one is of power -law form and second the exponential form. Using these two forms singular and non -singular solutions are obtained with and without heat conduction. We observe that a constant value of the deceleration parameter is reasonable a description of the different phases of the universe. We arrive to the conclusion that the universe decelerates for positive value of deceleration parameter where as it accelerates for negative one. The physical constraints on the solutions of the field equations, and, in particular, the thermodynamical laws and energy conditions that govern such solutions are discussed in some detail.The behavior of the observationally important parameters like expansion scalar, anisotropy parameter and shear scalar is considered in detail. (author)

  2. Non-perturbative black holes in Type-IIA String Theory vs. the No-Hair conjecture

    Bueno, Pablo

    2013-01-01

    We obtain the first black hole solution to Type-IIA String Theory compactified on an arbitrary self-mirror Calabi Yau manifold in the presence of non-perturbative quantum corrections. Remarkably enough, the solution involves multivalued functions, which could lead to a violation of the No-Hair conjecture. We discuss how String Theory forbids such secenario. However the possibility still remains open in the context of four-dimensional ungauged Supergravity.

  3. Screening for and validated quantification of phenethylamine-type designer drugs and mescaline in human blood plasma by gas chromatography/mass spectrometry.

    Habrdova, Vilma; Peters, Frank T; Theobald, Denis S; Maurer, Hans H

    2005-06-01

    In recent years, several newer designer drugs of the so-called 2C series such as 2C-D, 2C-E, 2C-P, 2C-B, 2C-I, 2C-T-2, and 2C-T-7 have entered the illicit drug market as recreational drugs. Some fatal intoxications involving 2C-T-7 have been reported. Only scarce data have been published about analyses of these substances in human blood and/or plasma. This paper describes a method for screening and simultaneous quantification of the above-mentioned compounds and their analog mescaline in human blood plasma. The analytes were analyzed by gas chromatography/mass spectrometry in the selected-ion monitoring mode, after mixed-mode solid-phase extraction (HCX) and derivatization with heptafluorobutyric anhydride. The method was fully validated according to international guidelines. Validation data for 2C-T-2 and 2C-T-7 were unacceptable. For all other analytes, the method was linear from 5 to 500 microg/L and the data for accuracy (bias) and precision (coefficient of variation) were within the acceptance limits of +/-15% and <15%, respectively (within +/-20% and <20% near the limit of quantification of 5 microg/L).

  4. Campbelling-type theory of fission chamber signals generated by neutron chains in a multiplying medium

    Pál, L. [Centre for Energy Research, Hungarian Academy of Sciences, H-1525 Budapest 114, POB 49 (Hungary); Pázsit, I., E-mail: imre@chalmers.se [Chalmers University of Technology, Department of Applied Physics, Division of Nuclear Engineering, SE-412 96 Göteborg (Sweden)

    2015-09-11

    The signals of fission chambers are usually evaluated with the help of the co-called Campbelling techniques. These are based on the Campbell theorem, which states that if the primary incoming events, generating the detector pulses, are independent, then relationships exist between the moments of various orders of the signal in the current mode. This gives the possibility to determine the mean value of the intensity of the detection events, which is proportional to the static flux, from the higher moments of the detector current, which has certain advantages. However, the main application area of fission chambers is measurements in power reactors where, as is well known, the individual detection events are not independent, due to the branching character of the neutron chains (neutron multiplication). Therefore it is of interest to extend the Campbelling-type theory for the case of correlated neutron events. Such a theory could address two questions: partly, to investigate the bias when the traditional Campbell techniques are used for correlated incoming events; and partly, to see whether the correlation properties of the detection events, which carry information on the multiplying medium, could be extracted from the measurements. This paper is devoted to the investigation of these questions. The results show that there is a potential possibility to extract the same information from fission chamber signals in the current mode as with the Rossi- or Feynman-alpha methods, or from coincidence and multiplicity measurements, which so far have required detectors working in the pulse mode. It is also shown that application of the standard Campbelling techniques to neutron detection in multiplying systems does not lead to an error for estimating the stationary flux as long as the detector is calibrated in in situ measurements.

  5. Intellectual style theories: different types of categorizations and their relevance for practitioners.

    Nielsen, Tine

    2014-01-01

    In the 20th century, a large number of psychological theories of intellectual styles were developed. Different reviews mention up to 71 theories of style. In the last 25 years, several suggestions as to how theories of styles may be divided into categories and fields of focus have been offered. Theorists and researchers disagree about the criteria on which categorizations should be based, and about which theories fulfill these criteria. Such disagreements are fruitful at a theoretical level, but also have negative consequences for the intended fields of application of the style theories and the associated instruments for measuring styles, because practitioners seeking the theory and instrument best suited for their intended use/application simply cannot find their way through the jungle of disagreements. The present study seeks to reduce the confusion for practitioners seeking to employ styles, by developing a taxonomy of categorizations of style theories in which all style theories can be placed.

  6. (2,2) and (0,4) Supersymmetric Boundary Conditions in 3d N = 4 Theories and Type IIB Branes

    Chung, Hee-Joong

    2016-01-01

    The half-BPS boundary conditions preserving N = (2,2) and N = (0,4) supersymmetry in 3d N = 4 supersymmetric gauge theories are examined. The BPS equations admit decomposition of the bulk supermultiplets into specific boundary supermultiplets of preserved supersymmetry. Bogomolony-like equations and Nahm-like equations arise in the vector multiplet BPS boundary conditions and Robin-type boundary conditions appear for the hypermultiplet coupled to vector multiplet. The half-BPS boundary conditions are realized in the brane configurations of Type IIB string theory.

  7. MAMA Software Features: Visual Examples of Quantification

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-20

    This document shows examples of the results from quantifying objects of certain sizes and types in the software. It is intended to give users a better feel for some of the quantification calculations, and, more importantly, to help users understand the challenges with using a small set of ‘shape’ quantification calculations for objects that can vary widely in shapes and features. We will add more examples to this in the coming year.

  8. Itinerant type many-body theories for photo-induced structural phase transitions

    Nasu, Keiichiro

    2004-09-01

    Itinerant type quantum many-body theories for photo-induced structural phase transitions (PSPTs) are reviewed in close connection with various recent experimental results related to this new optical phenomenon. There are two key concepts: the hidden multi-stability of the ground state, and the proliferations of optically excited states. Taking the ionic (I) rarr neutral (N) phase transition in the organic charge transfer (CT) crystal, TTF-CA, as a typical example for this type of transition, we, at first, theoretically show an adiabatic path which starts from CT excitons in the I-phase, but finally reaches an N-domain with a macroscopic size. In connection with this I-N transition, the concept of the initial condition sensitivity is also developed so as to clarify experimentally observed nonlinear characteristics of this material. In the next, using a more simplified model for the many-exciton system, we theoretically study the early time quantum dynamics of the exciton proliferation, which finally results in the formation of a domain with a large number of excitons. For this purpose, we derive a stepwise iterative equation to describe the exciton proliferation, and clarify the origin of the initial condition sensitivity. Possible differences between a photo-induced nonequilibrium phase and an equilibrium phase at high temperatures are also clarified from general and conceptional points of view, in connection with recent experiments on the photo-induced phase transition in an organo-metallic complex crystal. It will be shown that the photo-induced phase can make a new interaction appear as a broken symmetry only in this phase, even when this interaction is almost completely hidden in all the equilibrium phases, such as the ground state and other high-temperature phases. The relation between the photo-induced nonequilibrium phase and the hysteresis induced nonequilibrium one is also qualitatively discussed. We will be concerned with a macroscopic parity violation

  9. Itinerant type many-body theories for photo-induced structural phase transitions

    Nasu, Keiichiro [Solid State Theory Division, Institute of Materials Structure Science, KEK, Graduate University for Advanced Study, 1-1, Oho, Tsukuba, Ibaraki, 305-0801 (Japan)

    2004-09-01

    Itinerant type quantum many-body theories for photo-induced structural phase transitions (PSPTs) are reviewed in close connection with various recent experimental results related to this new optical phenomenon. There are two key concepts: the hidden multi-stability of the ground state, and the proliferations of optically excited states. Taking the ionic (I) {yields} neutral (N) phase transition in the organic charge transfer (CT) crystal, TTF-CA, as a typical example for this type of transition, we, at first, theoretically show an adiabatic path which starts from CT excitons in the I-phase, but finally reaches an N-domain with a macroscopic size. In connection with this I-N transition, the concept of the initial condition sensitivity is also developed so as to clarify experimentally observed nonlinear characteristics of this material. In the next, using a more simplified model for the many-exciton system, we theoretically study the early time quantum dynamics of the exciton proliferation, which finally results in the formation of a domain with a large number of excitons. For this purpose, we derive a stepwise iterative equation to describe the exciton proliferation, and clarify the origin of the initial condition sensitivity. Possible differences between a photo-induced nonequilibrium phase and an equilibrium phase at high temperatures are also clarified from general and conceptional points of view, in connection with recent experiments on the photo-induced phase transition in an organo-metallic complex crystal. It will be shown that the photo-induced phase can make a new interaction appear as a broken symmetry only in this phase, even when this interaction is almost completely hidden in all the equilibrium phases, such as the ground state and other high-temperature phases. The relation between the photo-induced nonequilibrium phase and the hysteresis induced nonequilibrium one is also qualitatively discussed. We will be concerned with a macroscopic parity

  10. From Type II string theory toward BSM/dark sector physics

    Honecker, Gabriele

    2016-11-01

    Four-dimensional compactifications of string theory provide a controlled set of possible gauge representations accounting for BSM particles and dark sector components. In this review, constraints from perturbative Type II string compactifications in the geometric regime are discussed in detail and then compared to results from heterotic string compactifications and nonperturbative/nongeometric corners. As a prominent example, an open string realization of the QCD axion is presented. The status of deriving the associated low-energy effective action in four dimensions is discussed and open avenues of major phenomenological importance are highlighted. As examples, a mechanism of closed string moduli stabilization by D-brane backreaction as well as one-loop threshold corrections to the gauge couplings and balancing a low string scale Mstring with unisotropic compact dimensions are discussed together with implications on potential future new physics observations. For illustrative purposes, an explicit example of a globally consistent D6-brane model with MSSM-like spectrum on T6/(ℤ 2 × ℤ6 × Ωℛ) is presented.

  11. Algebraic Signal Processing Theory: Cooley-Tukey Type Algorithms for Polynomial Transforms Based on Induction

    Sandryhaila, Aliaksei; Pueschel, Markus

    2010-01-01

    A polynomial transform is the multiplication of an input vector $x\\in\\C^n$ by a matrix $\\PT_{b,\\alpha}\\in\\C^{n\\times n},$ whose $(k,\\ell)$-th element is defined as $p_\\ell(\\alpha_k)$ for polynomials $p_\\ell(x)\\in\\C[x]$ from a list $b=\\{p_0(x),\\dots,p_{n-1}(x)\\}$ and sample points $\\alpha_k\\in\\C$ from a list $\\alpha=\\{\\alpha_0,\\dots,\\alpha_{n-1}\\}$. Such transforms find applications in the areas of signal processing, data compression, and function interpolation. Important examples include the discrete Fourier and cosine transforms. In this paper we introduce a novel technique to derive fast algorithms for polynomial transforms. The technique uses the relationship between polynomial transforms and the representation theory of polynomial algebras. Specifically, we derive algorithms by decomposing the regular modules of these algebras as a stepwise induction. As an application, we derive novel $O(n\\log{n})$ general-radix algorithms for the discrete Fourier transform and the discrete cosine transform of type 4.

  12. Psychosocial correlates of dietary behaviour in type 2 diabetic women, using a behaviour change theory.

    Didarloo, A; Shojaeizadeh, D; Gharaaghaji Asl, R; Niknami, S; Khorami, A

    2014-06-01

    The study evaluated the efficacy of the Theory of Reasoned Action (TRA), along with self-efficacy to predict dietary behaviour in a group of Iranian women with type 2 diabetes. A sample of 352 diabetic women referred to Khoy Diabetes Clinic, Iran, were selected and given a self-administered survey to assess eating behaviour, using the extended TRA constructs. Bivariate correlations and Enter regression analyses of the extended TRA model were performed with SPSS software. Overall, the proposed model explained 31.6% of variance of behavioural intention and 21.5% of variance of dietary behaviour. Among the model constructs, self-efficacy was the strongest predictor of intentions and dietary practice. In addition to the model variables, visit intervals of patients and source of obtaining information about diabetes from sociodemographic factors were also associated with dietary behaviours of the diabetics. This research has highlighted the relative importance of the extended TRA constructs upon behavioural intention and subsequent behaviour. Therefore, use of the present research model in designing educational interventions to increase adherence to dietary behaviours among diabetic patients was recommended and emphasized.

  13. An Sl(3,R) multiplet of 8-dimensional type II supergravity theories and the gauged supergravity inside

    Alonso-Alberca, N; Ortín, Tomas

    2001-01-01

    The so-called ``massive 11-dimensional supergravity'' theory gives, for one Killing vector, Romans' massive 10-dimensional supergravity in 10 dimensions, for two Killing vectors an Sl(2,Z) multiplet of massive 9-dimensional supergravity theories that can be obtained by standard generalized dimensional reduction type IIB supergravity and has been shown to contain a gauged supergravity. We consider a straightforward generalization of this theory to three Killing vectors and a 3\\times 3 symmetric mass matrix and show that it gives an Sl(3,Z) multiplet of 8-dimensional supergravity theories that contain an SO(3) gauged supergravity which is, in some way, the dual to the one found by Salam and Sezgin by standard generalized dimensional reduction.

  14. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jakeman, John Davis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stephens, John Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vigil, Dena M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wildey, Timothy Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bohnhoff, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hu, Kenneth T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dalbey, Keith R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauman, Lara E [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hough, Patricia Diane [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

  15. Advances in type-2 fuzzy sets and systems theory and applications

    Mendel, Jerry; Tahayori, Hooman

    2013-01-01

    This book explores recent developments in the theoretical foundations and novel applications of general and interval type-2 fuzzy sets and systems, including: algebraic properties of type-2 fuzzy sets, geometric-based definition of type-2 fuzzy set operators, generalizations of the continuous KM algorithm, adaptiveness and novelty of interval type-2 fuzzy logic controllers, relations between conceptual spaces and type-2 fuzzy sets, type-2 fuzzy logic systems versus perceptual computers; modeling human perception of real world concepts with type-2 fuzzy sets, different methods for generating membership functions of interval and general type-2 fuzzy sets, and applications of interval type-2 fuzzy sets to control, machine tooling, image processing and diet.  The applications demonstrate the appropriateness of using type-2 fuzzy sets and systems in real world problems that are characterized by different degrees of uncertainty.

  16. Construction of a Kaluza-Klein type Theory from One Dimension

    Jackson, David J

    2016-01-01

    We describe how a physical theory incorporating the properties of fields deriving from extra-dimensional structures over a four-dimensional spacetime manifold can in principle be obtained through the analysis of a simple initial structure consisting of the one dimension of time alone, as represented by the real line. The simplicity of this starting point leads to symmetries of multi-dimensional forms of time, from which a geometrical structure can be derived which is similar to the framework employed in non-Abelian Kaluza-Klein theories. This leads to a relationship between the external and internal curvature on the spacetime manifold unified through the underlying constraint of the one dimension of time for the theory presented here. We also describe how the symmetry breaking structure is compatible with the Coleman-Mandula theorem for the subsequent quantisation of the theory.

  17. Spectral analysis of polynomial potentials and its relation with ABJ/M-type theories

    Garcia del Moral, M.P., E-mail: garciamormaria@uniovi.e [Departamento de Fisica, Universidad de Oviedo, Calvo Sotelo 18, 33007 Oviedo (Spain); Martin, I., E-mail: isbeliam@usb.v [Departamento de Fisica, Universidad Simon Bolivar, Apartado 89000, Caracas 1080-A (Venezuela, Bolivarian Republic of); Navarro, L., E-mail: lnavarro@ma.usb.v [Departamento de Matematicas, Universidad Simon Bolivar, Apartado 89000, Caracas 1080-A (Venezuela, Bolivarian Republic of); Perez, A.J., E-mail: ajperez@ma.usb.v [Departamento de Matematicas, Universidad Simon Bolivar, Apartado 89000, Caracas 1080-A (Venezuela, Bolivarian Republic of); Restuccia, A., E-mail: arestu@usb.v [Departamento de Fisica, Universidad Simon Bolivar, Apartado 89000, Caracas 1080-A (Venezuela, Bolivarian Republic of)

    2010-11-01

    We obtain a general class of polynomial potentials for which the Schroedinger operator has a discrete spectrum. This class includes all the scalar potentials in membrane, 5-brane, p-branes, multiple M2 branes, BLG and ABJM theories. We provide a proof of the discreteness of the spectrum of the associated Schroedinger operators. This is the first step in order to analyze BLG and ABJM supersymmetric theories from a non-perturbative point of view.

  18. Understanding physical activity intentions among French Canadians with type 2 diabetes: an extension of Ajzen's theory of planned behaviour

    Godin Gaston

    2009-06-01

    Full Text Available Abstract Background Regular physical activity is considered a cornerstone for managing type 2 diabetes. However, in Canada, most individuals with type 2 diabetes do not meet national physical activity recommendations. When designing a theory-based intervention, one should first determine the key determinants of physical activity for this population. Unfortunately, there is a lack of information on this aspect among adults with type 2 diabetes. The purpose of this cross-sectional study is to fill this gap using an extended version of Ajzen's Theory of Planned Behavior (TPB as reference. Methods A total of 501 individuals with type 2 diabetes residing in the Province of Quebec (Canada completed the study. Questionnaires were sent and returned by mail. Results Multiple hierarchical regression analyses indicated that TPB variables explained 60% of the variance in intention. The addition of other psychosocial variables in the model added 7% of the explained variance. The final model included perceived behavioral control (β = .38, p Conclusion The findings suggest that interventions aimed at individuals with type 2 diabetes should ensure that people have the necessary resources to overcome potential obstacles to behavioral performance. Interventions should also favor the development of feelings of personal responsibility to exercise and promote the advantages of exercising for individuals with type 2 diabetes.

  19. Algebraic Geometry Approach in Gravity Theory and New Relations between the Parameters in Type I Low-Energy String Theory Action in Theories with Extra Dimensions

    Dimitrov, Bogdan G

    2009-01-01

    On the base of the distinction between covariant and contravariant metric tensor components, a new (multivariable) cubic algebraic equation for reparametrization invariance of the gravitational Lagrangian has been derived and parametrized with complicated non - elliptic functions, depending on the (elliptic) Weierstrass function and its derivative. This is different from standard algebraic geometry, where only two-dimensional cubic equations are parametrized with elliptic functions and not multivariable ones. Physical applications of the approach have been considered in reference to theories with extra dimensions. The s.c. "length function" l(x) has been introduced and found as a solution of quasilinear differential equations in partial derivatives for two different cases of "compactification + rescaling" and "rescaling + compactification". New physically important relations (inequalities) between the parameters in the action are established, which cannot be derived in the case $l=1$ of the standard gravitati...

  20. Inflation and Singularity of a Bianchi Type-Ⅶ0 Universe with a Dirac Field in the Einstein-Cartan Theory

    HUANG Zeng-Guang; FANG Wei; LU Hui-Qing

    2011-01-01

    @@ We discuss Bianchi type-Ⅶ0 cosmology with a Dirac field in the Einstein-Cartan(E-C) theory and obtain the equations of the Dirac and gravitational fields in the E-C theory.A Bianchi type-Ⅶ0 inflationary solution is found.When(3/16)S2-σ2>0, the Universe may avoid singularity.

  1. Extension Theory and Krein-type Resolvent Formulas for Nonsmooth Boundary Value Problems

    Abels, Helmut; Grubb, Gerd; Wood, Ian Geoffrey

    2014-01-01

    The theory of selfadjoint extensions of symmetric operators, and more generally the theory of extensions of dual pairs, was implemented some years ago for boundary value problems for elliptic operators on smooth bounded domains. Recently, the questions have been taken up again for nonsmooth domains....... In the present work we show that pseudodifferential methods can be used to obtain a full characterization, including Kreĭn resolvent formulas, of the realizations of nonselfadjoint second-order operators on View the MathML source

  2. Communication: Cosolvency and cononsolvency explained in terms of a Flory-Huggins type theory.

    Dudowicz, Jacek; Freed, Karl F; Douglas, Jack F

    2015-10-07

    Standard Flory-Huggins (FH) theory is utilized to describe the enigmatic cosolvency and cononsolvency phenomena for systems of polymers dissolved in mixed solvents. In particular, phase boundaries (specifically upper critical solution temperature spinodals) are calculated for solutions of homopolymers B in pure solvents and in binary mixtures of small molecule liquids A and C. The miscibility (or immiscibility) patterns for the ternary systems are classified in terms of the FH binary interaction parameters {χαβ} and the ratio r = ϕ A /ϕ C of the concentrations ϕ A and ϕ C of the two solvents. The trends in miscibility are compared to those observed for blends of random copolymers (AxC1-x) with homopolymers (B) and to those deduced for A/B/C solutions of polymers B in liquid mixtures of small molecules A and C that associate into polymeric clusters {ApCq}i, (i = 1, 2, …, ∞). Although the classic FH theory is able to explain cosolvency and cononsolvency phenomena, the theory does not include a consideration of the mutual association of the solvent molecules and the competitive association between the solvent molecules and the polymer. These interactions can be incorporated in refinements of the FH theory, and the present paper provides a foundation for such extensions for modeling the rich thermodynamics of polymers in mixed solvents.

  3. Communication: Cosolvency and cononsolvency explained in terms of a Flory-Huggins type theory

    Dudowicz, Jacek, E-mail: dudowicz@jfi.uchicago.edu; Freed, Karl F. [The James Franck Institute and the Department of Chemistry, The University of Chicago, Chicago, Illinois 60637 (United States); Douglas, Jack F. [Materials Science and Engineering Division, National Institute of Standards and Technology, Gaithersburg, Maryland 20899 (United States)

    2015-10-07

    Standard Flory-Huggins (FH) theory is utilized to describe the enigmatic cosolvency and cononsolvency phenomena for systems of polymers dissolved in mixed solvents. In particular, phase boundaries (specifically upper critical solution temperature spinodals) are calculated for solutions of homopolymers B in pure solvents and in binary mixtures of small molecule liquids A and C. The miscibility (or immiscibility) patterns for the ternary systems are classified in terms of the FH binary interaction parameters (χ{sub αβ}) and the ratio r = ϕ{sub A}/ϕ{sub C} of the concentrations ϕ{sub A} and ϕ{sub C} of the two solvents. The trends in miscibility are compared to those observed for blends of random copolymers (A{sub x}C{sub 1−x}) with homopolymers (B) and to those deduced for A/B/C solutions of polymers B in liquid mixtures of small molecules A and C that associate into polymeric clusters (A{sub p}C{sub q}){sub i}, (i = 1, 2, …, ∞). Although the classic FH theory is able to explain cosolvency and cononsolvency phenomena, the theory does not include a consideration of the mutual association of the solvent molecules and the competitive association between the solvent molecules and the polymer. These interactions can be incorporated in refinements of the FH theory, and the present paper provides a foundation for such extensions for modeling the rich thermodynamics of polymers in mixed solvents.

  4. Quantification of beta-cell function during IVGTT in Type II and non-diabetic subjects: assessment of insulin secretion by mathematical methods

    Kjems, L L; Vølund, A; Madsbad, Sten

    2001-01-01

    AIMS/HYPOTHESIS: We compared four methods to assess their accuracy in measuring insulin secretion during an intravenous glucose tolerance test in patients with Type II (non-insulin-dependent) diabetes mellitus and with varying beta-cell function and matched control subjects. METHODS: Eight control...... subjects and eight Type II diabetic patients underwent an intravenous glucose tolerance test with tolbutamide and an intravenous bolus injection of C-peptide to assess C-peptide kinetics. Insulin secretion rates were determined by the Eaton deconvolution (reference method), the Insulin SECretion method...

  5. Statistical mechanics of cracks. Dualities in supersymmetric field theories. Orientifolds of Type IIB strings with discrete B flux

    Buchel, Alexander Sergeevich

    In the first part of this thesis we study a class of models for brittle fracture: elastic theory models which allow for cracks but not for plastic flow. We show that these models exhibit, at all finite temperatures, a transition to fracture under applied load. We study this transition at low temperature for small tension. We discuss the appropriate thermodynamic limit of these theories: a large class of boundary conditions is identified for which the energy release for a crack becomes independent of the macroscopic shape of the material. We prove that the energy release in an isotropically stretched material due to the creation of an arbitrary curvy cut is the same to cubic order as the energy release for the straight cut with the same end points. We find the normal modes and the energy spectrum for crack shape fluctuations and for crack surface phonons, under a uniform isotropic tension. For small uniform isotropic tension in two dimensions we calculate the essential singularity associated with fracturing the material in a saddle point approximation including quadratic fluctuations. We calculate the asymptotic ratio of the high-order elastic coefficients of the inverse bulk modulus and argue that the result is unchanged by nonlinearities. In the second part of this thesis we study dualities in supersymmetric field theories. We derive S-dualities in scale invariant N = 2 supersymmetric gauge theories by embedding those theories in asymptotically free theories with higher rank gauge groups. We proceed then to study ``ultrastrong'' coupling points in scale- invariant N = 2 gauge theories. Using the low-energy field theory arguments we relate these theories to other known N = 2 CFT. Finally, we argue that the topology of the quantum coupling space and the low energy effective action on the Coulomb branch of scale invariant N = 2 SU(n) gauge theories pick out a preferred nonperturbative definition of the gauge coupling up to non-singular holomorphic reparameterization

  6. Demographic and Motivation Differences Among Online Sex Offenders by Type of Offense: An Exploration of Routine Activities Theories.

    Navarro, Jordana N; Jasinski, Jana L

    2015-01-01

    This article presents an analysis of the relationship between online sexual offenders' demographic background and characteristics indicative of motivation and offense type. Specifically, we investigate whether these characteristics can distinguish different online sexual offender groups from one another as well as inform routine activity theorists on what potentially motivates perpetrators. Using multinomial logistic regression, this study found that online sexual offenders' demographic backgrounds and characteristics indicative of motivation do vary by offense types. Two important implications of this study are that the term "online sexual offender" encompasses different types of offenders, including some who do not align with mainstream media's characterization of "predators," and that the potential offender within routine activity theory can be the focus of empirical investigation rather than taken as a given in research.

  7. The double Mellin-Barnes type integrals and their applications to convolution theory

    Hai, Nguyen Thanh

    1992-01-01

    This book presents new results in the theory of the double Mellin-Barnes integrals popularly known as the general H-function of two variables.A general integral convolution is constructed by the authors and it contains Laplace convolution as a particular case and possesses a factorization property for one-dimensional H-transform. Many examples of convolutions for classical integral transforms are obtained and they can be applied for the evaluation of series and integrals.

  8. Quantification of beta-cell function during IVGTT in Type II and non-diabetic subjects: assessment of insulin secretion by mathematical methods

    Kjems, L L; Vølund, A; Madsbad, Sten

    2001-01-01

    AIMS/HYPOTHESIS: We compared four methods to assess their accuracy in measuring insulin secretion during an intravenous glucose tolerance test in patients with Type II (non-insulin-dependent) diabetes mellitus and with varying beta-cell function and matched control subjects. METHODS: Eight control...... subjects and eight Type II diabetic patients underwent an intravenous glucose tolerance test with tolbutamide and an intravenous bolus injection of C-peptide to assess C-peptide kinetics. Insulin secretion rates were determined by the Eaton deconvolution (reference method), the Insulin SECretion method...... (ISEC) based on population kinetic parameters as well as one-compartment and two-compartment versions of the combined model of insulin and C-peptide kinetics. To allow a comparison of the accuracy of the four methods, fasting rates and amounts of insulin secreted during the first phase (0-10 min...

  9. Development of solution phase hybridisation PCR-ELISA for the detection and quantification of Enterococcus faecalis and Pediococcus pentosaceus in Nurmi-type cultures

    Sinead M Waters; Doyle, Sean; Murphy, Richard A.; Power, Ronan F.G.

    2005-01-01

    Nurmi-type cultures (NTCs), derived from the fermentation of caecal contents of specifically pathogen-free (SPF) birds, have been used successfully to control salmonella colonisation in chicks. These cultures are undefined in nature and, consequently, it is difficult to obtain approval from regulatory agencies for their use as direct fed microbials (DFMs) for poultry. Progress towards the generation of effective defined probiotics requires further knowledge of the composition of thes...

  10. Quantification of interferon signaling in avian cells

    Kint, Joeri; Forlenza, Maria

    2015-01-01

    Activation of the type I interferon (IFN) response is an essential defense mechanism against invading pathogens such as viruses. This chapter describes two protocols to quantify activation of the chicken IFN response through analysis of gene expression by real-time quantitative PCR and by quantif

  11. How Many Types of Thermodynamical Equilibrium are There: Relation to Information Theory and Holism

    Koleva, M K

    2006-01-01

    Major revision of the thermodynamics is made in order to provide rigorous fundament for functional diversity of holistic type. It turns out that the new approach ensures reproducibility of the information as well.

  12. RENEWAL OF BASIC LAWS AND PRINCIPLES FOR POLAR CONTINUUM THEORIES(Ⅶ)-INCREMENTAL RATE TYPE

    戴安民

    2003-01-01

    The purpose is to establish the rather complete equations of motion, boundary conditions and equation of energy rate of incremental rate type for micropolar continua. To this end the rather complete definitions for rates of deformation gradient and its inverse are made. The new relations between various stress and couple stress rate tensors are derived.Finally, the coupled equations of motion, boundary conditions and equation of energy rate of incremental rate type for continuum mechanics are obtained as a special case.

  13. Cyclic uniaxial and biaxial hardening of type 304 stainless steel modeled by the viscoplasticity theory based on overstress

    Yao, David; Krempl, Erhard

    1988-01-01

    The isotropic theory of viscoplasticity based on overstress does not use a yield surface or a loading and unloading criterion. The inelastic strain rate depends on overstress, the difference between the stress and the equilibrium stress, and is assumed to be rate dependent. Special attention is paid to the modeling of elastic regions. For the modeling of cyclic hardening, such as observed in annealed Type 304 stainless steel, and additional growth law for a scalar quantity which represents the rate independent asymptotic value of the equilibrium stress is added. It is made to increase with inelastic deformation using a new scalar measure which differentiates between nonproportional and proportional loading. The theory is applied to correlate uniaxial data under two step amplitude loading including the effect of further hardening at the high amplitude and proportional and nonproportional cyclic loadings. Results are compared with corresponding experiments.

  14. Can galaxy clusters, type Ia supernovae and cosmic microwave background ruled out a class of modified gravity theories?

    Holanda, R F L

    2016-01-01

    In this paper we study cosmological signatures of modified gravity theories that can be written as a coupling between a extra scalar field and the electromagnetic part of the usual Lagrangian for the matter fields. In these frameworks all the electromagnetic sector of the theory is affected and variations of fundamental constants, of the cosmic distance duality relation and of the evolution law of the cosmic microwave background radiation (CMB) are expected and are related each other. In order to search these variations we perform jointly analyses with angular diameter distances of galaxy clusters, luminosity distances of type Ia supernovae and $T_{CMB}(z)$ measurements. We obtain tight constraints with no indication of violation of the standard framework.

  15. Simultaneous trace identification and quantification of common types of microplastics in environmental samples by pyrolysis-gas chromatography-mass spectrometry.

    Fischer, Marten; Scholz-Böttcher, Barbara M

    2017-04-09

    The content of microplastics (MP) in the environment is constantly growing. Since the environmental relevance, particularly bioavailability, rises with decreasing particle size, the knowledge of the MP proportion in habitats and organisms is of gaining importance. The reliable recognition of MP particles is limited and underlies substantial uncertainties. Therefor spectroscopically methods are necessary to ensure the plastic nature of isolated particles, determine the polymer type and obtain particle count related quantitative data. In this study Curie-Point pyrolysis-gas chromatography-mass spectrometry combined with thermochemolysis is shown to be an excellent analytical tool to simultaneously identify and optionally quantify MP in environmental samples on a polymer specific mass related trace level. The method is independent of any optical preselection or particle appearance. For this purpose polymer characteristic pyrolysis products and their indicative fragment ions were used to analyze eight common types of plastics. Further aspects of calibration, recoveries, and potential matrix effects are discussed. The method is exemplarily applied on selected fish samples after an enzymatic-chemically pretreatment. This new approach with mass-related results is complementary to established FT-IR and Raman methods providing particle counts of individual polymer particles.

  16. Weyl Group Multiple Dirichlet Series Type A Combinatorial Theory (AM-175)

    Brubaker, Ben; Friedberg, Solomon

    2011-01-01

    Weyl group multiple Dirichlet series are generalizations of the Riemann zeta function. Like the Riemann zeta function, they are Dirichlet series with analytic continuation and functional equations, having applications to analytic number theory. By contrast, these Weyl group multiple Dirichlet series may be functions of several complex variables and their groups of functional equations may be arbitrary finite Weyl groups. Furthermore, their coefficients are multiplicative up to roots of unity, generalizing the notion of Euler products. This book proves foundational results about these series an

  17. Communication: Two types of flat-planes conditions in density functional theory.

    Yang, Xiaotian Derrick; Patel, Anand H G; Miranda-Quintana, Ramón Alain; Heidar-Zadeh, Farnaz; González-Espinoza, Cristina E; Ayers, Paul W

    2016-07-21

    Using results from atomic spectroscopy, we show that there are two types of flat-planes conditions. The first type of flat-planes condition occurs when the energy as a function of the number of electrons of each spin, Nα and Nβ, has a derivative discontinuity on a line segment where the number of electrons, Nα + Nβ, is an integer. The second type of flat-planes condition occurs when the energy has a derivative discontinuity on a line segment where the spin polarization, Nα - Nβ, is an integer, but does not have a discontinuity associated with an integer number of electrons. Type 2 flat planes are rare-we observed just 15 type 2 flat-planes conditions out of the 4884 cases we tested-but their mere existence has implications for the design of exchange-correlation energy density functionals. To facilitate the development of functionals that have the correct behavior with respect to both fractional number of electrons and fractional spin polarization, we present a dataset for the chromium atom and its ions that can be used to test new functionals.

  18. Communication: Two types of flat-planes conditions in density functional theory

    Yang, Xiaotian Derrick; Patel, Anand H. G.; Miranda-Quintana, Ramón Alain; Heidar-Zadeh, Farnaz; González-Espinoza, Cristina E.; Ayers, Paul W.

    2016-07-01

    Using results from atomic spectroscopy, we show that there are two types of flat-planes conditions. The first type of flat-planes condition occurs when the energy as a function of the number of electrons of each spin, Nα and Nβ, has a derivative discontinuity on a line segment where the number of electrons, Nα + Nβ, is an integer. The second type of flat-planes condition occurs when the energy has a derivative discontinuity on a line segment where the spin polarization, Nα - Nβ, is an integer, but does not have a discontinuity associated with an integer number of electrons. Type 2 flat planes are rare—we observed just 15 type 2 flat-planes conditions out of the 4884 cases we tested—but their mere existence has implications for the design of exchange-correlation energy density functionals. To facilitate the development of functionals that have the correct behavior with respect to both fractional number of electrons and fractional spin polarization, we present a dataset for the chromium atom and its ions that can be used to test new functionals.

  19. Introduction to uncertainty quantification

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  20. Relative quantification and detection of different types of infectious bursal disease virus in bursa of Fabricius and cloacal swabs using real time RT-PCR SYBR green technology

    Li, Yiping; Handberg, K.J.; Kabell, Susanne;

    2007-01-01

    or F52/70 inoculation were detected as virus positive at day I post inoculation (p.i.). The D78 viral load peaked at day 4 and day 8 p.i., while the DK01 and F52/70 viral load showed relatively high levels at day 2 p.i. In cloacal swabs, viruses detectable were at day 2 p.i. for DK01 and F52/70, day 8......In present study, different types of infectious bursal disease virus (IBDV), virulent strain DK01, classic strain F52/70 and vaccine strain D78 were quantified and detected in infected bursa of Fabricius (BF) and cloacal swabs using quantitative real time RT-PCR with SYBR green dye. For selection...

  1. Five Dimensional Bianchi Type-V Space-Time in f (R,T Theory of Gravityw

    L.S. Ladke,

    2016-02-01

    Full Text Available We study the spatially homogeneous anisotropic Bianchi type-V universe in f(R,T theory of gravity, where R is the Ricci scalar and T is the trace of the energy-momentum tensor. We assume the variation law of mean Hubble parameter and constant deceleration parameter to find two different five dimensional exact solutions of the modified field equations. The first solution yields a singular model for n  0 while the second gives a nonsingular model for n  0. The physical quantities are discussed for both models in future evolution of the universe.

  2. Bianchi type-I massive string magnetized barotropic perfect fluid cosmological model in bimetric theory

    S D Katore; R S Rane; K S Wankhade

    2011-04-01

    Bianchi type-I massive string cosmological model for perfect fluid distribution in the presence of magnetic field is investigated in Rosen’s [Gen. Relativ. Gravit. 4, 435 (1973)] bimetric theory of gravitation. To obtain the deterministic model in terms of cosmic time, we have used the condition $A = (B C)^n$, where n is a constant, between the metric potentials. The magnetic field is due to the electric current produced along the -axis with infinite electrical conductivity. Some physical and geometrical properties of the exhibited model are discussed and studied.

  3. Types of two-dimensional = 4 superconformal field theories

    Abbas Ali

    2003-12-01

    Various types of = 4 superconformal symmetries in two dimensions are considered. It is proposed that apart from the well-known cases of (2) and (2)× (2)× (1), their Kac–Moody symmetry can also be (2)× ((1))4. Operator product expansions for the last case are derived. A complete free field realization for the same is obtained.

  4. Theory of light-matter interactions in cascade and diamond type atomic ensembles

    Jen, Hsiang-Hua

    2011-01-01

    In this thesis, we investigate the quantum mechanical interaction of light with matter in the form of a gas of ultracold atoms: the atomic ensemble. We present a theoretical analysis of two problems, which involve the interaction of quantized electromagnetic fields (called signal and idler) with the atomic ensemble (i) cascade two-photon emission in an atomic ladder configuration, and (ii) photon frequency conversion in an atomic diamond configuration. The motivation of these studies comes from potential applications in long-distance quantum communication where it is desirable to generate quantum correlations between telecommunication wavelength light fields and ground level atomic coherences. We develop a theory of correlated signal-idler pair correlation. The analysis is complicated by the possible generation of multiple excitations in the atomic ensemble. An analytical treatment is given in the limit of a single excitation assuming adiabatic laser excitations. The analysis predicts superradiant timescales ...

  5. The use of quantitative PCR for identification and quantification of Brachyspira pilosicoli, Lawsonia intracellularis and Escherichia coli fimbrial types F4 and F18 in pig feces

    Ståhl, Marie; Kokotovic, Branko; Hjulsager, Charlotte Kristiane

    2011-01-01

    of using specific standard curves, where each pathogen is analysed in the same matrix as sample DNA. The qPCRs were compared to traditional bacteriological diagnostic methods and found to be more sensitive than cultivation for E. coli and B. pilosicoli. The qPCR assay for Lawsonia was also more sensitive......Four quantitative PCR (qPCR) assays were evaluated for quantitative detection of Brachyspira pilosicoli, Lawsonia intracellularis, and E. coli fimbrial types F4 and F18 in pig feces. Standard curves were based on feces spiked with the respective reference strains. The detection limits from...... the spiking experiments were 102 bacteria/g feces for BpiloqPCR and Laws-qPCR, 103 CFU/g feces for F4-qPCR and F18-qPCR. The PCR efficiency for all four qPCR assays was between 0.91 and 1.01 with R2 above 0.993. Standard curves, slopes and elevation, varied between assays and between measurements from pure...

  6. Theory of highly efficient multiexciton generation in type-II nanorods

    Eshet, Hagai; Baer, Roi; Neuhauser, Daniel; Rabani, Eran

    2016-10-01

    Multiexciton generation, by which more than a single electron-hole pair is generated on optical excitation, is a promising paradigm for pushing the efficiency of solar cells beyond the Shockley-Queisser limit of 31%. Utilizing this paradigm, however, requires the onset energy of multiexciton generation to be close to twice the band gap energy and the efficiency to increase rapidly above this onset. This challenge remains unattainable even using confined nanocrystals, nanorods or nanowires. Here, we show how both goals can be achieved in a nanorod heterostructure with type-II band offsets. Using pseudopotential atomistic calculation on a model type-II semiconductor heterostructure we predict the optimal conditions for controlling multiexciton generation efficiencies at twice the band gap energy. For a finite band offset, this requires a sharp interface along with a reduction of the exciton cooling and may enable a route for breaking the Shockley-Queisser limit.

  7. Noether-type theory for discrete mechanico-electrical dynamical systems with nonregular lattices

    2010-01-01

    We investigate Noether symmetries and conservation laws of the discrete mechanico-electrical systems with nonregular lattices.The operators of discrete transformation and discrete differentiation to the right and left are introduced for the systems.Based on the invariance of discrete Hamilton action on nonregular lattices of the systems with the dissipation forces under the infinitesimal transformations with respect to the time,generalized coordinates and generalized charge quantities,we work out the discrete analog of the generalized variational formula.From this formula we derive the discrete analog of generalized Noether-type identity,and then we present the generalized quasi-extremal equations and properties of these equations for the systems.We also obtain the discrete analog of Noether-type conserved laws and the discrete analog of generalized Noether theorems for the systems.Finally we use an example to illustrate these results.

  8. TH-C-19A-09: Quantification of Transmission and Backscatter Factors as a function of Distance to Inhomogeneity Interface for Three Types of Surgical Implant Plates

    Wilson, D; Mills, M; Wang, B [University of Louisville, Louisville, KY (United States)

    2014-06-15

    Purpose: Carbon fiber materials have been increasingly used clinically, mainly in orthopedics, as an alternative to metallic implants because of their minimal artifacts on CT and MRI images. This study characterizes the transmission and backscatter property of carbon fiber plates (CarboFix Orthopedics, Herzeliya, Israel) with measurements for radiation therapy applications, and compares them to traditional Stainless Steel (SS) and Titanium (Ti) metal materials. Methods: For the transmission measurements, 1-mm-thick test plate was placed upstream from a plane parallel Markus chamber, separated by various thicknesses of polystyrene plates in 0.5 cm increments between 0 and 5 cm. With this setup, we quantified the radiation transmission as a function of distance to the inhomogeneity interface. The LINAC source to detector distance was maintained at 100 cm and 200 MU was delivered for each measurement. Two 3-cm solid water phantoms were placed at the top and bottom to provide build up. All the measurements were performed for 6 MV and 18 MV photons. The backscatter measurements had the identical setup, except that the test plate was downstream of the chamber from radiation. Results: The carbon fiber plates did not introduce any measureable inhomogeneity effect on the transmission and backscatter factor because of its low atomic number. In contrast, traditional metal implant materials caused up to 15% dose difference at upstream and 25% backscatter at downstream from radiation. Such differences decrease as the distance to the inhomogeneity interface increases and become unmeasurable at distance of 3 cm and 1 cm for upstream and downstream, respectively. Conclusion: A new type of carbon fiber implant plate was evaluated and found to have minimal inhomogeneity effect in MV radiation beams. Patients would benefit from a carbon based implant over metal for radiation therapy due to their minimal backscatter and imaging artifacts.

  9. An analysis of the openehr archetype semantics based on a typed lambda theory.

    Tatsukawa, Akimichi; Shinohara, Emiko Y; Kawazoe, Yoshimasa; Imai, Takeshi; Ohe, Kazuhiko

    2013-01-01

    The openEHR has adopted the dual model architecture consisting of Reference Model and Archetype. The specification, however, lacks formal definitions of archetype semantics, so that its behaviors have remained ambiguous. The objective of this poster is to analyze semantics of the openEHR archetypes: its variance and mutability. We use a typed lambda calculus as an analyzing tool. As a result, we have reached the conclusion that archetypes should be 1) covariant and 2) immutable schema.

  10. Theory of phase segregation in DNA assemblies containing two different base-pair sequence types

    (O’ Lee, Dominic J.; Wynveen, Aaron; Kornyshev, Alexei A.

    2017-01-01

    Spontaneous pairing of homologous DNA sequences—a challenging subject in molecular biophysics, often referred to as ‘homology recognition’—has been observed in vitro for several DNA systems. One of these experiments involved liquid crystalline quasi-columnar phases formed by a mixture of two kinds of double stranded DNA oligomer. Both oligomer types were of the same length and identical stoichiometric base-pair composition, but the base-pairs followed a different order. Phase segregation of the two DNA types was observed in the experiments, with the formation of boundaries between domains rich in molecules of one type (order) of base pair sequence. We formulate here a modified ‘X–Y model’ for phase segregation in such assemblies, obtain approximate solutions of the model, compare analytical results to Monte Carlo simulations, and rationalise past experimental observations. This study, furthermore, reveals the factors that affect the degree of segregation. Such information could be used in planning new versions of similar segregation experiments, needed for deepening our understanding of forces that might be involved, e.g., in gene–gene recognition.

  11. Theory of the normal modes of vibrations in the lanthanide type crystals

    Acevedo, Roberto [Instituto de Ciencias Basicas. Facultad de Ingenieria, Universidad Diego Portales, Avenida Ejercito 441, Santiago (Chile); Soto-Bubert, Andres, E-mail: roberto.acevedo@umayor.cl

    2008-11-01

    For the lanthanide type crystals, a vast and rich, though incomplete amount of experimental data has been accumulated, from linear and non linear optics, during the last decades. The main goal of the current research work is to report a new methodology and strategy to put forward a more representative approach to account for the normal modes of vibrations for a complex N-body system. For illustrative purposes, the chloride lanthanide type crystals Cs{sub 2}NaLnCl{sub 6} have been chosen and we develop new convergence tests as well as a criterion to deal with the details of the F-matrix (potential energy matrix). A novel and useful concept of natural potential energy distributions (NPED) is introduced and examined throughout the course of this work. The diagonal and non diagonal contributions to these NPED-values, are evaluated for a series of these crystals explicitly. Our model is based upon a total of seventy two internal coordinates and ninety eight internal Hooke type force constants. An optimization mathematical procedure is applied with reference to the series of chloride lanthanide crystals and it is shown that the strategy and model adopted is sound from both a chemical and a physical viewpoints. We can argue that the current model is able to accommodate a number of interactions and to provide us with a very useful physical insight. The limitations and advantages of the current model and the most likely sources for improvements are discussed in detail.

  12. Integrating theory and data to create an online self-management programme for adults with type 2 diabetes: HeLP-Diabetes

    Kingshuk Pal

    2015-10-01

    This protocol demonstrates a multi-disciplinary approach to combining evidence from multiple sources to create ’HeLP-Diabetes’: a theory and evidence based online self-management intervention for adults with type 2 diabetes.

  13. Quantitative RT-PCR assays for the determination of urokinase-type plasminogen activator and plasminogen activator inhibitor type 1 mRNA in primary tumor tissue of breast cancer patients: comparison to antigen quantification by ELISA.

    Biermann, J.C.; Holzscheiter, L.; Kotzsch, M.; Luther, T.; Kiechle-Bahat, M.; Sweep, F.C.; Span, P.N.; Schmitt, M.; Magdolen, V.

    2008-01-01

    Urokinase-type plasminogen activator (uPA) and its inhibitor plasminogen activator inhibitor type 1 (PAI-1) play a key role in tumor-associated processes such as the degradation of extracellular matrix proteins, tissue remodeling, cell adhesion, migration, and invasion. High antigen levels of uPA an

  14. Quantification of Human T-lymphotropic virus type I (HTLV-I) provirus load in a rural West African population: no enhancement of human immunodeficiency virus type 2 pathogenesis, but HTLV-I provirus load relates to mortality

    Ariyoshi, K; Berry, N; Cham, F;

    2003-01-01

    Human T-lymphotropic virus type I (HTLV-I) provirus load was examined in a cohort of a population in Guinea-Bissau among whom human immunodeficiency virus (HIV) type 2 is endemic. Geometric mean of HIV-2 RNA load among HTLV-I-coinfected subjects was significantly lower than that in subjects infec...

  15. Contiguous triple spinal dysraphism associated with Chiari malformation Type II and hydrocephalus: an embryological conundrum between the unified theory of Pang and the unified theory of McLone.

    Dhandapani, Sivashanmugam; Srinivasan, Anirudh

    2016-01-01

    Triple spinal dysraphism is extremely rare. There are published reports of multiple discrete neural tube defects with intervening normal segments that are explained by the multisite closure theory of primary neurulation, having an association with Chiari malformation Type II consistent with the unified theory of McLone. The authors report on a 1-year-old child with contiguous myelomeningocele and lipomyelomeningocele centered on Type I split cord malformation with Chiari malformation Type II and hydrocephalus. This composite anomaly is probably due to select abnormalities of the neurenteric canal during gastrulation, with a contiguous cascading impact on both dysjunction of the neural tube and closure of the neuropore, resulting in a small posterior fossa, probably bringing the unified theory of McLone closer to the unified theory of Pang.

  16. Blaschke-type conditions in unbounded domains, generalized convexity and applications in perturbation theory

    Favorov, S

    2012-01-01

    We introduce a new geometric characteristic of compact sets on the plane called $r$-convexity, which fits nicely into the concept of generalized convexity and extends essentially the conventional convexity. For a class of subharmonic functions on unbounded domains with $r$-convex compact complement, with the growth governed by the distance to the boundary, we obtain the Blaschke--type condition for their Riesz' measures. The result is applied to the study of the convergence of the discrete spectrum for the Schatten-von Neumann perturbations.

  17. Basic Theory for Differential Equations with Unified Reimann-Liouville and Hadamard Type Fractional Derivatives

    Basak Karpuz

    2017-03-01

    Full Text Available In this paper, we extend the definition of the fractional integral and derivative introduced in [Appl. Math. Comput. 218 (2011] by Katugampola, which exhibits nice properties only for numbers whose real parts lie in [0,1]. We prove some interesting properties of the fractional integrals and derivatives. Based on these properties, the following concepts for the new type fractional differential equations are explored: Existence and uniqueness of solutions; Solutions of autonomous fractional differential equations; Dependence on the initial conditions; Green’s function; Variation of parameters formula.

  18. Mushroom-type structures with the wires connected through diodes: Theory and applications

    Forouzmand, Ali; Kaipa, Chandra S. R.; Yakovlev, Alexander B.

    2016-07-01

    In this paper, we establish a general formalism to quantify the interaction of electromagnetic waves with mushroom-type structures (high impedance surface and bi-layer) with diodes inserted along the direction of the wires. The analysis is carried out using the nonlocal homogenization model for the mushroom structure with the generalized additional boundary conditions at the connection of the wires to diodes. We calculate numerically the magnitude and phase of the reflected/transmitted fields in the presence of an ideal and realistic PIN diodes. It is observed that the reflection/transmission characteristics of the mushroom-type structures can be controlled by tuning the working states of the integrated PIN diodes. We realize a structure with a multi-diode switch to minimize the undesired transmission for a particular incident angle. In addition, a dual-band subwavelength imaging lens is designed based on the resonant amplification of evanescent waves, wherein the operating frequency can be tuned by changing the states of the PIN diodes. The analytical results are verified with the full-wave electromagnetic solver CST Microwave Studio, showing a good agreement.

  19. Quantification of Cannabinoid Content in Cannabis

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  20. Type Ia Supernovae and their Environment: Theory and Applications to SN 2014J

    Dragulin, Paul

    2015-01-01

    We present theoretical semi-analytic models for the interaction of stellar winds with the interstellar medium (ISM) or prior mass loss implemented in our code SPICE (Supernovae Progenitor Interaction Calculator for parameterized Environments, available on request), assuming spherical symmetry and power-law ambient density profiles and using the Pi-theorem. This allows us to test a wide variety of configurations, their functional dependencies, and to find classes of solutions for given observations. Here, we study Type Ia (SN~Ia) surroundings of single and double degenerate systems, and their observational signatures. Winds may originate from the progenitor prior to the white dwarf (WD) stage, the WD, a donor star, or an accretion disk (AD). For M_Ch explosions,the AD wind dominates and produces a low-density void several light years across surrounded by a dense shell. The bubble explains the lack of observed interaction in late time SN light curves for, at least, several years. The shell produces narrow ISM l...

  1. Refreeze experiments with water droplets containing different types of ice nuclei interpreted by classical nucleation theory

    Kaufmann, Lukas; Marcolli, Claudia; Luo, Beiping; Peter, Thomas

    2017-03-01

    Homogeneous nucleation of ice in supercooled water droplets is a stochastic process. In its classical description, the growth of the ice phase requires the emergence of a critical embryo from random fluctuations of water molecules between the water bulk and ice-like clusters, which is associated with overcoming an energy barrier. For heterogeneous ice nucleation on ice-nucleating surfaces both stochastic and deterministic descriptions are in use. Deterministic (singular) descriptions are often favored because the temperature dependence of ice nucleation on a substrate usually dominates the stochastic time dependence, and the ease of representation facilitates the incorporation in climate models. Conversely, classical nucleation theory (CNT) describes heterogeneous ice nucleation as a stochastic process with a reduced energy barrier for the formation of a critical embryo in the presence of an ice-nucleating surface. The energy reduction is conveniently parameterized in terms of a contact angle α between the ice phase immersed in liquid water and the heterogeneous surface. This study investigates various ice-nucleating agents in immersion mode by subjecting them to repeated freezing cycles to elucidate and discriminate the time and temperature dependences of heterogeneous ice nucleation. Freezing rates determined from such refreeze experiments are presented for Hoggar Mountain dust, birch pollen washing water, Arizona test dust (ATD), and also nonadecanol coatings. For the analysis of the experimental data with CNT, we assumed the same active site to be always responsible for freezing. Three different CNT-based parameterizations were used to describe rate coefficients for heterogeneous ice nucleation as a function of temperature, all leading to very similar results: for Hoggar Mountain dust, ATD, and larger nonadecanol-coated water droplets, the experimentally determined increase in freezing rate with decreasing temperature is too shallow to be described properly by

  2. Bianchi Type-I Massive String Magnetized Barotropic Perfect Fluid Cosmological Model in the Bimetric Theory of Gravitation

    N. P. Gaikwad; M. S. Borkar; S. S. Charjan

    2011-01-01

    @@ We investigate the Bianchi type-I massive string magnetized barotropic perfect fluid cosmological model in Rosen's bimetric theory of gravitation with and without a magnetic field by applying the techniques used by Latelier(1979,1980) and Stachel(1983).To obtain a deterministic model of the universe, it is assumed that the universe is filled with barotropic perfect fluid distribution.The physical and geometrical significance of the model are discussed.By comparing our model with the model of Bali et al.(2007), it is realized that there are no big-bang and big-crunch singularities in our model and T=0 is not the time of the big bang, whereas the model of Bali et al.starts with a big bang at T=0.Further, our model is in agreement with Bali et al.(2007) as time increases in the presence, as well as in the absence, of a magnetic field.

  3. Monte Carlo simulation of transfer reactions using extended R-matrix theory picturing surrogate-type WFCF features

    Bouland, Olivier H.

    2016-03-01

    This article supplies an overview of issues related to the interpretation of surrogate measurement results for neutron-incident cross section predictions; difficulties that are somehow masked by the historical conversion route based on Weisskopf-Ewing approximation. Our proposal is to handle the various difficulties by using a more rigorous approach relying on Monte Carlo simulation of transfer reactions with extended R-matrix theory. The multiple deficiencies of the historical surrogate treatment are recalled but only one is examined in some details here; meaning the calculation of in-out-going channel Width Fluctuation Correction Factors (WFCF) which behavior witness partly the failure of Niels Bohr's compound nucleus theoretical landmark. Relevant WFCF calculations according to neutron-induced surrogate- and cross section-types as a function of neutron-induced fluctuating energy range [0 - 2.1 MeV] are presented and commented in the case of the 240Pu* and 241Pu* compound nucleus isotopes.

  4. On B-type open-closed Landau-Ginzburg theories defined on Calabi-Yau Stein manifolds

    Babalic, Mirela; Lazaroiu, Calin Iuliu; Tavakol, Mehdi

    2016-01-01

    We consider the bulk algebra and topological D-brane category arising from the differential model of the open-closed B-type topological Landau-Ginzburg theory defined by a pair $(X,W)$, where $X$ is a non-compact Calabi-Yau manifold and $W$ has compact critical set. When $X$ is a Stein manifold (but not restricted to be a domain of holomorphy), we extract equivalent descriptions of the bulk algebra and of the category of topological D-branes which are constructed using only the analytic space associated to $X$. In particular, we show that the D-brane category is described by projective matrix factorizations defined over the ring of holomorphic functions of $X$. We also discuss simplifications of the analytic models which arise when $X$ is holomorphically parallelizable and illustrate these analytic models in a few classes of examples.

  5. Factors Influencing Physical Activity Behavior among Iranian Women with Type 2 Diabetes Using the Extended Theory of Reasoned Action

    Alireza Didarloo

    2011-10-01

    Full Text Available BackgroundFindings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran.MethodsA sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior. The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions using the SPSS package.ResultsThe findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors.ConclusionOur findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease.

  6. Positron emission tomographic imaging of the cannabinoid type 1 receptor system with [¹¹C]OMAR ([¹¹C]JHU75528): improvements in image quantification using wild-type and knockout mice.

    Herance, Raúl; Rojas, Santiago; Abad, Sergio; Jiménez, Xavier; Gispert, Juan Domingo; Millán, Olga; Martín-García, Elena; Burokas, Aurelijus; Serra, Miquel Àngel; Maldonado, Rafael; Pareto, Deborah

    2011-12-01

    In this study, we assessed the feasibility of using positron emission tomography (PET) and the tracer [¹¹C]OMAR ([¹¹C]JHU75528), an analogue of rimonabant, to study the brain cannabinoid type 1 (CB1) receptor system. Wild-type (WT) and CB1 knockout (KO) animals were imaged at baseline and after pretreatment with blocking doses of rimonabant. Brain uptake in WT animals was higher (50%) than in KO animals in baseline conditions. After pretreatment with rimonabant, WT uptake lowered to the level of KO animals. The results of this study support the feasibility of using PET with the radiotracer [¹¹C]JHU75528 to image the brain CB1 receptor system in mice. In addition, this methodology can be used to assess the effect of new drugs in preclinical studies using genetically manipulated animals.

  7. Positron Emission Tomographic Imaging of the Cannabinoid Type 1 Receptor System with [11C]OMAR ([11C]JHU75528: Improvements in Image Quantification Using Wild-Type and Knockout Mice

    Raúl Herance

    2011-11-01

    Full Text Available In this study, we assessed the feasibility of using positron emission tomography (PET and the tracer [11C]OMAR ([11C]JHU75528, an analogue of rimonabant, to study the brain cannabinoid type 1 (CB1 receptor system. Wild-type (WT andCB1 knockout (KO animals were imaged at baseline and after pretreatment with blocking doses of rimonabant. Brain uptake in WT animals was higher (50% than in KO animals in baseline conditions. After pretreatment with rimonabant, WT uptake lowered to the level of KO animals. The results of this study support the feasibility of using PET with the radiotracer [11C]JHU75528 to image the brain CB1 receptor system in mice. In addition, this methodology can be used to assess the effect of new drugs in preclinical studies using genetically manipulated animals.

  8. Motivational Profiles for Physical Activity Practice in Adults with Type 2 Diabetes: A Self-Determination Theory Perspective.

    Gourlan, Mathieu; Trouilloud, David; Boiché, Julie

    2016-01-01

    Drawing on Self-Determination Theory, this study explored the motivational profiles toward Physical Activity (PA) among adults with type 2 diabetes and the relationships between motivational profile, perceived competence and PA. Participants were 350 men and women (Mean age 62.77 years) who were interviewed on their motivations toward PA, perceived level of competence to practice, and PA practice. Cluster analyses reveal the existence of three distinct profiles: "High Combined" (ie, high scores on motivations ranging from intrinsic to external regulation, moderate level on amotivation), "Self-Determined" (ie, high scores on intrinsic, integrated, and identified regulations; low scores on other regulations), and "Moderate" (ie, moderate scores on all regulations). Participants with "High Combined" and "Self-Determined" profiles reported higher perceived competence and longer leisure-time PA practice in comparison to those with a "Moderate" profile. This study highlights the necessity of adopting a person-centered approach to better understand motivation toward PA among type 2 diabetics.

  9. Experimental Support for the Ecoimmunity Theory: Distinct Phenotypes of Nonlymphocytic Cells in SCID and Wild-Type Mice.

    Ochayon, David E; Baranovski, Boris M; Malkin, Peter; Schuster, Ronen; Kalay, Noa; Ben-Hamo, Rotem; Sloma, Ido; Levinson, Justin; Brazg, Jared; Efroni, Sol; Lewis, Eli C; Nevo, Uri

    2016-01-01

    Immune tolerance toward "self" is critical in multiple immune disorders. While there are several mechanisms to describe the involvement of immune cells in the process, the role of peripheral tissue cells in that context is not yet clear. The theory of ecoimmunity postulates that interactions between immune and tissue cells represent a predator-prey relationship. A lifelong interaction, shaped mainly during early ontogeny, leads to selection of nonimmune cell phenotypes. Normally, therefore, nonimmune cells that evolve alongside an intact immune system would be phenotypically capable of evading immune responses, and cells whose phenotype falls short of satisfying this steady state would expire under hostile immune responses. This view was supported until recently by experimental evidence showing an inferior endurance of severe combined immunodeficiency (SCID)-derived pancreatic islets when engrafted into syngeneic immune-intact wild-type (WT) mice, relative to islets from WT. Here we extend the experimental exploration of ecoimmunity by searching for the presence of the phenotypic changes suggested by the theory. Immune-related phenotypes of islets, spleen, and bone marrow immune cells were determined, as well as SCID and WT nonlymphocytic cells. Islet submass grafting was performed to depict syngeneic graft functionality. Islet cultures were examined under both resting and inflamed conditions for expression of CD40 and major histocompatibility complex (MHC) class I/II and release of interleukin-1α (IL-1α), IL-1β, IL-6, tumor necrosis factor-α (TNF-α), IL-10, and insulin. Results depict multiple pathways that appear to be related to the sculpting of nonimmune cells by immune cells; 59 SCID islet genes displayed relative expression changes compared with WT islets. SCID cells expressed lower tolerability to inflammation and higher levels of immune-related molecules, including MHC class I. Accordingly, islets exhibited a marked increase in insulin release upon

  10. Adjoint-Based Uncertainty Quantification with MCNP

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  11. The theory of discrete barriers and its applications to linear boundary-value problems of the 'Dirichlet type'; Theorie des barrieres discretes et applications a des problemes lineaires elliptiques du ''type de dirichlet''

    Jamet, P. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    This report gives a general presentation of barrier theory for finite difference operators, with its applications to some boundary value problems. (author) [French] Ce rapport est un expose synthetique de la theorie des barrieres pour les operateurs aux differences finies et ses applications a certaines classes de problemes lineaires elliptiques du 'type de Dirichlet'. (auteur)

  12. Modified social learning theory re-examined: correlates of self-management behaviors of persons with Type 2 diabetes.

    Nugent, Linda E; Wallston, Kenneth A

    2016-12-01

    Modified social learning theory (MSLT) applied to health predicts that health behavior is a multiplicative function of health value and perceptions of control over health. The self-management behaviors of persons with Type 2 diabetes mellitus, internal diabetes locus of control (IDLC), diabetes self-efficacy (DSE), and health value (HV) were assessed with an index of diabetes self-care activities in 107 patients receiving insulin. Multiple regression analysis showed DSE as the only MSLT construct that correlated with the index of diabetes self-care behaviors (β = .21, p < .05). While the predicted three-way interaction of IDLC × DSE × HV was significant (∆R(2) = 4.5 %, p < .05) in the final step of the hierarchical model, the pattern of the findings only partially supported MSLT. Instead of finding that patients who were simultaneously high on all three predictors scored highest on the behavioral index, we found that patients who were low on all three constructs reported the least amount of diabetes self-care behavior. Implications for further modification of MSLT and its applications to clinical practice are discussed.

  13. Monte Carlo simulation of transfer reactions using extended R-matrix theory picturing surrogate-type WFCF features

    Bouland Olivier H.

    2016-01-01

    Full Text Available This article supplies an overview of issues related to the interpretation of surrogate measurement results for neutron-incident cross section predictions; difficulties that are somehow masked by the historical conversion route based on Weisskopf-Ewing approximation. Our proposal is to handle the various difficulties by using a more rigorous approach relying on Monte Carlo simulation of transfer reactions with extended R-matrix theory. The multiple deficiencies of the historical surrogate treatment are recalled but only one is examined in some details here; meaning the calculation of in-out-going channel Width Fluctuation Correction Factors (WFCF which behavior witness partly the failure of Niels Bohr’s compound nucleus theoretical landmark. Relevant WFCF calculations according to neutron-induced surrogate- and cross section-types as a function of neutron-induced fluctuating energy range [0 - 2.1 MeV] are presented and commented in the case of the 240Pu* and 241Pu* compound nucleus isotopes.

  14. Selection of Treatment Strategies among Patients with Type 2 Diabetes Mellitus in Malaysia: A Grounded Theory Approach.

    Lee Lan Low

    Full Text Available Diabetes Mellitus is a multifaceted chronic illness and its life-long treatment process requires patients to continuously engage with the healthcare system. The understanding of how patients manoeuvre through the healthcare system for treatment is crucial in assisting them to optimise their disease management. This study aims to explore issues determining patients' treatment strategies and the process of patients manoeuvring through the current healthcare system in selecting their choice of treatment for T2DM.The Grounded Theory methodology was used. Twelve patients with Type 2 Diabetes Mellitus, nine family members and five healthcare providers from the primary care clinics were interviewed using a semi-structured interview guide. Three focus group discussions were conducted among thirteen healthcare providers from public primary care clinics. Both purposive and theoretical samplings were used for data collection. The interviews were audio-taped and transcribed verbatim, followed by line-by-line coding and constant comparison to identify the categories and core category.The concept of "experimentation" was observed in patients' help-seeking behaviour. The "experimentation" process required triggers, followed by information seeking related to treatment characteristics from trusted family members, friends and healthcare providers to enable decisions to be made on the choice of treatment modalities. The whole process was dynamic and iterative through interaction with the healthcare system. The decision-making process in choosing the types of treatment was complex with an element of trial-and-error. The anchor of this process was the desire to fulfil the patient's expected outcome.Patients with Type 2 Diabetes Mellitus continuously used "experimentation" in their treatment strategies and help-seeking process. The "experimentation" process was experiential, with continuous evaluation, information seeking and decision-making tinged with the element

  15. Towards an integrative account of social cognition: marrying theory of mind and interactionism to study the interplay of Type 1 and Type 2 processes

    Vivian eBohl; Wouter evan den Bos

    2012-01-01

    Traditional theory of mind accounts of social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional theory of mind accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of conside...

  16. Using psychological theory to understand the clinical management of type 2 diabetes in Primary Care: a comparison across two European countries.

    Hrisos, S.; Eccles, M.P.; Francis, J.J.; Bosch, M.C.; Dijkstra, R.F.; Johnston, M.; Grol, R.P.T.M.; Kaner, E.F.; Steen, I.N.

    2009-01-01

    BACKGROUND: Long term management of patients with Type 2 diabetes is well established within Primary Care. However, despite extensive efforts to implement high quality care both service provision and patient health outcomes remain sub-optimal. Several recent studies suggest that psychological theori

  17. Disease quantification in dermatology

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    useful in quantifying disease severity, they require an extensive clinical experience and carry a risk of subjectivity. We explore the opportunity to use in vivo near-infrared (NIR) spectra as an objective and noninvasive method for local disease severity assessment in 31 psoriasis patients in whom...... selected plaques were scored clinically. A partial least squares (PLS) regression model was used to analyze and predict the severity scores on the NIR spectra of psoriatic and uninvolved skin. The correlation between predicted and clinically assigned scores was R=0.94 (RMSE=0.96), suggesting that in vivo...... NIR provides accurate clinical quantification of psoriatic plaques. Hence, NIR may be a practical solution to clinical severity assessment of psoriasis, providing a continuous, linear, numerical value of severity....

  18. Session Types = Intersection Types + Union Types

    Padovani, Luca

    2011-01-01

    We propose a semantically grounded theory of session types which relies on intersection and union types. We argue that intersection and union types are natural candidates for modeling branching points in session types and we show that the resulting theory overcomes some important defects of related behavioral theories. In particular, intersections and unions provide a native solution to the problem of computing joins and meets of session types. Also, the subtyping relation turns out to be a pre-congruence, while this is not always the case in related behavioral theories.

  19. Non-existence of non-topological solitons in some types of gauge field theories in Minkowski space

    Smolyakov, Mikhail N

    2010-01-01

    In this paper the conditions, under which non-topological solitons are absent in Yang-Mills theory coupled to a non-linear scalar field in Minkowski space, are obtained. It is also shown that non-topological solitons are absent in a theory describing massive complex vector field coupled to electromagnetic field in Minkowski space.

  20. Students' Personality Types, Intended Majors, and College Expectations: Further Evidence Concerning Psychological and Sociological Interpretations of Holland's Theory

    Pike, Gary R.

    2006-01-01

    Because it focuses on the interactions between students and their environments, Holland's theory of vocational choice provides a powerful framework for studying college experiences. The present study assessed the relative merits of psychological and sociological interpretations of Holland's theory by examining the relationships among students' …

  1. Automated Template Quantification for DNA Sequencing Facilities

    Ivanetich, Kathryn M.; Yan, Wilson; Wunderlich, Kathleen M.; Weston, Jennifer; Walkup, Ward G.; Simeon, Christian

    2005-01-01

    The quantification of plasmid DNA by the PicoGreen dye binding assay has been automated, and the effect of quantification of user-submitted templates on DNA sequence quality in a core laboratory has been assessed. The protocol pipets, mixes and reads standards, blanks and up to 88 unknowns, generates a standard curve, and calculates template concentrations. For pUC19 replicates at five concentrations, coefficients of variance were 0.1, and percent errors were from 1% to 7% (n = 198). Standard curves with pUC19 DNA were nonlinear over the 1 to 1733 ng/μL concentration range required to assay the majority (98.7%) of user-submitted templates. Over 35,000 templates have been quantified using the protocol. For 1350 user-submitted plasmids, 87% deviated by ≥ 20% from the requested concentration (500 ng/μL). Based on data from 418 sequencing reactions, quantification of user-submitted templates was shown to significantly improve DNA sequence quality. The protocol is applicable to all types of double-stranded DNA, is unaffected by primer (1 pmol/μL), and is user modifiable. The protocol takes 30 min, saves 1 h of technical time, and costs approximately $0.20 per unknown. PMID:16461949

  2. Advancing agricultural greenhouse gas quantification*

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    Agricultural Research Service 2011), which aim to improve consistency of field measurement and data collection for soil carbon sequestration and soil nitrous oxide fluxes. Often these national-level activity data and emissions factors are the basis for regional and smaller-scale applications. Such data are used for model-based estimates of changes in GHGs at a project or regional level (Olander et al 2011). To complement national data for regional-, landscape-, or field-level applications, new data are often collected through farmer knowledge or records and field sampling. Ideally such data could be collected in a standardized manner, perhaps through some type of crowd sourcing model to improve regional—and national—level data, as well as to improve consistency of locally collected data. Data can also be collected by companies working with agricultural suppliers and in country networks, within efforts aimed at understanding firm and product (supply-chain) sustainability and risks (FAO 2009). Such data may feed into various certification processes or reporting requirements from buyers. Unfortunately, this data is likely proprietary. A new process is needed to aggregate and share private data in a way that would not be a competitive concern so such data could complement or supplement national data and add value. A number of papers in this focus issue discuss issues surrounding quantification methods and systems at large scales, global and national levels, while others explore landscape- and field-scale approaches. A few explore the intersection of top-down and bottom-up data measurement and modeling approaches. 5. The agricultural greenhouse gas quantification project and ERL focus issue Important land management decisions are often made with poor or few data, especially in developing countries. Current systems for quantifying GHG emissions are inadequate in most low-income countries, due to a lack of funding, human resources, and infrastructure. Most non-Annex 1 countries

  3. Verb aspect, alternations and quantification

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  4. Optimized sequential extraction for carbonates : Quantification and δ13C analysis of calcite, dolomite and siderite

    Morera-Chavarría, A.; Griffioen, J.; Behrends, T.

    2016-01-01

    Siderite is present in diverse types of rocks and sediments, but its quantification is cumbersome when present in relatively low contents. A new analytical method for the sequential separation of different carbonate phases is presented. The separation, quantification and characterization of the carb

  5. Uncertainty Quantification in Aeroelasticity

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  6. Quantification of thermal damage in skin tissue

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  7. Uncertainty quantification and stochastic modeling with Matlab

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  8. General Theories of Regulation

    Hertog, J.A. den

    1999-01-01

    This chapter makes a distinction between three types of theories of regulation: public interest theories, the Chicago theory of regulation and the public choice theories. The Chicago theory is mainly directed at the explanation of economic regulation; public interest theories and public choice theor

  9. Aharonov-Bohm Effect for Bound States on the Confinement of a Relativistic Scalar Particle to a Coulomb-Type Potential in Kaluza-Klein Theory

    E. V. B. Leite

    2015-01-01

    Full Text Available Based on the Kaluza-Klein theory, we study the Aharonov-Bohm effect for bound states for a relativistic scalar particle subject to a Coulomb-type potential. We introduce this scalar potential as a modification of the mass term of the Klein-Gordon equation, and a magnetic flux through the line element of the Minkowski spacetime in five dimensions. Then, we obtain the relativistic bound states solutions and calculate the persistent currents.

  10. The field and Killing spinor equations of M-theory and type IIA/IIB supergravity in coordinate-free notation

    Hamilton, M J D

    2016-01-01

    We review the actions of the supergravity theory in eleven dimensions as well as the type IIA and IIB supergravities in ten dimensions and derive the bosonic equations of motion in a coordinate-free notation. We also consider the existence of supersymmetries and the associated generalized Killing spinor equations. The aim of this note is to serve as a formulary and make the equations of supergravity more easily accessible to mathematicians.

  11. Curing theory of A_f-A_g type free radical polymerization (Ⅱ)——Characterization of network structural parameters

    王海军; 吕中元; 黄旭日; 李泽生; 唐敖庆

    1999-01-01

    By means of the polymer statistical theory, the A_f-A_g type nonlinear free radical polymerization is investigated to give the number of effective elastic chains, the number of effective elastic mers and the average length for the elastic chains. The corresponding quantities for the dangling chains, the number of effective cross-linkage and the modulus are also obtained. Furthermore, the number- and weight-fractions of elastic chains are deduced.

  12. Curing theory of A_f-A_g type free radical polymerization (Ⅲ)——The evaluation of network defects

    王海军; 巴信武; 赵敏; 李泽生

    2000-01-01

    Evaluation of defects in the polymer network is important to characterize the polymer materials, in which there always exist the defects that affect the physical and chemical properties of polymer network. Taking Af-Ag type nonlinear free radical polymerization as an example, one type of defects called dangling loops in the gel network is investigated by means of the statistical theory of polymeric reactions. The number of dangling loops and the probability of its formation are obtained by analyzing the polymer network structure in detail.

  13. Uncertainty Quantification in Climate Modeling

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  14. Quantification and Negation in Event Semantics

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  15. Quantification in dynamic and small-animal positron emission tomography

    Disselhorst, Johannes Antonius

    2011-01-01

    This thesis covers two aspects of positron emission tomography (PET) quantification. The first section addresses the characterization and optimization of a small-animal PET/CT scanner. The sensitivity and resolution as well as various parameters affecting image quality (reconstruction settings, type

  16. A Point-Wise Quantification of Asymmetry Using Deformation Fields

    Ólafsdóttir, Hildur; Lanche, Stephanie; Darvann, Tron Andre;

    2007-01-01

    sutures, which gives rise to a highly asymmetric growth. Quantification and localisation of this asymmetry is of high value with respect to surgery planning and treatment evaluation. Using the proposed method, asymmetry was calculated in each point of the surface of Crouzon mice and wild-type mice...

  17. Uncertainty Quantification in Hybrid Dynamical Systems

    Sahai, Tuhin

    2011-01-01

    Uncertainty quantification (UQ) techniques are frequently used to ascertain output variability in systems with parametric uncertainty. Traditional algorithms for UQ are either system-agnostic and slow (such as Monte Carlo) or fast with stringent assumptions on smoothness (such as polynomial chaos and Quasi-Monte Carlo). In this work, we develop a fast UQ approach for hybrid dynamical systems by extending the polynomial chaos methodology to these systems. To capture discontinuities, we use a wavelet-based Wiener-Haar expansion. We develop a boundary layer approach to propagate uncertainty through separable reset conditions. We also introduce a transport theory based approach for propagating uncertainty through hybrid dynamical systems. Here the expansion yields a set of hyperbolic equations that are solved by integrating along characteristics. The solution of the partial differential equation along the characteristics allows one to quantify uncertainty in hybrid or switching dynamical systems. The above method...

  18. Uncertainty quantification in hybrid dynamical systems

    Sahai, Tuhin; Pasini, José Miguel

    2013-03-01

    Uncertainty quantification (UQ) techniques are frequently used to ascertain output variability in systems with parametric uncertainty. Traditional algorithms for UQ are either system-agnostic and slow (such as Monte Carlo) or fast with stringent assumptions on smoothness (such as polynomial chaos and Quasi-Monte Carlo). In this work, we develop a fast UQ approach for hybrid dynamical systems by extending the polynomial chaos methodology to these systems. To capture discontinuities, we use a wavelet-based Wiener-Haar expansion. We develop a boundary layer approach to propagate uncertainty through separable reset conditions. We also introduce a transport theory based approach for propagating uncertainty through hybrid dynamical systems. Here the expansion yields a set of hyperbolic equations that are solved by integrating along characteristics. The solution of the partial differential equation along the characteristics allows one to quantify uncertainty in hybrid or switching dynamical systems. The above methods are demonstrated on example problems.

  19. Quantification of adipose tissue insulin sensitivity.

    Søndergaard, Esben; Jensen, Michael D

    2016-06-01

    In metabolically healthy humans, adipose tissue is exquisitely sensitive to insulin. Similar to muscle and liver, adipose tissue lipolysis is insulin resistant in adults with central obesity and type 2 diabetes. Perhaps uniquely, however, insulin resistance in adipose tissue may directly contribute to development of insulin resistance in muscle and liver because of the increased delivery of free fatty acids to those tissues. It has been hypothesized that insulin adipose tissue resistance may precede other metabolic defects in obesity and type 2 diabetes. Therefore, precise and reproducible quantification of adipose tissue insulin sensitivity, in vivo, in humans, is an important measure. Unfortunately, no consensus exists on how to determine adipose tissue insulin sensitivity. We review the methods available to quantitate adipose tissue insulin sensitivity and will discuss their strengths and weaknesses.

  20. Micro-RNA quantification using DNA polymerase and pyrophosphate quantification.

    Yu, Hsiang-Ping; Hsiao, Yi-Ling; Pan, Hung-Yin; Huang, Chih-Hung; Hou, Shao-Yi

    2011-12-15

    A rapid quantification method for micro-RNA based on DNA polymerase activity and pyrophosphate quantification has been developed. The tested micro-RNA serves as the primer, unlike the DNA primer in all DNA sequencing methods, and the DNA probe serves as the template for DNA replication. After the DNA synthesis, the pyrophosphate detection and quantification indicate the existence and quantity of the tested miRNA. Five femtomoles of the synthetic RNA could be detected. In 20-100 μg RNA samples purified from SiHa cells, the measurement was done using the proposed assay in which hsa-miR-16 and hsa-miR-21 are 0.34 fmol/μg RNA and 0.71 fmol/μg RNA, respectively. This simple and inexpensive assay takes less than 5 min after total RNA purification and preparation. The quantification is not affected by the pre-miRNA which cannot serve as the primer for the DNA synthesis in this assay. This assay is general for the detection of the target RNA or DNA with a known matched DNA template probe, which could be widely used for detection of small RNA, messenger RNA, RNA viruses, and DNA. Therefore, the method could be widely used in RNA and DNA assays.

  1. MAMA Software Features: Quantification Verification Documentation-1

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  2. Bianchi type-V cosmological models with perfect fluid and heat flow in Saez–Ballester theory

    Shri Ram; M Zeyauddin; C P Singh

    2009-02-01

    In this paper we discuss the variation law for Hubble's parameter with average scale factor in a spatially homogeneous and anisotropic Bianchi type-V space-time model, which yields constant value of the deceleration parameter. We derive two laws of variation of the average scale factor with cosmic time, one is of power-law type and the other is of exponential form. Exact solutions of Einstein field equations with perfect fluid and heat conduction are obtained for Bianchi type-V space-time in these two types of cosmologies. In the cosmology with the power-law, the solutions correspond to a cosmological model which starts expanding from the singular state with positive deceleration parameter. In the case of exponential cosmology, we present an accelerating non-singular model of the Universe. We find that the constant value of deceleration parameter is reasonable for the present day Universe and gives an appropriate description of evolution of Universe. We have also discussed different types of physical and kinematical behaviour of the models in these two types of cosmologies.

  3. A recipe for EFT uncertainty quantification in nuclear physics

    Furnstahl, R. J.; Phillips, D. R.; Wesolowski, S.

    2015-03-01

    The application of effective field theory (EFT) methods to nuclear systems provides the opportunity to rigorously estimate the uncertainties originating in the nuclear Hamiltonian. Yet this is just one source of uncertainty in the observables predicted by calculations based on nuclear EFTs. We discuss the goals of uncertainty quantification in such calculations and outline a recipe to obtain statistically meaningful error bars for their predictions. We argue that the different sources of theory error can be accounted for within a Bayesian framework, as we illustrate using a toy model.

  4. Effects of gender, age, and diabetes duration on dietary self-care in adolescents with type 1 diabetes: a Self-Determination Theory perspective.

    Austin, Stéphanie; Senécal, Caroline; Guay, Frédéric; Nouwen, Arie

    2011-09-01

    This study tests a model derived from Self-Determination Theory (SDT) (Deci and Ryan, 2000) to explain the mechanisms by which non-modifiable factors influence dietary self-care in adolescents with type 1 diabetes (n = 289). SEM analyses adjusted for HbA1c levels revealed that longer diabetes duration and female gender were indicative of poorer dietary self-care. This effect was mediated by contextual and motivational factors as posited by SDT. Poorer autonomy support from practitioners was predominant in girls with longer diabetes duration. Perceived autonomous motivation and self-efficacy were indicative of greater autonomy support, and led to better dietary self-care.

  5. The Most General BPS Black Hole from Type II String Theory on a Six-Torus the Macroscopic-Microscopic Correspondence

    Bertolini, M.; Trigiante, M.

    2002-12-01

    (static, spherically symmetric) black holes within type II (M) theory compactified down to four dimensions on tori, and whose zero modes are described by N = 8 four dimensional supergravity. The U-duality group of this theory is U = E7(7) [6]. This class of black holes was shown to preserve 1/8 of the original supersymmetries and to be characterized by five U-duality invariants (one of which being the entropy) ...

  6. Effectiveness of a brief theory-based health promotion intervention among adults at high risk of type 2 diabetes

    Juul, Lise; Andersen, Vibeke Just; Arnoldsen, Jette;

    2016-01-01

    Dette studie evaluerer effekten af et kort gruppebaseret kommunalt kursusforløb til personer med høj risiko for at udvikle type 2-diabetes. Deltagerne blev rekrutteret via almen praksis og randomiseret til at deltage i kurset (intervention) eller efter 12 måneder (kontrol). Effekten blev målt eft...

  7. A shear deformable theory of laminated composite shallow shell-type panels and their response analysis. II - Static response

    Khdeir, A. A.; Librescu, L.; Frederick, D.

    1989-01-01

    In the second part of this paper, by using the static counterparts of the governing equations derived in Librescu (1989), the static response of shallow composite shell-type panels subjected to a sinusoidal transverse load is investigated. The numerical applications, encompassing a large number of boundary conditions and various lamination schemes, allow one to obtain some conclusions which are formulated in the paper.

  8. Finding Balance : self-regulation in overweight patients with type 2 diabetes: from theory to a pilot intervention study

    Huisman, Sasja Deborah

    2008-01-01

    The central focus of this thesis was to examine the role of self-regulation principles in predicting and changing self-care behaviors of diabetes type 2 patients. Overall, the results in this thesis indicate that self-regulation cognitions and skills might be important intervention targets of future

  9. Absolute quantification of myocardial blood flow.

    Yoshinaga, Keiichiro; Manabe, Osamu; Tamaki, Nagara

    2016-07-21

    With the increasing availability of positron emission tomography (PET) myocardial perfusion imaging, the absolute quantification of myocardial blood flow (MBF) has become popular in clinical settings. Quantitative MBF provides an important additional diagnostic or prognostic information over conventional visual assessment. The success of MBF quantification using PET/computed tomography (CT) has increased the demand for this quantitative diagnostic approach to be more accessible. In this regard, MBF quantification approaches have been developed using several other diagnostic imaging modalities including single-photon emission computed tomography, CT, and cardiac magnetic resonance. This review will address the clinical aspects of PET MBF quantification and the new approaches to MBF quantification.

  10. An analysis of EH- type linear based on two- dimensional heterotic system string theory%对杂化弦EH型线性系统的分析

    魏益焕

    2011-01-01

    由文献[4]中方程(2.25)-(2.31b)给出了对杂化弦EH型线性系统的分析。结果表明该线性系统等同于由文献[4]中方程(2.17)-(2.24b)给出的EH型线性系统。%An analysis is made of the EH - type linear system based on two - dimensional heterotic string theory, as is indicated in the equation of (2.25) - (2.31 b) in Bibliography [ 4 ]. The result shows that this system is equivalent to the EH -type linear system in the equations of (2.17) - (2.24b) in Bibliography [4].

  11. Self-consistent Bogoliubov-de Gennes theory of the vortex lattice state in a two-dimensional strongly type-II superconductor at high magnetic fields

    Zhuravlev, Vladimir; Duan, Wenye; Maniv, Tsofar

    2017-01-01

    A self-consistent Bogoliubov-de Gennes theory of the vortex lattice state in a 2D strong type-II superconductor at high magnetic fields reveals a novel quantum mixed state around the semiclassical Hc 2, characterized by a well-defined Landau-Bloch band structure in the quasiparticle spectrum and suppressed order-parameter amplitude, which sharply crossover into the well-known semiclassical (Helfand-Werthamer) results upon decreasing magnetic field. Application to the 2D superconducting state observed recently on the surface of the topological insulator Sb2Te3 accounts well for the experimental data, revealing a strong type-II superconductor, with unusually low carrier density and very small cyclotron mass, which can be realized only in the strong coupling superconductor limit.

  12. A balance theory of peripheral corticotropin-releasing factor receptor type 1 and type 2 signaling to induce colonic contractions and visceral hyperalgesia in rats.

    Nozu, Tsukasa; Takakusaki, Kaoru; Okumura, Toshikatsu

    2014-12-01

    Several recent studies suggest that peripheral corticotropin-releasing factor (CRF) receptor type 1 (CRF1) and CRF2 have a counter regulatory action on gastrointestinal functions. We hypothesized that the activity balance of each CRF subtype signaling may determine the changes in colonic motility and visceral sensation. Colonic contractions were assessed by the perfused manometry, and contractions of colonic muscle strips were measured in vitro in rats. Visceromotor response was determined by measuring contractions of abdominal muscle in response to colorectal distensions (CRDs) (60 mm Hg for 10 min twice with a 30-min rest). All drugs were administered through ip route in in vivo studies. CRF increased colonic contractions. Pretreatment with astressin, a nonselective CRF antagonist, blocked the CRF-induced response, but astressin2-B, a selective CRF2 antagonist, enhanced the response by CRF. Cortagine, a selective CRF1 agonist, increased colonic contractions. In in vitro study, CRF increased contractions of muscle strips. Urocortin 2, a selective CRF2 agonist, itself did not alter the contractions but blocked this increased response by CRF. Visceromotor response to the second CRD was significantly higher than that of the first. Astressin blocked this CRD-induced sensitization, but astressin2-B or CRF did not affect it. Meanwhile, astressin2-B together with CRF significantly enhanced the sensitization. Urocortin 2 blocked, but cortagine significantly enhanced, the sensitization. These results indicated that peripheral CRF1 signaling enhanced colonic contractility and induced visceral sensitization, and these responses were modulated by peripheral CRF2 signaling. The activity balance of each subtype signaling may determine the colonic functions in response to stress.

  13. Semi-Group Theory for the Stokes Operator with Navier-Type Boundary Conditions on L p -Spaces

    Al Baba, Hind; Amrouche, Chérif; Escobedo, Miguel

    2017-02-01

    In this article we consider the Stokes problem with Navier-type boundary conditions on a domain {Ω}, not necessarily simply connected. Since, under these conditions, the Stokes problem has a non trivial kernel, we also study the solutions lying in the orthogonal of that kernel. We prove the analyticity of several semigroups generated by the Stokes operator considered in different functional spaces. We obtain strong, weak and very weak solutions for the time dependent Stokes problem with the Navier-type boundary condition under different hypotheses on the initial data u 0 and external force f. Then, we study the fractional and pure imaginary powers of several operators related with our Stokes operators. Using the fractional powers, we prove maximal regularity results for the homogeneous Stokes problem. On the other hand, using the boundedness of the pure imaginary powers, we deduce maximal {Lp-Lq} regularity for the inhomogeneous Stokes problem.

  14. Uncertainty Quantification for Optical Model Parameters

    Lovell, A E; Sarich, J; Wild, S M

    2016-01-01

    Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of this work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fit and create corresponding 95\\% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. We study a number of reactions involving neutron and deuteron p...

  15. Information Theory Filters for Wavelet Packet Coefficient Selection with Application to Corrosion Type Identification from Acoustic Emission Signals

    Van Dijck, Gert; Van Hulle, Marc M.

    2011-01-01

    The damage caused by corrosion in chemical process installations can lead to unexpected plant shutdowns and the leakage of potentially toxic chemicals into the environment. When subjected to corrosion, structural changes in the material occur, leading to energy releases as acoustic waves. This acoustic activity can in turn be used for corrosion monitoring, and even for predicting the type of corrosion. Here we apply wavelet packet decomposition to extract features from acoustic emission signals. We then use the extracted wavelet packet coefficients for distinguishing between the most important types of corrosion processes in the chemical process industry: uniform corrosion, pitting and stress corrosion cracking. The local discriminant basis selection algorithm can be considered as a standard for the selection of the most discriminative wavelet coefficients. However, it does not take the statistical dependencies between wavelet coefficients into account. We show that, when these dependencies are ignored, a lower accuracy is obtained in predicting the corrosion type. We compare several mutual information filters to take these dependencies into account in order to arrive at a more accurate prediction. PMID:22163921

  16. Predictive Game Theory

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  17. Obtaining Good Performance With Triple-ζ-Type Basis Sets in Double-Hybrid Density Functional Theory Procedures.

    Chan, Bun; Radom, Leo

    2011-09-13

    A variety of combinations of B-LYP-based double-hybrid density functional theory (DHDFT) procedures and basis sets have been examined. A general observation is that the optimal combination of exchange contributions is in the proximity of 30% Becke 1988 (B88) exchange and 70% Hartree-Fock (HF) exchange, while for the correlation contributions, the use of independently optimized spin-component-scaled Møller-Plesset second-order perturbation theory (SCS-MP2) parameters (MP2OS and MP2SS) is beneficial. The triple-ζ Dunning aug'-cc-pVTZ+d and Pople 6-311+G(3df,2p)+d basis sets are found to be cost-effective for DHDFT methods. As a result, we have formulated the DuT-D3 DHDFT procedure, which employs the aug'-cc-pVTZ+d basis set and includes 30% B88 and 70% HF exchange energies, 59% LYP, 47% MP2OS, and 36% MP2SS correlation energies, and a D3 dispersion correction with the parameters s6 = 0.5, sr,6 = 1.569, and s8 = 0.35. Likewise, the PoT-D3 DHDFT procedure was formulated with the 6-311+G(3df,2p)+d basis set and has 32% B88 and 68% HF exchange energies, 63% LYP, 46% MP2OS, and 27% MP2SS correlation energies, and the D3 parameters s6 = 0.5, sr,6 = 1.569, and s8 = 0.30. Testing using the large E3 set of 740 energies demonstrates the robustness of these methods. Further comparisons show that the performance of these methods, particularly DuT-D3, compares favorably with the previously reported DSD-B-LYP and DSD-B-LYP-D3 methods used in conjunction with quadruple-ζ aug'-pc3+d and aug'-def2-QZVP basis sets but at lower computational expense. The previously reported ωB97X-(LP)/6-311++G(3df,3pd) procedure also performs very well. Our findings highlight the cost-effectiveness of appropriate- and moderate-sized triple-ζ basis sets in the application of DHDFT procedures.

  18. Matrix Theory

    1988-06-30

    MATRICES . The monograph Nonnegative Matrices [6] is an advanced book on all aspect of the theory of nonnegative matrices and...and on inverse eigenvalue problems for nonnegative matrices . The work explores some of the most recent developments in the theory of nonnegative...k -1, t0 . Define the associated polynomial of type <z>: t t-t 2 t-t 3 t-tk_ 1,X - x - x . . .X- where t = tk . The

  19. Generation of non-classical correlated photon pairs via a ladder-type atomic configuration: theory and experiment.

    Ding, Dong-Sheng; Zhou, Zhi-Yuan; Shi, Bao-Sen; Zou, Xu-Bo; Guo, Guang-Can

    2012-05-07

    We experimentally generate a non-classical correlated two-color photon pair at 780 and 1529.4 nm in a ladder-type configuration using a hot 85Rb atomic vapor with the production rate of ~10(7)/s. The non-classical correlation between these two photons is demonstrated by strong violation of Cauchy-Schwarz inequality by the factor R = 48 ± 12. Besides, we experimentally investigate the relations between the correlation and some important experimental parameters such as the single-photon detuning, the powers of pumps. We also make a theoretical analysis in detail and the theoretical predictions are in reasonable agreement with our experimental results.

  20. Statistical theory for hydrogen bonding fluid system of AaDd type (Ⅱ): Properties of hydrogen bonding networks

    WANG HaiJun; HONG XiaoZhong; GU Fang; BA XinWu

    2007-01-01

    Making use of the invariant property of the equilibrium size distribution of the hydrogen bonding clusters formed in hydrogen bonding system of AaDd type, the analytical expressions of the free energy in pregel and postgel regimes are obtained. Then the gel free energy and the scaling behavior of the number of hydrogen bonds in gel phase near the critical point are investigated to give the corresponding scaling exponents and scaling law. Meanwhile, some properties of intermolecular and intramolecular hydrogen bonds in the system, sol and gel phases are discussed. As a result, the explicit relationship between the number of intramolecular hydrogen bonds and hydrogen bonding degree is obtained.

  1. Rogers-Schur-Ramanujan Type Identities for the $M(p,p')$ minimal models of Conformal Field Theory

    Berkovich, A; Schilling, A

    1996-01-01

    We present and prove Rogers--Schur--Ramanujan (Bose/Fermi) type identities for the Virasoro characters of the minimal model $M(p,p').$ The proof uses the continued fraction decomposition of $p'/p$ introduced by Takahashi and Suzuki for the study of the Bethe's Ansatz equations of the XXZ model and gives a general method to construct polynomial generalizations of the fermionic form of the characters which satisfy the same recursion relations as the bosonic polynomials of Forrester and Baxter. We use this method to get fermionic representations of the characters $\\chi_{r,s}^{(p,p')}$ for many classes of $r$ and $s.$

  2. THE RELATIONSHIP BETWEEN HERZBERG’S MOTIVATION THEORY AND THE LEADERSHIP TYPES: AN APPLICATION IN BANKING SECTOR

    KILIÇ, Recep; Çoban, Mehmet

    2015-01-01

    This paper aims to explore the effects of leadership types on personal motivation in banking sector. The methodology used is survey based. In this study, the group who fill the survey is consist of bank employees in Bandırma. The group was selected randomly. In this study, Herzberg’s two factor of Motive and Hygiene applied seperately. Firstly, the effects of leadership styles on motive factors was tested. Then the effects of leadership styles on hygiene factors was tested. Correlation analyz...

  3. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  4. Precise Quantification of Nanoparticle Internalization

    Gottstein, Claudia; Wu, Guohui; Wong, Benjamin J.; Zasadzinski, Joseph Anthony

    2013-01-01

    Nanoparticles have opened new exciting avenues for both diagnostic and therapeutic applications in human disease, and targeted nanoparticles are increasingly used as specific drug delivery vehicles. The precise quantification of nanoparticle internalization is of importance to measure the impact of physical and chemical properties on the uptake of nanoparticles into target cells or into cells responsible for rapid clearance. Internalization of nanoparticles has been measured...

  5. Dunkl’s Theory and Best Approximation by Entire Functions of Exponential Type in L2-metric with Power Weight

    Yong Ping LIU; Chun Yuan SONG

    2014-01-01

    In this paper, we study the sharp Jackson inequality for the best approximation of f ∈L2,κ(Rd) by a subspace E2κ(σ) (SE2κ(σ)), which is a subspace of entire functions of exponential type (spherical exponential type) at most σ. Here L2,κ(Rd) denotes the space of all d-variate functions f endowed with the L2-norm with the weight vκ(x)=? ξ∈R+|(ξ, x)|2κ (ξ), which is defined by a positive subsystem R+ of a finite root system R ⊂ Rd and a function κ(ξ) : R → R+ invariant under the reflection group G(R) generated by R. In the case G(R)=Zd2 , we get some exact results. Moreover, the deviation of best approximation by the subspace E2κ(σ) (SE2κ(σ)) of some class of the smooth functions in the space L2,κ(Rd) is obtained.

  6. The synthesis and characterization of Ag-N dual-doped p-type ZnO: experiment and theory.

    Duan, Li; Wang, Pei; Yu, Xiaochen; Han, Xiao; Chen, Yongnan; Zhao, Peng; Li, Donglin; Yao, Ran

    2014-03-07

    Ag-N dual-doped ZnO films have been fabricated by a chemical bath deposition method. The p-type conductivity of the dual-doped ZnO:(Ag, N) is stable over a long period of time, and the hole concentration in the ZnO:(Ag, N) is much higher than that in mono-doped ZnO:Ag or ZnO:N. We found that this is because AgZn-NO complex acceptors can be formed in ZnO:(Ag, N). First-principles calculations show that the complex acceptors generate a fully occupied band above the valance band maximum, so the acceptor levels become shallower and the hole concentration is increased. Furthermore, the binding energy of the Ag-N complex in ZnO is negative, so ZnO:(Ag, N) can be stable. These results indicate that the Ag-N dual-doping may be expected to be a potential route to achieving high-quality p-type ZnO for use in a variety of devices.

  7. Review of Hydroelasticity Theories

    Chen, Xu-jun; Wu, You-sheng; Cui, Wei-cheng

    2006-01-01

    Existing hydroelastic theories are reviewed. The theories are classified into different types: two-dimensional linear theory, two-dimensional nonlinear theory, three-dimensional linear theory and three-dimensional nonlinear theory. Applications to analysis of very large floating structures (VLFS)......) are reviewed and discussed in details. Special emphasis is placed on papers from China and Japan (in native languages) as these papers are not generally publicly known in the rest of the world....

  8. Statistical theory for hydrogen bonding fluid system of AaDd type (Ⅲ): Equation of state and fluctuations

    WANG HaiJun; GU Fang; HONG XiaoZhong; BA XinWu

    2007-01-01

    The equation of the state of the hydrogen bonding fluid system of AaDd type is studied by the principle of statistical mechanics. The influences of hydrogen bonds on the equation of state of the system are obtained based on the change in volume due to hydrogen bonds. Moreover, the number density fluctuations of both molecules and hydrogen bonds as well as their spatial correlation property are investigated. Furthermore, an equation describing relation between the number density correlation function of "molecules-hydrogen bonds" and that of molecules and hydrogen bonds is derived. As application,taking the van der Waals hydrogen bonding fluid as an example, we considered the effect of hydrogen bonds on its relevant statistical properties.

  9. Statistical theory for hydrogen bonding fluid system of AaDd type (I): The geometrical phase transition

    WANG Haijun; HONG Xiaozhong; GU Fang; BA Xinwu

    2006-01-01

    The influence of hydrogen bonds on the physical and chemical properties of hydrogen bonding fluid system of AaDd type is investigated from two viewpoints by the principle of statistical mechanics. In detail, we proposed two new ways that can be used to obtain the equilibrium size distribution of the hydrogen bonding clusters, and derived the analytical expression of a relationship between the hydrogen bonding free energy and hydrogen bonding degree. For the nonlinear hydrogen bonding systems, it is shown that the sol-gel phase transition can take place under proper conditions, which is further proven to be a kind of geometrical phase transition rather than a thermodynamic one. Moreover, several problems associated with the geometrical phase transition and liquid-solid phase transition in nonlinear hydrogen bonding systems are discussed.

  10. Effective field theory and Ab-initio calculation of p-type (Ga, Fe)N within LDA and SIC approximation

    Salmani, E. [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); Mounkachi, O. [Institute of Nanomaterials and Nanotechnology, MAScIR, Rabat (Morocco); Ez-Zahraouy, H., E-mail: ezahamid@fsr.ac.ma [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); El Kenz, A. [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); Hamedoun, M. [Institute of Nanomaterials and Nanotechnology, MAScIR, Rabat (Morocco); Benyoussef, A. [LMPHE, associe au CNRST (URAC 12), Faculte des Sciences, Universite Mohammed V-Agdal, Rabat (Morocco); Institute of Nanomaterials and Nanotechnology, MAScIR, Rabat (Morocco); Hassan II Academy of Science and Technology, Rabat (Morocco)

    2013-03-15

    Based on first-principles spin-density functional calculations, using the Korringa-Kohn-Rostoker method combined with the coherent potential approximation, we investigated the half-metallic ferromagnetic behavior of (Ga, Fe)N co-doped with carbon within the self-interaction-corrected local density approximation. Mechanism of hybridization and interaction between magnetic ions in p-type (Ga, Fe)N is investigated. Stability energy of ferromagnetic and disorder local moment states was calculated for different carbon concentration. The local density and the self-interaction-corrected approximations have been used to explain the strong ferromagnetic interaction observed and the mechanism that stabilizes this state. The transition temperature to the ferromagnetic state has been calculated within the effective field theory, with a Honmura-Kaneyoshi differential operator technique. - Highlights: Black-Right-Pointing-Pointer The paper focus on the study the magnetic properties and electronic structure of p-type (Ga, Fe)N within LDA and SIC approximation. Black-Right-Pointing-Pointer These methods allow us to explain the strong ferromagnetic interaction observed and the mechanism for its stability and the mechanism of hybridization and interaction between magnetic ions in p-type (Ga, Fe). Black-Right-Pointing-Pointer The results obtained are interesting and can be serve as a reference in the field of dilute magnetic semi conductor.

  11. Quantum theory of open systems based on stochastic differential equations of generalized Langevin (non-Wiener) type

    Basharov, A. M., E-mail: basharov@gmail.com [National Research Centre ' Kurchatov Institute,' (Russian Federation)

    2012-09-15

    It is shown that the effective Hamiltonian representation, as it is formulated in author's papers, serves as a basis for distinguishing, in a broadband environment of an open quantum system, independent noise sources that determine, in terms of the stationary quantum Wiener and Poisson processes in the Markov approximation, the effective Hamiltonian and the equation for the evolution operator of the open system and its environment. General stochastic differential equations of generalized Langevin (non-Wiener) type for the evolution operator and the kinetic equation for the density matrix of an open system are obtained, which allow one to analyze the dynamics of a wide class of localized open systems in the Markov approximation. The main distinctive features of the dynamics of open quantum systems described in this way are the stabilization of excited states with respect to collective processes and an additional frequency shift of the spectrum of the open system. As an illustration of the general approach developed, the photon dynamics in a single-mode cavity without losses on the mirrors is considered, which contains identical intracavity atoms coupled to the external vacuum electromagnetic field. For some atomic densities, the photons of the cavity mode are 'locked' inside the cavity, thus exhibiting a new phenomenon of radiation trapping and non-Wiener dynamics.

  12. Quantum theory of open systems based on stochastic differential equations of generalized Langevin (non-Wiener) type

    Basharov, A. M.

    2012-09-01

    It is shown that the effective Hamiltonian representation, as it is formulated in author's papers, serves as a basis for distinguishing, in a broadband environment of an open quantum system, independent noise sources that determine, in terms of the stationary quantum Wiener and Poisson processes in the Markov approximation, the effective Hamiltonian and the equation for the evolution operator of the open system and its environment. General stochastic differential equations of generalized Langevin (non-Wiener) type for the evolution operator and the kinetic equation for the density matrix of an open system are obtained, which allow one to analyze the dynamics of a wide class of localized open systems in the Markov approximation. The main distinctive features of the dynamics of open quantum systems described in this way are the stabilization of excited states with respect to collective processes and an additional frequency shift of the spectrum of the open system. As an illustration of the general approach developed, the photon dynamics in a single-mode cavity without losses on the mirrors is considered, which contains identical intracavity atoms coupled to the external vacuum electromagnetic field. For some atomic densities, the photons of the cavity mode are "locked" inside the cavity, thus exhibiting a new phenomenon of radiation trapping and non-Wiener dynamics.

  13. Quantification of three aconitine- type diester alkaloids in Xiaozhong tablets by UPLC- MS%UPLC-MS法测定消肿片中3个乌头碱类双酯型生物碱的含量

    崔萍; 杨莉; 熊爱珍; 王峥涛; 胡春湘; 詹常森

    2012-01-01

    Objective:To establish an UPLC -MS method for simultaneous quantification of mesaconitine,aconitine and hypaconitine in Xiaozhong tablets.Methods:The separation was performed on an Agilent ZORBAX SB- C18 (100 mm×2.1mm,l.8 μm)column with acetonitrile -0.1% formic acid(35:65)as the mobile phase.The column temperature was set at 35℃ with flow rate of 0.3 mL o min-1.AP- ESI at positive ion mode and the selected ion monitoring were used.Ions at m/z 632,646 and 616 were selected to quantify mesaconitine,aconitine and hyp aconitine.Results:The linear ranges were 0.2250 -510.0 ng o mL-1(r=0.9996) for mesaconitine, 0.2500 -500.0 ng o mL-1(r = 0.9996 ) for aconitine and 0.2540 -508.0 ng o mL-1 ( r = 0.9996 ) for hypaconitine.The average recoveries( n =9) were 95.1% for mesaconitine,93.6% for aconitine and 93.0% for hypaconitine.Conclusion:The established method is convenient,rapid and accurate,which can be used to determine three poisonous al kaloids in Xiaozhong tablets.The study may provide a scientific basis for the quality control of this preparation.%目的:建立同时测定消肿片中新乌头碱、乌头碱和次乌头碱3个双酯型生物碱含量的超高效液相色谱-质谱分析方法.方法:采用Agilent ZORBAX SB-C18(100 mm ×2.1 mm,1.8μm)色谱柱,以乙腈-0.1%甲酸水溶液(35:65)为流动相,流速0.3mL·min -1,柱温35℃;ESI+模式下选择离子监测(SIM)质荷比(m/z)为632,646,616的离子.结果:新乌头碱、乌头碱、次乌头碱浓度分别在0.2250~510.0 ng·mL-1(r=0.9996)、0.2500~500.O ng·mL-1(r =0.9996)和0.2540~508.0 ng·mL-1(r =0.9996)范围内有良好的线性关系,平均回收率(n=9)分别为95.1%,93.6%,93.O%.结论:该方法简便、快捷、准确,同时检测3个有毒生物碱,可用于消肿片的质量控制.

  14. Properties of rectified averaging of an evoked-type signal: theory and application to the vestibular-evoked myogenic potential.

    Colebatch, J G

    2009-11-01

    The properties of rectified averages were investigated using the VEMP (vestibular-evoked myogenic potential) as an example of an evoked-type response. Recordings were made of surface EMG from the sternocleidomastoid (SCM) muscles of six volunteers, unstimulated, at different levels of tonic activation and then in response to clicks of different intensities. The stochastic properties of the surface EMG recorded were shown to be well modelled using a zero mean normal distribution with a standard deviation equivalent to the mean RMS (root mean squared) value (mean residual error variance 0.87%). Assuming a normal distribution, equations were derived for the expected value of both the rectified and RMS average with the addition of constant waveforms of different sizes. A simulation using recorded EMG and added sine waves of different amplitudes demonstrated that the equations predicted the rectified averages accurately. It also confirmed the importance of the relative amplitude of the added signal in determining whether it was detected using rectified averages. The same equations were then applied to actual data consisting of VEMPs of different relative amplitudes recorded from the volunteers. Whilst the signal-to-noise ratio (measured by corrected amplitude) was a major determinant of the nature of the rectified average, consistent deviations were detected between the predicted and actual rectified averages. Deviations from predicted values indicated that the VEMP did not behave simply like a constant signal added to tonic background EMG. A more complicated model, which included temporal jitter as well as inhibition of background EMG during the VEMP, was required to fit the physiological recordings. Rectified averages are sensitive to physiological properties, which are not apparent when using unrectified averages alone. Awareness of the properties of rectified averages should improve their interpretation.

  15. Toward a theory of distinct types of "impulsive" behaviors: A meta-analysis of self-report and behavioral measures.

    Sharma, Leigh; Markon, Kristian E; Clark, Lee Anna

    2014-03-01

    Impulsivity is considered a personality trait affecting behavior in many life domains, from recreational activities to important decision making. When extreme, it is associated with mental health problems, such as substance use disorders, as well as with interpersonal and social difficulties, including juvenile delinquency and criminality. Yet, trait impulsivity may not be a unitary construct. We review commonly used self-report measures of personality trait impulsivity and related constructs (e.g., sensation seeking), plus the opposite pole, control or constraint. A meta-analytic principal-components factor analysis demonstrated that these scales comprise 3 distinct factors, each of which aligns with a broad, higher order personality factor-Neuroticism/Negative Emotionality, Disinhibition versus Constraint/Conscientiousness, and Extraversion/Positive Emotionality/Sensation Seeking. Moreover, Disinhibition versus Constraint/Conscientiousness comprise 2 correlated but distinct subfactors: Disinhibition versus Constraint and Conscientiousness/Will versus Resourcelessness. We also review laboratory tasks that purport to measure a construct similar to trait impulsivity. A meta-analytic principal-components factor analysis demonstrated that these tasks constitute 4 factors (Inattention, Inhibition, Impulsive Decision-Making, and Shifting). Although relations between these 2 measurement models are consistently low to very low, relations between both trait scales and laboratory behavioral tasks and daily-life impulsive behaviors are moderate. That is, both independently predict problematic daily-life impulsive behaviors, such as substance use, gambling, and delinquency; their joint use has incremental predictive power over the use of either type of measure alone and furthers our understanding of these important, problematic behaviors. Future use of confirmatory methods should help to ascertain with greater precision the number of and relations between impulsivity

  16. Magnetism in olivine-type LiCo(1-x)Fe(x)PO4 cathode materials: bridging theory and experiment.

    Singh, Vijay; Gershinsky, Yelena; Kosa, Monica; Dixit, Mudit; Zitoun, David; Major, Dan Thomas

    2015-12-14

    In the current paper, we present a non-aqueous sol-gel synthesis of olivine type LiCo1-xFexPO4 compounds (x = 0.00, 0.25, 0.50, 0.75, 1.00). The magnetic properties of the olivines are measured experimentally and calculated using first-principles theory. Specifically, the electronic and magnetic properties are studied in detail with standard density functional theory (DFT), as well as by including spin-orbit coupling (SOC), which couples the spin to the crystal structure. We find that the Co(2+) ions exhibit strong orbital moment in the pure LiCoPO4 system, which is partially quenched upon substitution of Co(2+) by Fe(2+). Interestingly, we also observe a non-negligible orbital moment on the Fe(2+) ion. We underscore that the inclusion of SOC in the calculations is essential to obtain qualitative agreement with the observed effective magnetic moments. Additionally, Wannier functions were used to understand the experimentally observed rising trend in the Néel temperature, which is directly related to the magnetic exchange interaction paths in the materials. We suggest that out of layer M-O-P-O-M magnetic interactions (J⊥) are present in the studied materials. The current findings shed light on important differences observed in the electrochemistry of the cathode material LiCoPO4 compared to the already mature olivine material LiFePO4.

  17. The role of the l{sub 1}-norm in quantum information theory and two types of the Yang-Baxter equation

    Niu Kai; Ge Mollin [Theoretical Physics Section, Chern Institute of Mathematics, Nankai University, Tianjin 300071 (China); Xue Kang [Department of Physics, Northeast Normal University, Changchun, Ji Lin 120024 (China); Zhao Qing, E-mail: nkniukai@gmail.com, E-mail: geml@nankai.edu.cn [Department of Physics, College of Science, Beijing Institute of Technology, Beijing 100081 (China)

    2011-07-01

    The role of the l{sub 1}-norm in the Yang-Baxter system has been studied through Wigner's D-functions, where l{sub 1}-norm means {Sigma}{sub i}|C{sub i}| for |{Psi}) = {Sigma}{sub i}C{sub i}|{psi}{sub i}) with |{psi}{sub i}) being the orthonormal basis. It is shown that the existing two types of braiding matrices, which can be viewed as particular solutions of the Yang-Baxter equation (YBE) with different spectral parameters can be unified in the 2D YBE. We prove that the maximum of the l{sub 1}-norm is connected with the maximally entangled states and topological quantum field theory with two-component anyons, while the minimum leads to the deformed permutation related to the familiar integrable models.

  18. Effective field theory and Ab-initio calculation of p-type (Ga, Fe)N within LDA and SIC approximation

    Salmani, E.; Mounkachi, O.; Ez-Zahraouy, H.; El Kenz, A.; Hamedoun, M.; Benyoussef, A.

    2013-03-01

    Based on first-principles spin-density functional calculations, using the Korringa-Kohn-Rostoker method combined with the coherent potential approximation, we investigated the half-metallic ferromagnetic behavior of (Ga, Fe)N co-doped with carbon within the self-interaction-corrected local density approximation. Mechanism of hybridization and interaction between magnetic ions in p-type (Ga, Fe)N is investigated. Stability energy of ferromagnetic and disorder local moment states was calculated for different carbon concentration. The local density and the self-interaction-corrected approximations have been used to explain the strong ferromagnetic interaction observed and the mechanism that stabilizes this state. The transition temperature to the ferromagnetic state has been calculated within the effective field theory, with a Honmura-Kaneyoshi differential operator technique.

  19. Quantification of Information in a One-Way Plant-to-Animal Communication System

    Laurance R. Doyle

    2009-08-01

    Full Text Available In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum to the wasp (Cardiochiles nigriceps studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type, to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia. We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types, to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message. We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception, for possible insights into the history and actual working of this one-way communication system.

  20. Two symmetric n-type interfaces SrTiO3/LaAlO3 in perovskite: Electronic properties from density functional theory

    Reshak, A. H.; Abu-Jafar, M. S.; Al-Douri, Y.

    2016-06-01

    The first principles study of the (001) two symmetric n-type interfaces between two insulating perovskites, the nonpolar SrTiO3 (STO), and the polar LaAlO3 (LAO) was performed. We have analyzed the formation of metallic interface states between the STO and LAO heterointerfaces by using the all-electron full-potential linearized augmented plane-wave approach based on the density functional theory, within the local density approximation, the Perdew-Burke-Ernzerhof generalized gradient approximation (PBE-GGA), and the Engel-Vosko GGA (EVGGA) formalism. It has been found that some bands cross the Fermi energy level (EF), forming a metallic nature of two symmetric n-type 6.5STO/1.5LAO interfaces with density of states at EF, N(EF) of about 3.56 (state/eV/unit cell), and bare electronic specific heat coefficient (γ) of about 0.62 mJ/(mol cell K2). The electronic band stature and the partial density of states in the vicinity of EF are mainly originated from Ti1,2,3,4-3dxy orbitals. These bands are responsible for the metallic behavior and the forming of the Fermi surface of the two symmetric n-type 6.5STO/1.5LAO interfaces. To obtain a clear map of the valence band electronic charge density distribution of the two symmetric n-type 6.5STO/1.5LAO interfaces, we have investigated the bond's nature and the interactions between the atoms. It reveals that the charge is attracted towards O atoms as it is clear that the O atoms are surrounded by uniform blue spheres which indicate the maximum charge accumulation.

  1. Aerobic physical activity and resistance training: an application of the theory of planned behavior among adults with type 2 diabetes in a random, national sample of Canadians

    Karunamuni Nandini

    2008-12-01

    Full Text Available Abstract Background Aerobic physical activity (PA and resistance training are paramount in the treatment and management of type 2 diabetes (T2D, but few studies have examined the determinants of both types of exercise in the same sample. Objective The primary purpose was to investigate the utility of the Theory of Planned Behavior (TPB in explaining aerobic PA and resistance training in a population sample of T2D adults. Methods A total of 244 individuals were recruited through a random national sample which was created by generating a random list of household phone numbers. The list was proportionate to the actual number of household telephone numbers for each Canadian province (with the exception of Quebec. These individuals completed self-report TPB constructs of attitude, subjective norm, perceived behavioral control and intention, and a 3-month follow-up that assessed aerobic PA and resistance training. Results TPB explained 10% and 8% of the variance respectively for aerobic PA and resistance training; and accounted for 39% and 45% of the variance respectively for aerobic PA and resistance training intentions. Conclusion These results may guide the development of appropriate PA interventions for aerobic PA and resistance training based on the TPB.

  2. Quantification of the contribution of GLP-1 to mediating insulinotropic effects of DPP-4 inhibition with vildagliptin in healthy subjects and type 2-diabetic patients using exendin [9-39] as a GLP-1 receptor antagonist

    Nauck, Michael A; Kind, J; Köthe, Lars D;

    2016-01-01

    under the curve (AUCs) of integrated insulin secretion rates (total AUC(ISR)) and glucose (total AUC(glucose)) over 4 h after the meal. Vildagliptin treatment more than doubled responses of intact GLP-1 and glucose-dependent insulinotropic polypeptide and lowered glucose responses without changing AUC(ISR......)/AUC(glucose) in healthy subjects. Vildagliptin significantly increased this ratio by 10.5% in patients with type 2 diabetes, and exendin [9-39] reduced it (both P ISR)/AUC(glucose) ratio achieved with exendin [9-39] was significantly smaller after vildagliptin treatment than...

  3. Emerging themes on information theory and Bayesian approach

    Lei XU; Yanda LI

    2010-01-01

    @@ Though efforts on the quantification of information started several decades earlier, the foundations of information theoretic studies were laid during the middle and late 1940's, from two perspectives that both based on probability theory.

  4. Predicting Noninsulin Antidiabetic Drug Adherence Using a Theoretical Framework Based on the Theory of Planned Behavior in Adults With Type 2 Diabetes: A Prospective Study.

    Zomahoun, Hervé Tchala Vignon; Moisan, Jocelyne; Lauzier, Sophie; Guillaumie, Laurence; Grégoire, Jean-Pierre; Guénette, Line

    2016-04-01

    Understanding the process behind noninsulin antidiabetic drug (NIAD) nonadherence is necessary for designing effective interventions to resolve this problem. This study aimed to explore the ability of the theory of planned behavior (TPB), which is known as a good predictor of behaviors, to predict the future NIAD adherence in adults with type 2 diabetes. We conducted a prospective study of adults with type 2 diabetes. They completed a questionnaire on TPB variables and external variables. Linear regression was used to explore the TPB's ability to predict future NIAD adherence, which was prospectively measured as the proportion of days covered by at least 1 NIAD using pharmacy claims data. The interaction between past NIAD adherence and intention was tested. The sample included 340 people. There was an interaction between past NIAD adherence and intention to adhere to the NIAD (P = 0.032). Intention did not predict future NIAD adherence in the past adherers and nonadherers groups, but its association measure was high among past nonadherers (β = 5.686, 95% confidence interval [CI] -10.174, 21.546). In contrast, intention was mainly predicted by perceived behavioral control both in the past adherers (β = 0.900, 95% CI 0.796, 1.004) and nonadherers groups (β = 0.760, 95% CI 0.555, 0.966). The present study suggests that TPB is a good tool to predict intention to adhere and future NIAD adherence. However, there was a gap between intention to adhere and actual adherence to the NIAD, which is partly explained by the past adherence level in adults with type 2 diabetes.

  5. A pH and solvent optimized reverse-phase ion-paring-LC–MS/MS method that leverages multiple scan-types for targeted absolute quantification of intracellular metabolites

    McCloskey, Douglas; Gangoiti, Jon A.; Palsson, Bernhard O.;

    2015-01-01

    to the understanding of intracellular metabolism. Liquid chromatography coupled to mass spectrometry (LC–MS and LC–MS/MS) has become a reliable means with which to quantify a multitude of intracellular metabolites in parallel with great specificity and accuracy. This work details a method that builds and extends upon...... existing reverse phase ion-paring liquid chromatography methods for separation and detection of polar and anionic compounds that comprise key nodes of intracellular metabolism by optimizing pH and solvent composition. In addition, the presented method utilizes multiple scan types provided by hybrid...... instrumentation to improve confidence in compound identification. The developed method was validated for a broad coverage of polar and anionic metabolites of intracellular metabolism...

  6. Comparison of five DNA quantification methods

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes;

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than ...

  7. Detection and Quantification of Neurotransmitters in Dialysates

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2009-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection).

  8. Protocol for Quantification of Defects in Natural Fibres for Composites

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based...... of defect size by width, and it is shown that both definitions can be used to give unbiased findings for the comparison between fibre types. Finally, considerations are given with respect to true measures of defect content, number of determinations, and number of significant figures used for the descriptive...

  9. Gauge theory and little gauge theory

    Koizumi, Kozo

    2016-01-01

    The gauge theory is the most important type of the field theory, in which the interactions of the elementary particles are described by the exchange of the gauge bosons.In this article, the gauge theory is reexamined as geometry of the vector space, and a new concept of "little gauge theory" is introduced. A key peculiarity of the little gauge theory is that the theory is able to give a restriction for form of the connection field. Based on the little gauge theory, Cartan geometry, a charged boson and the Dirac fermion field theory are investigated. In particular, the Dirac fermion field theory leads to an extension of Sogami's covariant derivative. And it is interpreted that Higgs bosons are included in new fields introduced in this article.

  10. Rapid and specific high-performance liquid chromatography for the in vitro quantification of D-Lys6-GnRH in a microemulsion-type formulation in the presence of peptide oxidation products.

    Kafka, Alexandra P; Rades, Thomas; McDowell, Arlene

    2010-02-01

    A high-performance liquid chromatography (HPLC) method for assay of d-Lys(6)-GnRH contained in a microemulsion-type formulation is described. The peptide is extracted from the microemulsion matrix and quantified using a two-step gradient method. Separation from microemulsion compounds and potential peptide oxidation products was achieved on a Jupiter C(18) column at 40 degrees C, using a gradient of 10-35% CH(3)CN for peptide elution. The correlation of peak intensity measured at 220 nm and peptide concentration was linear over the range 2.5-60 microg/mL with a correlation coefficient of 0.9997 and a y-intercept not significantly different from zero (p > 0.05). Intraday and interday variability of the assay was less than 5% for multiple injections of samples containing 7.5, 30 and 60 microg/mL. The lower limit of quantitation was calculated to be 0.38 microg/mL, and the lower limit of detection was 0.13 microg/mL. The assay was applied to samples that were stressed under physiological conditions (37 degrees C, pH 7.4) over 4 days. Three degradation peaks were well resolved from the parent peptide, demonstrating the selectivity of the assay. Off-line MALDI TOF mass spectrometry was applied to identify these degradation species as oxidation products of the peptide.

  11. Accessible quantification of multiparticle entanglement

    Cianciaruso, Marco; Adesso, Gerardo

    2015-01-01

    Entanglement is a key ingredient for quantum technologies and a fundamental signature of quantumness in a broad range of phenomena encompassing many-body physics, thermodynamics, cosmology, and life sciences. For arbitrary multiparticle systems, the quantification of entanglement typically involves hard optimisation problems, and requires demanding tomographical techniques. In this paper we show that such difficulties can be overcome by developing an experimentally friendly method to evaluate measures of multiparticle entanglement via a geometric approach. The method provides exact analytical results for a relevant class of mixed states of $N$ qubits, and computable lower bounds to entanglement for any general state. For practical purposes, the entanglement determination requires local measurements in just three settings for any $N$. We demonstrate the power of our approach to quantify multiparticle entanglement in $N$-qubit bound entangled states and other states recently engineered in laboratory using quant...

  12. 论健康旅游的类型、市场和概念%Theory of Health Tourism Type, Market and Concepts

    李东

    2016-01-01

    分析了健康旅游的发展历史,认为国内健康旅游理论研究较晚,但是实践探索历史久远,具有很深的历史背景和传统。发展健康旅游既是历史的传承,又是人们对健康、养生、延年益寿滋滋追求的探索和创新。从旅游者需求的角度,将健康旅游分为主动追求型和被动追求型两类。分析了健康旅游市场,认为健康旅游作为一种“积极的生活方式”,康旅游是一个综合性概念,是一门科学活动和文化体验活动,表现形式多样,为健康旅游产品开发提供了优越条件。%Analyzing the development of health tourism history, the domestic health tourism theory research is relatively later than the foreign. On the other hand, the health tourism has a deep history background and tradition. Health tourism development is a historical heritage, and making for people's pursuit health. From the perspective of tourists demand, health tourism can be divided into active type and passive type two kinds. As a“positive way of life”, the health tourism had a broad market prospects. Finally, it points out that the health tourism is a comprehensive concept, a science activities and cultural experience, diverse forms, providing advantageous conditions for health tourism product development.

  13. Toward a chemical mechanism of proton pumping by the B-type cytochrome c oxidases: application of density functional theory to cytochrome ba3 of Thermus thermophilus.

    Fee, James A; Case, David A; Noodleman, Louis

    2008-11-12

    A mechanism for proton pumping by the B-type cytochrome c oxidases is presented in which one proton is pumped in conjunction with the weakly exergonic, two-electron reduction of Fe-bound O 2 to the Fe-Cu bridging peroxodianion and three protons are pumped in conjunction with the highly exergonic, two-electron reduction of Fe(III)- (-)O-O (-)-Cu(II) to form water and the active oxidized enzyme, Fe(III)- (-)OH,Cu(II). The scheme is based on the active-site structure of cytochrome ba 3 from Thermus thermophilus, which is considered to be both necessary and sufficient for coupled O 2 reduction and proton pumping when appropriate gates are in place (not included in the model). Fourteen detailed structures obtained from density functional theory (DFT) geometry optimization are presented that are reasonably thought to occur during the four-electron reduction of O 2. Each proton-pumping step takes place when a proton resides on the imidazole ring of I-His376 and the large active-site cluster has a net charge of +1 due to an uncompensated, positive charge formally associated with Cu B. Four types of DFT were applied to determine the energy of each intermediate, and standard thermochemical approaches were used to obtain the reaction free energies for each step in the catalytic cycle. This application of DFT generally conforms with previously suggested criteria for a valid model (Siegbahn, P. E. M.; Blomberg, M. A. R. Chem. Rev. 2000, 100, 421-437) and shows how the chemistry of O 2 reduction in the heme a 3 -Cu B dinuclear center can be harnessed to generate an electrochemical proton gradient across the lipid bilayer.

  14. Multicenter comparison of Roche COBAS AMPLICOR MONITOR version 1.5, Organon Teknika NucliSens QT with Extractor, and Bayer Quantiplex version 3.0 for quantification of human immunodeficiency virus type 1 RNA in plasma.

    Murphy, D G; Côté, L; Fauvel, M; René, P; Vincelette, J

    2000-11-01

    The performance and characteristics of Roche COBAS AMPLICOR HIV-1 MONITOR version 1.5 (CA MONITOR 1.5) UltraSensitive (usCA MONITOR 1. 5) and Standard (stCA MONITOR 1.5) procedures, Organon Teknika NucliSens HIV-1 RNA QT with Extractor (NucliSens), and Bayer Quantiplex HIV RNA version 3.0 (bDNA 3.0) were compared in a multicenter trial. Samples used in this study included 460 plasma specimens from human immunodeficiency virus (HIV) type 1 (HIV-1)-infected persons, 100 plasma specimens from HIV antibody (anti-HIV)-negative persons, and culture supernatants of HIV-1 subtype A to E isolates diluted in anti-HIV-negative plasma. Overall, bDNA 3.0 showed the least variation in RNA measures upon repeat testing. For the Roche assays, usCA MONITOR 1.5 displayed less variation in RNA measures than stCA MONITOR 1.5. NucliSens, at an input volume of 2 ml, showed the best sensitivity. Deming regression analysis indicated that the results of all three assays were significantly correlated (P < 0.0001). However, the mean difference in values between CA MONITOR 1.5 and bDNA 3.0 (0.274 log(10) RNA copies/ml; 95% confidence interval, 0.192 to 0.356) was significantly different from 0, indicating that CA MONITOR 1.5 values were regularly higher than bDNA 3.0 values. Upon testing of 100 anti-HIV-negative plasma specimens, usCA MONITOR 1.5 and NucliSens displayed 100% specificity, while bDNA 3.0 showed 98% specificity. NucliSens quantified 2 of 10 non-subtype B viral isolates at 1 log(10) lower than both CA MONITOR 1.5 and bDNA 3.0. For NucliSens, testing of specimens with greater than 1,000 RNA copies/ml at input volumes of 0.1, 0.2, and 2.0 ml did not affect the quality of results. Additional factors differing between assays included specimen throughput and volume requirements, limit of detection, ease of execution, instrument work space, and costs of disposal. These characteristics, along with assay performance, should be considered when one is selecting a viral load assay.

  15. ["A Little Bit of Switzerland, a Little Bit of Kosovo". Swiss Immigrants from Former Yugoslavia with Type 2 Diabetes. A Qualitative Study' in Analogy to Grounded Theory].

    Wenger, A; Mischke, C

    2015-10-01

    Type 2 diabetes is on the increase among the Swiss immigrants. The cultural background of patients presents new linguistic and sociocultural barriers and gains in importance for health care. In order to develop patient-centred care, it is necessary to focus on different sociocultural aspects in everyday life and experiences of immigrants from the former republics of Yugoslavia with diabetes who have rarely been studied in Switzerland. Based on these insights the needs for counselling can be identified and nursing interventions can be designed accordingly. Using the Grounded Theory approach, 5 interviews were analysed according to the Corbin and Strauss coding paradigm. The central phenomenon found is the experience to live in 2 different cultures. The complexity arises from the tension living in 2 cultural backgrounds at the same time. It turns out that in the country of origin the immigrants adjust their disease management. The changing daily rhythm and the more traditional role model affect aspects of their disease management such as diet and/or drug therapy. The different strategies impact the person's roles, emotions, their everyday lives and their families. It provides an insight into the perspective of Swiss immigrants from the former republics of Yugoslavia suffering from diabetes. Many questions are still unanswered and further research will be required.

  16. Protein inference: A protein quantification perspective.

    He, Zengyou; Huang, Ting; Liu, Xiaoqing; Zhu, Peijun; Teng, Ben; Deng, Shengchun

    2016-08-01

    In mass spectrometry-based shotgun proteomics, protein quantification and protein identification are two major computational problems. To quantify the protein abundance, a list of proteins must be firstly inferred from the raw data. Then the relative or absolute protein abundance is estimated with quantification methods, such as spectral counting. Until now, most researchers have been dealing with these two processes separately. In fact, the protein inference problem can be regarded as a special protein quantification problem in the sense that truly present proteins are those proteins whose abundance values are not zero. Some recent published papers have conceptually discussed this possibility. However, there is still a lack of rigorous experimental studies to test this hypothesis. In this paper, we investigate the feasibility of using protein quantification methods to solve the protein inference problem. Protein inference methods aim to determine whether each candidate protein is present in the sample or not. Protein quantification methods estimate the abundance value of each inferred protein. Naturally, the abundance value of an absent protein should be zero. Thus, we argue that the protein inference problem can be viewed as a special protein quantification problem in which one protein is considered to be present if its abundance is not zero. Based on this idea, our paper tries to use three simple protein quantification methods to solve the protein inference problem effectively. The experimental results on six data sets show that these three methods are competitive with previous protein inference algorithms. This demonstrates that it is plausible to model the protein inference problem as a special protein quantification task, which opens the door of devising more effective protein inference algorithms from a quantification perspective. The source codes of our methods are available at: http://code.google.com/p/protein-inference/.

  17. Evaluation of vehicle damage involved in road crashes based on quantificated model

    FAN Yan-hui; XU Hong-guo; JIANG Hua-ping

    2008-01-01

    Based on economics theory, social value loss caused by vehicle involved in crashes as well as various factors influencing on it were analyzed, the corresponding micro-econometrics model was theoretically given. Moreover, the practicability of the model,the veracity and rationality of quantification were analyzed. Based on probability theory and mathematical statistical theory, macro approach to evaluating vehicle damage in crashes was presented, and the corresponding macro-econometrics model was constructed. In addition, the macro-econometrics model was utilized to assess economic loss from statistical data of vehicle damaged in crashes, which has shown that the model can meet the demand of quantification analysis of vehicle damage, and be applied to the evaluation of economic loss caused by crashes. The results in this paper will be of practical significance for scientific, comprehensive and rational evaluating socio-economic loss caused by road crashes.

  18. A fast and robust hepatocyte quantification algorithm including vein processing

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  19. Legionella spp. isolation and quantification from greywater.

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample.

  20. Superspace conformal field theory

    Quella, Thomas [Koeln Univ. (Germany). Inst. fuer Theoretische Physik; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-07-15

    Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.

  1. Using psychological theory to understand the clinical management of type 2 diabetes in Primary Care: a comparison across two European countries

    Johnston Marie

    2009-08-01

    Full Text Available Abstract Background Long term management of patients with Type 2 diabetes is well established within Primary Care. However, despite extensive efforts to implement high quality care both service provision and patient health outcomes remain sub-optimal. Several recent studies suggest that psychological theories about individuals' behaviour can provide a valuable framework for understanding generalisable factors underlying health professionals' clinical behaviour. In the context of the team management of chronic disease such as diabetes, however, the application of such models is less well established. The aim of this study was to identify motivational factors underlying health professional teams' clinical management of diabetes using a psychological model of human behaviour. Methods A predictive questionnaire based on the Theory of Planned Behaviour (TPB investigated health professionals' (HPs' cognitions (e.g., beliefs, attitudes and intentions about the provision of two aspects of care for patients with diabetes: prescribing statins and inspecting feet. General practitioners and practice nurses in England and the Netherlands completed parallel questionnaires, cross-validated for equivalence in English and Dutch. Behavioural data were practice-level patient-reported rates of foot examination and use of statin medication. Relationships between the cognitive antecedents of behaviour proposed by the TPB and healthcare teams' clinical behaviour were explored using multiple regression. Results In both countries, attitude and subjective norm were important predictors of health professionals' intention to inspect feet (Attitude: beta = .40; Subjective Norm: beta = .28; Adjusted R2 = .34, p 2 = .40, p Conclusion Using the TPB, we identified modifiable factors underlying health professionals' intentions to perform two clinical behaviours, providing a rationale for the development of targeted interventions. However, we did not observe a relationship

  2. Epidermal Nerve Fiber Quantification in the Assessment of Diabetic Neuropathy

    Beiswenger, Kristina K.; Calcutt, Nigel A.; Mizisin, Andrew P.

    2008-01-01

    Summary Assessment of cutaneous innervation in skin biopsies is emerging as a valuable means of both diagnosing and staging diabetic neuropathy. Immunolabeling, using antibodies to neuronal proteins such as protein gene product 9.5, allows for the visualization and quantification of intraepidermal nerve fibers. Multiple studies have shown reductions in intraepidermal nerve fiber density in skin biopsies from patients with both type 1 and type 2 diabetes. More recent studies have focused on correlating these changes with other measures of diabetic neuropathy. A loss of epidermal innervation similar to that observed in diabetic patients has been observed in rodent models of both type 1 and type 2 diabetes and several therapeutics have been reported to prevent reductions in intraepidermal nerve fiber density in these models. This review discusses the current literature describing diabetes-induced changes in cutaneous innervation in both human and animal models of diabetic neuropathy. PMID:18384843

  3. A shear deformable theory of laminated composite shallow shell-type panels and their response analysis. I - Free vibration and buckling

    Librescu, L.; Khdeir, A. A.; Frederick, D.

    1989-01-01

    This paper deals with the substantiation of a shear deformable theory of cross-ply laminated composite shallow shells. While the developed theory preserves all the advantages of the first order transverse shear deformation theory it succeeds in eliminating some of its basic shortcomings. The theory is further employed in the analysis of the eigenvibration and static buckling problems of doubly curved shallow panels. In this context, the state space concept is used in conjunction with the Levy method, allowing one to analyze these problems in a unified manner, for a variety of boundary conditions. Numerical results are presented and some pertinent conclusions are formulated.

  4. Uncertainty Quantification in Aerodynamics Simulations Project

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  5. Quantification of nanowire uptake by live cells

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  6. Design Theory in Information Systems

    Shirley Gregor

    2002-11-01

    Full Text Available The aim of this paper is to explore an important category of information systems knowledge that is termed “design theory”. This knowledge is distinguished as the fifth of five types of theory: (i theory for analysing and describing, (ii theory for understanding, (iii theory for predicting, (iv theory for explaining and predicting, and (v theory for design and action. Examples of design theory in information systems are provided, with associated research methods. The limited understanding and recognition of this type of theory in information systems indicates that further debate concerning its nature and role in our discipline is needed.

  7. Alberta Diabetes and Physical Activity Trial (ADAPT: A randomized theory-based efficacy trial for adults with type 2 diabetes - rationale, design, recruitment, evaluation, and dissemination

    Birkett Nicholas

    2010-01-01

    Full Text Available Abstract Background The primary aim of this study was to compare the efficacy of three physical activity (PA behavioural intervention strategies in a sample of adults with type 2 diabetes. Method/Design Participants (N = 287 were randomly assigned to one of three groups consisting of the following intervention strategies: (1 standard printed PA educational materials provided by the Canadian Diabetes Association [i.e., Group 1/control group]; (2 standard printed PA educational materials as in Group 1, pedometers, a log book and printed PA information matched to individuals' PA stage of readiness provided every 3 months (i.e., Group 2; and (3 PA telephone counseling protocol matched to PA stage of readiness and tailored to personal characteristics, in addition to the materials provided in Groups 1 and 2 (i.e., Group 3. PA behaviour measured by the Godin Leisure Time Exercise Questionnaire and related social-cognitive measures were assessed at baseline, 3, 6, 9, 12 and 18-months (i.e., 6-month follow-up. Clinical (biomarkers and health-related quality of life assessments were conducted at baseline, 12-months, and 18-months. Linear Mixed Model (LMM analyses will be used to examine time-dependent changes from baseline across study time points for Groups 2 and 3 relative to Group 1. Discussion ADAPT will determine whether tailored but low-cost interventions can lead to sustainable increases in PA behaviours. The results may have implications for practitioners in designing and implementing theory-based physical activity promotion programs for this population. Clinical Trials Registration ClinicalTrials.gov identifier: NCT00221234

  8. e/a classification of Hume–Rothery Rhombic Triacontahedron-type approximants based on all-electron density functional theory calculations

    Mizutani, U; Inukai, M; Sato, H; Zijlstra, E S; Lin, Q

    2014-05-16

    There are three key electronic parameters in elucidating the physics behind the Hume–Rothery electron concentration rule: the square of the Fermi diameter (2kF)2, the square of the critical reciprocal lattice vector and the electron concentration parameter or the number of itinerant electrons per atom e/a. We have reliably determined these three parameters for 10 Rhombic Triacontahedron-type 2/1–2/1–2/1 (N = 680) and 1/1–1/1–1/1 (N = 160–162) approximants by making full use of the full-potential linearized augmented plane wave-Fourier band calculations based on all-electron density-functional theory. We revealed that the 2/1–2/1–2/1 approximants Al13Mg27Zn45 and Na27Au27Ga31 belong to two different sub-groups classified in terms of equal to 126 and 109 and could explain why they take different e/a values of 2.13 and 1.76, respectively. Among eight 1/1–1/1–1/1 approximants Al3Mg4Zn3, Al9Mg8Ag3, Al21Li13Cu6, Ga21Li13Cu6, Na26Au24Ga30, Na26Au37Ge18, Na26Au37Sn18 and Na26Cd40Pb6, the first two, the second two and the last four compounds were classified into three sub-groups with = 50, 46 and 42; and were claimed to obey the e/a = 2.30, 2.10–2.15 and 1.70–1.80 rules, respectively.

  9. e/a classification of Hume-Rothery Rhombic Triacontahedron-type approximants based on all-electron density functional theory calculations

    Mizutani, U.; Inukai, M.; Sato, H.; Zijlstra, E. S.; Lin, Q.

    2014-08-01

    There are three key electronic parameters in elucidating the physics behind the Hume-Rothery electron concentration rule: the square of the Fermi diameter (2kF)2, the square of the critical reciprocal lattice vector ? and the electron concentration parameter or the number of itinerant electrons per atom e/a. We have reliably determined these three parameters for 10 Rhombic Triacontahedron-type 2/1-2/1-2/1 (N = 680) and 1/1-1/1-1/1 (N = 160-162) approximants by making full use of the full-potential linearized augmented plane wave-Fourier band calculations based on all-electron density-functional theory. We revealed that the 2/1-2/1-2/1 approximants Al13Mg27Zn45 and Na27Au27Ga31 belong to two different sub-groups classified in terms of ? equal to 126 and 109 and could explain why they take different e/a values of 2.13 and 1.76, respectively. Among eight 1/1-1/1-1/1 approximants Al3Mg4Zn3, Al9Mg8Ag3, Al21Li13Cu6, Ga21Li13Cu6, Na26Au24Ga30, Na26Au37Ge18, Na26Au37Sn18 and Na26Cd40Pb6, the first two, the second two and the last four compounds were classified into three sub-groups with ? = 50, 46 and 42; and were claimed to obey the e/a = 2.30, 2.10-2.15 and 1.70-1.80 rules, respectively.

  10. Risk Quantification and Evaluation Modelling

    Manmohan Singh

    2014-07-01

    Full Text Available In this paper authors have discussed risk quantification methods and evaluation of risks and decision parameter to be used for deciding on ranking of the critical items, for prioritization of condition monitoring based risk and reliability centered maintenance (CBRRCM. As time passes any equipment or any product degrades into lower effectiveness and the rate of failure or malfunctioning increases, thereby lowering the reliability. Thus with the passage of time or a number of active tests or periods of work, the reliability of the product or the system, may fall down to a low value known as a threshold value, below which the reliability should not be allowed to dip. Hence, it is necessary to fix up the normal basis for determining the appropriate points in the product life cycle where predictive preventive maintenance may be applied in the programme so that the reliability (the probability of successful functioning can be enhanced, preferably to its original value, by reducing the failure rate and increasing the mean time between failure. It is very important for defence application where reliability is a prime work. An attempt is made to develop mathematical model for risk assessment and ranking them. Based on likeliness coefficient β1 and risk coefficient β2 ranking of the sub-systems can be modelled and used for CBRRCM.Defence Science Journal, Vol. 64, No. 4, July 2014, pp. 378-384, DOI:http://dx.doi.org/10.14429/dsj.64.6366 

  11. Precise quantification of nanoparticle internalization.

    Gottstein, Claudia; Wu, Guohui; Wong, Benjamin J; Zasadzinski, Joseph Anthony

    2013-06-25

    Nanoparticles have opened new exciting avenues for both diagnostic and therapeutic applications in human disease, and targeted nanoparticles are increasingly used as specific drug delivery vehicles. The precise quantification of nanoparticle internalization is of importance to measure the impact of physical and chemical properties on the uptake of nanoparticles into target cells or into cells responsible for rapid clearance. Internalization of nanoparticles has been measured by various techniques, but comparability of data between different laboratories is impeded by lack of a generally accepted standardized assay. Furthermore, the distinction between associated and internalized particles has been a challenge for many years, although this distinction is critical for most research questions. Previously used methods to verify intracellular location are typically not quantitative and do not lend themselves to high-throughput analysis. Here, we developed a mathematical model which integrates the data from high-throughput flow cytometry measurements with data from quantitative confocal microscopy. The generic method described here will be a useful tool in biomedical nanotechnology studies. The method was then applied to measure the impact of surface coatings of vesosomes on their internalization by cells of the reticuloendothelial system (RES). RES cells are responsible for rapid clearance of nanoparticles, and the resulting fast blood clearance is one of the major challenges in biomedical applications of nanoparticles. Coating of vesosomes with long chain polyethylene glycol showed a trend for lower internalization by RES cells.

  12. Physiologic upper limits of pore size of different blood capillary types and another perspective on the dual pore theory of microvascular permeability

    Sarin Hemant

    2010-08-01

    Full Text Available Abstract Background Much of our current understanding of microvascular permeability is based on the findings of classic experimental studies of blood capillary permeability to various-sized lipid-insoluble endogenous and non-endogenous macromolecules. According to the classic small pore theory of microvascular permeability, which was formulated on the basis of the findings of studies on the transcapillary flow rates of various-sized systemically or regionally perfused endogenous macromolecules, transcapillary exchange across the capillary wall takes place through a single population of small pores that are approximately 6 nm in diameter; whereas, according to the dual pore theory of microvascular permeability, which was formulated on the basis of the findings of studies on the accumulation of various-sized systemically or regionally perfused non-endogenous macromolecules in the locoregional tissue lymphatic drainages, transcapillary exchange across the capillary wall also takes place through a separate population of large pores, or capillary leaks, that are between 24 and 60 nm in diameter. The classification of blood capillary types on the basis of differences in the physiologic upper limits of pore size to transvascular flow highlights the differences in the transcapillary exchange routes for the transvascular transport of endogenous and non-endogenous macromolecules across the capillary walls of different blood capillary types. Methods The findings and published data of studies on capillary wall ultrastructure and capillary microvascular permeability to lipid-insoluble endogenous and non-endogenous molecules from the 1950s to date were reviewed. In this study, the blood capillary types in different tissues and organs were classified on the basis of the physiologic upper limits of pore size to the transvascular flow of lipid-insoluble molecules. Blood capillaries were classified as non-sinusoidal or sinusoidal on the basis of capillary wall

  13. ALinguisticandStylisticAnalysisoftheChineseTranslationofObama’sSpeechattheFirstMeetingoftheStrategicandEconomicDialoguebetweentheUnitedStatesandChina--ThroughtheLensofReiss’sTextTypeTheory

    付端凌

    2014-01-01

    AccordingtoReiss’sTextTypetheory,akeypartofthefunctionalistapproachintranslationstudies,thesourcetextcanbeassignedtoatexttypeandtoagenre.Inmakingthisassignment,thetranslatorcandecideonthehierarchyofpostulateswhichhastobeobservedduringtarget-textproduction(Mona,2005).ThisessayintendstoconductalinguisticandstylisticanalysisoftheChinesetranslationofObama’sspeechtoexplorethegeneralapproachofthetranslatorifthereisone),bycomparingtherespectiveresultsofthetwoanalysesfromtheperspectiveofKatharinaReiss’sTextTypetheory.Indoingso,criticaljudgmentswillaccordinglybemadeastowhethersuchanapproachisjustifiableornot.

  14. Division of afforestation site type in the watershed of Wangjiagou,West Shanxi through GIS

    FAN Liangxin; LIU Yuecui

    2006-01-01

    Using quantification theory I,an analysis of the relation of soil water and qualitative factors,such as,slope degree,slope aspect,slope position,and soil in Wangjiagou watershed area has been done.The study aims to quantify the factors influencing soil water,the descending order of the factors being the slope aspect,soil,slope degree,and slope position,thereby scientifically facilitating division of afforestation site types and afforestation site products digitally on geographical information system (GIS).

  15. Hydration free energy of hard-sphere solute over a wide range of size studied by various types of solution theories

    N.Matubayasi

    2007-12-01

    Full Text Available The hydration free energy of hard-sphere solute is evaluated over a wide range of size using the method of energy representation, information-theoretic approach, reference interaction site model, and scaled-particle theory. The former three are distribution function theories and the hydration free energy is formulated to reflect the solution structure through distribution functions. The presence of the volume-dependent term is pointed out for the distribution function theories, and the asymptotic behavior in the limit of large solute size is identified. It is indicated that the volume-dependent term is a key to the improvement of distribution function theories toward the application to large molecules.

  16. Advanced Quantification of Plutonium Ionization Potential to Support Nuclear Forensic Evaluations by Resonance Ionization Mass Spectrometry

    2015-06-01

    decay and induced emission and absorption are almost identical. Their probability distributions are closely related. The electromagnetic radiation ...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited ADVANCED ...ONLY (Leave blank) 2. REPORT DATE June 2015 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE ADVANCED QUANTIFICATION OF

  17. High photocatalytic performance of a type-II α-MoO3@MoS2 heterojunction: from theory to experiment.

    Li, Honglin; Yu, Ke; Tang, Zheng; Fu, Hao; Zhu, Ziqiang

    2016-05-18

    For the first time, a systematic study using density functional theory (DFT) has been employed to survey the synergistic effect of α-MoO3@MoS2 with the aim of gaining insights into the role of this heterogeneous structure in a relevant photocatalytic reaction. The geometry, electronic structures and the band edge positions of the α-MoO3@MoS2 composite were computed to explore the characteristics of the heterojunction. This revealed that the established heterogeneous structure could facilitate the separation of the photoinduced carriers into two parts around the interface. The photoinduced electron carriers injected into the conduction band minimum (CBM) of α-MoO3 from the CBM of MoS2 while the hole carriers transferred from the valence band maximum (VBM) of α-MoO3 to the VBM of MoS2. This separation process could markedly restrain the photogenerated electron-hole pair recombination and was further verified by photocurrent and photoluminescence (PL) surveys. Based on the results obtained from computation, we then synthesized the α-MoO3@MoS2 hybrid rod@sphere structure via a facile two-step hydrothermal method. A reasonable formation mechanism of this rod@sphere structured composite was proposed. The enhanced photocatalytic performance originated from the synergistic effect between α-MoO3 and MoS2. On the one hand, the unique structural characteristics of the composite possessed massive MoS2 spheres closely attached to α-MoO3 rods. On the other hand, the staggered type-II band formation also contributed to the effective separation of photoinduced carriers and thus the corresponding photocatalytic activity was far superior to that of the pristine α-MoO3/MoS2 structures. In brief, the general analyses could fully explain the inner mechanism for the improved photocatalytic activity of the composite structure and provide a reference for the research of composite structures in the future.

  18. Two families with quadrupedalism, mental retardation, no speech, and infantile hypotonia (Uner Tan Syndrome Type-II); a novel theory for the evolutionary emergence of human bipedalism

    Tan, Uner

    2014-01-01

    Two consanguineous families with Uner Tan Syndrome (UTS) were analyzed in relation to self-organizing processes in complex systems, and the evolutionary emergence of human bipedalism. The cases had the key symptoms of previously reported cases of UTS, such as quadrupedalism, mental retardation, and dysarthric or no speech, but the new cases also exhibited infantile hypotonia and are designated UTS Type-II. There were 10 siblings in Branch I and 12 siblings in Branch II. Of these, there were seven cases exhibiting habitual quadrupedal locomotion (QL): four deceased and three living. The infantile hypotonia in the surviving cases gradually disappeared over a period of years, so that they could sit by about 10 years, crawl on hands and knees by about 12 years. They began walking on all fours around 14 years, habitually using QL. Neurological examinations showed normal tonus in their arms and legs, no Babinski sign, brisk tendon reflexes especially in the legs, and mild tremor. The patients could not walk in a straight line, but (except in one case) could stand up and maintain upright posture with truncal ataxia. Cerebello-vermial hypoplasia and mild gyral simplification were noted in their MRIs. The results of the genetic analysis were inconclusive: no genetic code could be identified as the triggering factor for the syndrome in these families. Instead, the extremely low socio-economic status of the patients was thought to play a role in the emergence of UTS, possibly by epigenetically changing the brain structure and function, with a consequent selection of ancestral neural networks for QL during locomotor development. It was suggested that UTS may be regarded as one of the unpredictable outcomes of self-organization within a complex system. It was also noted that the prominent feature of this syndrome, the diagonal-sequence habitual QL, generated an interference between ipsilateral hands and feet, as in non-human primates. It was suggested that this may have been

  19. Waltz's Theory of Theory

    Wæver, Ole

    2009-01-01

    Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism and refle......Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism...... and reflectivism. Yet, ironically, there has been little attention to Waltz's very explicit and original arguments about the nature of theory. This article explores and explicates Waltz's theory of theory. Central attention is paid to his definition of theory as ‘a picture, mentally formed' and to the radical anti......-empiricism and anti-positivism of his position. Followers and critics alike have treated Waltzian neorealism as if it was at bottom a formal proposition about cause-effect relations. The extreme case of Waltz being so victorious in the discipline, and yet being consistently mis-interpreted on the question of theory...

  20. MOTIVATION INTERNALIZATION AND SIMPLEX STRUCTURE IN SELF-DETERMINATION THEORY.

    Ünlü, Ali; Dettweiler, Ulrich

    2015-12-01

    Self-determination theory, as proposed by Deci and Ryan, postulated different types of motivation regulation. As to the introjected and identified regulation of extrinsic motivation, their internalizations were described as "somewhat external" and "somewhat internal" and remained undetermined in the theory. This paper introduces a constrained regression analysis that allows these vaguely expressed motivations to be estimated in an "optimal" manner, in any given empirical context. The approach was even generalized and applied for simplex structure analysis in self-determination theory. The technique was exemplified with an empirical study comparing science teaching in a classical school class versus an expeditionary outdoor program. Based on a sample of 84 German pupils (43 girls, 41 boys, 10 to 12 years old), data were collected using the German version of the Academic Self-Regulation Questionnaire. The science-teaching format was seen to not influence the pupils' internalization of identified regulation. The internalization of introjected regulation differed and shifted more toward the external pole in the outdoor teaching format. The quantification approach supported the simplex structure of self-determination theory, whereas correlations may disconfirm the simplex structure.

  1. Separation and quantification of microalgal carbohydrates.

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse.

  2. Automatic Quantification of the Number of Intracellular Compartments in Arabidopsis thaliana Root Cells

    Bayle, Vincent; Platre, Matthieu Pierre; Jaillais, Yvon

    2017-01-01

    In the era of quantitative biology, it is increasingly required to quantify confocal microscopy images. If possible, quantification should be performed in an automatic way, in order to avoid bias from the experimenter, to allow the quantification of a large number of samples, and to increase reproducibility between laboratories. In this protocol, we describe procedures for automatic counting of the number of intracellular compartments in Arabidopsis root cells, which can be used for example to study endocytosis or secretory trafficking pathways and to compare membrane organization between different genotypes or treatments. While developed for Arabidopsis roots, this method can be used on other tissues, cell types and plant species. PMID:28255574

  3. Xinhe Mine water inrush risk assessment based on quantification theoretical models

    LI Hui; JING Guo-xun; CAI Zheng-long; OU Jian-chun

    2010-01-01

    Taking the Xinhe mine's structure, mine pressure, structural fissure, fault and fault displacement, the distance between fault and water inrush point, thickness of block,water pressure those geological factors which influenced the water inrush as the independent variable, based on these data of water inrush point and water uninrush point, using the method of quantification theory( Ⅰ, Ⅱ ), it would quantitatively disposes the qualitative variable, applied to calculation to evaluate the risk of Xinhe's water inrush.

  4. Splitting the Reference Time Temporal Anaphora and Quantification in DRT

    Nelken, R; Nelken, Rani; Francez, Nissim

    1995-01-01

    This paper presents an analysis of temporal anaphora in sentences which contain quantification over events, within the framework of Discourse Representation Theory. The analysis in (Partee 1984) of quantified sentences, introduced by a temporal connective, gives the wrong truth-conditions when the temporal connective in the subordinate clause is "before" or "after". This problem has been previously analyzed in (de Swart 1991) as an instance of the proportion problem, and given a solution from a Generalized Quantifier approach. By using a careful distinction between the different notions of reference time, based on (Kamp and Reyle 1993), we propose a solution to this problem, within the framework of DRT. We show some applications of this solution to additional temporal anaphora phenomena in quantified sentences.

  5. WaveletQuant, an improved quantification software based on wavelet signal threshold de-noising for labeled quantitative proteomic analysis

    Li Song

    2010-04-01

    Full Text Available Abstract Background Quantitative proteomics technologies have been developed to comprehensively identify and quantify proteins in two or more complex samples. Quantitative proteomics based on differential stable isotope labeling is one of the proteomics quantification technologies. Mass spectrometric data generated for peptide quantification are often noisy, and peak detection and definition require various smoothing filters to remove noise in order to achieve accurate peptide quantification. Many traditional smoothing filters, such as the moving average filter, Savitzky-Golay filter and Gaussian filter, have been used to reduce noise in MS peaks. However, limitations of these filtering approaches often result in inaccurate peptide quantification. Here we present the WaveletQuant program, based on wavelet theory, for better or alternative MS-based proteomic quantification. Results We developed a novel discrete wavelet transform (DWT and a 'Spatial Adaptive Algorithm' to remove noise and to identify true peaks. We programmed and compiled WaveletQuant using Visual C++ 2005 Express Edition. We then incorporated the WaveletQuant program in the Trans-Proteomic Pipeline (TPP, a commonly used open source proteomics analysis pipeline. Conclusions We showed that WaveletQuant was able to quantify more proteins and to quantify them more accurately than the ASAPRatio, a program that performs quantification in the TPP pipeline, first using known mixed ratios of yeast extracts and then using a data set from ovarian cancer cell lysates. The program and its documentation can be downloaded from our website at http://systemsbiozju.org/data/WaveletQuant.

  6. Antioxidant Activity and Validation of Quantification Method for Lycopene Extracted from Tomato.

    Cefali, Letícia Caramori; Cazedey, Edith Cristina Laignier; Souza-Moreira, Tatiana Maria; Correa, Marcos Antônio; Salgado, Hérida Regina Nunes; Isaac, Vera Lucia Borges

    2015-01-01

    Lycopene is a carotenoid found in tomatoes with potent antioxidant activity. The aim of the study was to obtain an extract containing lycopene from four types of tomatoes, validate a quantification method for the extracts by HPLC, and assess its antioxidant activity. Results revealed that the tomatoes analyzed contained lycopene and antioxidant activity. Salad tomato presented the highest concentration of this carotenoid and antioxidant activity. The quantification method exhibited linearity with a correlation coefficient of 0.9992. Tests for the assessment of precision, accuracy, and robustness achieved coefficients with variation of less than 5%. The LOD and LOQ were 0.0012 and 0.0039 μg/mL, respectively. Salad tomato can be used as a source of lycopene for the development of topical formulations, and based on performed tests, the chosen method for the identification and quantification of lycopene was considered to be linear, precise, exact, selective, and robust.

  7. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  8. Quantification model for energy consumption in edification

    Mercader, Mª P.

    2012-12-01

    Full Text Available The research conducted in this paper focuses on the generation of a model for the quantification of energy consumption in building. This is to be done through one of the most relevant environmental impact indicators associated with weight per m2 of construction, as well as the energy consumption resulting from the manufacturing process of materials used in building construction. The practical application of the proposed model on different buildings typologies in Seville, will provide information regarding the building materials, the subsystems and the most relevant construction elements. Hence, we will be able to observe the impact the built surface has on the environment. The results obtained aim to reference the scientific community, providing quantitative data comparable to other types of buildings and geographical areas. Furthermore, it may also allow the analysis and the characterization of feasible solutions to reduce the environmental impact generated by the different materials, subsystems and construction elements commonly used in the different building types defined in this study.

    La investigación realizada en el presente trabajo plantea la generación de un modelo de cuantificación del consumo energético en edificación, a través de uno de los indicadores de impacto ambiental más relevantes asociados al peso por m2 de construcción, el consumo energético derivado del proceso de fabricación de los materiales de construcción empleados en edificación. La aplicación práctica del modelo propuesto sobre diferentes tipologías edificatorias en Sevilla aportará información respecto a los materiales de construcción, subsistemas y elementos constructivos más impactantes, permitiendo visualizar la influencia que presenta la superficie construida en cuanto al impacto ambiental generado. Los resultados obtenidos pretenden servir de referencia a la comunidad científica, aportando datos num

  9. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  10. F-theory Superspace

    Linch, William D

    2015-01-01

    We consider, at the linearized level, the superspace formulation of lower-dimensional F-theory. In particular, we describe the embedding of 3D Type II supergravity of the superstring, or 4D, N=1 supergravity of M-theory, into the corresponding F-theory in full detail, giving the linearized action and gauge transformations in terms of the prepotential. This manifestly supersymmetric formulation reveals some features not evident from a component treatment, such as Weyl and local S-supersymmetry invariances. The linearized multiplet appears as a super 3-form (just as that for the manifestly T-dual theory is a super 2-form), reflecting the embedding of M-theory (as the T-dual theory embeds Type II supergravity). We also give the embedding of matter multiplets into this superspace, and derive the F-constraint from the gauge invariance of the gauge invariance.

  11. Entanglement quantification by local unitaries

    Monras, A; Giampaolo, S M; Gualdi, G; Davies, G B; Illuminati, F

    2011-01-01

    Invariance under local unitary operations is a fundamental property that must be obeyed by every proper measure of quantum entanglement. However, this is not the only aspect of entanglement theory where local unitaries play a relevant role. In the present work we show that the application of suitable local unitary operations defines a family of bipartite entanglement monotones, collectively referred to as "shield entanglement". They are constructed by first considering the (squared) Hilbert- Schmidt distance of the state from the set of states obtained by applying to it a given local unitary. To the action of each different local unitary there corresponds a different distance. We then minimize these distances over the sets of local unitaries with different spectra, obtaining an entire family of different entanglement monotones. We show that these shield entanglement monotones are organized in a hierarchical structure, and we establish the conditions that need to be imposed on the spectrum of a local unitary f...

  12. Quantifications and Modeling of Human Failure Events in a Fire PSA

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures.

  13. Matrix string theory

    Dijkgraaf, Robbert; Verlinde, Erik; Verlinde, Herman

    1997-02-01

    Via compactification on a circle, the matrix mode] of M-theory proposed by Banks et a]. suggests a concrete identification between the large N limit of two-dimensional N = 8 supersymmetric Yang-Mills theory and type IIA string theory. In this paper we collect evidence that supports this identification. We explicitly identify the perturbative string states and their interactions, and describe the appearance of D-particle and D-membrane states.

  14. Matrix string theory

    Dijkgraaf, R. [Amsterdam Univ. (Netherlands). Dept. of Mathematics; Verlinde, E. [TH-Division, CERN, CH-1211 Geneva 23 (Switzerland)]|[Institute for Theoretical Physics, Universtity of Utrecht, 3508 TA Utrecht (Netherlands); Verlinde, H. [Institute for Theoretical Physics, University of Amsterdam, 1018 XE Amsterdam (Netherlands)

    1997-09-01

    Via compactification on a circle, the matrix model of M-theory proposed by Banks et al. suggests a concrete identification between the large N limit of two-dimensional N=8 supersymmetric Yang-Mills theory and type IIA string theory. In this paper we collect evidence that supports this identification. We explicitly identify the perturbative string states and their interactions, and describe the appearance of D-particle and D-membrane states. (orig.).

  15. Matrix String Theory

    Dijkgraaf, R; Verlinde, Herman L

    1997-01-01

    Via compactification on a circle, the matrix model of M-theory proposed by Banks et al suggests a concrete identification between the large N limit of two-dimensional N=8 supersymmetric Yang-Mills theory and type IIA string theory. In this paper we collect evidence that supports this identification. We explicitly identify the perturbative string states and their interactions, and describe the appearance of D-particle and D-membrane states.

  16. Introduction to superstring theory

    Nunez, Carmen [Instituto de Astronomia y Fisica del Espacio, Buenos Aires (Argentina)], e-mail: carmen@iafe.uba.ar

    2009-07-01

    This is a very basic introduction to the AdS/CFT correspondence. The first lecture motivates the duality between gauge theories and gravity/string theories. The next two lectures introduce the bosonic and supersymmetric string theories. The fourth lecture is devoted to study Dp-branes and finally, in the fifth lecture I discuss the two worlds: N=4 SYM in 3+1 flat dimensions and type IIB superstrings in AdS{sub 5} x S5. (author)

  17. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  18. Quantification of coating aging using impedance measurements

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting c

  19. Perfusion Quantification Using Gaussian Process Deconvolution

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward;

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  20. Quantification of topological concepts using ideals

    Robert Lowen

    2001-01-01

    Full Text Available We introduce certain ideals of real-valued functions as a natural generalization of filters. We show that these ideals establish a canonical framework for the quantification of topological concepts, such as closedness, adherence, and compactness, in the setting of approach spaces.

  1. Z Theory

    Nekrasov, Nikita

    2004-01-01

    We present the evidence for the existence of the topological string analogue of M-theory, which we call Z-theory. The corners of Z-theory moduli space correspond to the Donaldson-Thomas theory, Kodaira-Spencer theory, Gromov-Witten theory, and Donaldson-Witten theory. We discuss the relations of Z-theory with Hitchin's gravities in six and seven dimensions, and make our own proposal, involving spinor generalization of Chern-Simons theory of three-forms. Based on the talk at Strings'04 in Paris.

  2. Constructor theory of probability.

    Marletto, Chiara

    2016-08-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called 'decision-theoretic approach', I shall recast that problem in the recently proposed constructor theory of information-where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch-Wallace-type argument-thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles.

  3. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical

  4. Quantum Information Theory

    Nielsen, M. A.

    2000-01-01

    Quantum information theory is the study of the achievable limits of information processing within quantum mechanics. Many different types of information can be accommodated within quantum mechanics, including classical information, coherent quantum information, and entanglement. Exploring the rich variety of capabilities allowed by these types of information is the subject of quantum information theory, and of this Dissertation. In particular, I demonstrate several novel limits to the informa...

  5. Gauge Theories of Gravitation

    Blagojević, Milutin

    2012-01-01

    During the last five decades, gravity, as one of the fundamental forces of nature, has been formulated as a gauge field theory of the Weyl-Cartan-Yang-Mills type. The resulting theory, the Poincar\\'e gauge theory of gravity, encompasses Einstein's gravitational theory as well as the teleparallel theory of gravity as subcases. In general, the spacetime structure is enriched by Cartan's torsion and the new theory can accommodate fermionic matter and its spin in a perfectly natural way. The present reprint volume contains articles from the most prominent proponents of the theory and is supplemented by detailed commentaries of the editors. This guided tour starts from special relativity and leads, in its first part, to general relativity and its gauge type extensions a la Weyl and Cartan. Subsequent stopping points are the theories of Yang-Mills and Utiyama and, as a particular vantage point, the theory of Sciama and Kibble. Later, the Poincar\\'e gauge theory and its generalizations are explored and specific topi...

  6. Massive IIA string theory and Matrix theory compactification

    Lowe, David A. E-mail: lowe@het.brown.edu; Nastase, Horatiu; Ramgoolam, Sanjaye

    2003-09-08

    We propose a Matrix theory approach to Romans' massive Type IIA supergravity. It is obtained by applying the procedure of Matrix theory compactifications to Hull's proposal of the massive Type IIA string theory as M-theory on a twisted torus. The resulting Matrix theory is a super-Yang-Mills theory on large N three-branes with a space-dependent noncommutativity parameter, which is also independently derived by a T-duality approach. We give evidence showing that the energies of a class of physical excitations of the super-Yang-Mills theory show the correct symmetry expected from massive Type IIA string theory in a lightcone quantization.

  7. Half-Metallic p -Type LaAlO3/EuTiO3 Heterointerface from Density-Functional Theory

    Lu, Hai-Shuang; Cai, Tian-Yi; Ju, Sheng; Gong, Chang-De

    2015-03-01

    The two-dimensional electron gas (2DEG) observed at the LaAlO3/SrTiO3 heterointerface has attracted intense research interest in recent years. The high mobility, electric tunability, and giant persistent photoconductivity suggest its potential for electronic and photonic applications. The lack of a p -type counterpart as well as a highly spin-polarized carrier in the LaAlO3/SrTiO3 system, however, restricts its widespread application, since both multiple carriers and high spin polarization are very desirable for electronic devices. Here, we report a system of LaAlO3/EuTiO3 digital heterostructures that may overcome these limitations. Results from first-principles calculations reveal that the 2DEG in the n -type LaAlO3/EuTiO3 is a normal ferromagnet. The p -type two-dimensional hole gas, on the other hand, is a 100% spin-polarized half-metal. For digital heterostructures with alternating n -type and p -type interfaces, a magnetic-field-driven insulator-to-metal transition, together with spatially separated electrons and holes, can be realized by tuning the intrinsic polar field. At low temperatures, the spin-polarized electron-hole pairs may result in spin-triplet exciton condensation, which provides an experimentally accessible system for achieving the theoretically proposed dissipationless spin transport. Our findings open a path for exploring spintronics at the heterointerface of transition-metal oxides.

  8. Detection and quantification of microRNA in cerebral microdialysate

    Bache, Søren; Rasmussen, Rune; Rossing, Maria

    2015-01-01

    BACKGROUND: Secondary brain injury accounts for a major part of the morbidity and mortality in patients with spontaneous aneurysmal subarachnoid hemorrhage (SAH), but the pathogenesis and pathophysiology remain controversial. MicroRNAs (miRNAs) are important posttranscriptional regulators...... of complementary mRNA targets and have been implicated in the pathophysiology of other types of acute brain injury. Cerebral microdialysis is a promising tool to investigate these mechanisms. We hypothesized that miRNAs would be present in human cerebral microdialysate. METHODS: RNA was extracted and miRNA...... profiles were established using high throughput real-time quantification PCR on the following material: 1) Microdialysate sampled in vitro from A) a solution of total RNA extracted from human brain, B) cerebrospinal fluid (CSF) from a neurologically healthy patient, and C) a patient with SAH; and 2...

  9. The effect of surfaces type on vibration behavior of piezoelectric micro-cantilever close to sample surface in a humid environment based on MCS theory

    Korayem, M. H.; Korayem, A. H.

    2016-08-01

    Atomic force microscopy (AFM) has been known as an innovative tool in the fields of surface topography, determination of different mechanical properties and manipulation of particles at the micro- and nanoscales. This paper has been concerned with advanced modeling and dynamic simulation of AFM micro-cantilever (MC) in the amplitude mode in the air environment. To increase the accuracy of the governing equations, modified couple stress theory appropriate in micro- and nanoscales has been utilized based on Timoshenko beam theory in the air environment near the sample surface. Also, to discretize the equations, differential quadrature method has been recommended. In modeling, geometric discontinuities due to the presence of a piezoelectric layer enclosed between two electrode layers and the change in MC cross section when connected to the MC have been considered. In addition to the effect of MC modeling on the accuracy of modeling and vibration amplitude during surface topography, understanding and modeling the environmental forces in the air environment, including van der Waals, capillary and contact forces, are important. This paper has been provided more accurate environmental forces modeling and has been investigated the vibration behavior of piezoelectric MC in the humid environment. Moreover, this paper has been examined the maximum and minimum MC amplitude in the air environment close to the surface with different kinds of topography. The results illustrate that kind of surfaces has effect on the maximum and minimum amplitude due to the decrease or increase in equilibrium MC distance.

  10. Identification and Quantification of Protein Glycosylation

    Ziv Roth

    2012-01-01

    Full Text Available Glycosylation is one of the most abundant posttranslation modifications of proteins, and accumulating evidence indicate that the vast majority of proteins in eukaryotes are glycosylated. Glycosylation plays a role in protein folding, interaction, stability, and mobility, as well as in signal transduction. Thus, by regulating protein activity, glycosylation is involved in the normal functioning of the cell and in the development of diseases. Indeed, in the past few decades there has been a growing realization of the importance of protein glycosylation, as aberrant glycosylation has been implicated in metabolic, neurodegenerative, and neoplastic diseases. Thus, the identification and quantification of protein-borne oligosaccharides have become increasingly important both in the basic sciences of biochemistry and glycobiology and in the applicative sciences, particularly biomedicine and biotechnology. Here, we review the state-of-the-art methodologies for the identification and quantification of oligosaccharides, specifically N- and O-glycosylated proteins.

  11. Whitepaper on Uncertainty Quantification for MPACT

    Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  12. Standardized Relative Quantification of Immunofluorescence Tissue Staining

    sprotocols

    2015-01-01

    Authors: Oriol Arqués, Irene Chicote, Stephan Tenbaum, Isabel Puig & Héctor G. Palmer ### Abstract The detection of correlations between the expression levels or sub-cellular localization of different proteins with specific characteristics of human tumors, such as e.g. grade of malignancy, may give important hints of functional associations. Here we describe the method we use for relative quantification of immunofluorescence staining of tumor tissue sections, which allows us to co...

  13. Automated quantification of synapses by fluorescence microscopy.

    Schätzle, Philipp; Wuttke, René; Ziegler, Urs; Sonderegger, Peter

    2012-02-15

    The quantification of synapses in neuronal cultures is essential in studies of the molecular mechanisms underlying synaptogenesis and synaptic plasticity. Conventional counting of synapses based on morphological or immunocytochemical criteria is extremely work-intensive. We developed a fully automated method which quantifies synaptic elements and complete synapses based on immunocytochemistry. Pre- and postsynaptic elements are detected by their corresponding fluorescence signals and their proximity to dendrites. Synapses are defined as the combination of a pre- and postsynaptic element within a given distance. The analysis is performed in three dimensions and all parameters required for quantification can be easily adjusted by a graphical user interface. The integrated batch processing enables the analysis of large datasets without any further user interaction and is therefore efficient and timesaving. The potential of this method was demonstrated by an extensive quantification of synapses in neuronal cultures from DIV 7 to DIV 21. The method can be applied to all datasets containing a pre- and postsynaptic labeling plus a dendritic or cell surface marker.

  14. Uncertainty Quantification with Applications to Engineering Problems

    Bigoni, Daniele

    The systematic quantification of the uncertainties affecting dynamical systems and the characterization of the uncertainty of their outcomes is critical for engineering design and analysis, where risks must be reduced as much as possible. Uncertainties stem naturally from our limitations in measu......The systematic quantification of the uncertainties affecting dynamical systems and the characterization of the uncertainty of their outcomes is critical for engineering design and analysis, where risks must be reduced as much as possible. Uncertainties stem naturally from our limitations...... in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... some auxiliary properties, we will apply PC on it, obtaining the STT-decomposition. This will allow the decoupling of each dimension, leading to a much cheaper construction of the PC surrogate. In the associated paper, the capabilities of the STT-decomposition are checked on commonly used test...

  15. Towards a Mathematical Theory of Knowledge

    Ru-Qian Lu

    2005-01-01

    A typed category theory is proposed for the abstract description of knowledge and knowledge processing. It differs from the traditional category theory in two directions: all morphisms have types and the composition of morphisms is not necessary a morphism. Two aspects of application of typed category theory are discussed: cones and limits of knowledge complexity classes and knowledge completion with pseudo-functors.

  16. Modeling the effect of intermolecular force on the size-dependent pull-in behavior of beam-type NEMS using modified couple stress theory

    Beni, Yaghoub Tadi; Karimipour, Iman [Shahrekord University, Shahrekord (Iran, Islamic Republic of); Abadyan, Mohamadreza [Islamic Azad University, Shahrekord (Iran, Islamic Republic of)

    2014-09-15

    Experimental observations reveal that the physical response of nano structures is size-dependent. Herein, modified couple stress theory has been used to study the effect of intermolecular van der Waals force on the size dependent pull-in of nano bridges and nano cantilevers. Three approaches including using differential transformation method, applying numerical method and developing a simple lumped parameter model have been employed to solve the governing equation of the systems. The pull-in parameters i.e. critical tip deflection and instability voltage of the nano structures have been determined. Effect of the van der Waals attraction and the size dependency and the importance of coupling between them on the pull-in performance have been discussed.

  17. AdS{sub 3} x{sub w} (S{sup 3} x S{sup 3} x S{sup 1}) solutions of type IIB string theory

    Donos, Aristomenis [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Gauntlett, Jerome P. [Imperial College, London (United Kingdom). Blackett Lab.]|[Imperial College, London (United Kingdom). The Institute for Mathematical Sicences; Sparks, James [Oxford Univ. (United Kingdom). Mathematical Institute

    2008-10-15

    We analyse a recently constructed class of local solutions of type IIB supergravity that consist of a warped product of AdS{sub 3} with a sevendimensional internal space. In one duality frame the only other nonvanishing fields are the NS three-form and the dilaton. We analyse in detail how these local solutions can be extended to globally well-defined solutions of type IIB string theory, with the internal space having topology S{sup 3} x S{sup 3} x S{sup 1} and with properly quantised three-form flux. We show that many of the dual (0,2) SCFTs are exactly marginal deformations of the (0,2) SCFTs whose holographic duals are warped products of AdS{sub 3} with seven-dimensional manifolds of topology S{sup 3} x S{sup 2} x T{sup 2}. (orig.)

  18. Program Theory Evaluation: Logic Analysis

    Brousselle, Astrid; Champagne, Francois

    2011-01-01

    Program theory evaluation, which has grown in use over the past 10 years, assesses whether a program is designed in such a way that it can achieve its intended outcomes. This article describes a particular type of program theory evaluation--logic analysis--that allows us to test the plausibility of a program's theory using scientific knowledge.…

  19. Predicting the consumption of foods low in saturated fats among people diagnosed with Type 2 diabetes and cardiovascular disease. The role of planning in the theory of planned behaviour.

    White, Katherine M; Terry, Deborah J; Troup, Carolyn; Rempel, Lynn A; Norman, Paul

    2010-10-01

    The present study tested the utility of an extended version of the theory of planned behaviour that included a measure of planning, in the prediction of eating foods low in saturated fats among adults diagnosed with Type 2 diabetes and/or cardiovascular disease. Participants (N=184) completed questionnaires assessing standard theory of planned behaviour measures (attitude, subjective norm, and perceived behavioural control) and the additional volitional variable of planning in relation to eating foods low in saturated fats. Self-report consumption of foods low insaturated fats was assessed 1 month later. In partial support of the theory of planned behaviour, results indicated that attitude and subjective norm predicted intentions to eat foods low in saturated fats and intentions and perceived behavioural control predicted the consumption of foods low in saturated fats. As an additional variable, planning predicted the consumption of foods low in saturated fats directly and also mediated the intention-behaviour and perceived behavioural control-behaviour relationships, suggesting an important role for planning as a post-intentional construct determining healthy eating choices. Suggestions are offered for interventions designed to improve adherence to healthy eating recommendations for people diagnosed with these chronic conditions with a specific emphasis on the steps and activities that are required to promote a healthier lifestyle.

  20. Forest Carbon Leakage Quantification Methods and Their Suitability for Assessing Leakage in REDD

    Sabine Henders

    2012-01-01

    Full Text Available This paper assesses quantification methods for carbon leakage from forestry activities for their suitability in leakage accounting in a future Reducing Emissions from Deforestation and Forest Degradation (REDD mechanism. To that end, we first conducted a literature review to identify specific pre-requisites for leakage assessment in REDD. We then analyzed a total of 34 quantification methods for leakage emissions from the Clean Development Mechanism (CDM, the Verified Carbon Standard (VCS, the Climate Action Reserve (CAR, the CarbonFix Standard (CFS, and from scientific literature sources. We screened these methods for the leakage aspects they address in terms of leakage type, tools used for quantification and the geographical scale covered. Results show that leakage methods can be grouped into nine main methodological approaches, six of which could fulfill the recommended REDD leakage requirements if approaches for primary and secondary leakage are combined. The majority of methods assessed, address either primary or secondary leakage; the former mostly on a local or regional and the latter on national scale. The VCS is found to be the only carbon accounting standard at present to fulfill all leakage quantification requisites in REDD. However, a lack of accounting methods was identified for international leakage, which was addressed by only two methods, both from scientific literature.

  1. A simple dot-blot-Sirius red-based assay for collagen quantification.

    Rodríguez-Rodríguez, Pilar; Arribas, Silvia M; de Pablo, Angel Luis López; González, M Carmen; Abderrahim, Fatima; Condezo-Hoyos, Luis

    2013-08-01

    The assessment of collagen content in tissues is important in biomedical research, since this protein is altered in numerous diseases. Hydroxyproline and Sirius red based assays are the most common methods for collagen quantification. However, these procedures have some pitfalls, such as the requirement of oxygen-free medium or expensive equipment and large sample size or being unsuitable for hydrolyzed collagen, respectively. Our objective was to develop a specific, versatile, and user-friendly quantitative method applicable to small tissue samples and extracts obtained from elastin purification, therefore, suitable for simultaneous quantification of elastin. This method is based on the binding of Sirius red to collagen present in a sample immobilized on a PVDF membrane, as in the dot-blot technique, and quantified by a scanner and image analysis software. Sample loading, Sirius red concentration, temperature and incubation time, type of standard substance, albumin interference, and quantification time are optimized. The method enabled the quantification of (1) intact collagen in several rat tissue homogenates, including small resistance-sized arteries, (2) partially hydrolyzed collagen obtained from NaOH extracts, compatible with elastin purification, and (3) for the detection of differences in collagen content between hypertensive and normotensive rats. We conclude that the developed technique can be widely used since it is versatile (quantifies intact and hydrolyzed collagen), requires small sample volumes, is user-friendly (low-cost, easy to use, minimum toxic materials, and reduced time of test), and is specific (minimal interference with serum albumin).

  2. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  3. Comparative Analysis for the Urban Metabolic Differences of Two Types of Cities in the Resource-Dependent Region Based on Emergy Theory

    Chang Liu

    2016-07-01

    Full Text Available Urban metabolism analysis has become a useful and effective tool to explore urban socio-economic processes. In this research, in order to explore the similarities and differences of metabolic characteristics and variation rules of different types of resource-dependent cities, we selected two cities—Taiyuan and Jincheng, the capital and a traditional resource-dependent city of Shanxi province, respectively, as research subjects, we also established an urban metabolic evaluation framework by employing a set of eight emergy-based indicators from socio-economic data from 2007 to 2014, and compared the similarities and discrepancies from the perspectives of metabolic structure, intensity, pressure, and efficiency, and put forward some suggestions for pursuing sustainable development for both cities and pointed out that more types of resource-dependent cities should be incorporated in future research work.

  4. F-theory with Worldvolume Sectioning

    Linch, William D

    2015-01-01

    We describe the worldvolume for the bosonic sector of the lower-dimensional F-theory that embeds 5D, N=1 M-theory and the 4D type II superstring. This theory is a complexification of the fundamental 5-brane theory that embeds the 4D, N=1 M-theory of the 3D type II string in a sense that we make explicit at the level of the Lagrangian and Hamiltonian formulations. We find three types of section condition: in spacetime, on the worldvolume, and one tying them together. The 5-brane theory is recovered from the new theory by a double dimensional reduction.

  5. Comparing determination methods of detection and quantification limits for aflatoxin analysis in hazelnut

    2016-01-01

    Hazelnut is a type of plant that grows in wet and humid climatic conditions. Adverse climatic conditions result in the formation of aflatoxin in hazelnuts during the harvesting, drying, and storing processes. Aflatoxin is considered an important food contaminant, which makes aflatoxin analysis important in the international produce trade. For this reason, validation is important for the analysis of aflatoxin in hazelnuts. The limit of detection (LOD) and limit of quantification (LOQ) are two ...

  6. 赖斯的文本类型理论与《静夜思》英译研究%A Study of Translating Jing Ye Si into English with Reiss’s Text Type Theory

    陈龙; 黄万武

    2014-01-01

    Text type theory put forward by German functional school scholar Reiss in 1970s offers different translation strategies and methods of evaluation for various texts. On Tang poetry, the treasure of Chinese literary history, more and more scholastic voice of growing concern over the translation of Tang poetry. The study on four English versions of Jing Ye Si, a classic lyric poem, is conducted from the perspective of Reiss’ text type theory, which indicates that the theory is easy to be applied and is of great significance for translating Tang poetry and disseminating the brilliant Chinese culture.%德国功能主义派学者赖斯20世纪70年代提出的文本类型理论为不同类型的文本提供了不同的翻译策略和评估方法。唐诗作为我国文学史上的瑰宝,越来越多的学者开始关注唐诗的英译。本文从赖斯的文本类型理论的角度来研究经典抒情唐诗《静夜思》的4个英译本。研究发现文本类型理论操作性强,对指导唐诗英译以及弘扬中华民族优秀文化有重要意义。

  7. Research on English Lexical Acquisition Based on Proto- type Theory%基于原型范畴理论的英语词汇习得研究

    王丽

    2015-01-01

    With the development of Cognitive Linguistics, the cognitive law of second language acquisition has aroused general concern. Prototype Theory, from the cognitive angle, explores the internal link in language and the relation between language and the objective world, thus opening a new perspective for English lexical acquisition.%随着认知语言学的发展,二语习得的认知规律广受关注.原型范畴理论从认知角度探究语言内部各要素之间以及语言与客观世界之间的联系,为英语词汇习得提供了新的思路——加强基本层次范畴词汇的学习;多义词的习得以原型意义为中心;注重隐喻、转喻思维意识的培养.

  8. Neural networks and graph theory

    许进; 保铮

    2002-01-01

    The relationships between artificial neural networks and graph theory are considered in detail. The applications of artificial neural networks to many difficult problems of graph theory, especially NP-complete problems, and the applications of graph theory to artificial neural networks are discussed. For example graph theory is used to study the pattern classification problem on the discrete type feedforward neural networks, and the stability analysis of feedback artificial neural networks etc.

  9. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    National Aeronautics and Space Administration — Propagation of parameter uncertainties through large computer models can be very resource intensive. Frameworks and tools for uncertainty quantification are...

  10. Development of a VHH-Based Erythropoietin Quantification Assay

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon;

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...... of EPO in a high-throughput setting....

  11. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  12. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  13. Methodological strategies for transgene copy number quantification in goats (Capra hircus) using real-time PCR.

    Batista, Ribrio I T P; Luciano, Maria C S; Teixeira, Dárcio I A; Freitas, Vicente J F; Melo, Luciana M; Andreeva, Lyudmila E; Serova, Irina A; Serov, Oleg L

    2014-01-01

    Taking into account the importance of goats as transgenic models, as well as the rarity of copy number (CN) studies in farm animals, the present work aimed to evaluate methodological strategies for accurate and precise transgene CN quantification in goats using quantitative polymerase chain reaction (qPCR). Mouse and goat lines transgenic for human granulocyte-colony stimulating factor were used. After selecting the best genomic DNA extraction method to be applied in mouse and goat samples, intra-assay variations, accuracy and precision of CN quantifications were assessed. The optimized conditions were submitted to mathematical strategies and used to quantify CN in goat lines. The findings were as follows: validation of qPCR conditions is required, and amplification efficiency is the most important. Absolute and relative quantifications are able to produce similar results. For normalized absolute quantification, the same plasmid fragment used to generate goat lines must be mixed with wild-type goat genomic DNA, allowing the choice of an endogenous reference gene for data normalization. For relative quantifications, a resin-based genomic DNA extraction method is strongly recommended when using mouse tail tips as calibrators to avoid tissue-specific inhibitors. Efficient qPCR amplifications (≥95%) allow reliable CN measurements with SYBR technology. TaqMan must be used with caution in goats if the nucleotide sequence of the endogenous reference gene is not yet well understood. Adhering to these general guidelines can result in more exact CN determination in goats. Even when working under nonoptimal circumstances, if assays are performed that respect the minimum qPCR requirements, good estimations of transgene CN can be achieved.

  14. Stereo-particle image velocimetry uncertainty quantification

    Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  15. A new approach for the quantification of synchrony of multivariate non-stationary psychophysiological variables during emotion eliciting stimuli

    Augustin eKelava

    2015-01-01

    Full Text Available Emotion eliciting situations are accompanied by reactions on multiple response variables on subjective, physiological, and behavioral levels. The quantification of the overall simultaneous synchrony of psychophysiological reactions, plays a major role in emotion theories and has received increasing attention in recent research. From a psychometric perspective, the reactions represent multivariate non-stationary intra-individual time series. In this paper, we present a new time-frequency based latent variable approach for the quantification of the synchrony of the responses. The approach is applied to empirical data collected during an emotion eliciting situation. The results are compared with a complementary inter-individual approach of Hsieh et al. (2011. Finally, the proposed approach is discussed in the context of emotion theories, and possible future applications and limitations are provided.

  16. Statistical theory for hydrogen bonding fluid system of A_aD_d type(II):Properties of hydrogen bonding networks

    2007-01-01

    Making use of the invariant property of the equilibrium size distribution of the hydrogen bonding clus- ters formed in hydrogen bonding system of AaDd type,the analytical expressions of the free energy in pregel and postgel regimes are obtained.Then the gel free energy and the scaling behavior of the number of hydrogen bonds in gel phase near the critical point are investigated to give the corre- sponding scaling exponents and scaling law.Meanwhile,some properties of intermolecular and in- tramolecular hydrogen bonds in the system,sol and gel phases are discussed.As a result,the explicit relationship between the number of intramolecular hydrogen bonds and hydrogen bonding degree is obtained.

  17. Statistical theory for hydrogen bonding fluid system of A_aD_d type(Ⅲ):Equation of state and fluctuations

    2007-01-01

    The equation of the state of the hydrogen bonding fluid system of AaDd type is studied by the principle of statistical mechanics. The influences of hydrogen bonds on the equation of state of the system are obtained based on the change in volume due to hydrogen bonds. Moreover,the number density fluc-tuations of both molecules and hydrogen bonds as well as their spatial correlation property are inves-tigated. Furthermore,an equation describing relation between the number density correlation function of "molecules-hydrogen bonds" and that of molecules and hydrogen bonds is derived. As application,taking the van der Waals hydrogen bonding fluid as an example,we considered the effect of hydrogen bonds on its relevant statistical properties.

  18. Birth of String Theory

    Itoyama, H

    2016-01-01

    This is a brief summary of an introductory lecture for students and scholars in general given by the author at Nambu Memorial Symposium which was held at Osaka City University on September 29, 2015. We review the invention of string theory by Professor Yoichiro Nambu following the discovery of the Veneziano amplitude. We also discuss Professor Nambu's proposal on string theory in the Schild gauge in 1976 which is related to the matrix model of Yang-Mills type.

  19. Instantons in string theory

    Ahlén, Olof, E-mail: olof.ahlen@aei.mpg.de [Max-Planck-Institut für Gravitationsphysik (Albert-Einstein-Institut), Am Mühlenberg 1, DE-14476 Potsdam (Germany)

    2015-12-17

    These proceedings from the second Caesar Lattes meeting in Rio de Janeiro 2015 are a brief introduction to how automorphic forms appear in the low energy effective action of maximally supersymmetric string theory. The explicit example of the R{sup 4}-interaction of type IIB string theory in ten dimensions is discussed. Its Fourier expansion is interpreted in terms of perturbative and non-perturbative contributions to the four graviton amplitude.

  20. QUANTIFICATION OF TISSUE PROPERTIES IN SMALL VOLUMES

    J. MOURANT; ET AL

    2000-12-01

    The quantification of tissue properties by optical measurements will facilitate the development of noninvasive methods of cancer diagnosis and detection. Optical measurements are sensitive to tissue structure which is known to change during tumorigenesis. The goals of the work presented in this paper were to verify that the primary scatterers of light in cells are structures much smaller than the nucleus and then to develop an optical technique that can quantify parameters of structures the same size as the scattering features in cells. Polarized, elastic back-scattering was found to be able to quantify changes in scattering properties for turbid media consisting of scatterers of the size found in tissue.

  1. Tutorial examples for uncertainty quantification methods.

    De Bord, Sarah [Univ. of California, Davis, CA (United States)

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  2. Packaging Theory.

    Williams, Jeffrey

    1994-01-01

    Considers the recent flood of anthologies of literary criticism and theory as exemplifications of the confluence of pedagogical concerns, economics of publishing, and other historical factors. Looks specifically at how these anthologies present theory. Cites problems with their formatting theory and proposes alternative ways of organizing theory…

  3. Agency Theory

    Linder, Stefan; Foss, Nicolai Juul

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  4. Agency Theory

    Linder, Stefan; Foss, Nicolai Juul

    2015-01-01

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  5. A survey of hidden-variables theories

    Belinfante, F J

    1973-01-01

    A Survey of Hidden-Variables Theories is a three-part book on the hidden-variable theories, referred in this book as """"theories of the first kind"""". Part I reviews the motives in developing different types of hidden-variables theories. The quest for determinism led to theories of the first kind; the quest for theories that look like causal theories when applied to spatially separated systems that interacted in the past led to theories of the second kind. Parts II and III further describe the theories of the first kind and second kind, respectively. This book is written to make the literat

  6. Density functional theory study of lithium diffusion at the interface between olivine-type LiFePO4 and LiMnPO4

    Shi, Jianjian; Wang, Zhiguo; Qing Fu, Yong

    2016-12-01

    Coating LiMnPO4 with a thin layer of LiFePO4 shows a better electrochemical performance than the pure LiFePO4 and LiMnPO4, thus it is critical to understand Li diffusion at their interfaces to improve the performance of electrode materials. Li diffusion at the (1 0 0)\\text{LiFeP{{\\text{O}}4}} //(1 0 0)\\text{LiMnP{{\\text{O}}4}} , (0 1 0)\\text{LiFeP{{\\text{O}}4}} //(0 1 0)\\text{LiMnP{{\\text{O}}4}} , and (0 0 1)\\text{LiFeP{{\\text{O}}4}} //(0 0 1)\\text{LiMnP{{\\text{O}}4}} interfaces between LiFePO4 and LiMnPO4 was investigated using density functional theory. The calculated diffusion energy barriers are 0.55 eV for Li to diffuse along the (0 0 1) interface, 0.44 and 0.49 eV for the Li diffusion inside the LiMnPO4 and along the (1 0 0) interface, respectively. When Li diffuses from the LiFePO4 to LiMnPO4 by passing through the (0 1 0) interfaces, the diffusion barriers are 0.45 and 0.60 eV for the Li diffusions in both sides. The diffusion barriers for Li to diffuse in LiMnPO4 near the interfaces decrease compared with those in the pure LiMnPO4. The calculated diffusion coefficient of Li along the (1 0 0) interface is in the range of 3.65  ×  10-11-5.28  ×  10-12 cm2 s-1, which is larger than that in the pure LiMnPO4 with a value of 7.5  ×  10-14 cm2 s-1. Therefore, the charging/discharging rate performance of the LiMnPO4 can be improved by surface coating with the LiFePO4.

  7. Theory of relations

    Fraïssé, R

    2011-01-01

    The first part of this book concerns the present state of the theory of chains (= total or linear orderings), in connection with some refinements of Ramsey's theorem, due to Galvin and Nash-Williams. This leads to the fundamental Laver's embeddability theorem for scattered chains, using Nash-Williams' better quasi-orderings, barriers and forerunning.The second part (chapters 9 to 12) extends to general relations the main notions and results from order-type theory. An important connection appears with permutation theory (Cameron, Pouzet, Livingstone and Wagner) and with logics (existence criter

  8. Dualities in M-theory and Born-Infeld Theory

    Brace, Daniel, M

    2001-08-01

    We discuss two examples of duality. The first arises in the context of toroidal compactification of the discrete light cone quantization of M-theory. In the presence of nontrivial moduli coming from the M-theory three form, it has been conjectured that the system is described by supersymmetric Yang-Mills gauge theory on a noncommutative torus. We are able to provide evidence for this conjecture, by showing that the dualities of this M-theory compactification, which correspond to T-duality in Type IIA string theory, are also dualities of the noncommutative supersymmetric Yang-Mills description. One can also consider this as evidence for the accuracy of the Matrix Theory description of M-theory in this background. The second type of duality is the self-duality of theories with U(1) gauge fields. After discussing the general theory of duality invariance for theories with complex gauge fields, we are able to find a generalization of the well known U(1) Born-Infeld theory that contains any number of gauge fields and which is invariant under the maximal duality group. We then find a supersymmetric extension of our results, and also show that our results can be extended to find Born-Infeld type actions in any even dimensional spacetime.

  9. Dualities in M-theory and Born-Infeld Theory

    Brace, Daniel M. [Univ. of California, Berkeley, CA (United States)

    2001-01-01

    We discuss two examples of duality. The first arises in the context of toroidal compactification of the discrete light cone quantization of M-theory. In the presence of nontrivial moduli coming from the M-theory three form, it has been conjectured that the system is described by supersymmetric Yang-Mills gauge theory on a noncommutative torus. We are able to provide evidence for this conjecture, by showing that the dualities of this M-theory compactification, which correspond to T-duality in Type IIA string theory, are also dualities of the noncommutative supersymmetric Yang-Mills description. One can also consider this as evidence for the accuracy of the Matrix Theory description of M-theory in this background. The second type of duality is the self-duality of theories with U(1) gauge fields. After discussing the general theory of duality invariance for theories with complex gauge fields, we are able to find a generalization of the well known U(1) Born-Infeld theory that contains any number of gauge fields and which is invariant under the maximal duality group. We then find a supersymmetric extension of our results, and also show that our results can be extended to find Born-Infeld type actions in any even dimensional spacetime.

  10. Application of PDCA theory to community intervention on patients with type 2 diabetes%PDCA 理论在2型糖尿病患者社区干预中的应用

    刘学梅; 温士玲; 李红

    2013-01-01

    目的:探讨质量管理循环( PDCA)理论在2型糖尿病患者社区干预中的应用方法及临床效果。方法:对50例2型糖尿病出院患者应用PDCA理论进行为期6个月的社区干预,观察比较干预前后患者生活方式、遵医行为及各代谢指标变化情况。结果:本组患者干预后各代谢指标与入院时基线值比较差异有统计学意义(P<0.01),生活方式和遵医行为较前明显改善(P<0.01)。结论:应用PDCA理论对2型糖尿病出院患者进行社区干预,能巩固住院期间健康教育效果,改善其生活方式,提高遵医治疗依从性,从而达到血糖控制的目的,值得推广应用。%Objective:To explore the approach of application of plan -do-check-action( PDCA) theory to community intervention on patients with type 2 diabetes and investigate the clinical effect .Methods:50 patients with type 2 diabetes were given community intervention for 6 months in the light of PDCA theory ,and then the lifestyle and medical compliance behavior of the patients and the changes of meta -bolic indexes were observed and compared before and after the intervention .Results:There was statistically significant difference in the comparison of the metabolic indexes and baseline values after the intervention and on admission ( P<0.01);the lifestyle and medical compli-ance behavior were significantly improved after the intervention compared with those before the intervention (P<0.01).Conclusion:Applica-tion of PDCA theory in community intervention on the patients with type 2 diabetes can consolidate the effect of health education during hos-pitalization and improve the lifestyle and medical compliance behavior of the patients so as to achieve the control of blood glucose .

  11. Uncertainty Quantification for Airfoil Icing using Polynomial Chaos Expansions

    DeGennaro, Anthony M; Martinelli, Luigi

    2014-01-01

    The formation and accretion of ice on the leading edge of a wing can be detrimental to airplane performance. Complicating this reality is the fact that even a small amount of uncertainty in the shape of the accreted ice may result in a large amount of uncertainty in aerodynamic performance metrics (e.g., stall angle of attack). The main focus of this work concerns using the techniques of Polynomial Chaos Expansions (PCE) to quantify icing uncertainty much more quickly than traditional methods (e.g., Monte Carlo). First, we present a brief survey of the literature concerning the physics of wing icing, with the intention of giving a certain amount of intuition for the physical process. Next, we give a brief overview of the background theory of PCE. Finally, we compare the results of Monte Carlo simulations to PCE-based uncertainty quantification for several different airfoil icing scenarios. The results are in good agreement and confirm that PCE methods are much more efficient for the canonical airfoil icing un...

  12. The Method of Manufactured Universes for validating uncertainty quantification methods

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  13. Theory and Troubleshooting of e Type Electron Beam Evaporation Source%e型电子束蒸发源的原理及维修

    王秀海

    2012-01-01

    In virtue of its outstanding advantages, electron beam evaporation ( EBE ) has an increasingly extensive application, and becomes the main technology in the industrial vacuum plating process. The definition of EBE was explained, the main feature and the advantage of the EBE source were described, the structure and the working principle of the e type EBE sources were explained in detail. How the relative installation position between the filament and beam-formming electrode affects the beam current and size were analyzed especially. According to the working experience, the daily maintenance for the e type EBE source was summarized. The announcements and procedure to replace the filament were described. The influence of the cleanliness in the chamber and source on the process quality was analyzed. The fault phenomenon and troubleshooting were introduced.%电子束蒸发由于其显著的优点应用日趋广泛,成为工业镀膜工艺中最主要的技术之一.基于真空镀膜技术,首先对电子束蒸发的定义进行了阐述,介绍了电子束蒸发源的特点及优势,详细说明了e型电子束蒸发源的结构及工作原理.重点分析了电子枪灯丝与聚束极的相对位置对束斑及束流大小的影响.结合维修经验对e型电子束蒸发源的日常维护进行了总结,对更换电子枪灯丝步骤及注意事项进行了细致说明,着重提出了腔室及蒸发源的洁净度对工艺质量的影响.最后对于蒸发过程中常见故障现象及解决方法进行说明.

  14. 基于关系理论的高校知识型人才管理策略%Management Strategy of Knowledge-type Talents in Universities Based on Relations Theory

    康志荣

    2011-01-01

    高校知识型人才队伍是学校发展的主体,对教育事业的发展起着主导性作用,是高校核心竞争力的人力因素.文章在解析行为科学的人际关系理论的概念与作用基础上,结合高校知识型人才的特点,提出高校科学有效的管理措施,实现对知识型人才的柔性管理.%Knowledge-type talents in university are subject of school development. They play a dominant role in the development of educational cause, and to be human resources factor of core competence in universities. Based on concept and function of relations theory and combined with characteristics of knowledge-type talents in universities, the article proposes scientific and effective management measures to realize flexible management of knowledge-type talents.

  15. Atomic theories

    Loring, FH

    2014-01-01

    Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec

  16. Grounded theory.

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  17. Ring theory

    Rowen, Louis H

    1991-01-01

    This is an abridged edition of the author's previous two-volume work, Ring Theory, which concentrates on essential material for a general ring theory course while ommitting much of the material intended for ring theory specialists. It has been praised by reviewers:**""As a textbook for graduate students, Ring Theory joins the best....The experts will find several attractive and pleasant features in Ring Theory. The most noteworthy is the inclusion, usually in supplements and appendices, of many useful constructions which are hard to locate outside of the original sources....The audience of non

  18. Theory of filtered type-II parametric down-conversion in the continuous-variable domain: Quantifying the impacts of filtering

    Christ, Andreas; Lupo, Cosmo; Reichelt, Matthias; Meier, Torsten; Silberhorn, Christine

    2014-08-01

    Parametric down-conversion (PDC) forms one of the basic building blocks for quantum optical experiments. However, the intrinsic multimode spectral-temporal structure of pulsed PDC often poses a severe hindrance for the direct implementation of the heralding of pure single-photon states or, for example, continuous-variable entanglement distillation experiments. To get rid of multimode effects narrowband frequency filtering is frequently applied to achieve a single-mode behavior. A rigorous theoretical description to accurately describe the effects of filtering on PDC, however, is still missing. To date, the theoretical models of filtered PDC are rooted in the discrete-variable domain and only account for filtering in the low-gain regime, where only a few photon pairs are emitted at any single point in time. In this paper we extend these theoretical descriptions and put forward a simple model, which is able to accurately describe the effects of filtering on PDC in the continuous-variable domain. This developed straightforward theoretical framework enables us to accurately quantify the tradeoff between suppression of higher-order modes, reduced purity, and lowered Einstein-Podolsky-Rosen entanglement, when narrowband filters are applied to multimode type-II PDC.

  19. Quantification of prebiotics in commercial infant formulas.

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics.

  20. CT quantification of central airway in tracheobronchomalacia

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  1. Theory of differential equations

    Gel'fand, I M

    1967-01-01

    Generalized Functions, Volume 3: Theory of Differential Equations focuses on the application of generalized functions to problems of the theory of partial differential equations.This book discusses the problems of determining uniqueness and correctness classes for solutions of the Cauchy problem for systems with constant coefficients and eigenfunction expansions for self-adjoint differential operators. The topics covered include the bounded operators in spaces of type W, Cauchy problem in a topological vector space, and theorem of the Phragmén-Lindelöf type. The correctness classes for the Cau

  2. Two families with quadrupedalism, mental retardation, no speech, and infantile hypotonia (Uner Tan Syndrome Type-II; a novel theory for the evolutionary emergence of human bipedalism

    Uner eTan

    2014-04-01

    Full Text Available Two consanguineous families with Uner Tan Syndrome (UTS were analyzed in relation to self-organizing processes in complex systems, and the evolutionary emergence of human bipedalism. The cases had the key symptoms of previously reported cases of UTS, such as quadrupedalism, mental retardation, and dysarthric or no speech, but the new cases also exhibited infantile hypotonia and are designated UTS Type-II. There were 10 siblings in Branch I and 12 siblings in Branch II. Of these, there were seven cases exhibiting habitual quadrupedal locomotion (QL: four deceased and three living. The infantile hypotonia in the surviving cases gradually disappeared over a period of years, so that they could sit by about 10 years, crawl on hands and knees by about 12 years. They began walking on all fours around 14 years, habitually using QL. Neurological examinations showed normal tonus in their arms and legs, no Babinski sign, brisk tendon reflexes especially in the legs, and mild tremor. The patients could not walk in a straight line, but (except in one case could stand up and maintain upright posture with truncal ataxia. Cerebello-vermial hypoplasia and mild gyral simplification were noted in their MRIs. The results of the genetic analysis were inconclusive: no genetic code could be identified as the triggering factor for the syndrome in these families. Instead, the extremely low socio-economic status of the patients was thought to play a role in the emergence of UTS, possibly by epigenetically changing the brain structure and function, with a consequent selection of ancestral neural networks for QL during locomotor development. It was suggested that UTS may be regarded as one of the unpredictable outcomes of self-organization within a complex system. It was also noted that the prominent feature of this syndrome, the diagonal-sequence habitual QL, generated an interference between ipsilateral hands and feet, as in non-human primates. It was suggested that this

  3. Fluorometric quantification of polyphosphate in environmental plankton samples: extraction protocols, matrix effects, and nucleic acid interference.

    Martin, Patrick; Van Mooy, Benjamin A S

    2013-01-01

    Polyphosphate (polyP) is a ubiquitous biochemical with many cellular functions and comprises an important environmental phosphorus pool. However, methodological challenges have hampered routine quantification of polyP in environmental samples. We tested 15 protocols to extract inorganic polyphosphate from natural marine samples and cultured cyanobacteria for fluorometric quantification with 4',6-diamidino-2-phenylindole (DAPI) without prior purification. A combination of brief boiling and digestion with proteinase K was superior to all other protocols, including other enzymatic digestions and neutral or alkaline leaches. However, three successive extractions were required to extract all polyP. Standard addition revealed matrix effects that differed between sample types, causing polyP to be over- or underestimated by up to 50% in the samples tested here. Although previous studies judged that the presence of DNA would not complicate fluorometric quantification of polyP with DAPI, we show that RNA can cause significant interference at the wavelengths used to measure polyP. Importantly, treating samples with DNase and RNase before proteinase K digestion reduced fluorescence by up to 57%. We measured particulate polyP along a North Pacific coastal-to-open ocean transect and show that particulate polyP concentrations increased toward the open ocean. While our final method is optimized for marine particulate matter, different environmental sample types may need to be assessed for matrix effects, extraction efficiency, and nucleic acid interference.

  4. Quantification of protein posttranslational modifications using stable isotope and mass spectrometry. II. Performance.

    Luo, Quanzhou; Wypych, Jette; Jiang, Xinzhao Grace; Zhang, Xin; Luo, Shun; Jerums, Matthew; Lewis, Jeffrey; Iii, Ronald Keener; Huang, Gang; Apostol, Izydor

    2012-02-15

    In this report, we examine the performance of a mass spectrometry (MS)-based method for quantification of protein posttranslational modifications (PTMs) using stable isotope labeled internal standards. Uniform labeling of proteins and highly similar behavior of the labeled vs nonlabeled analyte pairs during chromatographic separation and electrospray ionization (ESI) provide the means to directly quantify a wide range of PTMs. In the companion report (Jiang et al., Anal. Biochem., 421 (2012) 506-516.), we provided principles and example applications of the method. Here we show satisfactory accuracy and precision for quantifying protein modifications by using the SILIS method when the analyses were performed on different types of mass spectrometers, such as ion-trap, time-of-flight (TOF), and quadrupole instruments. Additionally, the stable isotope labeled internal standard (SILIS) method demonstrated an extended linear range of quantification expressed in accurate quantification up to at least a 4 log concentration range on three different types of mass spectrometers. We also demonstrate that lengthy chromatographic separation is no longer required to obtain quality results, offering an opportunity to significantly shorten the method run time. The results indicate the potential of this methodology for rapid and large-scale assessment of multiple quality attributes of a therapeutic protein in a single analysis.

  5. Multiparty Symmetric Sum Types

    Nielsen, Lasse; Yoshida, Nobuko; Honda, Kohei

    2010-01-01

    This paper introduces a new theory of multiparty session types based on symmetric sum types, by which we can type non-deterministic orchestration choice behaviours. While the original branching type in session types can represent a choice made by a single participant and accepted by others...... determining how the session proceeds, the symmetric sum type represents a choice made by agreement among all the participants of a session. Such behaviour can be found in many practical systems, including collaborative workflow in healthcare systems for clinical practice guidelines (CPGs). Processes...... with the symmetric sums can be embedded into the original branching types using conductor processes. We show that this type-driven embedding preserves typability, satisfies semantic soundness and completeness, and meets the encodability criteria adapted to the typed setting. The theory leads to an efficient...

  6. Torsion as a source of expansion in a Bianchi type-I universe in the self-consistent Einstein-Cartan theory of a perfect fluid with spin density

    Bradas, James C.; Fennelly, Alphonsus J.; Smalley, Larry L.

    1987-01-01

    It is shown that a generalized (or 'power law') inflationary phase arises naturally and inevitably in a simple (Bianchi type-I) anisotropic cosmological model in the self-consistent Einstein-Cartan gravitation theory with the improved stress-energy-momentum tensor with the spin density of Ray and Smalley (1982, 1983). This is made explicit by an analytical solution of the field equations of motion of the fluid variables. The inflation is caused by the angular kinetic energy density due to spin. The model further elucidates the relationship between fluid vorticity, the angular velocity of the inertially dragged tetrads, and the precession of the principal axes of the shear ellipsoid. Shear is not effective in damping the inflation.

  7. 论《知音》篇理论命题的类型、内涵及当代意义%On the Types,Intensions and Present Significance of the Theory Propositions in "Zhiyin"

    唐萌; 吴建民

    2011-01-01

    "命题"是古代文论的重要构成形式。《知音》篇作为《文心雕龙》关于文学鉴赏和文学批评的专论,刘勰在文中提出了一系列有关文学鉴赏和文学批评的理论命题。这些命题从形式类型上看,主要有四种构成形式;从理论内涵上看,各有不同的思想内涵,构成了刘勰文学欣赏、批评论的基本内容;从价值意义上看,这些理论命题在今天仍然具有切实的应用价值和重要的理论意义。%"Proposition" is an important type of ancient literary theory."Zhiyin",a monography about literary appreciation and criticism in Liuxie's The Literary Mind and the Carving of Dragons,embodies a series of the author's theoretical propositions on literary appreciation and criticism.In terms of form,these propositions can be classified into four types;in terms of content,all the theories contained constitute the basic outlooks of Liuxie's literary appreciation and criticism;in terms of value,many of these propositions are still of practical and theoretical meaning in present time.

  8. Molecular quantification of genes encoding for green-fluorescent proteins

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification is pro...

  9. An improved competitive inhibition enzymatic immunoassay method for tetrodotoxin quantification

    Stokes Amber N; Williams Becky L; French Susannah S

    2012-01-01

    Abstract Quantifying tetrodotoxin (TTX) has been a challenge in both ecological and medical research due to the cost, time and training required of most quantification techniques. Here we present a modified Competitive Inhibition Enzymatic Immunoassay for the quantification of TTX, and to aid researchers in the optimization of this technique for widespread use with a high degree of accuracy and repeatability.

  10. An improved competitive inhibition enzymatic immunoassay method for tetrodotoxin quantification

    Stokes Amber N

    2012-03-01

    Full Text Available Abstract Quantifying tetrodotoxin (TTX has been a challenge in both ecological and medical research due to the cost, time and training required of most quantification techniques. Here we present a modified Competitive Inhibition Enzymatic Immunoassay for the quantification of TTX, and to aid researchers in the optimization of this technique for widespread use with a high degree of accuracy and repeatability.

  11. An improved competitive inhibition enzymatic immunoassay method for tetrodotoxin quantification.

    Stokes, Amber N; Williams, Becky L; French, Susannah S

    2012-01-01

    Quantifying tetrodotoxin (TTX) has been a challenge in both ecological and medical research due to the cost, time and training required of most quantification techniques. Here we present a modified Competitive Inhibition Enzymatic Immunoassay for the quantification of TTX, and to aid researchers in the optimization of this technique for widespread use with a high degree of accuracy and repeatability.

  12. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples.

  13. Basis Set Requirements for Sulfur Compounds in Density Functional Theory:  a Comparison between Correlation-Consistent, Polarized-Consistent, and Pople-Type Basis Sets.

    Denis, Pablo A

    2005-09-01

    We have investigated the SX (X = first- or second-row atom), SO2, and SO3 molecules employing the correlation-consistent (cc), the recently developed polarization-consistent (pc), and three Pople-type basis sets, in conjunction with the B3LYP functional. The results confirmed that the aug-pc basis sets represent a great contribution in terms of cost-benefits. In the case of the B3LYP functional, when employing the aug-pc-3 and aug-pc-4 basis sets, it is possible to obtain results that are of aug-cc-pV(5+d)Z and aug-cc-pV(6+d)Z quality, respectively, at a much lower cost. The estimations obtained employing smaller members of the family are of nearly double-ζ quality and do not provide reliable results. There is no basis set of quadruple-ζ quality among the polarized-consistent basis sets, although in terms of composition, the aug-pc-3 basis set is a QZ basis set. A precise estimation of the Kohn-Sham complete basis set (CBS) limit with the aug-pc-X basis sets is too difficult for the B3LYP functional because the ∞(aug-pc-4, aug-pc-3, aug-pc-2) extrapolation gives the same results as those of the aug-pc-4 basis set. This is in contrast with the results observed for ab initio methodologies for which the largest basis sets provided the best estimation of the CBS limit. In our opinion, the closest results to the B3LYP/CBS limit are expected to be those obtained with a two-point extrapolation employing the aug-cc-pV(X+d)Z (X = 5, 6) basis sets. The results obtained with this extrapolation are very close to those predicted by the ∞(aug-pc-3, aug-pc-2, aug-pc-1) extrapolation, and that provides a cheaper but more inaccurate alternative to estimate the CBS limit. Minor problems were found for the aug-pc-X basis sets and the B3LYP functional for molecules in which sulfur is bound to a very electronegative element, such as SO, SF, SO2, and SO3. For these molecules, the cc basis sets were demonstrated to be more useful. The importance of tight d functions was observed

  14. The influence of sampling design on tree-ring-based quantification of forest growth.

    Nehrbass-Ahles, Christoph; Babst, Flurin; Klesse, Stefan; Nötzli, Magdalena; Bouriaud, Olivier; Neukom, Raphael; Dobbertin, Matthias; Frank, David

    2014-09-01

    Tree-rings offer one of the few possibilities to empirically quantify and reconstruct forest growth dynamics over years to millennia. Contemporaneously with the growing scientific community employing tree-ring parameters, recent research has suggested that commonly applied sampling designs (i.e. how and which trees are selected for dendrochronological sampling) may introduce considerable biases in quantifications of forest responses to environmental change. To date, a systematic assessment of the consequences of sampling design on dendroecological and-climatological conclusions has not yet been performed. Here, we investigate potential biases by sampling a large population of trees and replicating diverse sampling designs. This is achieved by retroactively subsetting the population and specifically testing for biases emerging for climate reconstruction, growth response to climate variability, long-term growth trends, and quantification of forest productivity. We find that commonly applied sampling designs can impart systematic biases of varying magnitude to any type of tree-ring-based investigations, independent of the total number of samples considered. Quantifications of forest growth and productivity are particularly susceptible to biases, whereas growth responses to short-term climate variability are less affected by the choice of sampling design. The world's most frequently applied sampling design, focusing on dominant trees only, can bias absolute growth rates by up to 459% and trends in excess of 200%. Our findings challenge paradigms, where a subset of samples is typically considered to be representative for the entire population. The only two sampling strategies meeting the requirements for all types of investigations are the (i) sampling of all individuals within a fixed area; and (ii) fully randomized selection of trees. This result advertises the consistent implementation of a widely applicable sampling design to simultaneously reduce uncertainties in

  15. The type III manufactory

    Palcoux, Sébastien

    2011-01-01

    Using unusual objects in the theory of von Neumann algebra, as the chinese game Go or the Conway game of life (generalized on finitely presented groups), we are able to build, by hands, many type III factors.

  16. Cliophysics: socio-political reliability theory, polity duration and African political (in)stabilities.

    Cherif, Alhaji; Barley, Kamal

    2010-12-29

    Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies.

  17. Survey and Evaluate Uncertainty Quantification Methodologies

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  18. Perturbation Theory of Embedded Eigenvalues

    Engelmann, Matthias

    We study problems connected to perturbation theory of embedded eigenvalues in two different setups. The first part deals with second order perturbation theory of mass shells in massive translation invariant Nelson type models. To this end an expansion of the eigenvalues w.r.t. fiber parameter up...... project gives a general and systematic approach to analytic perturbation theory of embedded eigenvalues. The spectral deformation technique originally developed in the theory of dilation analytic potentials in the context of Schrödinger operators is systematized by the use of Mourre theory. The group...

  19. Model theory

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  20. Viability Theory

    Aubin, Jean-Pierre; Saint-Pierre, Patrick

    2011-01-01

    Viability theory designs and develops mathematical and algorithmic methods for investigating the adaptation to viability constraints of evolutions governed by complex systems under uncertainty that are found in many domains involving living beings, from biological evolution to economics, from environmental sciences to financial markets, from control theory and robotics to cognitive sciences. It involves interdisciplinary investigations spanning fields that have traditionally developed in isolation. The purpose of this book is to present an initiation to applications of viability theory, explai

  1. Field theory

    Roman, Steven

    2006-01-01

    Intended for graduate courses or for independent study, this book presents the basic theory of fields. The first part begins with a discussion of polynomials over a ring, the division algorithm, irreducibility, field extensions, and embeddings. The second part is devoted to Galois theory. The third part of the book treats the theory of binomials. The book concludes with a chapter on families of binomials - the Kummer theory. This new edition has been completely rewritten in order to improve the pedagogy and to make the text more accessible to graduate students.  The exercises have also been im

  2. Galois Theory

    Cox, David A

    2012-01-01

    Praise for the First Edition ". . .will certainly fascinate anyone interested in abstract algebra: a remarkable book!"—Monatshefte fur Mathematik Galois theory is one of the most established topics in mathematics, with historical roots that led to the development of many central concepts in modern algebra, including groups and fields. Covering classic applications of the theory, such as solvability by radicals, geometric constructions, and finite fields, Galois Theory, Second Edition delves into novel topics like Abel’s theory of Abelian equations, casus irreducibili, and the Galo

  3. Game theory.

    Dufwenberg, Martin

    2011-03-01

    Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website.

  4. Elastoplasticity theory

    Hashiguchi, Koichi

    2009-01-01

    This book details the mathematics and continuum mechanics necessary as a foundation of elastoplasticity theory. It explains physical backgrounds with illustrations and provides descriptions of detailed derivation processes..

  5. Advanced theories of hypoid gears

    Wang, Xudong

    2013-01-01

    In order to develop more efficient types of gears, further investigation into the theories of engagement is necessary. Up until now most of the research work on the theories of engagement has been carried out separately on different groups, and based on individual types of profiles. This book aims at developing some universal theories, which can not only be used for all types of gears, but can also be utilized in other fields such as sculptured surfaces. The book has four characteristics: the investigations are concentrated on mismatched tooth surfaces; all the problems are dealt with from a

  6. Theory of"YangJing RouJin," Manual Therapy on VertebralArtery Type Cervical Spondylosis%"养精柔筋"手法治疗椎动脉型颈椎病理论探析

    董万涛; 宋敏; 邓强

    2011-01-01

    There is a close relationship between tendons and the vertebral artery type cervical spondylosis occurrence and development,from the tendons distribution on the neck to explore the etiology and pathogenesis of cervical disease of vertebral artery by the tendons theory, and provide "YangJing RouJin" approach,through modem medical and biomechanical research to support,clarify, its mechanism of promotion and soft tendon,orthopedic soft tendon,YangJing RouJin,to provide new ideas for manual therapy t on vertebral artery type cervical disease,%从颈项部经筋的分布出发,通过经筋理论探讨椎动脉型颈椎病病因病机,提出"养精柔筋"手法,并通过现代医学及生物力学的研究加以佐证,阐明其通经舒筋、正骨柔筋、养精荣筋的机理,为手法治疗椎动脉型颈椎病提供新的思路.

  7. Recurrence quantification analysis of global stock markets

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  8. Uncertainty quantification in DIC with Kriging regression

    Wang, Dezhi; DiazDelaO, F. A.; Wang, Weizhuo; Lin, Xiaoshan; Patterson, Eann A.; Mottershead, John E.

    2016-03-01

    A Kriging regression model is developed as a post-processing technique for the treatment of measurement uncertainty in classical subset-based Digital Image Correlation (DIC). Regression is achieved by regularising the sample-point correlation matrix using a local, subset-based, assessment of the measurement error with assumed statistical normality and based on the Sum of Squared Differences (SSD) criterion. This leads to a Kriging-regression model in the form of a Gaussian process representing uncertainty on the Kriging estimate of the measured displacement field. The method is demonstrated using numerical and experimental examples. Kriging estimates of displacement fields are shown to be in excellent agreement with 'true' values for the numerical cases and in the experimental example uncertainty quantification is carried out using the Gaussian random process that forms part of the Kriging model. The root mean square error (RMSE) on the estimated displacements is produced and standard deviations on local strain estimates are determined.

  9. Quantification Methods of Management Skills in Shipping

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  10. Theoretical model and quantification of reflectance photometer

    Lihua Huang; Youbao Zhang; Chengke Xie; Jianfeng Qu; Huijie Huang; Xiangzhao Wang

    2009-01-01

    @@ The surface morphology of lateral flow (LF) strip is examined by scanning electron microscope (SEM) and the diffuse reflection of porous strip with or without nanogold particles is investigated.Based on the scattering and absorption of nanogold particles, a reflectance photometer is developed for quantification of LF strip with nanogold particles as reporter.The integration of reflection optical density is to indicate the signals of test line and control line.As an example, serial dilutions of microalbunminuria (MAU) solution are used to calibrate the performance of the reflectance photometer.The dose response curve is fitted with a four-parameter logistic mathematical model for the determination of an unknown MAU concentration.The response curve spans a dynamic range of 5 to 200 μg/ml.The developed reflectance photometer can realize simple and quantitative detection of analyte on nanogold-labeled LF strip.

  11. Quantification of Condylar Resorption in TMJ Osteoarthritis

    Cevidanes, LHS; Hajati, A-K; Paniagua, B; Lim, PF; Walker, DG; Palconet, G; Nackley, AG; Styner, M; Ludlow, JB; Zhu, H; Phillips, C

    2010-01-01

    OBJECTIVE This study was performed to determine the condylar morphological variation of osteoarthritic (OA) and asymptomatic temporomandibular joints (TMJ) and to determine its correlation with pain intensity and duration. STUDY DESIGN Three dimensional surface models of mandibular condyles were constructed from Cone-Beam CT images of 29 female patients with TMJ OA (Research Diagnostic Criteria for Temporomandibular Disorders Group III) and 36 female asymptomatic subjects. Shape Correspondence was used to localize and quantify the condylar morphology. Statistical analysis was performed with MANCOVA analysis using Hotelling T2 metric based on covariance matrices, and Pearson correlation. RESULTS OA condylar morphology was statistically significantly different from the asymptomatic condyles (p<0.05). 3D morphological variation of the OA condyles was significantly correlated with pain intensity and duration. CONCLUSION 3D quantification of condylar morphology revealed profound differences between OA and asymptomatic condyles and the extent of the resorptive changes paralleled pain severity and duration. PMID:20382043

  12. Did natural selection make the Dutch taller? A cautionary note on the importance of quantification in understanding evolution.

    Tarka, Maja; Bolstad, Geir H; Wacker, Sebastian; Räsänen, Katja; Hansen, Thomas F; Pélabon, Christophe

    2015-12-01

    One of the main achievements of the modern synthesis is a rigorous mathematical theory for evolution by natural selection. Combining this theory with statistical models makes it possible to estimate the relevant parameters so as to quantify selection and evolution in nature. Although quantification is a sign of a mature science, statistical models are unfortunately often interpreted independently of the motivating mathematical theory. Without a link to theory, numerical results do not represent proper quantifications, because they lack the connections that designate their biological meaning. Here, we want to raise awareness and exemplify this problem by examining a recent study on natural selection in a contemporary human population. Stulp et al. (2015) concluded that natural selection may partly explain the increasing stature of the Dutch population. This conclusion was based on a qualitative assessment of the presence of selection on height. Here, we provide a quantitative interpretation of these results using standard evolutionary theory to show that natural selection has had a minuscule effect.

  13. Quantification of total pigments in citrus essential oils by thermal wave resonant cavity photopyroelectric spectroscopy.

    López-Muñoz, Gerardo A; Antonio-Pérez, Aurora; Díaz-Reyes, J

    2015-05-01

    A general theory of thermal wave resonant cavity photopyroelectric spectroscopy (TWRC-PPE) was recently proposed by Balderas-López (2012) for the thermo-optical characterisation of substances in a condensed phase. This theory is used to quantify the total carotenoids and chlorophylls in several folded and un-folded citrus essential oils to demonstrate the viability of using this technique as an alternative analytical method for the quantification of total pigments in citrus oils. An analysis of variance (ANOVA) reveals significant differences (p spectroscopy can be used to quantify concentrations up to five times higher of total carotenoids and chlorophylls in citrus oils than UV-Vis spectroscopy without sample preparation or dilution. The optical limits of this technique and possible interference are also described.

  14. Potential Theory

    Lukeš, Jaroslav; Netuka, Ivan; Veselý, Jiří

    1988-01-01

    Within the tradition of meetings devoted to potential theory, a conference on potential theory took place in Prague on 19-24, July 1987. The Conference was organized by the Faculty of Mathematics and Physics, Charles University, with the collaboration of the Institute of Mathematics, Czechoslovak Academy of Sciences, the Department of Mathematics, Czech University of Technology, the Union of Czechoslovak Mathematicians and Physicists, the Czechoslovak Scientific and Technical Society, and supported by IMU. During the Conference, 69 scientific communications from different branches of potential theory were presented; the majority of them are in­ cluded in the present volume. (Papers based on survey lectures delivered at the Conference, its program as well as a collection of problems from potential theory will appear in a special volume of the Lecture Notes Series published by Springer-Verlag). Topics of these communications truly reflect the vast scope of contemporary potential theory. Some contributions deal...

  15. Uncertainty theory

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  16. Concept theory

    Hjørland, Birger

    2009-01-01

      Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...

  17. Conspiracy Theory

    Bjerg, Ole; Presskorn-Thygesen, Thomas

    2017-01-01

    The paper is a contribution to current debates about conspiracy theories within philosophy and cultural studies. Wittgenstein’s understanding of language is invoked to analyse the epistemological effects of designating particular questions and explanations as a ‘conspiracy theory......’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...... to alleged conspiracy theories within our prevalent paradigms of knowledge and truth is compared to the exceptional legal status assigned to individuals accused of terrorism under the War on Terror. The paper concludes by discussing the relation between conspiracy theory and ‘the paranoid style...

  18. Kinematic analysis of 3-RPS type parallel robot based on screw theory%基于螺旋理论的3-RPS型并联机器人运动学分析

    朱大昌; 张国新

    2011-01-01

    3-RPS type parallel mechanism is of three limbs symmetrical in structure in which each is connected with the base through a revolute pair and a spherical pair is connected with moving platform,While the revolute pair is connected with the spherical pair through a moving pair. Screw theory and spatial mechanism structure theory are adopted to derive the characteristic of motion of the robot through constraints. By applying vector analysis method the normallinverse solution of this mechanism can be derived respectively to obtain a kinematical equation for the mechanism. Furthermore ,the kinematic singularities for the structure of the robot are analyzed based on Jacobine matrix of the mechanism symmetrical with inhomogeneity and imperfect DOF.%3-RPS型并联机构具有三个结构对称的支链形式,各支链由一个转动副连接机座、一个球面副与动平台相连接,转动副与球面副由移动副所连接.采用螺旋理论及空间机构构型原理,通过约束形式分析得出该类型并联机器人运动性质,采用矢量分析方法对其运动学正解/反解进行求解,得出该并联机构运动学方程.基于对称非齐次性少自由度并联机构Jacobine矩阵,进一步对该类型并联机器人结构奇异性进行分析与总结.

  19. Detection and quantification of proteins in clinical samples using high resolution mass spectrometry.

    Gallien, Sebastien; Domon, Bruno

    2015-06-15

    Quantitative proteomics has benefited from the recent development of mass spectrometers capable of high-resolution and accurate-mass (HR/AM) measurements. While targeted experiments are routinely performed on triple quadrupole instruments in selected reaction monitoring (SRM; often referred as multiple reaction monitoring, MRM) mode, the quadrupole-orbitrap mass spectrometers allow quantification in MS/MS mode, also known as parallel reaction monitoring (PRM). This technique is characterized by higher selectivity and better confidence in the assignment of the precursor and fragment ions, and thus translates into an improved analytical performance. More fundamentally, PRM introduces a change of the overall paradigm of targeted experiments, by the decoupling of the acquisition and data processing. They rely on two distinct steps, with a simplified acquisition method in conjunction with a flexible, iterative, post-acquisition data processing. This account describes in detail the different steps of a PRM experiment, which include the design of the acquisition method, the confirmation of the identity of the analytes founded upon a full MS/MS fragmentation pattern, and the quantification based on the extraction of specific fragment ions (selected post-acquisition) using tight mass tolerance. The different types of PRM experiments, defined as large-scale screening or precise targeted quantification using calibrated internal standards, together with the considerations on the selection of experimental parameters are discussed.

  20. Signal processing of heart signals for the quantification of non-deterministic events

    Baddour Natalie

    2011-01-01

    Full Text Available Abstract Background Heart signals represent an important way to evaluate cardiovascular function and often what is desired is to quantify the level of some signal of interest against the louder backdrop of the beating of the heart itself. An example of this type of application is the quantification of cavitation in mechanical heart valve patients. Methods An algorithm is presented for the quantification of high-frequency, non-deterministic events such as cavitation from recorded signals. A closed-form mathematical analysis of the algorithm investigates its capabilities. The algorithm is implemented on real heart signals to investigate usability and implementation issues. Improvements are suggested to the base algorithm including aligning heart sounds, and the implementation of the Short-Time Fourier Transform to study the time evolution of the energy in the signal. Results The improvements result in better heart beat alignment and better detection and measurement of the random events in the heart signals, so that they may provide a method to quantify nondeterministic events in heart signals. The use of the Short-Time Fourier Transform allows the examination of the random events in both time and frequency allowing for further investigation and interpretation of the signal. Conclusions The presented algorithm does allow for the quantification of nondeterministic events but proper care in signal acquisition and processing must be taken to obtain meaningful results.