WorldWideScience

Sample records for normal form theory

  1. Normal form theory and spectral sequences

    OpenAIRE

    Sanders, Jan A.

    2003-01-01

    The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.

  2. A normal form approach to the theory of nonlinear betatronic motion

    International Nuclear Information System (INIS)

    Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.

    1994-01-01

    The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)

  3. Analysis of a renormalization group method and normal form theory for perturbed ordinary differential equations

    Science.gov (United States)

    DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.

    2008-06-01

    For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).

  4. Theory and praxis pf map analsys in CHEF part 1: Linear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /Fermilab

    2008-10-01

    This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.

  5. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    International Nuclear Information System (INIS)

    Michelotti, Leo

    2009-01-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from

  6. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /FERMILAB

    2009-04-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material

  7. Theory of normal metals

    International Nuclear Information System (INIS)

    Mahan, G.D.

    1992-01-01

    The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors

  8. Solitary-wave families of the Ostrovsky equation: An approach via reversible systems theory and normal forms

    International Nuclear Information System (INIS)

    Roy Choudhury, S.

    2007-01-01

    The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned

  9. Dynamic pathways to mediate reactions buried in thermal fluctuations. I. Time-dependent normal form theory for multidimensional Langevin equation.

    Science.gov (United States)

    Kawai, Shinnosuke; Komatsuzaki, Tamiki

    2009-12-14

    We present a novel theory which enables us to explore the mechanism of reaction selectivity and robust functions in complex systems persisting under thermal fluctuation. The theory constructs a nonlinear coordinate transformation so that the equation of motion for the new reaction coordinate is independent of the other nonreactive coordinates in the presence of thermal fluctuation. In this article we suppose that reacting systems subject to thermal noise are described by a multidimensional Langevin equation without a priori assumption for the form of potential. The reaction coordinate is composed not only of all the coordinates and velocities associated with the system (solute) but also of the random force exerted by the environment (solvent) with friction constants. The sign of the reaction coordinate at any instantaneous moment in the region of a saddle determines the fate of the reaction, i.e., whether the reaction will proceed through to the products or go back to the reactants. By assuming the statistical properties of the random force, one can know a priori a well-defined boundary of the reaction which separates the full position-velocity space in the saddle region into mainly reactive and mainly nonreactive regions even under thermal fluctuation. The analytical expression of the reaction coordinate provides the firm foundation on the mechanism of how and why reaction proceeds in thermal fluctuating environments.

  10. Normal forms in Poisson geometry

    NARCIS (Netherlands)

    Marcut, I.T.

    2013-01-01

    The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric

  11. TRASYS form factor matrix normalization

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1992-01-01

    A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.

  12. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  13. Nonlinear dynamics exploration through normal forms

    CERN Document Server

    Kahn, Peter B

    2014-01-01

    Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations

  14. An Algorithm for Higher Order Hopf Normal Forms

    Directory of Open Access Journals (Sweden)

    A.Y.T. Leung

    1995-01-01

    Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.

  15. Normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)

  16. Normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  17. Normal equivariant forms of vector fields

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    We prove a theorem of linearization of type Siegel and a theorem of normal forms of type Poincare-Dulac for germs of holomorphic vector fields in the origin of C 2 , Γ -equivariants, where Γ is a finite subgroup of GL (2,C). (author). 5 refs

  18. Normal Forms for Fuzzy Logics: A Proof-Theoretic Approach

    Czech Academy of Sciences Publication Activity Database

    Cintula, Petr; Metcalfe, G.

    2007-01-01

    Roč. 46, č. 5-6 (2007), s. 347-363 ISSN 1432-0665 R&D Projects: GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * normal form * proof theory * hypersequents Subject RIV: BA - General Mathematics Impact factor: 0.620, year: 2007

  19. Normal form for mirror machine Hamiltonians

    International Nuclear Information System (INIS)

    Dragt, A.J.; Finn, J.M.

    1979-01-01

    A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior

  20. The Form of HWID Theory

    DEFF Research Database (Denmark)

    Clemmensen, Torkil

    2015-01-01

    The aim of activities within the Human-Computer Interaction (HCI) area named Human Work Interaction Design (HWID) is to establish relationships between empirical work-domain studies and recent developments in interaction design. Recent areas of research within HWID include design sketches for work......, usability in context, work analysis for HCI, and integration of work analysis and interaction design methods for pervasive and smart workplaces. Across these areas, the question emerges what form of theory may HWID research produce? The aim with this paper is to investigate the requirements of different...... research purposes to a common framework. We take the position that we should approach HWID with a lightweight, medium-level framework that is useful to guide the application of other theories to study the relation between work analysis and interaction design. We analyse the requirements to theory found...

  1. Fast Bitwise Implementation of the Algebraic Normal Form Transform

    OpenAIRE

    Bakoev, Valentin

    2017-01-01

    The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...

  2. Differential forms theory and practice

    CERN Document Server

    Weintraub, Steven H

    2014-01-01

    Differential forms are utilized as a mathematical technique to help students, researchers, and engineers analyze and interpret problems where abstract spaces and structures are concerned, and when questions of shape, size, and relative positions are involved. Differential Forms has gained high recognition in the mathematical and scientific community as a powerful computational tool in solving research problems and simplifying very abstract problems through mathematical analysis on a computer. Differential Forms, 2nd Edition, is a solid resource for students and professionals needing a solid g

  3. AFP Algorithm and a Canonical Normal Form for Horn Formulas

    OpenAIRE

    Majdoddin, Ruhollah

    2014-01-01

    AFP Algorithm is a learning algorithm for Horn formulas. We show that it does not improve the complexity of AFP Algorithm, if after each negative counterexample more that just one refinements are performed. Moreover, a canonical normal form for Horn formulas is presented, and it is proved that the output formula of AFP Algorithm is in this normal form.

  4. Self-consistent normal ordering of gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1987-01-01

    Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs

  5. Abelian 2-form gauge theory: special features

    International Nuclear Information System (INIS)

    Malik, R P

    2003-01-01

    It is shown that the four (3 + 1)-dimensional (4D) free Abelian 2-form gauge theory provides an example of (i) a class of field theoretical models for the Hodge theory, and (ii) a possible candidate for the quasi-topological field theory (q-TFT). Despite many striking similarities with some of the key topological features of the two (1 + 1)-dimensional (2D) free Abelian (and self-interacting non-Abelian) gauge theories, it turns out that the 4D free Abelian 2-form gauge theory is not an exact TFT. To corroborate this conclusion, some of the key issues are discussed. In particular, it is shown that the (anti-)BRST and (anti-)co-BRST invariant quantities of the 4D 2-form Abelian gauge theory obey recursion relations that are reminiscent of the exact TFTs but the Lagrangian density of this theory is not found to be able to be expressed as the sum of (anti-)BRST and (anti-)co-BRST exact quantities as is the case with the topological 2D free Abelian (and self-interacting non-Abelian) gauge theories

  6. Disjoint sum forms in reliability theory

    Directory of Open Access Journals (Sweden)

    B. Anrig

    2014-01-01

    Full Text Available The structure function f of a binary monotone system is assumed to be known and given in a disjunctive normal form, i.e. as the logical union of products of the indicator variables of the states of its subsystems. Based on this representation of f, an improved Abraham algorithm is proposed for generating the disjoint sum form of f. This form is the base for subsequent numerical reliability calculations. The approach is generalized to multivalued systems. Examples are discussed.

  7. Normal form and synchronization of strict-feedback chaotic systems

    International Nuclear Information System (INIS)

    Wang, Feng; Chen, Shihua; Yu Minghai; Wang Changping

    2004-01-01

    This study concerns the normal form and synchronization of strict-feedback chaotic systems. We prove that, any strict-feedback chaotic system can be rendered into a normal form with a invertible transform and then a design procedure to synchronize the normal form of a non-autonomous strict-feedback chaotic system is presented. This approach needs only a scalar driving signal to realize synchronization no matter how many dimensions the chaotic system contains. Furthermore, the Roessler chaotic system is taken as a concrete example to illustrate the procedure of designing without transforming a strict-feedback chaotic system into its normal form. Numerical simulations are also provided to show the effectiveness and feasibility of the developed methods

  8. Normal form of linear systems depending on parameters

    International Nuclear Information System (INIS)

    Nguyen Huynh Phan.

    1995-12-01

    In this paper we resolve completely the problem to find normal forms of linear systems depending on parameters for the feedback action that we have studied for the special case of controllable linear systems. (author). 24 refs

  9. Volume-preserving normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-01-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)

  10. Volume-preserving normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  11. Paramagnetic form factors from itinerant electron theory

    International Nuclear Information System (INIS)

    Cooke, J.F.; Liu, S.H.; Liu, A.J.

    1985-01-01

    Elastic neutron scattering experiments performed over the past two decades have provided accurate information about the magnetic form factors of paramagnetic transition metals. These measurements have traditionally been analyzed in terms of an atomic-like theory. There are, however, some cases where this procedure does not work, and there remains the overall conceptual problem of using an atomistic theory for systems where the unpaired-spin electrons are itinerant. We have recently developed computer codes for efficiently evaluating the induced magnetic form factors of fcc and bcc itinerant electron paramagnets. Results for the orbital and spin contributions have been obtained for Cr, Nb, V, Mo, Pd, and Rh based on local density bands. By using calculated spin enhancement parameters, we find reasonable agreement between theory and neutron form factor data. In addition, these zero parameter calculations yield predictions for the bulk susceptibility on an absolute scale which are in reasonable agreement with experiment in all treated cases except palladium

  12. Rational homotopy theory and differential forms

    CERN Document Server

    Griffiths, Phillip

    2013-01-01

    This completely revised and corrected version of the well-known Florence notes circulated by the authors together with E. Friedlander examines basic topology, emphasizing homotopy theory. Included is a discussion of Postnikov towers and rational homotopy theory. This is then followed by an in-depth look at differential forms and de Tham's theorem on simplicial complexes. In addition, Sullivan's results on computing the rational homotopy type from forms is presented.  New to the Second Edition: *Fully-revised appendices including an expanded discussion of the Hirsch lemma*Presentation of a natu

  13. Chern-Simons forms in gravitation theories

    International Nuclear Information System (INIS)

    Zanelli, Jorge

    2012-01-01

    The Chern-Simons (CS) form evolved from an obstruction in mathematics into an important object in theoretical physics. In fact, the presence of CS terms in physics is more common than one may think: they seem to play an important role in high Tc superconductivity and in recently discovered topological insulators. In classical physics, the minimal coupling in electromagnetism and to the action for a mechanical system in Hamiltonian form are examples of CS functionals. CS forms are also the natural generalization of the minimal coupling between the electromagnetic field and a point charge when the source is not point like but an extended fundamental object, a membrane. They are found in relation with anomalies in quantum field theories, and as Lagrangians for gauge fields, including gravity and supergravity. A cursory review of the role of CS forms in gravitation theories is presented at an introductory level. (topical review)

  14. Chern-Simons forms in gravitation theories

    Science.gov (United States)

    Zanelli, Jorge

    2012-07-01

    The Chern-Simons (CS) form evolved from an obstruction in mathematics into an important object in theoretical physics. In fact, the presence of CS terms in physics is more common than one may think: they seem to play an important role in high Tc superconductivity and in recently discovered topological insulators. In classical physics, the minimal coupling in electromagnetism and to the action for a mechanical system in Hamiltonian form are examples of CS functionals. CS forms are also the natural generalization of the minimal coupling between the electromagnetic field and a point charge when the source is not point like but an extended fundamental object, a membrane. They are found in relation with anomalies in quantum field theories, and as Lagrangians for gauge fields, including gravity and supergravity. A cursory review of the role of CS forms in gravitation theories is presented at an introductory level.

  15. Structural information theory and visual form

    NARCIS (Netherlands)

    Leeuwenberg, E.L.J.; Kaernbach, C.; Schroeger, E.; Mueller, H.

    2003-01-01

    The paper attends to basic characteristics of visual form as approached by Structural information theory, or SIT, (Leeuwenberg, Van der Helm and Van Lier). The introduction provides a global survey of this approach. The main part of the paper focuses on three characteristics of SIT. Each one is made

  16. Statistical Theory of Normal Grain Growth Revisited

    International Nuclear Information System (INIS)

    Gadomski, A.; Luczka, J.

    2002-01-01

    In this paper, we discuss three physically relevant problems concerning the normal grain growth process. These are: Infinite vs finite size of the system under study (a step towards more realistic modeling); conditions of fine-grained structure formation, with possible applications to thin films and biomembranes, and interesting relations to superplasticity of materials; approach to log-normality, an ubiquitous natural phenomenon, frequently reported in literature. It turns out that all three important points mentioned are possible to be included in a Mulheran-Harding type behavior of evolving grains-containing systems that we have studied previously. (author)

  17. Indefinite harmonic forms and gauge theory

    International Nuclear Information System (INIS)

    Nakashima, M.

    1988-01-01

    Indecomposable representations have been extensively used in the construction of conformal and de Sitter gauge theories. It is thus noteworthy that certain unitary highest weight representations have been given a geometric realization as the unitary quotient of an indecomposable representation using indefinite harmonic forms [RSW]. We apply this construction to SU(2,2) and the de Sitter group. The relation is established between these representations and the massless, positive energy representations of SU(2,2) obtained in the physics literature. We investigate the extent to which this construction allows twistors to be viewed as a gauge theory of SU(2,2). For the de Sitter group, on which the gauge theory of singletons is based, we find that this construction is not directly applicable. (orig.)

  18. Utilizing Nested Normal Form to Design Redundancy Free JSON Schemas

    Directory of Open Access Journals (Sweden)

    Wai Yin Mok

    2016-12-01

    Full Text Available JSON (JavaScript Object Notation is a lightweight data-interchange format for the Internet. JSON is built on two structures: (1 a collection of name/value pairs and (2 an ordered list of values (http://www.json.org/. Because of this simple approach, JSON is easy to use and it has the potential to be the data interchange format of choice for the Internet. Similar to XML, JSON schemas allow nested structures to model hierarchical data. As data interchange over the Internet increases exponentially due to cloud computing or otherwise, redundancy free JSON data are an attractive form of communication because they improve the quality of data communication through eliminating update anomaly. Nested Normal Form, a normal form for hierarchical data, is a precise characterization of redundancy. A nested table, or a hierarchical schema, is in Nested Normal Form if and only if it is free of redundancy caused by multivalued and functional dependencies. Using Nested Normal Form as a guide, this paper introduces a JSON schema design methodology that begins with UML use case diagrams, communication diagrams and class diagrams that model a system under study. Based on the use cases’ execution frequencies and the data passed between involved parties in the communication diagrams, the proposed methodology selects classes from the class diagrams to be the roots of JSON scheme trees and repeatedly adds classes from the class diagram to the scheme trees as long as the schemas satisfy Nested Normal Form. This process continues until all of the classes in the class diagram have been added to some JSON scheme trees.

  19. A New One-Pass Transformation into Monadic Normal Form

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We present a translation from the call-by-value λ-calculus to monadic normal forms that includes short-cut boolean evaluation. The translation is higher-order, operates in one pass, duplicates no code, generates no chains of thunks, and is properly tail recursive. It makes a crucial use of symbolic...

  20. Closed form bound-state perturbation theory

    Directory of Open Access Journals (Sweden)

    Ollie J. Rose

    1980-01-01

    Full Text Available The perturbed Schrödinger eigenvalue problem for bound states is cast into integral form using Green's Functions. A systematic algorithm is developed and applied to the resulting equation giving rise to approximate solutions expressed as functions of the given perturbation parameter. As a by-product, convergence radii for the traditional Rayleigh-Schrödinger and Brillouin-Wigner perturbation theories emerge in a natural way.

  1. Baryon form factors in chiral perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Kubis, B.; Meissner, U.G. [Forschungszentrum Juelich GmbH (Germany). Inst. fuer Kernphysik

    2001-01-01

    We analyze the electromagnetic form factors of the ground state baryon octet to fourth order in relativistic baryon chiral perturbation theory. Predictions for the {sigma}{sup -} charge radius and the {lambda}-{sigma}{sup 0} transition moment are found to be in excellent agreement with the available experimental information. Furthermore, the convergence behavior of the hyperon charge radii is shown to be more than satisfactory. (orig.)

  2. Automorphic Forms and Mock Modular Forms in String Theory

    Science.gov (United States)

    Nazaroglu, Caner

    We study a variety of modular invariant objects in relation to string theory. First, we focus on Jacobi forms over generic rank lattices and Siegel forms that appear in N = 2, D = 4 compactifications of heterotic string with Wilson lines. Constraints from low energy spectrum and modularity are employed to deduce the relevant supersymmetric partition functions entirely. This procedure is applied on models that lead to Jacobi forms of index 3, 4, 5 as well as Jacobi forms over root lattices A2 and A3. These computations are then checked against an explicit orbifold model which can be Higgsed to the models under question. Models with a single Wilson line are then studied in detail with their relation to paramodular group Gammam as T-duality group made explicit. These results on the heterotic string side are then turned into predictions for geometric invariants using TypeII - Heterotic duality. Secondly, we study theta functions for indenite signature lattices of generic signature. Building on results in literature for signature (n-1,1) and (n-2,2) lattices, we work out the properties of generalized error functions which we call r-tuple error functions. We then use these functions to build such indenite theta functions and describe their modular completions.

  3. Microscopic theory of normal liquid 3He

    International Nuclear Information System (INIS)

    Nafari, N.; Doroudi, A.

    1994-03-01

    We have used the self-consistent scheme proposed by Singwi, Tosi, Land and Sjoelander (STLS) to study the properties of normal liquid 3 He. By employing the Aziz potential (HFD-B) and some other realistic pairwise interactions, we have calculated the static structure factor, the pair-correlation function, the zero sound frequencies as a function of wave-vector, and the Landau parameter F s 0 for different densities. Our results show considerable improvement over the Ng-Singwi's model potential of a hard core plus an attractive tail. Agreement between our results and the experimental data for the static structure factor and the zero sound frequencies is fairly good. (author). 30 refs, 6 figs, 2 tabs

  4. Teaching queer theory at a Normal School.

    Science.gov (United States)

    Bacon, Jen

    2006-01-01

    This article presents a case study of the ongoing struggle to queer West Chester University at the level of the institution, the curriculum, and the classroom. Part of that struggle includes an effort to establish a policy for free speech that accommodates the values of the institution toward diversity. Another part involves attempts to introduce LGBT Studies into the curriculum, and the resulting debates over whether the curriculum should be "gayer" or "queerer." I discuss the personal struggle to destabilize ready-made categories and encourage non-binary thinking, while honoring the identities we live, and perform, in the classroom. In the last four years, WCU has hired half a dozen out gay or lesbian faculty members, some of whom identify as "queer." In many ways, those faculty members have entered a climate open to new ideas for adding LGBT content to the curriculum and to queering the structure and curriculum of the university. But as faculty, staff, and students engage this cause-along with the broader cause of social justice at the University- we have found that our enemies are often closer than we might have guessed. Detailing the tensions that have characterized the landscape at WCUduring my three years and half years there, this essay elaborates on the epistemological and pedagogical issues that arise when queer Theory meets LGBT Studies in the process of institutional, curricular, and pedagogical reform. I argue that questions about content and method, inclusion and exclusion, and identity and performance can be answered only with a concerted effort and continued attention to the cultural tendency to re-assert binaries while simultaneously learning from them. What is true of West Chester, I argue, is true of the larger social system where the contested terrain of the queer has implications for the choices we make as both stakeholders and deviants in the systems we chronicle and critique.

  5. Automatic identification and normalization of dosage forms in drug monographs

    Science.gov (United States)

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  6. A New Normal Form for Multidimensional Mode Conversion

    International Nuclear Information System (INIS)

    Tracy, E. R.; Richardson, A. S.; Kaufman, A. N.; Zobin, N.

    2007-01-01

    Linear conversion occurs when two wave types, with distinct polarization and dispersion characteristics, are locally resonant in a nonuniform plasma [1]. In recent work, we have shown how to incorporate a ray-based (WKB) approach to mode conversion in numerical algorithms [2,3]. The method uses the ray geometry in the conversion region to guide the reduction of the full NxN-system of wave equations to a 2x2 coupled pair which can be solved and matched to the incoming and outgoing WKB solutions. The algorithm in [2] assumes the ray geometry is hyperbolic and that, in ray phase space, there is an 'avoided crossing', which is the most common type of conversion. Here, we present a new formulation that can deal with more general types of conversion [4]. This formalism is based upon the fact (first proved in [5]) that it is always possible to put the 2x2 wave equation into a 'normal' form, such that the diagonal elements of the dispersion matrix Poisson-commute with the off-diagonals (at leading order). Therefore, if we use the diagonals (rather than the eigenvalues or the determinant) of the dispersion matrix as ray Hamiltonians, the off-diagonals will be conserved quantities. When cast into normal form, the 2x2 dispersion matrix has a very natural physical interpretation: the diagonals are the uncoupled ray hamiltonians and the off-diagonals are the coupling. We discuss how to incorporate the normal form into ray tracing algorithms

  7. International conference "Galois Theory and Modular Forms"

    CERN Document Server

    Miyake, Katsuya; Nakamura, Hiroaki; Galois Theory and Modular Forms

    2004-01-01

    This volume is an outgrowth of the research project "The Inverse Ga­ lois Problem and its Application to Number Theory" which was carried out in three academic years from 1999 to 2001 with the support of the Grant-in-Aid for Scientific Research (B) (1) No. 11440013. In September, 2001, an international conference "Galois Theory and Modular Forms" was held at Tokyo Metropolitan University after some preparatory work­ shops and symposia in previous years. The title of this book came from that of the conference, and the authors were participants of those meet­ All of the articles here were critically refereed by experts. Some of ings. these articles give well prepared surveys on branches of research areas, and many articles aim to bear the latest research results accompanied with carefully written expository introductions. When we started our re~earch project, we picked up three areas to investigate under the key word "Galois groups"; namely, "generic poly­ nomials" to be applied to number theory, "Galois co...

  8. Alternative Forms of Fit in Contingency Theory.

    Science.gov (United States)

    Drazin, Robert; Van de Ven, Andrew H.

    1985-01-01

    This paper examines the selection, interaction, and systems approaches to fit in structural contingency theory. The concepts of fit evaluated may be applied not only to structural contingency theory but to contingency theories in general. (MD)

  9. SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS

    Directory of Open Access Journals (Sweden)

    A. V. Sokolov

    2016-01-01

    Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.

  10. Nevanlinna theory, normal families, and algebraic differential equations

    CERN Document Server

    Steinmetz, Norbert

    2017-01-01

    This book offers a modern introduction to Nevanlinna theory and its intricate relation to the theory of normal families, algebraic functions, asymptotic series, and algebraic differential equations. Following a comprehensive treatment of Nevanlinna’s theory of value distribution, the author presents advances made since Hayman’s work on the value distribution of differential polynomials and illustrates how value- and pair-sharing problems are linked to algebraic curves and Briot–Bouquet differential equations. In addition to discussing classical applications of Nevanlinna theory, the book outlines state-of-the-art research, such as the effect of the Yosida and Zalcman–Pang method of re-scaling to algebraic differential equations, and presents the Painlevé–Yosida theorem, which relates Painlevé transcendents and solutions to selected 2D Hamiltonian systems to certain Yosida classes of meromorphic functions. Aimed at graduate students interested in recent developments in the field and researchers wor...

  11. Reconstruction of normal forms by learning informed observation geometries from data.

    Science.gov (United States)

    Yair, Or; Talmon, Ronen; Coifman, Ronald R; Kevrekidis, Ioannis G

    2017-09-19

    The discovery of physical laws consistent with empirical observations is at the heart of (applied) science and engineering. These laws typically take the form of nonlinear differential equations depending on parameters; dynamical systems theory provides, through the appropriate normal forms, an "intrinsic" prototypical characterization of the types of dynamical regimes accessible to a given model. Using an implementation of data-informed geometry learning, we directly reconstruct the relevant "normal forms": a quantitative mapping from empirical observations to prototypical realizations of the underlying dynamics. Interestingly, the state variables and the parameters of these realizations are inferred from the empirical observations; without prior knowledge or understanding, they parametrize the dynamics intrinsically without explicit reference to fundamental physical quantities.

  12. Destination memory and cognitive theory of mind in normal ageing.

    Science.gov (United States)

    El Haj, Mohamad; Raffard, Stéphane; Gély-Nargeot, Marie-Christine

    2016-01-01

    Destination memory is the ability to remember the destination to which a piece of information has been addressed (e.g., "Did I tell you about the promotion?"). This ability is found to be impaired in normal ageing. Our work aimed to link this deterioration to the decline in theory of mind. Forty younger adults (M age = 23.13 years, SD = 4.00) and 36 older adults (M age = 69.53 years, SD = 8.93) performed a destination memory task. They also performed the False-belief test addressing cognitive theory of mind and the Reading the mind in the eyes test addressing affective theory of mind. Results showed significant deterioration in destination memory, cognitive theory of mind and affective theory of mind in the older adults. The older adults' performance on destination memory was significantly correlated with and predicted by their performance on cognitive theory of mind. Difficulties in the ability to interpret and predict others' mental states are related to destination memory decline in older adults.

  13. Theory of Single Point Incremental Forming

    DEFF Research Database (Denmark)

    Martins, P.A.F.; Bay, Niels; Skjødt, Martin

    2008-01-01

    This paper presents a closed-form theoretical analysis modelling the fundamentals of single point incremental forming and explaining the experimental and numerical results available in the literature for the past couple of years. The model is based on membrane analysis with bi-directional in-plan......-plane contact friction and is focused on the extreme modes of deformation that are likely to be found in single point incremental forming processes. The overall investigation is supported by experimental work performed by the authors and data retrieved from the literature.......This paper presents a closed-form theoretical analysis modelling the fundamentals of single point incremental forming and explaining the experimental and numerical results available in the literature for the past couple of years. The model is based on membrane analysis with bi-directional in...

  14. Problems in the theory of modular forms

    CERN Document Server

    Murty, M Ram; Graves, Hester

    2016-01-01

    This book introduces the reader to the fascinating world of modular forms through a problem-solving approach. As such, besides researchers, the book can be used by the undergraduate and graduate students for self-instruction. The topics covered include q-series, the modular group, the upper half-plane, modular forms of level one and higher level, the Ramanujan τ-function, the Petersson inner product, Hecke operators, Dirichlet series attached to modular forms and further special topics. It can be viewed as a gentle introduction for a deeper study of the subject. Thus, it is ideal for non-experts seeking an entry into the field. .

  15. Development of a theory of implementation and integration: Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    May Carl R

    2009-05-01

    Full Text Available Abstract Background Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built. Methods Between 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model. Results Each phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory. Conclusion Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.

  16. Differential and integral forms in supergauge theories and supergravity

    International Nuclear Information System (INIS)

    Zupnik, B.M.; Pak, D.G.

    1989-01-01

    D = 3, 4, N = 1 supergauge theories and D = 3, N = 1 supergravity are considered in the superfield formalism by using differential and integral forms. A special map of the space of differential forms into the space of integral forms is proposed. By means of this map we find the superfield Chern-Simons terms in D = 3, N = 1 Yang-Mills theory and supergravity. The integral forms corresponding to superfield invariants of D = 4, N = 1 supergauge theory have also been constructed. (Author)

  17. Normalization Of Thermal-Radiation Form-Factor Matrix

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1994-01-01

    Report describes algorithm that adjusts form-factor matrix in TRASYS computer program, which calculates intraspacecraft radiative interchange among various surfaces and environmental heat loading from sources such as sun.

  18. Cold rolling precision forming of shaft parts theory and technologies

    CERN Document Server

    Song, Jianli; Li, Yongtang

    2017-01-01

    This book presents in detail the theory, processes and equipment involved in cold rolling precision forming technologies, focusing on spline and thread shaft parts. The main topics discussed include the status quo of research on cold rolling precision forming technologies; the design and calculation of process parameters; the numerical simulation of cold rolling forming processes; and the equipment used in cold rolling forming. The mechanism of cold rolling forming is extremely complex, and research on the processes, theory and mechanical analysis of spline cold rolling forming has remained very limited to date. In practice, the forming processes and production methods used are mainly chosen on the basis of individual experience. As such, there is a marked lack of both systematic, theory-based guidelines, and of specialized books covering theoretical analysis, numerical simulation, experiments and equipment used in spline cold rolling forming processes – all key points that are included in this book and ill...

  19. Mean fields and self consistent normal ordering of lattice spin and gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1986-01-01

    Classical Heisenberg spin models on lattices possess mean field theories that are well defined real field theories on finite lattices. These mean field theories can be self consistently normal ordered. This leads to a considerable improvement over standard mean field theory. This concept is carried over to lattice gauge theories. We construct first an appropriate real mean field theory. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean field theory are derived. (orig.)

  20. Self-consistent theory of normal-to-superconducting transition

    International Nuclear Information System (INIS)

    Radzihovsky, L.; Chicago Univ., IL

    1995-01-01

    I study the normal-to-superconducting (NS) transition within the Ginzburg-Landau (GL) model, taking into account the fluctuations in the m-component complex order parameter ψ α and the vector potential A in the arbitrary dimension d, for any m. I find that the transition is of second order and that the previous conclusion of the fluctuation-driven first-order transition is a possible artifact of the breakdown of the ε-expansion and the inaccuracy of the 1/m-expansion for physical values ε = 1, m 1. I compute the anomalous η(d, m) exponent at the NS transition, and find η(3, 1) ∼ -0.38. In the m → ∞ limit, η(d, m) becomes exact and agrees with the 1/m-expansion. Near d = 4 the theory is also in good agreement with the perturbative ε-expansion results for m > 183 and provides a sensible interpolation formula for arbitrary d and m. (orig.)

  1. A Mathematical Framework for Critical Transitions: Normal Forms, Variance and Applications

    Science.gov (United States)

    Kuehn, Christian

    2013-06-01

    Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast-subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension-two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.

  2. Calculation of TC in a normal-superconductor bilayer using the microscopic-based Usadel theory

    International Nuclear Information System (INIS)

    Martinis, John M.; Hilton, G.C.; Irwin, K.D.; Wollman, D.A.

    2000-01-01

    The Usadel equations give a theory of superconductivity, valid in the diffusive limit, that is a generalization of the microscopic equations of the BCS theory. Because the theory is expressed in a tractable and physical form, even experimentalists can analytically and numerically calculate detailed properties of superconductors in physically relevant geometries. Here, we describe the Usadel equations and review their solution in the case of predicting the transition temperature T C of a thin normal-superconductor bilayer. We also extend this calculation for thicker bilayers to show the dependence on the resistivity of the films. These results, which show a dependence on both the interface resistance and heat capacity of the films, provide important guidance on fabricating bilayers with reproducible transition temperatures

  3. Conformal Field Theory, Automorphic Forms and Related Topics

    CERN Document Server

    Weissauer, Rainer; CFT 2011

    2014-01-01

    This book, part of the series Contributions in Mathematical and Computational Sciences, reviews recent developments in the theory of vertex operator algebras (VOAs) and their applications to mathematics and physics.   The mathematical theory of VOAs originated from the famous monstrous moonshine conjectures of J.H. Conway and S.P. Norton, which predicted a deep relationship between the characters of the largest simple finite sporadic group, the Monster, and the theory of modular forms inspired by the observations of J. MacKay and J. Thompson.   The contributions are based on lectures delivered at the 2011 conference on Conformal Field Theory, Automorphic Forms and Related Topics, organized by the editors as part of a special program offered at Heidelberg University that summer under the sponsorship of the MAThematics Center Heidelberg (MATCH).

  4. Diagonalization and Jordan Normal Form--Motivation through "Maple"[R

    Science.gov (United States)

    Glaister, P.

    2009-01-01

    Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package…

  5. On the relationship between LTL normal forms and Büchi automata

    DEFF Research Database (Denmark)

    Li, Jianwen; Pu, Geguang; Zhang, Lijun

    2013-01-01

    In this paper, we revisit the problem of translating LTL formulas to Büchi automata. We first translate the given LTL formula into a special disjuctive-normal form (DNF). The formula will be part of the state, and its DNF normal form specifies the atomic properties that should hold immediately...

  6. Normal forms of invariant vector fields under a finite group action

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    Let Γ be a finite subgroup of GL(n,C). This subgroup acts on the space of germs of holomorphic vector fields vanishing at the origin in C n . We prove a theorem of invariant conjugation to a normal form and linearization for the subspace of invariant elements and we give a description of these normal forms in dimension n=2. (author)

  7. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  8. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.

    Science.gov (United States)

    Frejlich, Pedro; Mărcuț, Ioan

    2018-01-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  9. Hadronic Form Factors in Asymptotically Free Field Theories

    Science.gov (United States)

    Gross, D. J.; Treiman, S. B.

    1974-01-01

    The breakdown of Bjorken scaling in asymptotically free gauge theories of the strong interactions is explored for its implications on the large q{sup 2} behavior of nucleon form factors. Duality arguments of Bloom and Gilman suggest a connection between the form factors and the threshold properties of the deep inelastic structure functions. The latter are addressed directly in an analysis of asymptotically free theories; and through the duality connection we are then led to statements about the form factors. For very large q{sup 2} the form factors are predicted to fall faster than any inverse power of q{sup 2}. For the more modest range of q{sup 2} reached in existing experiments the agreement with data is fairly good, though this may well be fortuitous. Extrapolations beyond this range are presented.

  10. International Conference on Automorphic Forms and Number Theory

    CERN Document Server

    Al-Baali, Mehiddin; Ibukiyama, Tomoyoshi; Rupp, Florian

    2014-01-01

    This edited volume presents a collection of carefully refereed articles covering the latest advances in Automorphic Forms and Number Theory, that were primarily developed from presentations given at the 2012 “International Conference on Automorphic Forms and Number Theory,” held in Muscat, Sultanate of Oman. The present volume includes original research as well as some surveys and outlines of research altogether providing a contemporary snapshot on the latest activities in the field and covering the topics of: Borcherds products Congruences and Codes Jacobi forms Siegel and Hermitian modular forms Special values of L-series Recently, the Sultanate of Oman became a member of the International Mathematical Society. In view of this development, the conference provided the platform for scientific exchange and collaboration between scientists of different countries from all over the world. In particular, an opportunity was established for a close exchange between scientists and students of Germany, Oman, and J...

  11. Hyperon decay form factors in chiral perturbation theory

    International Nuclear Information System (INIS)

    Lacour, Andre; Kubis, Bastian; Meissner, Ulf-G.

    2007-01-01

    We present a complete calculation of the SU(3)-breaking corrections to the hyperon vector form factors up to O(p 4 ) in covariant baryon chiral perturbation theory. Partial higher-order contributions are obtained, and we discuss chiral extrapolations of the vector form factor at zero momentum transfer. In addition we derive low-energy theorems for the subleading moments in hyperon decays, the weak Dirac radii and the weak anomalous magnetic moments, up to O(p 4 )

  12. Free Abelian 2-form gauge theory: BRST approach

    International Nuclear Information System (INIS)

    Malik, R.P.

    2008-01-01

    We discuss various symmetry properties of the Lagrangian density of a four- (3+1)-dimensional (4D) free Abelian 2-form gauge theory within the framework of Becchi-Rouet-Stora-Tyutin (BRST) formalism. The present free Abelian gauge theory is endowed with a Curci-Ferrari type condition, which happens to be a key signature of the 4D non-Abelian 1-form gauge theory. In fact, it is due to the above condition that the nilpotent BRST and anti-BRST symmetries of our present theory are found to be absolutely anticommuting in nature. For the present 2-form theory, we discuss the BRST, anti-BRST, ghost and discrete symmetry properties of the Lagrangian densities and derive the corresponding conserved charges. The algebraic structure, obeyed by the above conserved charges, is deduced and the constraint analysis is performed with the help of physicality criteria, where the conserved and nilpotent (anti-)BRST charges play completely independent roles. These physicality conditions lead to the derivation of the above Curci-Ferrari type restriction, within the framework of the BRST formalism, from the constraint analysis. (orig.)

  13. The use of exterior forms in Einstein's gravitation theory

    International Nuclear Information System (INIS)

    Thirring, W.; Wallner, R.

    1978-01-01

    Cartan's calculus is used to reformulate the general variational principle and conservation laws in terms of exterior forms. In applying this method to Einstein's gravitation theory, we do not only benefit from the great economy of Cartan's formalism but also gain a deeper understanding of fundamental results already known. So the existence of superpotential-forms may be deduced from d o d identical to 0 and as a consequence the vanishing of total energy and momentum in a closed universe is affirmed in a more general way. Simple expressions for the sundry superpotential are obtained quite naturally. As a byproduct, Einstein's equations are rewritten in a form where the coderivative of a 2-form (the superpotential-form) is a current, and therefore resembles the inhomogeneous Maxwell equations. In passing from the Lagrangian to the Hamiltonian 4-form, the ADM formalism is immediately entered without lengthy calculations [pt

  14. Late forming dark matter in theories of neutrino dark energy

    International Nuclear Information System (INIS)

    Das, Subinoy; Weiner, Neal

    2011-01-01

    We study the possibility of late forming dark matter, where a scalar field, previously trapped in a metastable state by thermal or finite density effects, goes through a phase transition near the era matter-radiation equality and begins to oscillate about its true minimum. Such a theory is motivated generally if the dark energy is of a similar form, but has not yet made the transition to dark matter, and, in particular, arises automatically in recently considered theories of neutrino dark energy. If such a field comprises the present dark matter, the matter power spectrum typically shows a sharp break at small, presently nonlinear scales, below which power is highly suppressed and previously contained acoustic oscillations. If, instead, such a field forms a subdominant component of the total dark matter, such acoustic oscillations may imprint themselves in the linear regime.

  15. Application of normal form methods to the analysis of resonances in particle accelerators

    International Nuclear Information System (INIS)

    Davies, W.G.

    1992-01-01

    The transformation to normal form in a Lie-algebraic framework provides a very powerful method for identifying and analysing non-linear behaviour and resonances in particle accelerators. The basic ideas are presented and illustrated. (author). 4 refs

  16. On some hypersurfaces with time like normal bundle in pseudo Riemannian space forms

    International Nuclear Information System (INIS)

    Kashani, S.M.B.

    1995-12-01

    In this work we classify immersed hypersurfaces with constant sectional curvature in pseudo Riemannian space forms if the normal bundle is time like and the mean curvature is constant. (author). 9 refs

  17. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, James A.; Heinemann, Klaus [New Mexico Univ., Albuquerque, NM (United States). Dept. of Mathematics and Statistics; Vogt, Mathias [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Gooden, Matthew [North Carolina State Univ., Raleigh, NC (United States). Dept. of Physics

    2013-03-15

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length {lambda} of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As {lambda} varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in

  18. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    International Nuclear Information System (INIS)

    Ellison, James A.; Heinemann, Klaus; Gooden, Matthew

    2013-03-01

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length λ of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As λ varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in the

  19. Intentionality forms the matrix of healing: a theory.

    Science.gov (United States)

    Zahourek, Rothlyn P

    2004-01-01

    The understanding of intentionality in a healing context has been incomplete and confusing. Attempts have been made to describe it as a concrete mental force in healing while healing has been accepted as a nonlocal phenomenon. This paper reviews several definitions and theoretical frameworks of intentionality. It proposes a new integrative theory of intentionality, Intentionality: the Matrix of Healing. The theory proposes definitions, forms, and phases of intentionality, a process of development and mediators that sculpt intentionality in healing. The theory has implications for conceptualizing intentionality and provides a framework for continued exploration of the nature of intentionality in healing for scholars as well as clinicians. This study was done as a Doctoral dissertation at New York University, School of Education, Division of Nursing.

  20. Prospect theory: A parametric analysis of functional forms in Brazil

    Directory of Open Access Journals (Sweden)

    Robert Eugene Lobel

    2017-10-01

    Full Text Available This study aims to analyze risk preferences in Brazil based on prospect theory by estimating the risk aversion parameter of the expected utility theory (EUT for a select sample, in addition to the value and probability function parameter, assuming various functional forms, and a newly proposed value function, the modified log. This is the first such study in Brazil, and the parameter results are slightly different from studies in other countries, indicating that subjects are more risk averse and exhibit a smaller loss aversion. Probability distortion is the only common factor. As expected, the study finds that behavioral models are superior to EUT, and models based on prospect theory, the TK and Prelec weighting function, and the value power function show superior performance to others. Finally, the modified log function proposed in the study fits the data well, and can thus be used for future studies in Brazil.

  1. Lorentz violating p-form gauge theories in superspace

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyay, Sudhaker [Indian Institute of Technology Kharagpur, Centre for Theoretical Studies, Kharagpur (India); Shah, Mushtaq B.; Ganai, Prince A. [National Institute of Technology, Department of Physics, Srinagar, Kashmir (India)

    2017-03-15

    Very special relativity (VSR) keeps the main features of special relativity but breaks rotational invariance due to an intrinsic preferred direction. We study the VSR-modified extended BRST and anti-BRST symmetry of the Batalin-Vilkovisky (BV) actions corresponding to the p = 1, 2, 3-form gauge theories. Within the VSR framework, we discuss the extended BRST invariant and extended BRST and anti-BRST invariant superspace formulations for these BV actions. Here we observe that the VSR-modified extended BRST invariant BV actions corresponding to the p = 1, 2, 3-form gauge theories can be written in a manifestly covariant manner in a superspace with one Grassmann coordinate. Moreover, two Grassmann coordinates are required to describe the VSR-modified extended BRST and extended anti-BRST invariant BV actions in a superspace. These results are consistent with the Lorentz-invariant (special relativity) formulation. (orig.)

  2. Normal Forms for Retarded Functional Differential Equations and Applications to Bogdanov-Takens Singularity

    Science.gov (United States)

    Faria, T.; Magalhaes, L. T.

    The paper addresses, for retarded functional differential equations (FDEs), the computation of normal forms associated with the flow on a finite-dimensional invariant manifold tangent to invariant spaces for the infinitesimal generator of the linearized equation at a singularity. A phase space appropriate to the computation of these normal forms is introduced, and adequate nonresonance conditions for the computation of the normal forms are derived. As an application, the general situation of Bogdanov-Takens singularity and its versal unfolding for scalar retarded FDEs with nondegeneracy at second order is considered, both in the general case and in the case of differential-delay equations of the form ẋ( t) = ƒ( x( t), x( t-1)).

  3. Quantifying Normal Craniofacial Form and Baseline Craniofacial Asymmetry in the Pediatric Population.

    Science.gov (United States)

    Cho, Min-Jeong; Hallac, Rami R; Ramesh, Jananie; Seaward, James R; Hermann, Nuno V; Darvann, Tron A; Lipira, Angelo; Kane, Alex A

    2018-03-01

    Restoring craniofacial symmetry is an important objective in the treatment of many craniofacial conditions. Normal form has been measured using anthropometry, cephalometry, and photography, yet all of these modalities have drawbacks. In this study, the authors define normal pediatric craniofacial form and craniofacial asymmetry using stereophotogrammetric images, which capture a densely sampled set of points on the form. After institutional review board approval, normal, healthy children (n = 533) with no known craniofacial abnormalities were recruited at well-child visits to undergo full head stereophotogrammetric imaging. The children's ages ranged from 0 to 18 years. A symmetric three-dimensional template was registered and scaled to each individual scan using 25 manually placed landmarks. The template was deformed to each subject's three-dimensional scan using a thin-plate spline algorithm and closest point matching. Age-based normal facial models were derived. Mean facial asymmetry and statistical characteristics of the population were calculated. The mean head asymmetry across all pediatric subjects was 1.5 ± 0.5 mm (range, 0.46 to 4.78 mm), and the mean facial asymmetry was 1.2 ± 0.6 mm (range, 0.4 to 5.4 mm). There were no significant differences in the mean head or facial asymmetry with age, sex, or race. Understanding the "normal" form and baseline distribution of asymmetry is an important anthropomorphic foundation. The authors present a method to quantify normal craniofacial form and baseline asymmetry in a large pediatric sample. The authors found that the normal pediatric craniofacial form is asymmetric, and does not change in magnitude with age, sex, or race.

  4. Center manifolds, normal forms and bifurcations of vector fields with application to coupling between periodic and steady motions

    Science.gov (United States)

    Holmes, Philip J.

    1981-06-01

    We study the instabilities known to aeronautical engineers as flutter and divergence. Mathematically, these states correspond to bifurcations to limit cycles and multiple equilibrium points in a differential equation. Making use of the center manifold and normal form theorems, we concentrate on the situation in which flutter and divergence become coupled, and show that there are essentially two ways in which this is likely to occur. In the first case the system can be reduced to an essential model which takes the form of a single degree of freedom nonlinear oscillator. This system, which may be analyzed by conventional phase-plane techniques, captures all the qualitative features of the full system. We discuss the reduction and show how the nonlinear terms may be simplified and put into normal form. Invariant manifold theory and the normal form theorem play a major role in this work and this paper serves as an introduction to their application in mechanics. Repeating the approach in the second case, we show that the essential model is now three dimensional and that far more complex behavior is possible, including nonperiodic and ‘chaotic’ motions. Throughout, we take a two degree of freedom system as an example, but the general methods are applicable to multi- and even infinite degree of freedom problems.

  5. Queer theory and education to approach not normalizing

    Directory of Open Access Journals (Sweden)

    Wendel Souza Santos

    2017-12-01

    Full Text Available Queer analytical commonly related to gender studies is a recent conceptual approach. This article aims mainly to bring out the prospect explored the critical analysis of the educational field. So the big challenge in education is to rethink what is educate, educate and educate and to whom. In a non-normalizing perspective, educate would be a dialogical activity in that the experiences to date unfeasible, non-recognized, or more commonly, raped, started to be incorporated into the school routine, changing the hierarchy between who teaches and who is educated and seeking establish more symmetry between them in order to move from education to a relational learning and transformative for both.

  6. Revisiting the pion's scalar form factor in chiral perturbation theory

    CERN Document Server

    Juttner, Andreas

    2012-01-01

    The quark-connected and the quark-disconnected Wick contractions contributing to the pion's scalar form factor are computed in the two and in the three flavour chiral effective theory at next-to-leading order. While the quark-disconnected contribution to the form factor itself turns out to be power-counting suppressed its contribution to the scalar radius is of the same order of magnitude as the one of the quark-connected contribution. This result underlines that neglecting quark-disconnected contributions in simulations of lattice QCD can cause significant systematic effects. The technique used to derive these predictions can be applied to a large class of observables relevant for QCD-phenomenology.

  7. Transgression forms and extensions of Chern-Simons gauge theories

    International Nuclear Information System (INIS)

    Mora, Pablo; Olea, Rodrigo; Troncoso, Ricardo; Zanelli, Jorge

    2006-01-01

    A gauge invariant action principle, based on the idea of transgression forms, is proposed. The action extends the Chern-Simons form by the addition of a boundary term that makes the action gauge invariant (and not just quasi-invariant). Interpreting the spacetime manifold as cobordant to another one, the duplication of gauge fields in spacetime is avoided. The advantages of this approach are particularly noticeable for the gravitation theory described by a Chern-Simons lagrangian for the AdS group, in which case the action is regularized and finite for black hole geometries in diverse situations. Black hole thermodynamics is correctly reproduced using either a background field approach or a background-independent setting, even in cases with asymptotically nontrivial topologies. It is shown that the energy found from the thermodynamic analysis agrees with the surface integral obtained by direct application of Noether's theorem

  8. Scientific Theories and Naive Theories as Forms of Mental Representation: Psychologism Revived

    Science.gov (United States)

    Brewer, William F.

    This paper analyzes recent work in psychology on the nature of the representation of complex forms of knowledge with the goal of understanding how theories are represented. The analysis suggests that, as a psychological form of representation, theories are mental structures that include theoretical entities (usually nonobservable), relationships among the theoretical entities, and relationships of the theoretical entities to the phenomena of some domain. A theory explains the phenomena in its domain by providing a conceptual framework for the phenomena that leads to a feeling of understanding in the reader/hearer. The explanatory conceptual framework goes beyond the original phenomena, integrates diverse aspects of the world, and shows how the original phenomena follow from the framework. This analysis is used to argue that mental models are the subclass of theories that use causal/mechanical explanatory frameworks. In addition, an argument is made for a new psychologism in the philosophy of science, in which the mental representation of scientific theories must be taken into account.

  9. Overcoming the Problem of Embedding Change in Educational Organizations: A Perspective from Normalization Process Theory

    Science.gov (United States)

    Wood, Phil

    2017-01-01

    In this article, I begin by outlining some of the barriers which constrain sustainable organizational change in schools and universities. I then go on to introduce a theory which has already started to help explain complex change and innovation processes in health and care contexts, Normalization Process Theory. Finally, I consider what this…

  10. The Forms of Value: Problems of Convertibility in Field Theory

    Directory of Open Access Journals (Sweden)

    Göran Bolin

    2012-01-01

    Full Text Available Media production in late capitalism is often measured in terms of economic value. If value is defined as the worth of a thing, a standard or measure, being the result of social praxis and negotiation between producers and consumers in various combinations, it follows that this worth can be of other kinds than the mere economic. This is, for example, the reasoning behind field theory (Bourdieu, where the generation of field-specific capital (value is deeply dependent on the belief shared by the competing agents within the field. The full extent of the consequences of such a theory of convertibility between fields of cultural production, centred on different forms of value, is, however yet to be explored. This is the task of this article. It especially focuses on how value is constructed differently depending on the relations of the valuing subject to the production process, something that becomes highly relevant in digital media environments, where users are increasingly drawn into the production process.

  11. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory

    OpenAIRE

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-01-01

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and...

  12. Closed-form confidence intervals for functions of the normal mean and standard deviation.

    Science.gov (United States)

    Donner, Allan; Zou, G Y

    2012-08-01

    Confidence interval methods for a normal mean and standard deviation are well known and simple to apply. However, the same cannot be said for important functions of these parameters. These functions include the normal distribution percentiles, the Bland-Altman limits of agreement, the coefficient of variation and Cohen's effect size. We present a simple approach to this problem by using variance estimates recovered from confidence limits computed for the mean and standard deviation separately. All resulting confidence intervals have closed forms. Simulation results demonstrate that this approach performs very well for limits of agreement, coefficients of variation and their differences.

  13. On the construction of the Kolmogorov normal form for the Trojan asteroids

    CERN Document Server

    Gabern, F; Locatelli, U

    2004-01-01

    In this paper we focus on the stability of the Trojan asteroids for the planar Restricted Three-Body Problem (RTBP), by extending the usual techniques for the neighbourhood of an elliptic point to derive results in a larger vicinity. Our approach is based on the numerical determination of the frequencies of the asteroid and the effective computation of the Kolmogorov normal form for the corresponding torus. This procedure has been applied to the first 34 Trojan asteroids of the IAU Asteroid Catalog, and it has worked successfully for 23 of them. The construction of this normal form allows for computer-assisted proofs of stability. To show it, we have implemented a proof of existence of families of invariant tori close to a given asteroid, for a high order expansion of the Hamiltonian. This proof has been successfully applied to three Trojan asteroids.

  14. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.; Spoto, F.; Scollo, Giuseppe; Nijholt, Antinus

    2003-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq 1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  15. Generating all permutations by context-free grammars in Chomsky normal form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2006-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq1$, with

  16. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2004-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  17. THE METHOD OF CONSTRUCTING A BOOLEAN FORMULA OF A POLYGON IN THE DISJUNCTIVE NORMAL FORM

    Directory of Open Access Journals (Sweden)

    A. A. Butov

    2014-01-01

    Full Text Available The paper focuses on finalizing the method of finding a polygon Boolean formula in disjunctive normal form, described in the previous article [1]. An improved method eliminates the drawback asso-ciated with the existence of a class of problems for which the solution is only approximate. The pro-posed method always allows to find an exact solution. The method can be used, in particular, in the systems of computer-aided design of integrated circuits topology.

  18. Irreducible integrable theories form tensor products of conformal models

    International Nuclear Information System (INIS)

    Mathur, S.D.; Warner, N.P.

    1991-01-01

    By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)

  19. Perturbations and quasi-normal modes of black holes in Einstein-Aether theory

    International Nuclear Information System (INIS)

    Konoplya, R.A.; Zhidenko, A.

    2007-01-01

    We develop a new method for calculation of quasi-normal modes of black holes, when the effective potential, which governs black hole perturbations, is known only numerically in some region near the black hole. This method can be applied to perturbations of a wide class of numerical black hole solutions. We apply it to the black holes in the Einstein-Aether theory, a theory where general relativity is coupled to a unit time-like vector field, in order to observe local Lorentz symmetry violation. We found that in the non-reduced Einstein-Aether theory, real oscillation frequency and damping rate of quasi-normal modes are larger than those of Schwarzschild black holes in the Einstein theory

  20. Microscopic theory of the current-voltage relationship across a normal-superconducting interface

    International Nuclear Information System (INIS)

    Kraehenbuehl, Y.; Watts-Tobin, R.J.

    1979-01-01

    Measurements by Pippard et al. have shown the existence of an extra resistance due to the penetration of an electrical potential into a superconductor. Previous theories of this effect are unable to explain the full temperature dependence of the extra resistance because they use oversimplified models of the normal--superconducting interface. We show that the microscopic theory for dirty superconductors leads to a good agreement with experiment over the whole temperature range

  1. A structure-preserving approach to normal form analysis of power systems; Una propuesta de preservacion de estructura al analisis de su forma normal en sistemas de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Carrillo, Irma

    2008-01-15

    Power system dynamic behavior is inherently nonlinear and is driven by different processes at different time scales. The size and complexity of these mechanisms has stimulated the search for methods that reduce the original dimension but retain a certain degree of accuracy. In this dissertation, a novel nonlinear dynamical analysis method for the analysis of large amplitude oscillations that embraces ideas from normal form theory and singular perturbation techniques is proposed. This approach allows the full potential of the normal form method to be reached, and is suitably general for application to a wide variety of nonlinear systems. Drawing on the formal theory of dynamical systems, a structure-preserving model of the system is developed that preservers network and load characteristics. By exploiting the separation of fast and slow time scales of the model, an efficient approach based on singular perturbation techniques, is then derived for constructing a nonlinear power system representation that accurately preserves network structure. The method requires no reduction of the constraint equations and gives therefore, information about the effect of network and load characteristics on system behavior. Analytical expressions are then developed that provide approximate solutions to system performance near a singularity and techniques for interpreting these solutions in terms of modal functions are given. New insights into the nature of nonlinear oscillations are also offered and criteria for characterizing network effects on nonlinear system behavior are proposed. Theoretical insight into the behavior of dynamic coupling of differential-algebraic equations and the origin of nonlinearity is given, and implications for analyzing for design and placement of power system controllers in complex nonlinear systems are discussed. The extent of applicability of the proposed procedure is demonstrated by analyzing nonlinear behavior in two realistic test power systems

  2. High molecular gas fractions in normal massive star-forming galaxies in the young Universe.

    Science.gov (United States)

    Tacconi, L J; Genzel, R; Neri, R; Cox, P; Cooper, M C; Shapiro, K; Bolatto, A; Bouché, N; Bournaud, F; Burkert, A; Combes, F; Comerford, J; Davis, M; Schreiber, N M Förster; Garcia-Burillo, S; Gracia-Carpio, J; Lutz, D; Naab, T; Omont, A; Shapley, A; Sternberg, A; Weiner, B

    2010-02-11

    Stars form from cold molecular interstellar gas. As this is relatively rare in the local Universe, galaxies like the Milky Way form only a few new stars per year. Typical massive galaxies in the distant Universe formed stars an order of magnitude more rapidly. Unless star formation was significantly more efficient, this difference suggests that young galaxies were much more molecular-gas rich. Molecular gas observations in the distant Universe have so far largely been restricted to very luminous, rare objects, including mergers and quasars, and accordingly we do not yet have a clear idea about the gas content of more normal (albeit massive) galaxies. Here we report the results of a survey of molecular gas in samples of typical massive-star-forming galaxies at mean redshifts of about 1.2 and 2.3, when the Universe was respectively 40% and 24% of its current age. Our measurements reveal that distant star forming galaxies were indeed gas rich, and that the star formation efficiency is not strongly dependent on cosmic epoch. The average fraction of cold gas relative to total galaxy baryonic mass at z = 2.3 and z = 1.2 is respectively about 44% and 34%, three to ten times higher than in today's massive spiral galaxies. The slow decrease between z approximately 2 and z approximately 1 probably requires a mechanism of semi-continuous replenishment of fresh gas to the young galaxies.

  3. Generating All Circular Shifts by Context-Free Grammars in Greibach Normal Form

    NARCIS (Netherlands)

    Asveld, Peter R.J.

    2007-01-01

    For each alphabet Σn = {a1,a2,…,an}, linearly ordered by a1 < a2 < ⋯ < an, let Cn be the language of circular or cyclic shifts over Σn, i.e., Cn = {a1a2 ⋯ an-1an, a2a3 ⋯ ana1,…,ana1 ⋯ an-2an-1}. We study a few families of context-free grammars Gn (n ≥1) in Greibach normal form such that Gn generates

  4. Evaluating accounting information systems that support multiple GAAP reporting using Normalized Systems Theory

    NARCIS (Netherlands)

    Vanhoof, E.; Huysmans, P.; Aerts, Walter; Verelst, J.; Aveiro, D.; Tribolet, J.; Gouveia, D.

    2014-01-01

    This paper uses a mixed methods approach of design science and case study research to evaluate structures of Accounting Information Systems (AIS) that report in multiple Generally Accepted Accounting Principles (GAAP), using Normalized Systems Theory (NST). To comply with regulation, many companies

  5. Normal Patterns of Deja Experience in a Healthy, Blind Male: Challenging Optical Pathway Delay Theory

    Science.gov (United States)

    O'Connor, Akira R.; Moulin, Christopher J. A.

    2006-01-01

    We report the case of a 25-year-old healthy, blind male, MT, who experiences normal patterns of deja vu. The optical pathway delay theory of deja vu formation assumes that neuronal input from the optical pathways is necessary for the formation of the experience. Surprisingly, although the sensation of deja vu is known to be experienced by blind…

  6. Normal form of particle motion under the influence of an ac dipole

    Directory of Open Access Journals (Sweden)

    R. Tomás

    2002-05-01

    Full Text Available ac dipoles in accelerators are used to excite coherent betatron oscillations at a drive frequency close to the tune. These beam oscillations may last arbitrarily long and, in principle, there is no significant emittance growth if the ac dipole is adiabatically turned on and off. Therefore the ac dipole seems to be an adequate tool for nonlinear diagnostics provided the particle motion is well described in the presence of the ac dipole and nonlinearities. Normal forms and Lie algebra are powerful tools to study the nonlinear content of an accelerator lattice. In this article a way to obtain the normal form of the Hamiltonian of an accelerator with an ac dipole is described. The particle motion to first order in the nonlinearities is derived using Lie algebra techniques. The dependence of the Hamiltonian terms on the longitudinal coordinate is studied showing that they vary differently depending on the ac dipole parameters. The relation is given between the lines of the Fourier spectrum of the turn-by-turn motion and the Hamiltonian terms.

  7. Principal Typings in a Restricted Intersection Type System for Beta Normal Forms with De Bruijn Indices

    Directory of Open Access Journals (Sweden)

    Daniel Ventura

    2010-01-01

    Full Text Available The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it.

  8. Nucleon form factors in dispersively improved chiral effective field theory. II. Electromagnetic form factors

    Science.gov (United States)

    Alarcón, J. M.; Weiss, C.

    2018-05-01

    We study the nucleon electromagnetic form factors (EM FFs) using a recently developed method combining chiral effective field theory (χ EFT ) and dispersion analysis. The spectral functions on the two-pion cut at t >4 Mπ2 are constructed using the elastic unitarity relation and an N /D representation. χ EFT is used to calculate the real functions J±1(t ) =f±1(t ) /Fπ(t ) (ratios of the complex π π →N N ¯ partial-wave amplitudes and the timelike pion FF), which are free of π π rescattering. Rescattering effects are included through the empirical timelike pion FF | Fπ(t) | 2 . The method allows us to compute the isovector EM spectral functions up to t ˜1 GeV2 with controlled accuracy (leading order, next-to-leading order, and partial next-to-next-to-leading order). With the spectral functions we calculate the isovector nucleon EM FFs and their derivatives at t =0 (EM radii, moments) using subtracted dispersion relations. We predict the values of higher FF derivatives, which are not affected by higher-order chiral corrections and are obtained almost parameter-free in our approach, and explain their collective behavior. We estimate the individual proton and neutron FFs by adding an empirical parametrization of the isoscalar sector. Excellent agreement with the present low-Q2 FF data is achieved up to ˜0.5 GeV2 for GE, and up to ˜0.2 GeV2 for GM. Our results can be used to guide the analysis of low-Q2 elastic scattering data and the extraction of the proton charge radius.

  9. Optimization of accelerator parameters using normal form methods on high-order transfer maps

    Energy Technology Data Exchange (ETDEWEB)

    Snopok, Pavel [Michigan State Univ., East Lansing, MI (United States)

    2007-05-01

    Methods of analysis of the dynamics of ensembles of charged particles in collider rings are developed. The following problems are posed and solved using normal form transformations and other methods of perturbative nonlinear dynamics: (1) Optimization of the Tevatron dynamics: (a) Skew quadrupole correction of the dynamics of particles in the Tevatron in the presence of the systematic skew quadrupole errors in dipoles; (b) Calculation of the nonlinear tune shift with amplitude based on the results of measurements and the linear lattice information; (2) Optimization of the Muon Collider storage ring: (a) Computation and optimization of the dynamic aperture of the Muon Collider 50 x 50 GeV storage ring using higher order correctors; (b) 750 x 750 GeV Muon Collider storage ring lattice design matching the Tevatron footprint. The normal form coordinates have a very important advantage over the particle optical coordinates: if the transformation can be carried out successfully (general restrictions for that are not much stronger than the typical restrictions imposed on the behavior of the particles in the accelerator) then the motion in the new coordinates has a very clean representation allowing to extract more information about the dynamics of particles, and they are very convenient for the purposes of visualization. All the problem formulations include the derivation of the objective functions, which are later used in the optimization process using various optimization algorithms. Algorithms used to solve the problems are specific to collider rings, and applicable to similar problems arising on other machines of the same type. The details of the long-term behavior of the systems are studied to ensure the their stability for the desired number of turns. The algorithm of the normal form transformation is of great value for such problems as it gives much extra information about the disturbing factors. In addition to the fact that the dynamics of particles is represented

  10. Bioactive form of resveratrol in glioblastoma cells and its safety for normal brain cells

    Directory of Open Access Journals (Sweden)

    Xiao-Hong Shu

    2013-05-01

    Full Text Available ABSTRACTBackground: Resveratrol, a plant polyphenol existing in grapes and many other natural foods, possesses a wide range of biological activities including cancer prevention. It has been recognized that resveratrol is intracellularly biotransformed to different metabolites, but no direct evidence has been available to ascertain its bioactive form because of the difficulty to maintain resveratrol unmetabolized in vivo or in vitro. It would be therefore worthwhile to elucidate the potential therapeutic implications of resveratrol metabolism using a reliable resveratrol-sensitive cancer cells.Objective: To identify the real biological form of trans-resveratrol and to evaluate the safety of the effective anticancer dose of resveratrol for the normal brain cells.Methods: The samples were prepared from the condition media and cell lysates of human glioblastoma U251 cells, and were purified by solid phase extraction (SPE. The samples were subjected to high performance liquid chromatography (HPLC and liquid chromatography/tandem mass spectrometry (LC/MS analysis. According to the metabolite(s, trans-resveratrol was biotransformed in vitro by the method described elsewhere, and the resulting solution was used to treat U251 cells. Meanwhile, the responses of U251 and primarily cultured rat normal brain cells (glial cells and neurons to 100μM trans-resveratrol were evaluated by multiple experimental methods.Results: The results revealed that resveratrol monosulfate was the major metabolite in U251 cells. About half fraction of resveratrol monosulfate was prepared in vitro and this trans-resveratrol and resveratrol monosulfate mixture showed little inhibitory effect on U251 cells. It is also found that rat primary brain cells (PBCs not only resist 100μM but also tolerate as high as 200μM resveratrol treatment.Conclusions: Our study thus demonstrated that trans-resveratrol was the bioactive form in glioblastoma cells and, therefore, the biotransforming

  11. Differential forms on singular varieties De Rham and Hodge theory simplified

    CERN Document Server

    Ancona, Vincenzo

    2005-01-01

    Differential Forms on Singular Varieties: De Rham and Hodge Theory Simplified uses complexes of differential forms to give a complete treatment of the Deligne theory of mixed Hodge structures on the cohomology of singular spaces. This book features an approach that employs recursive arguments on dimension and does not introduce spaces of higher dimension than the initial space. It simplifies the theory through easily identifiable and well-defined weight filtrations. It also avoids discussion of cohomological descent theory to maintain accessibility. Topics include classical Hodge theory, differential forms on complex spaces, and mixed Hodge structures on noncompact spaces.

  12. Is quantum theory a form of statistical mechanics?

    Science.gov (United States)

    Adler, S. L.

    2007-05-01

    We give a review of the basic themes of my recent book: Adler S L 2004 Quantum Theory as an Emergent Phenomenon (Cambridge: Cambridge University Press). We first give motivations for considering the possibility that quantum mechanics is not exact, but is instead an accurate asymptotic approximation to a deeper level theory. For this deeper level, we propose a non-commutative generalization of classical mechanics, that we call "trace dynamics", and we give a brief survey of how it works, considering for simplicity only the bosonic case. We then discuss the statistical mechanics of trace dynamics and give our argument that with suitable approximations, the Ward identities for trace dynamics imply that ensemble averages in the canonical ensemble correspond to Wightman functions in quantum field theory. Thus, quantum theory emerges as the statistical thermodynamics of trace dynamics. Finally, we argue that Brownian motion corrections to this thermodynamics lead to stochastic corrections to the Schrödinger equation, of the type that have been much studied in the "continuous spontaneous localization" model of objective state vector reduction. In appendices to the talk, we give details of the existence of a conserved operator in trace dynamics that encodes the structure of the canonical algebra, of the derivation of the Ward identities, and of the proof that the stochastically-modified Schrödinger equation leads to state vector reduction with Born rule probabilities.

  13. Normal form analysis of linear beam dynamics in a coupled storage ring

    International Nuclear Information System (INIS)

    Wolski, Andrzej; Woodley, Mark D.

    2004-01-01

    The techniques of normal form analysis, well known in the literature, can be used to provide a straightforward characterization of linear betatron dynamics in a coupled lattice. Here, we consider both the beam distribution and the betatron oscillations in a storage ring. We find that the beta functions for uncoupled motion generalize in a simple way to the coupled case. Defined in the way that we propose, the beta functions remain well behaved (positive and finite) under all circumstances, and have essentially the same physical significance for the beam size and betatron oscillation amplitude as in the uncoupled case. Application of this analysis to the online modeling of the PEP-II rings is also discussed

  14. Theory of normal and superconducting properties of fullerene-based solids

    International Nuclear Information System (INIS)

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ''standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes

  15. Theory of novel normal and superconducting states in doped oxide high-Tc superconductors

    International Nuclear Information System (INIS)

    Dzhumanov, S.

    2001-10-01

    A consistent and complete theory of the novel normal and superconducting (SC) states of doped high-T c superconductors (HTSC) is developed by combining the continuum model of carrier self-trapping, the tight-binding model and the novel Fermi-Bose-liquid (FBL) model. The ground-state energy of carriers in lightly doped HTSC is calculated within the continuum model and adiabatic approximation using the variational method. The destruction of the long-range antiferromagnetic (AF) order at low doping x≥ x cl ≅0.015, the formation of the in-gap states or bands and novel (bi)polaronic insulating phases at x c2 ≅0.06-0.08, and the new metal- insulator transition at x≅x c2 in HTSC are studied within the continuum model of impurity (defect) centers and large (bi)polarons by using the appropriate tight-binding approximations. It is found that the three-dimensional (3d) large (bi)polarons are formed at ε ∞ /ε 0 ≤0.1 and become itinerant when the (bi)polaronic insulator-to-(bi)polaronic metal transitions occur at x x c2 . We show that the novel pseudogapped metallic and SC states in HTSC are formed at x c2 ≤x≤x p ≅0.20-0.24. We demonstrate that the large polaronic and small BCS-like pairing pseudogaps opening in the excitation spectrum of underdoped (x c2 BCS =0.125), optimally doped (x BCS o ≅0.20) and overdoped (x>x o ) HTSC above T c are unrelated to superconductivity and they are responsible for the observed anomalous optical, transport, magnetic and other properties of these HTSC. We develop the original two-stage FBL model of novel superconductivity describing the combined novel BCS-like pairing scenario of fermions and true superfluid (SF) condensation scenario of composite bosons (i.e. bipolarons and cooperons) in any Fermi-systems, where the SF condensate gap Δ B and the BCS-like pairing pseudogap Δ F have different origins. The pair and single particle condensations of attracting 3d and two- dimensional (2d) composite bosons are responsible for

  16. Matrix theory from generalized inverses to Jordan form

    CERN Document Server

    Piziak, Robert

    2007-01-01

    Each chapter ends with a list of references for further reading. Undoubtedly, these will be useful for anyone who wishes to pursue the topics deeper. … the book has many MATLAB examples and problems presented at appropriate places. … the book will become a widely used classroom text for a second course on linear algebra. It can be used profitably by graduate and advanced level undergraduate students. It can also serve as an intermediate course for more advanced texts in matrix theory. This is a lucidly written book by two authors who have made many contributions to linear and multilinear algebra.-K.C. Sivakumar, IMAGE, No. 47, Fall 2011Always mathematically constructive, this book helps readers delve into elementary linear algebra ideas at a deeper level and prepare for further study in matrix theory and abstract algebra.-L'enseignement Mathématique, January-June 2007, Vol. 53, No. 1-2.

  17. Theory of the low-voltage impedance of superconductor-- p insulator--normal metal tunnel junctions

    International Nuclear Information System (INIS)

    Lemberger, T.R.

    1984-01-01

    A theory for the low-voltage impedance of a superconductor-- p insulator--normal metal tunnel junction is developed that includes the effects of charge imbalance and of quasiparticle fluctuations. A novel, inelastic, charge-imbalance relaxation process is identified that is associated with the junction itself. This new process leads to the surprising result that the charge-imbalance component of the dc resistance of a junction becomes independent of the electron-phonon scattering rate as the insulator resistance decreases

  18. Long-wave theory for a new convective instability with exponential growth normal to the wall.

    Science.gov (United States)

    Healey, J J

    2005-05-15

    A linear stability theory is presented for the boundary-layer flow produced by an infinite disc rotating at constant angular velocity in otherwise undisturbed fluid. The theory is developed in the limit of long waves and when the effects of viscosity on the waves can be neglected. This is the parameter regime recently identified by the author in a numerical stability investigation where a curious new type of instability was found in which disturbances propagate and grow exponentially in the direction normal to the disc, (i.e. the growth takes place in a region of zero mean shear). The theory describes the mechanisms controlling the instability, the role and location of critical points, and presents a saddle-point analysis describing the large-time evolution of a wave packet in frames of reference moving normal to the disc. The theory also shows that the previously obtained numerical solutions for numerically large wavelengths do indeed lie in the asymptotic long-wave regime, and so the behaviour and mechanisms described here may apply to a number of cross-flow instability problems.

  19. Closed 1-forms in topology and geometric group theory

    Energy Technology Data Exchange (ETDEWEB)

    Farber, Michael; Schuetz, Dirk [University of Durham, Durham (United Kingdom); Geoghegan, Ross [State University of New York, New York (United States)

    2010-01-01

    In this article we describe relations of the topology of closed 1-forms to the group-theoretic invariants of Bieri-Neumann-Strebel-Renz. Starting with a survey, we extend these Sigma invariants to finite CW-complexes and show that many properties of the group-theoretic version have analogous statements. In particular, we show the relation between Sigma invariants and finiteness properties of certain infinite covering spaces. We also discuss applications of these invariants to the Lusternik-Schnirelmann category of a closed 1-form and to the existence of a non-singular closed 1-form in a given cohomology class on a high-dimensional closed manifold. Bibliography: 32 titles.

  20. Child in a Form: The Definition of Normality and Production of Expertise in Teacher Statement Forms--The Case of Northern Finland, 1951-1990

    Science.gov (United States)

    Koskela, Anne; Vehkalahti, Kaisa

    2017-01-01

    This article shows the importance of paying attention to the role of professional devices, such as standardised forms, as producers of normality and deviance in the history of education. Our case study focused on the standardised forms used by teachers during child guidance clinic referrals and transfers to special education in northern Finland,…

  1. Metacognition and Reading: Comparing Three Forms of Metacognition in Normally Developing Readers and Readers with Dyslexia.

    Science.gov (United States)

    Furnes, Bjarte; Norman, Elisabeth

    2015-08-01

    Metacognition refers to 'cognition about cognition' and includes metacognitive knowledge, strategies and experiences (Efklides, 2008; Flavell, 1979). Research on reading has shown that better readers demonstrate more metacognitive knowledge than poor readers (Baker & Beall, 2009), and that reading ability improves through strategy instruction (Gersten, Fuchs, Williams, & Baker, 2001). The current study is the first to specifically compare the three forms of metacognition in dyslexic (N = 22) versus normally developing readers (N = 22). Participants read two factual texts, with learning outcome measured by a memory task. Metacognitive knowledge and skills were assessed by self-report. Metacognitive experiences were measured by predictions of performance and judgments of learning. Individuals with dyslexia showed insight into their reading problems, but less general knowledge of how to approach text reading. They more often reported lack of available reading strategies, but groups did not differ in the use of deep and surface strategies. Learning outcome and mean ratings of predictions of performance and judgments of learning were lower in dyslexic readers, but not the accuracy with which metacognitive experiences predicted learning. Overall, the results indicate that dyslexic reading and spelling problems are not generally associated with lower levels of metacognitive knowledge, metacognitive strategies or sensitivity to metacognitive experiences in reading situations. 2015 The Authors. Dyslexia Published by John Wiley & Sons Ltd.

  2. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    Science.gov (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  3. [Quantitative analysis method based on fractal theory for medical imaging of normal brain development in infants].

    Science.gov (United States)

    Li, Heheng; Luo, Liangping; Huang, Li

    2011-02-01

    The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P development.

  4. Assessment the Plasticity of Cortical Brain Theory through Visual Memory in Deaf and Normal Students

    Directory of Open Access Journals (Sweden)

    Ali Ghanaee-Chamanabad

    2012-10-01

    Full Text Available Background: The main aim of this research was to assess the differences of visual memory in deaf and normal students according to plasticity of cortical brain.Materials and Methods: This is an ex-post factor research. Benton visual test was performed by two different ways on 46 students of primary school. (22 deaf and 24 normal students. The t-student was used to analysis the data. Results: The visual memory in deaf students was significantly higher than the similar normal students (not deaf.While the action of visual memory in deaf girls was risen in comparison to normal girls in both ways, the deaf boys presented the better action in just one way of the two performances of Benton visual memory test.Conclusion: The action of plasticity of brain shows that the brain of an adult is dynamic and there are some changes in it. This brain plasticity has not limited to sensory somatic systems. Therefore according to plasticity of cortical brain theory, the deaf students due to the defect of hearing have increased the visual the visual inputs which developed the procedural visual memory.

  5. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory.

    Science.gov (United States)

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-12-02

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and shear strains) were measured to reveal directional growth information every 3 months during the first year of life. Directional normal strain maps revealed that, during the first 6 months, the growth pattern of gray matter is anisotropic and spatially inhomogeneous with higher left-right stretch around the temporal lobe and interhemispheric fissure, anterior-posterior stretch in the frontal and occipital lobes, and superior-inferior stretch in right inferior occipital and right inferior temporal gyri. In contrast, anterior lateral ventricles and insula showed an isotropic stretch pattern. Volumetric and directional growth rates were linearly decreased with age for most of the cortical regions. Our results revealed anisotropic and inhomogeneous brain growth patterns of the human brain during the first year of life using longitudinal MRI and a biomechanical framework.

  6. Relativity, Symmetry, and the Structure of Quantum Theory, Volume 2; Point form relativistic quantum mechanics

    Science.gov (United States)

    Klink, William H.; Schweiger, Wolfgang

    2018-03-01

    This book covers relativistic quantum theory from the point of view of a particle theory, based on the irreducible representations of the Poincaré group, the group that expresses the symmetry of Einstein relativity. There are several ways of formulating such a theory; this book develops what is called relativistic point form quantum mechanics, which, unlike quantum field theory, deals with a fixed number of particles in a relativistically invariant way. A chapter is devoted to applications of point form quantum mechanics to nuclear physics.

  7. Efficient Computation of Transition State Resonances and Reaction Rates from a Quantum Normal Form

    NARCIS (Netherlands)

    Schubert, Roman; Waalkens, Holger; Wiggins, Stephen

    2006-01-01

    A quantum version of a recent formulation of transition state theory in phase space is presented. The theory developed provides an algorithm to compute quantum reaction rates and the associated Gamov-Siegert resonances with very high accuracy. The algorithm is especially efficient for

  8. Bernstein's theory of pedagogic discourse as a theoretical framework for educators studying student radiographers' interpretation of normality vs. abnormality

    International Nuclear Information System (INIS)

    Winter, Peter D.; Linehan, Mark J.

    2014-01-01

    Purpose: To acknowledge the tacit rules underpinning academic practice of undergraduate radiographers in determining normality vs. abnormality when appraising skeletal images. Methodology: Twelve students were interviewed (individually) using in-depth semi-structured questions. Interviews were mediated through a PowerPoint presentation containing two digital X-ray images. Each image was based on a level of expertise; the elementary (Case 1) and the complicated (Case 2). The questions were based on regular ‘frames’ created from observing tutor–student contact in class, and then validated through a group interview. Bernstein's theory of pedagogic discourse was then utilised as a data analysis instrument to determine how third year diagnostic radiography students interpreted X-ray images, in relation to the ‘recognition’ and ‘realisation’ rules of the Educational Theoretical Framework. Conclusion: Bernstein's framework has made it possible to specify, in detail, how issues and difficulties are formed at the level of the acquirer during interpretation. The recognition rules enabled students to meaningfully recognise what trauma characteristics can be associated with the image and the demands of a detailed scrutiny so as to enact a competent interpretation. Realisation rules, made it possible for students to establish their own systematic approach and realise legitimate meanings of normality and abnormality. Whereas obvious or visible trauma generated realisation rules (represented via homogenous terminology), latent trauma authorised students to deviate from legitimate meanings. The latter rule, in this context, has directed attention to the student issue of visioning abnormality when images are normal

  9. Overweight but unseen: a review of the underestimation of weight status and a visual normalization theory.

    Science.gov (United States)

    Robinson, E

    2017-10-01

    Although overweight and obesity are widespread across most of the developed world, a considerable body of research has now accumulated, which suggests that adiposity often goes undetected. A substantial proportion of individuals with overweight or obesity do not identify they are overweight, and large numbers of parents of children with overweight or obesity fail to identify their child as being overweight. Lay people and medical practitioners are also now poor at identifying overweight and obesity in others. A visual normalization theory of the under-detection of overweight and obesity is proposed. This theory is based on the notion that weight status is judged relative to visual body size norms. Because larger body sizes are now common, this has caused a recalibration to the range of body sizes that are perceived as being 'normal' and increased the visual threshold for what constitutes 'overweight'. Evidence is reviewed that indicates this process has played a significant role in the under-detection of overweight and obesity. The public health relevance of the under-detection of overweight and obesity is also discussed. © 2017 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity.

  10. Nilpotent algebras of the generalized differential forms and the geometry of superfield theories

    International Nuclear Information System (INIS)

    Zupnik, B.M.

    1991-01-01

    We consider a new algebraic approach in the geometry of supergauge theories and supergravity. An introduction of nilpotent algebras simplifies significantly the analysis of D = 3, 4, N = 1 supergravity constraints. Different terms in the invariant action functionals of SG- and SYM-theories are constructed as the integrals of corresponding generalized differential forms. (orig.)

  11. Second Person Singular Address Forms in Caleno Spanish: Applying a Theory of Language Regard

    Science.gov (United States)

    Newall, Gregory M.

    2012-01-01

    Language regard is defined as the opinions and norms that speakers have about language. In this dissertation, a theory of language regard is applied to variation in second-person singular address forms in Cali Colombian Spanish (["tuteo," "voseo", and "ustedeo" ]). This theory claims that language production and…

  12. Mandibulary dental arch form differences between level four polynomial method and pentamorphic pattern for normal occlusion sample

    Directory of Open Access Journals (Sweden)

    Y. Yuliana

    2011-07-01

    Full Text Available The aim of an orthodontic treatment is to achieve aesthetic, dental health and the surrounding tissues, occlusal functional relationship, and stability. The success of an orthodontic treatment is influenced by many factors, such as diagnosis and treatment plan. In order to do a diagnosis and a treatment plan, medical record, clinical examination, radiographic examination, extra oral and intra oral photos, as well as study model analysis are needed. The purpose of this study was to evaluate the differences in dental arch form between level four polynomial and pentamorphic arch form and to determine which one is best suitable for normal occlusion sample. This analytic comparative study was conducted at Faculty of Dentistry Universitas Padjadjaran on 13 models by comparing the dental arch form using the level four polynomial method based on mathematical calculations, the pattern of the pentamorphic arch and mandibular normal occlusion as a control. The results obtained were tested using statistical analysis T student test. The results indicate a significant difference both in the form of level four polynomial method and pentamorphic arch form when compared with mandibular normal occlusion dental arch form. Level four polynomial fits better, compare to pentamorphic arch form.

  13. The relationship of theory of mind and executive functions in normal, deaf and cochlear-implanted children

    Directory of Open Access Journals (Sweden)

    Farideh Nazarzadeh

    2014-08-01

    Full Text Available Background and Aim : Theory of mind refers to the ability to understand the others have mental states that can be different from one's own mental states or facts. This study aimed to investigate the relationship of theory of mind and executive functions in normal hearing, deaf, and cochlear-implanted children.Methods: The study population consisted of normal, deaf and cochlear-implanted girl students in Mashhad city, Iran. Using random sampling, 30 children (10 normal, 10 deaf and 10 cochlear-implanted in age groups of 8-12 years old were selected. To measure the theoty of mind, theory of mind 38-item scale and to assess executive function, Coolidge neuropsychological and personality test was used. Research data were analyzed using the Spearman correlation coefficient, analysis of variance and Kruskal-Wallis tests.Results: There was a significant difference between the groups in the theory of mind and executive function subscales, organization, planning-decision-making, and inhibition. Between normal and deaf groups (p=0.01, as well as cochlear-implanted and deaf groups (p=0.01, there was significant difference in planning decision-making subscale. There was not any significant relationship between the theory of mind and executive functions generally or the theory of mind and executive function subscales in these three groups independently.Conclusion: Based on our findings, cochlear-implanted and deaf children have lower performance in theory of mind and executive function compared with normal hearing children.

  14. Effect of care management program structure on implementation: a normalization process theory analysis.

    Science.gov (United States)

    Holtrop, Jodi Summers; Potworowski, Georges; Fitzpatrick, Laurie; Kowalk, Amy; Green, Lee A

    2016-08-15

    Care management in primary care can be effective in helping patients with chronic disease improve their health status, however, primary care practices are often challenged with implementation. Further, there are different ways to structure care management that may make implementation more or less successful. Normalization process theory (NPT) provides a means of understanding how a new complex intervention can become routine (normalized) in practice. In this study, we used NPT to understand how care management structure affected how well care management became routine in practice. Data collection involved semi-structured interviews and observations conducted at 25 practices in five physician organizations in Michigan, USA. Practices were selected to reflect variation in physician organizations, type of care management program, and degree of normalization. Data were transcribed, qualitatively coded and analyzed, initially using an editing approach and then a template approach with NPT as a guiding framework. Seventy interviews and 25 observations were completed. Two key structures for care management organization emerged: practice-based care management where the care managers were embedded in the practice as part of the practice team; and centralized care management where the care managers worked independently of the practice work flow and was located outside the practice. There were differences in normalization of care management across practices. Practice-based care management was generally better normalized as compared to centralized care management. Differences in normalization were well explained by the NPT, and in particular the collective action construct. When care managers had multiple and flexible opportunities for communication (interactional workability), had the requisite knowledge, skills, and personal characteristics (skill set workability), and the organizational support and resources (contextual integration), a trusting professional relationship

  15. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Science.gov (United States)

    Edneral, Victor

    2018-02-01

    This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  16. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Directory of Open Access Journals (Sweden)

    Edneral Victor

    2018-01-01

    Full Text Available This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  17. Cognitive Factors in the Choice of Syntactic Form by Aphasic and Normal Speakers of English and Japanese: The Speaker's Impulse.

    Science.gov (United States)

    Menn, Lise; And Others

    This study examined the role of empathy in the choice of syntactic form and the degree of independence of pragmatic and syntactic abilities in a range of aphasic patients. Study 1 involved 9 English-speaking and 9 Japanese-speaking aphasic subjects with 10 English-speaking and 4 Japanese normal controls. Study 2 involved 14 English- and 6…

  18. A simple global representation for second-order normal forms of Hamiltonian systems relative to periodic flows

    International Nuclear Information System (INIS)

    Avendaño-Camacho, M; Vallejo, J A; Vorobjev, Yu

    2013-01-01

    We study the determination of the second-order normal form for perturbed Hamiltonians relative to the periodic flow of the unperturbed Hamiltonian H 0 . The formalism presented here is global, and can be easily implemented in any computer algebra system. We illustrate it by means of two examples: the Hénon–Heiles and the elastic pendulum Hamiltonians. (paper)

  19. Algorithms for finding Chomsky and Greibach normal forms for a fuzzy context-free grammar using an algebraic approach

    Energy Technology Data Exchange (ETDEWEB)

    Lee, E.T.

    1983-01-01

    Algorithms for the construction of the Chomsky and Greibach normal forms for a fuzzy context-free grammar using the algebraic approach are presented and illustrated by examples. The results obtained in this paper may have useful applications in fuzzy languages, pattern recognition, information storage and retrieval, artificial intelligence, database and pictorial information systems. 16 references.

  20. Semi-analytical quasi-normal mode theory for the local density of states in coupled photonic crystal cavity-waveguide structures

    DEFF Research Database (Denmark)

    de Lasson, Jakob Rosenkrantz; Kristensen, Philip Trøst; Mørk, Jesper

    2015-01-01

    We present and validate a semi-analytical quasi-normal mode (QNM) theory for the local density of states (LDOS) in coupled photonic crystal (PhC) cavity-waveguide structures. By means of an expansion of the Green's function on one or a few QNMs, a closed-form expression for the LDOS is obtained, ......-trivial spectrum with a peak and a dip is found, which is reproduced only when including both the two relevant QNMs in the theory. In both cases, we find relative errors below 1% in the bandwidth of interest.......We present and validate a semi-analytical quasi-normal mode (QNM) theory for the local density of states (LDOS) in coupled photonic crystal (PhC) cavity-waveguide structures. By means of an expansion of the Green's function on one or a few QNMs, a closed-form expression for the LDOS is obtained......, and for two types of two-dimensional PhCs, with one and two cavities side-coupled to an extended waveguide, the theory is validated against numerically exact computations. For the single cavity, a slightly asymmetric spectrum is found, which the QNM theory reproduces, and for two cavities a non...

  1. Theory of superconductivity. II. Excited Cooper pairs. Why does sodium remain normal down to 0 K?

    International Nuclear Information System (INIS)

    Fujita, S.

    1992-01-01

    Based on a generalized BCS Hamiltonian in which the interaction strengths (V 11 , V 22 , V 12 ) among and between electron (12) and hole (2) Cooper pairs are differentiated, the thermodynamic properties of a type-I superconductor below the critical temperature T c are investigated. An expression for the ground-state energy, W - W 0 , relative to the unperturbed Block system is obtained. The usual BCS formulas are obtained in the limits: (all) V jl = V 0 , N 1 (0) = N 2 (0). Any excitations generated through the BCS interaction Hamiltonian containing V jl must involve Cooper pairs of antiparallel spins and nearly opposite momenta. The nonzero momentum or excited Cooper pairs below T c are shown to have an excitation energy band minimum lower than the quasi-electrons, which were regarded as the elementary excitations in the original BCS theory. The energy gap var-epsilon g (T) defined relative to excited and zero-momentum Copper pairs (when V jl > 0) decreases from var-epsilon g (0) to 0 as the temperature T is raised from 0 to T c . If electrons only are available as in a monovalent metal like sodium (V 12 = 0), the energy constant Δ 1 is finite but the energy gap vanishes identically for all T. In agreement with the BCS theory, the present theory predicts that a pure nonmagnetic metal in any dimensions should have a Cooper-pair ground state whose energy is lower than that of the Bloch ground state. Additionally it predicts that a monovalent metal should remain normal down to 0 K, and that there should be no strictly one-dimensional superconductor

  2. Shear Stress-Normal Stress (Pressure) Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Science.gov (United States)

    Noguchi, Hiroshi; Takehara, Kimie; Ohashi, Yumiko; Suzuki, Ryo; Yamauchi, Toshimasa; Kadowaki, Takashi; Sanada, Hiromi

    2016-01-01

    Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure) and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure) ratio (SPR) was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH) as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure), concretely, peak values (SPR-p) and time integral values (SPR-i). The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i. PMID:28050567

  3. Shear Stress-Normal Stress (Pressure Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Directory of Open Access Journals (Sweden)

    Ayumi Amemiya

    2016-01-01

    Full Text Available Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure ratio (SPR was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure, concretely, peak values (SPR-p and time integral values (SPR-i. The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i.

  4. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  5. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  6. Three-form periods on Calabi-Yau fourfolds: toric hypersurfaces and F-theory applications

    Energy Technology Data Exchange (ETDEWEB)

    Greiner, Sebastian; Grimm, Thomas W. [Institute for Theoretical Physics and Center for Extreme Matter and Emergent Phenomena, Utrecht University,Leuvenlaan 4, 3584 CE Utrecht (Netherlands); Max-Planck-Institut für Physik, Föhringer Ring 6, 80805 Munich (Germany)

    2017-05-30

    The study of the geometry of Calabi-Yau fourfolds is relevant for compactifications of string theory, M-theory, and F-theory to various dimensions. This work introduces the mathematical machinery to derive the complete moduli dependence of the periods of non-trivial three-forms for fourfolds realized as hypersurfaces in toric ambient spaces. It sets the stage to determine Picard-Fuchs-type differential equations and integral expressions for these forms. The key tool is the observation that non-trivial three-forms on fourfold hypersurfaces in toric ambient spaces always stem from divisors that are build out of trees of toric surfaces fibered over Riemann surfaces. The three-form periods are then non-trivially related to the one-form periods of these Riemann surfaces. In general, the three-form periods are known to vary holomorphically over the complex structure moduli space and play an important role in the effective actions arising in fourfold compactifications. We discuss two explicit example fourfolds for F-theory compactifications in which the three-form periods determine axion decay constants.

  7. Hipertekstualnost kao oblik digitalizacije teorije / Hypertextuality as a Form of Digitalization of Theory

    Directory of Open Access Journals (Sweden)

    Predrag Rodić

    2012-12-01

    Full Text Available The introduction of the notion of hypertextuality (as the key concept of the new media into the fields of visual arts, film, fashion, photography, postdramatic theatre, dance and music has resulted in the formation of constructivist or soft art theory. The basic method of art theory thus labelled is reading and the (hyper text is the fundamental category founded on the concept of the sign and on the discursively encoded order of signs in textual form.

  8. Number theory and modular forms papers in memory of Robert A Rankin

    CERN Document Server

    Ono, Ken

    2003-01-01

    Robert A. Rankin, one of the world's foremost authorities on modular forms and a founding editor of The Ramanujan Journal, died on January 27, 2001, at the age of 85. Rankin had broad interests and contributed fundamental papers in a wide variety of areas within number theory, geometry, analysis, and algebra. To commemorate Rankin's life and work, the editors have collected together 25 papers by several eminent mathematicians reflecting Rankin's extensive range of interests within number theory. Many of these papers reflect Rankin's primary focus in modular forms. It is the editors' fervent hope that mathematicians will be stimulated by these papers and gain a greater appreciation for Rankin's contributions to mathematics. This volume would be an inspiration to students and researchers in the areas of number theory and modular forms.

  9. Generation of Strategies for Environmental Deception in Two-Player Normal-Form Games

    Science.gov (United States)

    2015-06-18

    found in the literature is pre- sented by Kohlberg and Mertens [23]. A stable equilibrium by their definition is an equi- librium in an extensive-form...the equilibrium in this state provides them with an increased payoff. While interesting, Kohlberg and Mertens’ defi- 13 nition of equilibrium...stability used by Kohlberg and Mertens. Arsham’s work focuses on determining the amount by which a mixed-strategy Nash equilibrium’s payoff values can

  10. Examination of the neighborhood activation theory in normal and hearing-impaired listeners.

    Science.gov (United States)

    Dirks, D D; Takayanagi, S; Moshfegh, A; Noffsinger, P D; Fausti, S A

    2001-02-01

    well as to an elderly group of listeners with sensorineural hearing loss in the speech-shaped noise (Experiment 3). The results of three experiments verified predictions of NAM in both normal hearing and hearing-impaired listeners. In each experiment, words from low density neighborhoods were recognized more accurately than those from high density neighborhoods. The presence of high frequency neighbors (average neighborhood frequency) produced poorer recognition performance than comparable conditions with low frequency neighbors. Word frequency was found to have a highly significant effect on word recognition. Lexical conditions with high word frequencies produced higher performance scores than conditions with low frequency words. The results supported the basic tenets of NAM theory and identified both neighborhood structural properties and word frequency as significant lexical factors affecting word recognition when listening in noise and "in quiet." The results of the third experiment permit extension of NAM theory to individuals with sensorineural hearing loss. Future development of speech recognition tests should allow for the effects of higher level cognitive (lexical) factors on lower level phonemic processing.

  11. A Novel Higher-Order Shear and Normal Deformable Plate Theory for the Static, Free Vibration and Buckling Analysis of Functionally Graded Plates

    Directory of Open Access Journals (Sweden)

    Shi-Chao Yi

    2017-01-01

    Full Text Available Closed-form solution of a special higher-order shear and normal deformable plate theory is presented for the static situations, natural frequencies, and buckling responses of simple supported functionally graded materials plates (FGMs. Distinguished from the usual theories, the uniqueness is the differentia of the new plate theory. Each individual FGM plate has special characteristics, such as material properties and length-thickness ratio. These distinctive attributes determine a set of orthogonal polynomials, and then the polynomials can form an exclusive plate theory. Thus, the novel plate theory has two merits: one is the orthogonality, where the majority of the coefficients of the equations derived from Hamilton’s principle are zero; the other is the flexibility, where the order of the plate theory can be arbitrarily set. Numerical examples with different shapes of plates are presented and the achieved results are compared with the reference solutions available in the literature. Several aspects of the model involving relevant parameters, length-to-thickness, stiffness ratios, and so forth affected by static and dynamic situations are elaborate analyzed in detail. As a consequence, the applicability and the effectiveness of the present method for accurately computing deflection, stresses, natural frequencies, and buckling response of various FGM plates are demonstrated.

  12. Process evaluation of discharge planning implementation in healthcare using normalization process theory.

    Science.gov (United States)

    Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger

    2016-04-27

    Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the

  13. Theories, Methods and Numerical Technology of Sheet Metal Cold and Hot Forming Analysis, Simulation and Engineering Applications

    CERN Document Server

    Hu, Ping; Liu, Li-zhong; Zhu, Yi-guo

    2013-01-01

    Over the last 15 years, the application of innovative steel concepts in the automotive industry has increased steadily. Numerical simulation technology of hot forming of high-strength steel allows engineers to modify the formability of hot forming steel metals and to optimize die design schemes. Theories, Methods and Numerical Technology of Sheet Metal Cold and Hot Forming focuses on hot and cold forming theories, numerical methods, relative simulation and experiment techniques for high-strength steel forming and die design in the automobile industry. Theories, Methods and Numerical Technology of Sheet Metal Cold and Hot Forming introduces the general theories of cold forming, then expands upon advanced hot forming theories and simulation methods, including: • the forming process, • constitutive equations, • hot boundary constraint treatment, and • hot forming equipment and experiments. Various calculation methods of cold and hot forming, based on the authors’ experience in commercial CAE software f...

  14. Primes of the form x2+ny2 Fermat, class field theory, and complex multiplication

    CERN Document Server

    Cox, David A

    2014-01-01

    An exciting approach to the history and mathematics of number theory ". . . the author's style is totally lucid and very easy to read . . .the result is indeed a wonderful story." -Mathematical ReviewsWritten in a unique and accessible style for readers of varied mathematical backgrounds, the Second Edition of Primes of the Form p = x2+ ny2 details the history behind how Pierre de Fermat's work ultimately gave birth to quadratic reciprocity and the genus theory of quadratic forms. The book also illustrates how results of Euler and Gauss can be fully understood only in the context of class fi

  15. Study on electric parameters of wild and cultivated cotton forms being in normal state and irradiated

    International Nuclear Information System (INIS)

    Nazirov, N.N.; Kamalov, N.; Norbaev, N.

    1978-01-01

    The radiation effect on electric conductivity of tissues in case of alternating current, electrical capacity and cell impedance has been studied. Gamma irradiation of seedlings results in definite changes of electric factors of cells (electric conductivity, electric capacity, impedance). It is shown that especially strong changes have been revealed during gamma irradiation of radiosensitive wild form of cotton plants. The deviation of cell electric factors from the standard depends on the violation of evolutionally composed ion heterogeneity and cell colloid system state, which results in changes in their structure and metabolism in them

  16. First-order systems of linear partial differential equations: normal forms, canonical systems, transform methods

    Directory of Open Access Journals (Sweden)

    Heinz Toparkus

    2014-04-01

    Full Text Available In this paper we consider first-order systems with constant coefficients for two real-valued functions of two real variables. This is both a problem in itself, as well as an alternative view of the classical linear partial differential equations of second order with constant coefficients. The classification of the systems is done using elementary methods of linear algebra. Each type presents its special canonical form in the associated characteristic coordinate system. Then you can formulate initial value problems in appropriate basic areas, and you can try to achieve a solution of these problems by means of transform methods.

  17. A Numerical Theory for Impedance Education in Three-Dimensional Normal Incidence Tubes

    Science.gov (United States)

    Watson, Willie R.; Jones, Michael G.

    2016-01-01

    A method for educing the locally-reacting acoustic impedance of a test sample mounted in a 3-D normal incidence impedance tube is presented and validated. The unique feature of the method is that the excitation frequency (or duct geometry) may be such that high-order duct modes may exist. The method educes the impedance, iteratively, by minimizing an objective function consisting of the difference between the measured and numerically computed acoustic pressure at preselected measurement points in the duct. The method is validated on planar and high-order mode sources with data synthesized from exact mode theory. These data are then subjected to random jitter to simulate the effects of measurement uncertainties on the educed impedance spectrum. The primary conclusions of the study are 1) Without random jitter the method is in excellent agreement with that for known impedance samples, and 2) Random jitter that is compatible to that found in a typical experiment has minimal impact on the accuracy of the educed impedance.

  18. Factorization of heavy-to-light form factors in soft-collinear effective theory

    CERN Document Server

    Beneke, Martin; Feldmann, Th.

    2004-01-01

    Heavy-to-light transition form factors at large recoil energy of the light meson have been conjectured to obey a factorization formula, where the set of form factors is reduced to a smaller number of universal form factors up to hard-scattering corrections. In this paper we extend our previous investigation of heavy-to-light currents in soft-collinear effective theory to final states with invariant mass Lambda^2 as is appropriate to exclusive B meson decays. The effective theory contains soft modes and two collinear modes with virtualities of order m_b*Lambda (`hard-collinear') and Lambda^2. Integrating out the hard-collinear modes results in the hard spectator-scattering contributions to exclusive B decays. We discuss the representation of heavy-to-light currents in the effective theory after integrating out the hard-collinear scale, and show that the previously conjectured factorization formula is valid to all orders in perturbation theory. The naive factorization of matrix elements in the effective theory ...

  19. Entropy generation and momentum transfer in the superconductor-normal and normal-superconductor phase transformations and the consistency of the conventional theory of superconductivity

    Science.gov (United States)

    Hirsch, J. E.

    2018-05-01

    Since the discovery of the Meissner effect, the superconductor to normal (S-N) phase transition in the presence of a magnetic field is understood to be a first-order phase transformation that is reversible under ideal conditions and obeys the laws of thermodynamics. The reverse (N-S) transition is the Meissner effect. This implies in particular that the kinetic energy of the supercurrent is not dissipated as Joule heat in the process where the superconductor becomes normal and the supercurrent stops. In this paper, we analyze the entropy generation and the momentum transfer between the supercurrent and the body in the S-N transition and the N-S transition as described by the conventional theory of superconductivity. We find that it is not possible to explain the transition in a way that is consistent with the laws of thermodynamics unless the momentum transfer between the supercurrent and the body occurs with zero entropy generation, for which the conventional theory of superconductivity provides no mechanism. Instead, we point out that the alternative theory of hole superconductivity does not encounter such difficulties.

  20. Geometric representation of the generator of duality in massless and massive p-form field theories

    International Nuclear Information System (INIS)

    Contreras, Ernesto; Martinez, Yisely; Leal, Lorenzo

    2010-01-01

    We study the invariance under duality transformations in massless and massive p-form field theories and obtain the Noether generators of the infinitesimal transformations that correspond to this symmetry. These generators can be realized in geometrical representations that generalize the loop representation of the Maxwell field, allowing for a geometrical interpretation which is studied.

  1. Towards a Robuster Interpretive Parsing: learning from overt forms in Optimality Theory

    NARCIS (Netherlands)

    Biró, T.

    2013-01-01

    The input data to grammar learning algorithms often consist of overt forms that do not contain full structural descriptions. This lack of information may contribute to the failure of learning. Past work on Optimality Theory introduced Robust Interpretive Parsing (RIP) as a partial solution to this

  2. A resource-based theory of market structure and organizational form

    NARCIS (Netherlands)

    van Witteloostuijn, A.; Boone, C.A.J.J.

    We argue that combining the insights from both the industrial organization and organizational ecology perspectives is likely to produce value added. We develop a resource-based theory of market structure, where resources pertain to the environmental assets (together forming the resource space)

  3. Theory of Inclusive Scattering of Polarized Electrons by Polarized $^{3}$He and the Neutron Form Factors

    OpenAIRE

    Atti, C. Ciofi degli; Pace, E.; Salmé, G.

    1993-01-01

    The theory of inclusive lepton scattering of polarized leptons by polarized J = 1/2 hadrons is presented and the origin of different expressions for the polarized nuclear response function appearing in the literature is explained. The sensitivity of the longitudinal asymmetry upon the neutron form factors is investigated.

  4. Simple Theory for the Dynamics of Mean-Field-Like Models of Glass-Forming Fluids

    Science.gov (United States)

    Szamel, Grzegorz

    2017-10-01

    We propose a simple theory for the dynamics of model glass-forming fluids, which should be solvable using a mean-field-like approach. The theory is based on transparent physical assumptions, which can be tested in computer simulations. The theory predicts an ergodicity-breaking transition that is identical to the so-called dynamic transition predicted within the replica approach. Thus, it can provide the missing dynamic component of the random first order transition framework. In the large-dimensional limit the theory reproduces the result of a recent exact calculation of Maimbourg et al. [Phys. Rev. Lett. 116, 015902 (2016), 10.1103/PhysRevLett.116.015902]. Our approach provides an alternative, physically motivated derivation of this result.

  5. A qualitative systematic review of studies using the normalization process theory to research implementation processes.

    Science.gov (United States)

    McEvoy, Rachel; Ballini, Luciana; Maltoni, Susanna; O'Donnell, Catherine A; Mair, Frances S; Macfarlane, Anne

    2014-01-02

    There is a well-recognized need for greater use of theory to address research translational gaps. Normalization Process Theory (NPT) provides a set of sociological tools to understand and explain the social processes through which new or modified practices of thinking, enacting, and organizing work are implemented, embedded, and integrated in healthcare and other organizational settings. This review of NPT offers readers the opportunity to observe how, and in what areas, a particular theoretical approach to implementation is being used. In this article we review the literature on NPT in order to understand what interventions NPT is being used to analyze, how NPT is being operationalized, and the reported benefits, if any, of using NPT. Using a framework analysis approach, we conducted a qualitative systematic review of peer-reviewed literature using NPT. We searched 12 electronic databases and all citations linked to six key NPT development papers. Grey literature/unpublished studies were not sought. Limitations of English language, healthcare setting and year of publication 2006 to June 2012 were set. Twenty-nine articles met the inclusion criteria; in the main, NPT is being applied to qualitatively analyze a diverse range of complex interventions, many beyond its original field of e-health and telehealth. The NPT constructs have high stability across settings and, notwithstanding challenges in applying NPT in terms of managing overlaps between constructs, there is evidence that it is a beneficial heuristic device to explain and guide implementation processes. NPT offers a generalizable framework that can be applied across contexts with opportunities for incremental knowledge gain over time and an explicit framework for analysis, which can explain and potentially shape implementation processes. This is the first review of NPT in use and it generates an impetus for further and extended use of NPT. We recommend that in future NPT research, authors should explicate

  6. On the use and computation of the Jordan canonical form in system theory

    Science.gov (United States)

    Sridhar, B.; Jordan, D.

    1974-01-01

    This paper investigates various aspects of the application of the Jordan canonical form of a matrix in system theory and develops a computational approach to determining the Jordan form for a given matrix. Applications include pole placement, controllability and observability studies, serving as an intermediate step in yielding other canonical forms, and theorem proving. The computational method developed in this paper is both simple and efficient. The method is based on the definition of a generalized eigenvector and a natural extension of Gauss elimination techniques. Examples are included for demonstration purposes.

  7. Geometric Methods in the Algebraic Theory of Quadratic Forms : Summer School

    CERN Document Server

    2004-01-01

    The geometric approach to the algebraic theory of quadratic forms is the study of projective quadrics over arbitrary fields. Function fields of quadrics have been central to the proofs of fundamental results since the renewal of the theory by Pfister in the 1960's. Recently, more refined geometric tools have been brought to bear on this topic, such as Chow groups and motives, and have produced remarkable advances on a number of outstanding problems. Several aspects of these new methods are addressed in this volume, which includes - an introduction to motives of quadrics by Alexander Vishik, with various applications, notably to the splitting patterns of quadratic forms under base field extensions; - papers by Oleg Izhboldin and Nikita Karpenko on Chow groups of quadrics and their stable birational equivalence, with application to the construction of fields which carry anisotropic quadratic forms of dimension 9, but none of higher dimension; - a contribution in French by Bruno Kahn which lays out a general fra...

  8. Form factor of relativistic two-particle system and covariant hamiltonian formulation of quantum field theory

    International Nuclear Information System (INIS)

    Skachkov, N.; Solovtsov, I.

    1979-01-01

    Based on the hamiltonian formulation of quantum field theory proposed by Kadyshevsky the three-dimensional relativistic approach is developed for describing the form factors of composite systems. The main features of the diagram technique appearing in the covariant hamiltonian formulation of field theory are discussed. The three-dimensional relativistic equation for the vertex function is derived and its connection with that for the quasipotential wave function is found. The expressions are obtained for the form factor of the system through equal-time two-particle wave functions both in momentum and relativistic configurational representations. An explicit expression for the form factor is found for the case of two-particle interaction through the Coulomb potential

  9. Post-UV colony-forming ability of normal fibroblast strains and of the xeroderma pigmentosum group G strain

    International Nuclear Information System (INIS)

    Barrett, S.F.; Tarone, R.E.; Moshell, A.N.; Ganges, M.B.; Robbins, J.H.

    1981-01-01

    In xeroderma pigmentosum, an inherited disorder of defective DNA repair, post-uv colony-forming ability of fibroblasts from patients in complementation groups A through F correlates with the patients' neurological status. The first xeroderma pigmentosum patient assigned to the recently discovered group G had the neurological abnormalities of XP. Researchers have determined the post-uv colony-forming ability of cultured fibroblasts from this patient and from 5 more control donors. Log-phase fibroblasts were irradiated with 254 nm uv light from a germicidal lamp, trypsinized, and replated at known densities. After 2 to 4 weeks' incubation the cells were fixed, stained and scored for colony formation. The strains' post-uv colony-forming ability curves were obtained by plotting the log of the percent remaining post-uv colony-forming ability as a function of the uv dose. The post-uv colony-forming ability of 2 of the 5 new normal strains was in the previously defined control donor zone, but that of the other 3 extended down to the level of the most resistant xeroderma pigmentosum strain. The post-uv colony-forming ability curve of the group G fibroblasts was not significantly different from the curves of the group D fibroblast strains from patients with clinical histories similar to that of the group G patient

  10. Bonding in Mercury Molecules Described by the Normalized Elimination of the Small Component and Coupled Cluster Theory

    NARCIS (Netherlands)

    Cremer, Dieter; Kraka, Elfi; Filatov, Michael

    2008-01-01

    Bond dissociation energies (BDEs) of neutral HgX and cationic HgX(+) molecules range from less than a kcal mol(-1) to as much as 60 kcal mol(-1). Using NESCICCCSD(T) [normalized elimination of the small component and coupled-cluster theory with all single and double excitations and a perturbative

  11. The method of normal forms for singularly perturbed systems of Fredholm integro-differential equations with rapidly varying kernels

    Energy Technology Data Exchange (ETDEWEB)

    Bobodzhanov, A A; Safonov, V F [National Research University " Moscow Power Engineering Institute" , Moscow (Russian Federation)

    2013-07-31

    The paper deals with extending the Lomov regularization method to classes of singularly perturbed Fredholm-type integro-differential systems, which have not so far been studied. In these the limiting operator is discretely noninvertible. Such systems are commonly known as problems with unstable spectrum. Separating out the essential singularities in the solutions to these problems presents great difficulties. The principal one is to give an adequate description of the singularities induced by 'instability points' of the spectrum. A methodology for separating singularities by using normal forms is developed. It is applied to the above type of systems and is substantiated in these systems. Bibliography: 10 titles.

  12. The three-loop form factor in N=4 super Yang-Mills theory

    Energy Technology Data Exchange (ETDEWEB)

    Gehrmann, Thomas [Universitaet Zuerich (Switzerland); Henn, Johannes [IAS Princeton (United States); Huber, Tobias [Universitaet Siegen (Germany)

    2012-07-01

    We present the calculation of the Sudakov form factor in N=4 super Yang-Mills theory to the three-loop order. At leading colour, the latter is expressed in terms of planar and non-planar loop integrals. We show that it is possible to choose a representation in which each loop integral has uniform transcendentality in the Riemann {zeta}-function. We comment on the expected exponentiation of the infrared divergences and the values of the three-loop cusp and collinear anomalous dimensions in dimensional regularisation. We also compare the form factor in N=4 super Yang-Mills to the leading transcendentality pieces of the quark and gluon form factor in QCD. Finally, we investigate the ultraviolet properties of the form factor in D>4 dimensions.

  13. Evaluation of Forming Limit by the 3 Dimensional Local Bifurcation Theory

    International Nuclear Information System (INIS)

    Nishimura, Ryuichi; Nakazawa, Yoshiaki; Ito, Koichi; Uemura, Gen; Mori, Naomichi

    2007-01-01

    A theoretical prediction and evaluation method for the sheet metal formability is developed on the basis of the three-dimensional local bifurcation theory previously proposed by authors. The forming limit diagram represented on the plane defined by the ratio of stress component to work-hardening rate is perfectly independent of plastic strain history. The upper and the lower limit of the sheet formability are indicated by the 3D critical line and the Stoeren-Rice's critical line on this plane, respectively. In order to verify the above mentioned behavior of the proposed forming limit diagram, the experimental research is also conducted. From the standpoint of the mechanical instability theory, a new concept called instability factor is introduced. It represents a degree of acceleration by current stress for developing the local bifurcation mode toward a fracture. The instability factor provides a method to evaluate a forming allowance which is useful to appropriate identification for a forming limit and to optimize the forming condition. The proposed criterion provides not only the moment to initiate the necking but also the local bifurcation mode vector and the direction of necking line

  14. Calculation of the hyperfine interaction using an effective-operator form of many-body theory

    International Nuclear Information System (INIS)

    Garpman, S.; Lindgren, I.; Lindgren, J.; Morrison, J.

    1975-01-01

    The effective-operator form of many-body theory is reviewed and applied to the calculation of the hyperfine structure. Numerical results are given for the 2p, 3p, and 4p excited states of Li and the 3p state of Na. This is the first complete calculation of the hyperfine structure using an effective-operator form of perturbation theory. As in the Brueckner-Goldstone form of many-body theory, the various terms in the perturbation expansion are represented by Feynman diagrams which correspond to basic physical processes. The angular part of the perturbation diagrams are evaluated by taking advantage of the formal analogy between the Feynman diagrams and the angular-momentum diagrams, introduced by Jucys et al. The radial part of the diagrams is calculated by solving one- and two-particle equations for the particular linear combination of excited states that contribute to the Feynman diagrams. In this way all second- and third-order effects are accurately evaluated without explicitly constructing the excited orbitals. For the 2p state of Li our results are in agreement with the calculations of Nesbet and of Hameed and Foley. However, our quadrupole calculation disagrees with the work of Das and co-workers. The many-body results for Li and Na are compared with semiempirical methods for evaluating the quadrupole moment from the hyperfine interaction, and a new quadrupole moment of 23 Na is given

  15. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game

    Directory of Open Access Journals (Sweden)

    Adam Karbowski

    2017-09-01

    Full Text Available The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants’ attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  16. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game.

    Science.gov (United States)

    Karbowski, Adam; Ramsza, Michał

    2017-01-01

    The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants' attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  17. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    Directory of Open Access Journals (Sweden)

    Casault Sébastien

    2016-05-01

    Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture

  18. Modal analysis of inter-area oscillations using the theory of normal modes

    Energy Technology Data Exchange (ETDEWEB)

    Betancourt, R.J. [School of Electromechanical Engineering, University of Colima, Manzanillo, Col. 28860 (Mexico); Barocio, E. [CUCEI, University of Guadalajara, Guadalajara, Jal. 44480 (Mexico); Messina, A.R. [Graduate Program in Electrical Engineering, Cinvestav, Guadalajara, Jal. 45015 (Mexico); Martinez, I. [State Autonomous University of Mexico, Toluca, Edo. Mex. 50110 (Mexico)

    2009-04-15

    Based on the notion of normal modes in mechanical systems, a method is proposed for the analysis and characterization of oscillatory processes in power systems. The method is based on the property of invariance of modal subspaces and can be used to represent complex power system modal behavior by a set of decoupled, two-degree-of-freedom nonlinear oscillator equations. Using techniques from nonlinear mechanics, a new approach is outlined, for determining the normal modes (NMs) of motion of a general n-degree-of-freedom nonlinear system. Equations relating the normal modes and the physical velocities and displacements are developed from the linearized system model and numerical issues associated with the application of the technique are discussed. In addition to qualitative insight, this method can be utilized in the study of nonlinear behavior and bifurcation analyses. The application of these procedures is illustrated on a planning model of the Mexican interconnected system using a quadratic nonlinear model. Specifically, the use of normal mode analysis as a basis for identifying modal parameters, including natural frequencies and damping ratios of general, linear systems with n degrees of freedom is discussed. Comparisons to conventional linear analysis techniques demonstrate the ability of the proposed technique to extract the different oscillation modes embedded in the oscillation. (author)

  19. Theory of charge transport in diffusive normal metal conventional superconductor point contacts

    NARCIS (Netherlands)

    Tanaka, Y.; Golubov, Alexandre Avraamovitch; Kashiwaya, S.

    2003-01-01

    Tunneling conductance in diffusive normal (DN) metal/insulator/s-wave superconductor junctions is calculated for various situations by changing the magnitudes of the resistance and Thouless energy in DN and the transparency of the insulating barrier. The generalized boundary condition introduced by

  20. Theory of thermal and charge transport in diffusive normal metal / superconductor junctions

    NARCIS (Netherlands)

    Yokoyama, T.; Tanaka, Y.; Golubov, Alexandre Avraamovitch; Asano, Y.

    2005-01-01

    Thermal and charge transport in diffusive normal metal (DN)/insulator/s-, d-, and p-wave superconductor junctions are studied based on the Usadel equation with the Nazarov's generalized boundary condition. We derive a general expression of the thermal conductance in unconventional superconducting

  1. A mixture theory model of fluid and solute transport in the microvasculature of normal and malignant tissues. I. Theory.

    Science.gov (United States)

    Schuff, M M; Gore, J P; Nauman, E A

    2013-05-01

    In order to better understand the mechanisms governing transport of drugs, nanoparticle-based treatments, and therapeutic biomolecules, and the role of the various physiological parameters, a number of mathematical models have previously been proposed. The limitations of the existing transport models indicate the need for a comprehensive model that includes transport in the vessel lumen, the vessel wall, and the interstitial space and considers the effects of the solute concentration on fluid flow. In this study, a general model to describe the transient distribution of fluid and multiple solutes at the microvascular level was developed using mixture theory. The model captures the experimentally observed dependence of the hydraulic permeability coefficient of the capillary wall on the concentration of solutes present in the capillary wall and the surrounding tissue. Additionally, the model demonstrates that transport phenomena across the capillary wall and in the interstitium are related to the solute concentration as well as the hydrostatic pressure. The model is used in a companion paper to examine fluid and solute transport for the simplified case of an axisymmetric geometry with no solid deformation or interconversion of mass.

  2. Mechanical properties of novel forms of graphyne under strain: A density functional theory study

    Science.gov (United States)

    Majidi, Roya

    2017-06-01

    The mechanical properties of two forms of graphyne sheets named α-graphyne and α2-graphyne under uniaxial and biaxial strains were studied. In-plane stiffness, bulk modulus, and shear modulus were calculated based on density functional theory. The in-plane stiffness, bulk modulus, and shear modulus of α2-graphyne were found to be larger than that of α-graphyne. The maximum values of supported uniaxial and biaxial strains before failure were determined. The α-graphyne was entered into the plastic region with the higher magnitude of tension in comparison to α2-graphyne. The mechanical properties of α-graphyne family revealed that these forms of graphyne are proper materials for use in nanomechanical applications.

  3. Investigation of reliability, validity and normality Persian version of the California Critical Thinking Skills Test; Form B (CCTST

    Directory of Open Access Journals (Sweden)

    Khallli H

    2003-04-01

    Full Text Available Background: To evaluate the effectiveness of the present educational programs in terms of students' achieving problem solving, decision making and critical thinking skills, reliable, valid and standard instrument are needed. Purposes: To Investigate the Reliability, validity and Norm of CCTST Form.B .The California Critical Thinking Skills Test contain 34 multi-choice questions with a correct answer in the jive Critical Thinking (CT cognitive skills domain. Methods: The translated CCTST Form.B were given t0405 BSN nursing students ojNursing Faculties located in Tehran (Tehran, Iran and Shahid Beheshti Universitiesthat were selected in the through random sampling. In order to determine the face and content validity the test was translated and edited by Persian and English language professor and researchers. it was also confirmed by judgments of a panel of medical education experts and psychology professor's. CCTST reliability was determined with internal consistency and use of KR-20. The construct validity of the test was investigated with factor analysis and internal consistency and group difference. Results: The test coefficien for reliablity was 0.62. Factor Analysis indicated that CCTST has been formed from 5 factor (element namely: Analysis, Evaluation, lriference, Inductive and Deductive Reasoning. Internal consistency method shows that All subscales have been high and positive correlation with total test score. Group difference method between nursing and philosophy students (n=50 indicated that there is meaningfUl difference between nursing and philosophy students scores (t=-4.95,p=0.OOO1. Scores percentile norm also show that percentile offifty scores related to 11 raw score and 95, 5 percentiles are related to 17 and 6 raw score ordinary. Conclusions: The Results revealed that the questions test is sufficiently reliable as a research tool, and all subscales measure a single construct (Critical Thinking and are able to distinguished the

  4. Normal loads program for aerodynamic lifting surface theory. [evaluation of spanwise and chordwise loading distributions

    Science.gov (United States)

    Medan, R. T.; Ray, K. S.

    1974-01-01

    A description of and users manual are presented for a U.S.A. FORTRAN 4 computer program which evaluates spanwise and chordwise loading distributions, lift coefficient, pitching moment coefficient, and other stability derivatives for thin wings in linearized, steady, subsonic flow. The program is based on a kernel function method lifting surface theory and is applicable to a large class of planforms including asymmetrical ones and ones with mixed straight and curved edges.

  5. Microscopic Theory of Coupled Slow Activated Dynamics in Glass-Forming Binary Mixtures.

    Science.gov (United States)

    Zhang, Rui; Schweizer, Kenneth S

    2018-04-05

    The Elastically Collective Nonlinear Langevin Equation theory for one-component viscous liquids and suspensions is generalized to treat coupled slow activated relaxation and diffusion in glass-forming binary sphere mixtures of any composition, size ratio, and interparticle interactions. A trajectory-level dynamical coupling parameter concept is introduced to construct two coupled dynamic free energy functions for the smaller penetrant and larger matrix particle. A two-step dynamical picture is proposed where the first-step process involves matrix-facilitated penetrant hopping quantified in a self-consistent manner based on a temporal coincidence condition. After penetrants dynamically equilibrate, the effectively one-component matrix particle dynamics is controlled by a new dynamic free energy (second-step process). Depending on the time scales associated with the first- and second-step processes, as well as the extent of matrix-correlated facilitation, distinct physical scenarios are predicted. The theory is implemented for purely hard-core interactions, and addresses the glass transition based on variable kinetic criteria, penetrant-matrix coupled activated relaxation, self-diffusion of both species, dynamic fragility, and shear elasticity. Testable predictions are made. Motivated by the analytic ultralocal limit idea derived for pure hard sphere fluids, we identify structure-thermodynamics-dynamics relationships. As a case study for molecule-polymer thermal mixtures, the chemically matched fully miscible polystyrene-toluene system is quantitatively studied based on a predictive mapping scheme. The resulting no-adjustable-parameter results for toluene diffusivity and the mixture glass transition temperature are in good agreement with experiment. The theory provides a foundation to treat diverse dynamical problems in glass-forming mixtures, including suspensions of colloids and nanoparticles, polymer-molecule liquids, and polymer nanocomposites.

  6. An approach to normal forms of Kuramoto model with distributed delays and the effect of minimal delay

    Energy Technology Data Exchange (ETDEWEB)

    Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)

    2015-09-25

    Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.

  7. Theory of mind and emotion-recognition functioning in autistic spectrum disorders and in psychiatric control and normal children.

    Science.gov (United States)

    Buitelaar, J K; van der Wees, M; Swaab-Barneveld, H; van der Gaag, R J

    1999-01-01

    The hypothesis was tested that weak theory of mind (ToM) and/or emotion recognition (ER) abilities are specific to subjects with autism. Differences in ToM and ER performance were examined between autistic (n = 20), pervasive developmental disorder-not otherwise specified (PDD-NOS) (n = 20), psychiatric control (n = 20), and normal children (n = 20). The clinical groups were matched person-to-person on age and verbal IQ. We used tasks for the matching and the context recognition of emotional expressions, and a set of first- and second-order ToM tasks. Autistic and PDD-NOS children could not be significantly differentiated from each other, nor could they be differentiated from the psychiatric controls with a diagnosis of ADHD (n = 9). The psychiatric controls with conduct disorder or dysthymia performed about as well as normal children. The variance in second-order ToM performance contributed most to differences between diagnostic groups.

  8. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization.

    Science.gov (United States)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-28

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  9. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization

    Science.gov (United States)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-01

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  10. Note: Determination of torsional spring constant of atomic force microscopy cantilevers: Combining normal spring constant and classical beam theory

    DEFF Research Database (Denmark)

    Álvarez-Asencio, R.; Thormann, Esben; Rutland, M.W.

    2013-01-01

    A technique has been developed for the calculation of torsional spring constants for AFM cantilevers based on the combination of the normal spring constant and plate/beam theory. It is easy to apply and allow the determination of torsional constants for stiff cantilevers where the thermal power s...... spectrum is difficult to obtain due to the high resonance frequency and low signal/noise ratio. The applicability is shown to be general and this simple approach can thus be used to obtain torsional constants for any beam shaped cantilever. © 2013 AIP Publishing LLC....

  11. The work of art as a manifestation of form: Formalistic materialism in the theory of art

    Directory of Open Access Journals (Sweden)

    Popović Radovan

    2015-01-01

    Full Text Available The basic purpose of this paper is to point out and theoretically illuminate the key moments of transformation and re-constitution of the art theory during the first half of XIX century, taking as its point of departure the role and influence of Organicism as a philosophico-reflexive basis, gnoseological postulate and methodological pattern. Sedlmayr's attempt at foundation of a comparative morphology of architectural styles, taking into account the three constitutive building elements as analytical tools of study of concrete-historical manifestations of style: material structure (building materials, technique and intended usability, is a matter of particular attention of the following analysis. Regarding Viollet-le-Duc, the stress is on the interpretation of his conception of architectural form as the application of logically founded general laws of statics and construction, with a special emphasis on his thesis of logical and purposely-rational correspondence of the outer form and the functional structure of the interior. Finally, an attempt is made at the clarification of the revival of Viollet-le-Duc's explanatory pattern within the dynamic and subjectivistic conceptual transformations of form in Fiedler's, von Hildebrandt's and Focillon's theoretical conceptions.

  12. On electromagnetic forming processes in finitely strained solids: Theory and examples

    Science.gov (United States)

    Thomas, J. D.; Triantafyllidis, N.

    2009-08-01

    The process of electromagnetic forming (EMF) is a high velocity manufacturing technique that uses electromagnetic (Lorentz) body forces to shape sheet metal parts. EMF holds several advantages over conventional forming techniques: speed, repeatability, one-sided tooling, and most importantly considerable ductility increase in several metals. Current modeling techniques for EMF processes are not based on coupled variational principles to simultaneously account for electromagnetic and mechanical effects. Typically, separate solutions to the electromagnetic (Maxwell) and motion (Newton) equations are combined in staggered or lock-step methods, sequentially solving the mechanical and electromagnetic problems. The present work addresses these issues by introducing a fully coupled Lagrangian (reference configuration) least-action variational principle, involving magnetic flux and electric potentials and the displacement field as independent variables. The corresponding Euler-Lagrange equations are Maxwell's and Newton's equations in the reference configuration, which are shown to coincide with their current configuration counterparts obtained independently by a direct approach. The general theory is subsequently simplified for EMF processes by considering the eddy current approximation. Next, an application is presented for axisymmetric EMF problems. It is shown that the proposed variational principle forms the basis of a variational integration numerical scheme that provides an efficient staggered solution algorithm. As an illustration a number of such processes are simulated, inspired by recent experiments of freely expanding uncoated and polyurea-coated aluminum tubes.

  13. Theory of emission spectra from metal films irradiated by low energy electrons near normal incidence

    International Nuclear Information System (INIS)

    Kretschmann, E.; Callcott, T.A.; Arakawa, E.T.

    1980-01-01

    The emission spectrum produced by low energy electrons incident on a rough metal surface has been calculated for a roughness auto-correlation function containing a prominent peak at a high wave vector. For low energy electrons near normal incidence, the high wavevector peak dominates the roughness coupled surface plasmon radiation (RCSPR) process. The calculation yields estimates of the ratio of RCSPR to transition radiation, the dependence of emission intensity on electron energy and the shape and position of the RCSPR peak. The most interesting result is that the high-wavevector roughness can split the RCSPR radiation into peaks lying above and below the asymptotic surface plasma frequency. The results are compared with data from Ag in the following paper. (orig.)

  14. Pion-nucleon form factor in the Chew-Low theory

    International Nuclear Information System (INIS)

    Ernst, D.J.; Johnson, M.B.

    1978-01-01

    We find a solution to the static Chew-Low theory of pion-nucleon scattering, avoiding the ''one-meson approximation.'' Our basic equation is crossing symmetric and may be solved for phase shifts delta (p) by standard numerical techniques, upon specifying a form factor v (p) and a set of inelasticities. With v (p) = exp(-p 2 /30) we reproduce experimental delta (p) for p/sub L/ < or = 1.2 GeV/c in the (3,3) state; in the (1,3) states and (3,1) states delta (p) compare well on the average but in the (1,1) state delta (p) have opposite signs. We show the importance of crossing symmetry and the coupling to inelastic channels, and we discuss the possibility of determining v (p) directly from elastic scattering by an inverse scattering formula

  15. Ideal flow theory for the double - shearing model as a basis for metal forming design

    Science.gov (United States)

    Alexandrov, S.; Trung, N. T.

    2018-02-01

    In the case of Tresca’ solids (i.e. solids obeying the Tresca yield criterion and its associated flow rule) ideal flows have been defined elsewhere as solenoidal smooth deformations in which an eigenvector field associated everywhere with the greatest principal stress (and strain rate) is fixed in the material. Under such conditions all material elements undergo paths of minimum plastic work, a condition which is often advantageous for metal forming processes. Therefore, the ideal flow theory is used as the basis of a procedure for the preliminary design of such processes. The present paper extends the theory of stationary planar ideal flow to pressure dependent materials obeying the double shearing model and the double slip and rotation model. It is shown that the original problem of plasticity reduces to a purely geometric problem. The corresponding system of equations is hyperbolic. The characteristic relations are integrated in elementary functions. In regions where one family of characteristics is straight, mapping between the principal lines and Cartesian coordinates is determined by linear ordinary differential equations. An illustrative example is provided.

  16. The two-fermion relativistic wave equations of Constraint Theory in the Pauli-Schroedinger form

    International Nuclear Information System (INIS)

    Mourad, J.; Sazdjian, H.

    1994-01-01

    The two-fermion relativistic wave equations of Constraint Theory are reduced, after expressing the components of the 4x4 matrix wave function in terms of one of the 2x2 components, to a single equation of the Pauli-Schroedinger type, valid for all sectors of quantum numbers. The potentials that are present belong to the general classes of scalar, pseudoscalar and vector interactions and are calculable in perturbation theory from Feynman diagrams. In the limit when one of the masses becomes infinite, the equation reduces to the two-component form of the one-particle Dirac equation with external static potentials. The Hamiltonian, to order 1/c 2 , reproduces most of the known theoretical results obtained by other methods. The gauge invariance of the wave equation is checked, to that order, in the case of QED. The role of the c.m. energy dependence of the relativistic interquark confining potential is emphasized and the structure of the Hamiltonian, to order 1/c 2 , corresponding to confining scalar potentials, is displayed. (authors). 32 refs., 2 figs

  17. Research approach for forming a new typology of spatial planning theory

    Directory of Open Access Journals (Sweden)

    Bulajić Vladan

    2011-01-01

    Full Text Available What is being suggested in this paper is the research approach for the classification of theoretical contributions in the scientific domain of the spatial planning. Typology is a multidimensional classification, actually it is the framework for the understanding of the subject area, theory and practice, ideas and methodologies. The complex approach is needed to organize the complex and diverse domain of spatial planning theory, which has been shaped by different schools of thought and the influences of the related scientific disciplines. It has been suggested that the research approach becomes the bridge between two cultures, in other words it should be the synthesis of the qualitative and quantitative methods of the typology construction. With the analysis of the existing typologies, which are quantitatively derived, the chosen concepts will be improved and completed due to the computerized statistical analysis of the appropriate bibliometrical data. Moreover, the procedure in the opposite direction will be used, which also connects empiric types with their conceptual counterparts. With that approach, the main aim is to achieve the comprehensive classification scheme, which will take part of the platform for integration of the interdisciplinary approach in the spatial planning domain. That concept of the research belongs to the wider approach that has got the aim that with the scientific innovations and imaginations bring about the solving of the problems and challenges that the spatial planning faces with. The forming of the new typology is the first step in that direction.

  18. Development and validation of an item response theory-based Social Responsiveness Scale short form.

    Science.gov (United States)

    Sturm, Alexandra; Kuhfeld, Megan; Kasari, Connie; McCracken, James T

    2017-09-01

    Research and practice in autism spectrum disorder (ASD) rely on quantitative measures, such as the Social Responsiveness Scale (SRS), for characterization and diagnosis. Like many ASD diagnostic measures, SRS scores are influenced by factors unrelated to ASD core features. This study further interrogates the psychometric properties of the SRS using item response theory (IRT), and demonstrates a strategy to create a psychometrically sound short form by applying IRT results. Social Responsiveness Scale analyses were conducted on a large sample (N = 21,426) of youth from four ASD databases. Items were subjected to item factor analyses and evaluation of item bias by gender, age, expressive language level, behavior problems, and nonverbal IQ. Item selection based on item psychometric properties, DIF analyses, and substantive validity produced a reduced item SRS short form that was unidimensional in structure, highly reliable (α = .96), and free of gender, age, expressive language, behavior problems, and nonverbal IQ influence. The short form also showed strong relationships with established measures of autism symptom severity (ADOS, ADI-R, Vineland). Degree of association between all measures varied as a function of expressive language. Results identified specific SRS items that are more vulnerable to non-ASD-related traits. The resultant 16-item SRS short form may possess superior psychometric properties compared to the original scale and emerge as a more precise measure of ASD core symptom severity, facilitating research and practice. Future research using IRT is needed to further refine existing measures of autism symptomatology. © 2017 Association for Child and Adolescent Mental Health.

  19. Theory of the normal modes of vibrations in the lanthanide type crystals

    Science.gov (United States)

    Acevedo, Roberto; Soto-Bubert, Andres

    2008-11-01

    For the lanthanide type crystals, a vast and rich, though incomplete amount of experimental data has been accumulated, from linear and non linear optics, during the last decades. The main goal of the current research work is to report a new methodology and strategy to put forward a more representative approach to account for the normal modes of vibrations for a complex N-body system. For illustrative purposes, the chloride lanthanide type crystals Cs2NaLnCl6 have been chosen and we develop new convergence tests as well as a criterion to deal with the details of the F-matrix (potential energy matrix). A novel and useful concept of natural potential energy distributions (NPED) is introduced and examined throughout the course of this work. The diagonal and non diagonal contributions to these NPED-values, are evaluated for a series of these crystals explicitly. Our model is based upon a total of seventy two internal coordinates and ninety eight internal Hooke type force constants. An optimization mathematical procedure is applied with reference to the series of chloride lanthanide crystals and it is shown that the strategy and model adopted is sound from both a chemical and a physical viewpoints. We can argue that the current model is able to accommodate a number of interactions and to provide us with a very useful physical insight. The limitations and advantages of the current model and the most likely sources for improvements are discussed in detail.

  20. Bicervical normal uterus with normal vagina | Okeke | Annals of ...

    African Journals Online (AJOL)

    To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...

  1. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Seyyede Zohreh Ziatabar Ahmadi

    2015-12-01

    Full Text Available Objective: Theory of mind (ToM or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children.Method: We searched MEDLINE (PubMed interface, Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP.Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric

  2. Conference Elliptic Curves, Modular Forms and Iwasawa Theory : in honour of John H. Coates' 70th birthday

    CERN Document Server

    Zerbes, Sarah

    2016-01-01

    Celebrating one of the leading figures in contemporary number theory – John H. Coates – on the occasion of his 70th birthday, this collection of contributions covers a range of topics in number theory, concentrating on the arithmetic of elliptic curves, modular forms, and Galois representations. Several of the contributions in this volume were presented at the conference Elliptic Curves, Modular Forms and Iwasawa Theory, held in honour of the 70th birthday of John Coates in Cambridge, March 25-27, 2015. The main unifying theme is Iwasawa theory, a field that John Coates himself has done much to create. This collection is indispensable reading for researchers in Iwasawa theory, and is interesting and valuable for those in many related fields. .

  3. Coronary heart disease patients transitioning to a normal life: perspectives and stages identified through a grounded theory approach.

    Science.gov (United States)

    Najafi Ghezeljeh, Tahereh; Yadavar Nikravesh, Mansoureh; Emami, Azita

    2014-02-01

    To explore how Iranian patients with coronary heart disease experience their lives. Coronary heart disease is a leading cause of death in Iran and worldwide. Understanding qualitatively how patients experience the acute and postacute stages of this chronic condition is essential knowledge for minimising the negative consequences of coronary heart disease. Qualitative study using grounded theory for the data analysis. Data for this study were collected through individual qualitative interviews with 24 patients with coronary heart disease, conducted between January 2009 and January 2011. Patients with angina pectoris were selected for participation through purposive sampling, and sample size was determined by data saturation. Data analysis began with initial coding and continued with focused coding. Categories were determined, and the core category was subsequently developed and finalised. The main categories of the transition from acute phase to a modified or 'new normal' life were: (1) Loss of normal life. Experiencing emotions and consequences of illness; (2) Coming to terms. Using coping strategies; (3) Recreating normal life. Healthcare providers must correctly recognise the stages of transition patients navigate while coping with coronary heart disease to support and educate them appropriately throughout these stages. Patients with coronary heart disease lose their normal lives and must work towards recreating a revised life using coping strategies that enable them to come to terms with their situations. By understanding Iranian patients' experiences, healthcare providers and especially nurses can use the information to support and educate patients with coronary heart disease on how to more effectively deal with their illness and its consequences. © 2013 John Wiley & Sons Ltd.

  4. Promoting health workers' ownership of infection prevention and control: using Normalization Process Theory as an interpretive framework.

    Science.gov (United States)

    Gould, D J; Hale, R; Waters, E; Allen, D

    2016-12-01

    All health workers should take responsibility for infection prevention and control (IPC). Recent reduction in key reported healthcare-associated infections in the UK is impressive, but the determinants of success are unknown. It is imperative to understand how IPC strategies operate as new challenges arise and threats of antimicrobial resistance increase. The authors undertook a retrospective, independent evaluation of an action plan to enhance IPC and 'ownership' (individual accountability) for IPC introduced throughout a healthcare organization. Twenty purposively selected informants were interviewed. Data were analysed inductively. Normalization Process Theory (NPT) was applied to interpret the findings and explain how the action plan was operating. Six themes emerged through inductive analysis. Theme 1: 'Ability to make sense of ownership' provided evidence of the first element of NPT (coherence). Regardless of occupational group or seniority, informants understood the importance of IPC ownership and described what it entailed. They identified three prerequisites: 'Always being vigilant' (Theme 2), 'Importance of access to information' (Theme 3) and 'Being able to learn together in a no-blame culture' (Theme 4). Data relating to each theme provided evidence of the other elements of NPT that are required to embed change: planning implementation (cognitive participation), undertaking the work necessary to achieve change (collective action), and reflection on what else is needed to promote change as part of continuous quality improvement (reflexive monitoring). Informants identified barriers (e.g. workload) and facilitators (clear lines of communication and expectations for IPC). Eighteen months after implementing the action plan incorporating IPC ownership, there was evidence of continuous service improvement and significant reduction in infection rates. Applying a theory that identifies factors that promote/inhibit routine incorporation ('normalization') of IPC

  5. Holographic Dark Energy in Brans-Dicke Theory with Logarithmic Form of Scalar Field

    Science.gov (United States)

    Singh, C. P.; Kumar, Pankaj

    2017-10-01

    In this paper, an interacting holographic dark energy model with Hubble horizon as an infra-red cut-off is considered in the framework of Brans-Dicke theory. We assume the Brans-Dicke scalar field as a logarithmic form ϕ = ϕ 0 l n( α + β a), where a is the scale factor, α and β are arbitrary constants, to interpret the physical phenomena of the Universe. The equation of state parameter w h and deceleration parameter q are obtained to discuss the dynamics of the evolution of the Universe. We present a unified model of holographic dark energy which explains the early time acceleration (inflation), medieval time deceleration and late time acceleration. It is also observed that w h may cross the phantom divide line in the late time evolution. We also discuss the cosmic coincidence problem. We obtain a time-varying density ratio of holographic dark energy to dark matter which is a constant of order one (r˜ O(1)) during early and late time evolution, and may evolve sufficiently slow at present time. Thus, the model successfully resolves the cosmic coincidence problem.

  6. Augmented superfield approach to gauge-invariant massive 2-form theory

    International Nuclear Information System (INIS)

    Kumar, R.; Krishna, S.

    2017-01-01

    We discuss the complete sets of the off-shell nilpotent (i.e. s 2 (a)b = 0) and absolutely anticommuting (i.e. s b s ab + s ab s b = 0) Becchi-Rouet-Stora-Tyutin (BRST) (s b ) and anti-BRST (s ab ) symmetries for the (3 + 1)-dimensional (4D) gauge-invariant massive 2-form theory within the framework of an augmented superfield approach to the BRST formalism. In this formalism, we obtain the coupled (but equivalent) Lagrangian densities which respect both BRST and anti-BRST symmetries on the constrained hypersurface defined by the Curci-Ferrari type conditions. The absolute anticommutativity property of the (anti-) BRST transformations (and corresponding generators) is ensured by the existence of the Curci-Ferrari type conditions which emerge very naturally in this formalism. Furthermore, the gauge-invariant restriction plays a decisive role in deriving the proper(anti-) BRST transformations for the Stueckelberg-like vector field. (orig.)

  7. Augmented superfield approach to gauge-invariant massive 2-form theory

    Science.gov (United States)

    Kumar, R.; Krishna, S.

    2017-06-01

    We discuss the complete sets of the off-shell nilpotent (i.e. s^2_{(a)b} = 0) and absolutely anticommuting (i.e. s_b s_{ab} + s_{ab} s_b = 0) Becchi-Rouet-Stora-Tyutin (BRST) (s_b) and anti-BRST (s_{ab}) symmetries for the (3+1)-dimensional (4D) gauge-invariant massive 2-form theory within the framework of an augmented superfield approach to the BRST formalism. In this formalism, we obtain the coupled (but equivalent) Lagrangian densities which respect both BRST and anti-BRST symmetries on the constrained hypersurface defined by the Curci-Ferrari type conditions. The absolute anticommutativity property of the (anti-) BRST transformations (and corresponding generators) is ensured by the existence of the Curci-Ferrari type conditions which emerge very naturally in this formalism. Furthermore, the gauge-invariant restriction plays a decisive role in deriving the proper (anti-) BRST transformations for the Stückelberg-like vector field.

  8. The Role of Sexual Disorder in FormingDivorce Process: a Grounded Theory Study

    Directory of Open Access Journals (Sweden)

    H enayat

    2016-03-01

    Full Text Available Background & Aim: consequences resulting in the increase of the divorce rate in the Iranian society, which surrounded all individuals, families and society, has prepared the background of the present study. The main purpose of the present study was demonstrating a paradigm model of the role of sexual disorder in forming the divorce process among men in Iran. Method: The present study was conducted by applying a qualitative method using the grounded theory approach in Gachsaran, Iran, in 2014. The participants of the study were 15 divorced men who were selected using purposeful sampling. Data were gathered using depth interview, and were analyzed with coding paradigm. Results: according to the coding paradigm, men's sexual dysfunctional as a causal condition, physical disease, mind stress, and age difference between couples as a contextual condition, culture of drug abuse for satisfaction of sexual relation, and infidelity as an interventional condition, caused disorder in their sexual relationship. These men and their wives applied various strategies, such as drug abuse, disconnected sexual relation with each other, and latent violence in order to counteract this phenomenon. Conclusion: The narrative of participants of the present study revealed that disorder in their sexual relation led to other social problems, such as drug abuse, domestic violence, and infidelity in their families. Moreover, these problems led to other disorders in their sexual relationship with their wives, which eventually ended to emotional, sexual and legal divorce.

  9. Augmented superfield approach to gauge-invariant massive 2-form theory

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, R. [University of Delhi, Department of Physics and Astrophysics, New Delhi (India); Krishna, S. [Indian Institute of Science Education and Research Mohali, Manauli, Punjab (India)

    2017-06-15

    We discuss the complete sets of the off-shell nilpotent (i.e. s{sup 2}{sub (a)b} = 0) and absolutely anticommuting (i.e. s{sub b}s{sub ab} + s{sub ab}s{sub b} = 0) Becchi-Rouet-Stora-Tyutin (BRST) (s{sub b}) and anti-BRST (s{sub ab}) symmetries for the (3 + 1)-dimensional (4D) gauge-invariant massive 2-form theory within the framework of an augmented superfield approach to the BRST formalism. In this formalism, we obtain the coupled (but equivalent) Lagrangian densities which respect both BRST and anti-BRST symmetries on the constrained hypersurface defined by the Curci-Ferrari type conditions. The absolute anticommutativity property of the (anti-) BRST transformations (and corresponding generators) is ensured by the existence of the Curci-Ferrari type conditions which emerge very naturally in this formalism. Furthermore, the gauge-invariant restriction plays a decisive role in deriving the proper(anti-) BRST transformations for the Stueckelberg-like vector field. (orig.)

  10. Performances on a cognitive theory of mind task: specific decline or general cognitive deficits? Evidence from normal aging.

    Science.gov (United States)

    Fliss, Rafika; Lemerre, Marion; Mollard, Audrey

    2016-06-01

    Compromised theory of mind (ToM) can be explained either by a failure to implement specific representational capacities (mental state representations) or by more general executive selection demands. In older adult populations, evidence supporting affected executive functioning and cognitive ToM in normal aging are reported. However, links between these two functions remain unclear. In the present paper, we address these shortcomings by using a specific task of ToM and classical executive tasks. We studied, using an original cognitive ToM task, the effect of age on ToM performances, in link with the progressive executive decline. 96 elderly participants were recruited. They were asked to perform a cognitive ToM task, and 5 executive tests (Stroop test and Hayling Sentence Completion Test to appreciate inhibitory process, Trail Making Test and Verbal Fluency for shifting assessment and backward span dedicated to estimate working memory capacity). The results show changes in cognitive ToM performance according to executive demands. Correlational studies indicate a significant relationship between ToM performance and the selected executive measures. Regression analyzes demonstrates that level of vocabulary and age as the best predictors of ToM performance. The results are consistent with the hypothesis that ToM deficits are related to age-related domain-general decline rather than as to a breakdown in specialized representational system. The implications of these findings for the nature of social cognition tests in normal aging are also discussed.

  11. Systems of differential forms, including Kuranishi's theory of total prolongations. Technical report No. 2

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, H.H.

    1956-01-01

    The theory of differential forms and their integral manifolds was created by E. Cartan in order to study the differential equations which occur in Lie groups and differential geometry. We present here a modern outline of this theory using refinements and notations taken from recent papers of Y. Matsushima and M. Kuranishi. Section III is devoted to an exposition of a paper of Kuranishi on total prolongations which was presented at the 1956 Summer Institute in Differential Geometry at the University of Washington.

  12. Analysis of deposit of physiological and psychological theories of forming motive skills on development of theory of teaching to the physical drills

    Directory of Open Access Journals (Sweden)

    Khudolii O.N.

    2010-06-01

    Full Text Available Influence of different theories is certain on the construction of process of teaching motive actions of young gymnasts. The results of complete factor experiment are presented. They allowed to formulate principle settings to the construction of process of teaching the physical drills of young gymnasts at the age 7-13 years old. On the construction of teaching process influences more in all: theory of functional systems (43%, р<0,001, theory of construction of motions (41%,р<0,001, theory of management mastering of knowledge, forming actions and concepts (2,6%, р<0,05. The positive effect of teaching depends on the successive decision of tasks of teaching and rational application of methods.

  13. "A powerful, opinion-forming public? Rethinking the Habermasian public sphere in a perspective of feminist theory and citizenship"

    DEFF Research Database (Denmark)

    Fiig, Christina

    2011-01-01

    The article’s main argument is that a public sphere forms a constructive arena for citizenship practice if we by citizenship understand four components: rights, responsibilities, participation and identity as formulated by Gerard Delanty. The Habermasian (re)working of the concept remains...... an essential contribution to theories of democracy and of political participation. With this in mind, the author’s ambition is to address and to rework a specific type of public: an opinion-forming public within a framework of feminist political theory. The article is informed by the assumption that an opinion...

  14. Can Lorentz-breaking fermionic condensates form in large N strongly-coupled Lattice Gauge Theories?

    OpenAIRE

    Tomboulis, E. T.

    2010-01-01

    The possibility of Lorentz symmetry breaking (LSB) has attracted considerable attention in recent years for a variety of reasons, including the attractive prospect of the graviton as a Goldstone boson. Though a number of effective field theory analyses of such phenomena have recently been given it remains an open question whether they can take place in an underlying UV complete theory. Here we consider the question of LSB in large N lattice gauge theories in the strong coupling limit. We appl...

  15. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  16. Implementing nutrition guidelines for older people in residential care homes: a qualitative study using Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    Bamford Claire

    2012-10-01

    Full Text Available Abstract Background Optimizing the dietary intake of older people can prevent nutritional deficiencies and diet-related diseases, thereby improving quality of life. However, there is evidence that the nutritional intake of older people living in care homes is suboptimal, with high levels of saturated fat, salt, and added sugars. The UK Food Standards Agency therefore developed nutrient- and food-based guidance for residential care homes. The acceptability of these guidelines and their feasibility in practice is unknown. This study used the Normalization Process Theory (NPT to understand the barriers and facilitators to implementing the guidelines and inform future implementation. Methods We conducted a process evaluation in five care homes in the north of England using qualitative methods (observation and interviews to explore the views of managers, care staff, catering staff, and domestic staff. Data were analyzed thematically and discussed in data workshops; emerging themes were then mapped to the constructs of NPT. Results Many staff perceived the guidelines as unnecessarily restrictive and irrelevant to older people. In terms of NPT, the guidelines simply did not make sense (coherence, and as a result, relatively few staff invested in the guidelines (cognitive participation. Even where staff supported the guidelines, implementation was hampered by a lack of nutritional knowledge and institutional support (collective action. Finally, the absence of observable benefits to clients confirmed the negative preconceptions of many staff, with limited evidence of reappraisal following implementation (reflexive monitoring. Conclusions The successful implementation of the nutrition guidelines requires that the fundamental issues relating to their perceived value and fit with other priorities and goals be addressed. Specialist support is needed to equip staff with the technical knowledge and skills required for menu analysis and development and to

  17. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  18. The electric dipole form factor of the nucleon in chiral perturbation theory to sub-leading order

    NARCIS (Netherlands)

    Mereghetti, E.; de Vries, J.; Hockings, W. H.; Maekawa, C. M.; van Kolck, U.

    2011-01-01

    The electric dipole form factor (EDFF) of the nucleon stemming from the QCD (theta) over bar term and from the quark color-electric dipole moments is calculated in chiral perturbation theory to sub-leading order. This is the lowest order in which the isoscalar EDFF receives a calculable,

  19. BIOCHEMICAL EFFECTS IN NORMAL AND STONE FORMING RATS TREATED WITH THE RIPE KERNEL JUICE OF PLANTAIN (MUSA PARADISIACA)

    Science.gov (United States)

    Devi, V. Kalpana; Baskar, R.; Varalakshmi, P.

    1993-01-01

    The effect of Musa paradisiaca stem kernel juice was investigated in experimental urolithiatic rats. Stone forming rats exhibited a significant elevation in the activities of two oxalate synthesizing enzymes - Glycollic acid oxidase and Lactate dehydrogenase. Deposition and excretion of stone forming constituents in kidney and urine were also increased in these rats. The enzyme activities and the level of crystalline components were lowered with the extract treatment. The extract also reduced the activities of urinary alkaline phosphatase, lactate dehydrogenase, r-glutamyl transferase, inorganic pyrophosphatase and β-glucuronidase in calculogenic rats. No appreciable changes were noticed with leucine amino peptidase activity in treated rats. PMID:22556626

  20. Contextual learning theory: Concrete form and a software prototype to improve early education.

    NARCIS (Netherlands)

    Mooij, Ton

    2016-01-01

    In 'contextual learning theory' three types of contextual conditions (differentiation of learning procedures and materials, integrated ICT support, and improvement of development and learning progress) are related to four aspects of the learning process (diagnostic, instructional, managerial, and

  1. Quasiclassical Theory of Spin Imbalance in a Normal Metal-Superconductor Heterostructure with a Spin-Active Interface

    International Nuclear Information System (INIS)

    Shevtsov, O; Löfwander, T

    2014-01-01

    Non-equilibrium phenomena in superconductors have attracted much attention since the first experiments on charge imbalance in the early 1970's. Nowadays a new promising line of research lies at an intersection between superconductivity and spintronics. Here we develop a quasiclassical theory of a single junction between a normal metal and a superconductor with a spin-active interface at finite bias voltages. Due to spin-mixing and spin-filtering effects of the interface a non-equilibrium magnetization (or spin imbalance) is induced at the superconducting side of the junction, which relaxes to zero in the bulk. A peculiar feature of the system is the presence of interface-induced Andreev bound states, which influence the magnitude and the decay length of spin imbalance. Recent experiments on spin and charge density separation in superconducting wires required external magnetic field for observing a spin signal via non-local measurements. Here, we propose an alternative way to observe spin imbalance without applying magnetic field

  2. ω→π0γ* and ϕ→π0γ* transition form factors in dispersion theory

    Science.gov (United States)

    Schneider, Sebastian P.; Kubis, Bastian; Niecknig, Franz

    2012-09-01

    We calculate the ω→π0γ* and ϕ→π0γ* electromagnetic transition form factors based on dispersion theory, relying solely on a previous dispersive analysis of the corresponding three-pion decays and the pion vector form factor. We compare our findings to recent measurements of the ω→π0μ+μ- decay spectrum by the NA60 collaboration, and strongly encourage experimental investigation of the Okubo-Zweig-Iizuka forbidden ϕ→π0ℓ+ℓ- decays in order to understand the strong deviations from vector-meson dominance found in these transition form factors.

  3. Form factors and the dilatation operator in N= 4 super Yang-Mills theory and its deformations

    International Nuclear Information System (INIS)

    Wilhelm, Matthias Oliver

    2016-01-01

    In the first part of this thesis, we study form factors of general gauge-invariant local composite operators in N=4 super Yang-Mills theory at various loop orders and for various numbers of external legs. We show how to use on-shell methods for their calculation and in particular extract the dilatation operator from the result. We also investigate the properties of the corresponding remainder functions. Moreover, we extend on-shell diagrams, a Grassmannian integral formulation and an integrability-based construction via R-operators to form factors, focussing on the chiral part of the stress-tensor supermultiplet as an example. In the second part, we study the β- and the γ i -deformation, which were respectively shown to be the most general supersymmetric and non-supersymmetric field-theory deformations of N=4 super Yang-Mills theory that are integrable at the level of the asymptotic Bethe ansatz. For these theories, a new kind of finite-size effect occurs, which we call prewrapping and which emerges from double-trace structures that are required in the deformed Lagrangians. While the β-deformation is conformal when the double-trace couplings are at their non-trivial IR fixed points, the γ i -deformation has running double-trace couplings without fixed points, which break conformal invariance even in the planar theory. Nevertheless, the γ i -deformation allows for highly non-trivial field-theoretic tests of integrability at arbitrarily high loop orders.

  4. B→D*lν and B→Dlν form factors in staggered chiral perturbation theory

    International Nuclear Information System (INIS)

    Laiho, Jack; Water, Ruth S. van de

    2006-01-01

    We calculate the B→D and B→D* form factors at zero recoil in staggered chiral perturbation theory. We consider heavy-light mesons in which only the light (u, d, or s) quark is staggered; current lattice simulations generally use a highly improved action such as the Fermilab or nonrelativistic QCD action for the heavy (b or c) quark. We work to lowest nontrivial order in the heavy-quark expansion and to one-loop order in the chiral expansion. We present results for a partially quenched theory with three sea quarks in which there are no mass degeneracies (the ''1+1+1'' theory) and for a partially quenched theory in which the u and d sea quark masses are equal (the ''2+1'' theory). We also present results for full (2+1) QCD, along with a numerical estimate of the size of staggered discretization errors. Finally, we calculate the finite volume corrections to the form factors and estimate their numerical size in current lattice simulations

  5. (Anti-)chiral superfield approach to interacting Abelian 1-form gauge theories: Nilpotent and absolutely anticommuting charges

    Science.gov (United States)

    Chauhan, B.; Kumar, S.; Malik, R. P.

    2018-02-01

    We derive the off-shell nilpotent (fermionic) (anti-)BRST symmetry transformations by exploiting the (anti-)chiral superfield approach (ACSA) to Becchi-Rouet-Stora-Tyutin (BRST) formalism for the interacting Abelian 1-form gauge theories where there is a coupling between the U(1) Abelian 1-form gauge field and Dirac as well as complex scalar fields. We exploit the (anti-)BRST invariant restrictions on the (anti-)chiral superfields to derive the fermionic symmetries of our present D-dimensional Abelian 1-form gauge theories. The novel observation of our present investigation is the derivation of the absolute anticommutativity of the nilpotent (anti-)BRST charges despite the fact that our ordinary D-dimensional theories are generalized onto the (D,1)-dimensional (anti-) chiral super-submanifolds (of the general (D,2)-dimensional supermanifold) where only the (anti-)chiral super expansions of the (anti-)chiral superfields have been taken into account. We also discuss the nilpotency of the (anti-)BRST charges and (anti-)BRST invariance of the Lagrangian densities of our present theories within the framework of ACSA to BRST formalism.

  6. Three-index symmetric matter representations of SU(2) in F-theory from non-Tate form Weierstrass models

    Energy Technology Data Exchange (ETDEWEB)

    Klevers, Denis [Theoretical Physics Department, CERN,CH-1211 Geneva 23 (Switzerland); Taylor, Washington [Center for Theoretical Physics, Department of Physics, Massachusetts Institute of Technology,77 Massachusetts Avenue Cambridge, MA 02139 (United States)

    2016-06-29

    We give an explicit construction of a class of F-theory models with matter in the three-index symmetric (4) representation of SU(2). This matter is realized at codimension two loci in the F-theory base where the divisor carrying the gauge group is singular; the associated Weierstrass model does not have the form associated with a generic SU(2) Tate model. For 6D theories, the matter is localized at a triple point singularity of arithmetic genus g=3 in the curve supporting the SU(2) group. This is the first explicit realization of matter in F-theory in a representation corresponding to a genus contribution greater than one. The construction is realized by “unHiggsing” a model with a U(1) gauge factor under which there is matter with charge q=3. The resulting SU(2) models can be further unHiggsed to realize non-Abelian G{sub 2}×SU(2) models with more conventional matter content or SU(2){sup 3} models with trifundamental matter. The U(1) models used as the basis for this construction do not seem to have a Weierstrass realization in the general form found by Morrison-Park, suggesting that a generalization of that form may be needed to incorporate models with arbitrary matter representations and gauge groups localized on singular divisors.

  7. Theory and practice as cultural forms and the research design on The open school program in the Danish school reform

    DEFF Research Database (Denmark)

    Knudsen, Lars Emmerik Damgaard; Haastrup, Lisbeth

    2015-01-01

    and not necessarily bridgeable but always embedded in the cultural setting or topos that surrounds them. This gave us inspiration to view the theory and practice theme in a pedagogical perspective where the knowledge forms and variable relations are not evaluated as optimal or coherent but as different ways......) Teori og praksisdidaktik. København. Unge Pædagoger Jorgensen, E. R. (2005). "Four Philosophical Models of the Relationship Between Theory and Practice." Philosophy of Music Education Review 13(no. 1). Knudsen, L. E. D. Knudsen (2012). Teori og praksis i læreruddannelsen. Kundskabsformer, kultur og...

  8. The Effect of Normal Force on Tribocorrosion Behaviour of Ti-10Zr Alloy and Porous TiO2-ZrO2 Thin Film Electrochemical Formed

    Science.gov (United States)

    Dănăilă, E.; Benea, L.

    2017-06-01

    The tribocorrosion behaviour of Ti-10Zr alloy and porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy was evaluated in Fusayama-Mayer artificial saliva solution. Tribocorrosion experiments were performed using a unidirectional pin-on-disc experimental set-up which was mechanically and electrochemically instrumented, under various solicitation conditions. The effect of applied normal force on tribocorrosion performance of the tested materials was determined. Open circuit potential (OCP) measurements performed before, during and after sliding tests were applied in order to determine the tribocorrosion degradation. The applied normal force was found to greatly affect the potential during tribocorrosion experiments, an increase in the normal force inducing a decrease in potential accelerating the depassivation of the materials studied. The results show a decrease in friction coefficient with gradually increasing the normal load. It was proved that the porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy lead to an improvement of tribocorrosion resistance compared to non-anodized Ti-10Zr alloy intended for biomedical applications.

  9. Asymptotic behavior of the elastic form factor in two-dimensional scalar field theory of the bag model

    International Nuclear Information System (INIS)

    Krapchev, V.

    1976-01-01

    In the framework of the two-dimensional scalar quantum theory of the bag model of Chodos et al a definition of the physical field and a general scheme for constructing a physical state are given. Some of the difficulties associated with such an approach are exposed. Expressions for the physical current and the elastic form factor are given. The calculation of the latter is restricted at first to the approximation in which the mapping from a bag of changing shape to a fixed domain is realized only by a term which is a diagonal, bilinear function of the creation and annihilation operators. This is done for the case of a one-mode and an infinite-mode bag theory. By computing the form factor in an exact one-mode bag model it is shown that the logarithmic falloff of the asymptotic term is the same as the one in the approximation. On the basis of this a form for the asymptotic behavior of the form factor is suggested which may be correct for the general two-dimensional scalar bag theory

  10. Analysis of the nonlinear dynamic behavior of power systems using normal forms of superior order; Analisis del comportamiento dinamico no lineal de sistemas de potencia usando formas normales de orden superior

    Energy Technology Data Exchange (ETDEWEB)

    Marinez Carrillo, Irma

    2003-08-01

    This thesis investigates the application of parameter disturbance methods of analysis to the nonlinear dynamic systems theory, for the study of the stability of small signal of electric power systems. The work is centered in the determination of two fundamental aspects of interest in the study of the nonlinear dynamic behavior of the system: the characterization and quantification of the nonlinear interaction degree between the fundamental ways of oscillation of the system and the study of the ways with greater influence in the response of the system in the presence of small disturbances. With these objectives, a general mathematical model, based on the application of the expansion in series of power of the nonlinear model of the power system and the theory of normal forms of vector fields is proposed for the study of the dynamic behavior of the power system. The proposed tool generalizes the existing methods in the literature to consider effects of superior order in the dynamic model of the power system. Starting off of this representation, a methodology is proposed to obtain analytical solutions of loop back and the extension of the existing methods is investigated to identify and quantify the of interaction degree among the fundamental ways of oscillation of the system. The developed tool allows, from analytical expressions of loop backs, the development of analytical measures to evaluate the stress degree in the system, the interaction between the fundamental ways of oscillation and the determination of stability borders. The conceptual development of the proposed method in this thesis offers, on the other hand, a great flexibility to incorporate detailed models of the power system and the evaluation of diverse measures of the nonlinear modal interaction. Finally, the results are presented of the application of the method of analysis proposed for the study of the nonlinear dynamic behavior in a machine-infinite bus system considering different modeled degrees

  11. A Hard X-Ray Study of the Normal Star-Forming Galaxy M83 with NuSTAR

    DEFF Research Database (Denmark)

    Yukita, M.; Hornschemeier, A. E.; Lehmer, B. D.

    2016-01-01

    We present the results from sensitive, multi-epoch NuSTAR observations of the late-type star-forming galaxy M83 (d = 4.6 Mpc). This is the first investigation to spatially resolve the hard (E > 10 keV) X-ray emission of this galaxy. The nuclear region and similar to 20 off-nuclear point sources......, including a previously discovered ultraluminous X-ray source, are detected in our NuSTAR observations. The X-ray hardnesses and luminosities of the majority of the point sources are consistent with hard X-ray sources resolved in the starburst galaxy NGC 253. We infer that the hard X-ray emission is most...

  12. The Four Elementary Forms of Sociality: Framework for a Unified Theory of Social Relations.

    Science.gov (United States)

    Fiske, Alan Page

    1992-01-01

    A theory is presented that postulates that people in all cultures use four relational models to generate most kinds of social interaction, evaluation, and affect. Ethnographic and field studies (n=19) have supported cultural variations on communal sharing; authority ranking; equality matching; and market pricing. (SLD)

  13. Towards a spatial theory of organizations : Creating new organizational forms to improve business performance

    NARCIS (Netherlands)

    Tissen, R.J.; Lekanne Deprez, F.R.E.

    2008-01-01

    Research in the field of management and organizational theory generally indicates the absense of space in organizations. Space has largely been a neglected phenomenon, left implicite to practice as something ‘limiting’ without actually ‘existing’. The aim of this research paper is to explore and

  14. Storytelling Leadership: A Semiotics Theories Qualitative Inquiry into the Components Forming an Oral Story

    Science.gov (United States)

    Cater, Earl F.

    2015-01-01

    Using semiotics theories as a guide, the qualitative examination of storytelling literature and current storytelling practitioners provides research support for a list of storytelling components. Analysis of story building components discovered from literature in comparison to the results from research questionnaire responses by current…

  15. Gluing operation and form factors of local operators in N = 4 Super Yang-Mills theory

    Science.gov (United States)

    Bolshov, A. E.

    2018-04-01

    The gluing operation is an effective way to get form factors of both local and non-local operators starting from different representations of on-shell scattering amplitudes. In this paper it is shown how it works on the example of form factors of operators from stress-tensor operator supermultiplet in Grassmannian and spinor helicity representations.

  16. Theory of the Andreev reflection and the density of states in proximity contact normal-superconducting infinite double-layer

    International Nuclear Information System (INIS)

    Nagato, Yasushi; Nagai, Katsuhiko

    1993-01-01

    Proximity contact N-S double-layer with infinite layer widths is studied in the clean limit. The finite reflection at the interface is taken into account. Starting from a recent theory of finite width double-layer by Ashida et al., the authors obtain explicit expressions for the quasi-classical Green's function which already satisfy the boundary condition and include no exploding terms at infinities. The self-consistent pair potentials are obtained numerically with sufficient accuracy. The Andreev reflection at the N-S interface is discussed on the basis of the self-consistent pair potential. It is shown that there exists a resonance state in a potential valley formed between the depressed pair potential and the partially reflecting interface, which leads to a peak of the Andreev reflection coefficient with the height unity slightly below the bulk superconductor energy gap. They also find general relationship between the Andreev reflection coefficient and the local density of states of the superconductor just at the interface

  17. Exact form factors for the scaling ZN-Ising and the affine AN-1-Toda quantum field theories

    International Nuclear Information System (INIS)

    Babujian, H.; Karowski, M.

    2003-01-01

    Previous results on form factors for the scaling Ising and the sinh-Gordon models are extended to general Z N -Ising and affine A N-1 -Toda quantum field theories. In particular result for order, disorder parameters and para-Fermi fields σ Q (x), μ Q-tilde (x) and ψ Q (x) are presented for the Z N -model. For the A N-1 -Toda model form factors for exponentials of the Toda fields are proposed. The quantum field equation of motion is proved and the mass and wave function renormalization are calculated exactly

  18. Pion form factor in QCD sum rules, local duality approach, and O(A2) fractional analytic perturbation theory

    International Nuclear Information System (INIS)

    Bakulev, Alexander P.

    2010-01-01

    Using the results on the electromagnetic pion Form Factor (FF) obtained in the O(α s ) QCD sum rules with non-local condensates [A.P. Bakulev, A.V. Pimikov, and N.G. Stefanis, Phys. Rev. D79 (2009) 093010] we determine the effective continuum threshold for the local duality approach. Then we apply it to construct the O(α s 2 ) estimation of the pion FF in the framework of the fractional analytic perturbation theory.

  19. Einstein in matrix form exact derivation of the theory of special and general relativity without tensors

    CERN Document Server

    Ludyk, Günter

    2013-01-01

    This book is an introduction to the theories of Special and General Relativity. The target audience are physicists, engineers and applied scientists who are looking for an understandable introduction to the topic - without too much new mathematics. The fundamental equations of Einsteins theory of Special and General Relativity are derived using matrix calculus, without the help of tensors. This feature makes the book special and a valuable tool for scientists and engineers with no experience in the field of tensor calculus. In part I the foundations of Special Relativity are developed, part II describes the structure and principle of General Relativity. Part III explains the Schwarzschild solution of spherical body gravity and examines the "Black Hole" phenomenon. Any necessary mathematical tools are user friendly provided, either directly in the text or in the appendices.

  20. Forming the Modern Labour Market Economics: On the Role of Institutionalist Theories

    Directory of Open Access Journals (Sweden)

    Dagmar Brožová

    2016-12-01

    Full Text Available The growing role of institutions and their influence on the labour market outcomes, i.e., wage rates and labour allocation, has been among the most significant characteristic features of labour markets in recent decades. The labour market economics built its paradigm on the principles of marginalism, which brought suitable instruments for the analysis of market agents’ individual decisions capable of achieving effective solutions. Smith’s “invisible hand” has gradually been limited by institutional interventions – by governments, corporations and trade unions with government legislation, corporate personnel policies and collective bargaining. The expanding regulatory interventions in the labour market and the effort to explain the reality leads inevitably to the fact that the modern labour market economics incorporates more and more institutional theories. The contribution outlines the gradual invasion of neoinstitutional topics and theories into the neoclassical labour market paradigm and it analyses the differences between the neoclassical and institutional interpretation of labour market functioning. It presents the recent discussion on the consequences for the labour market economic theory and formulates a conclusion about the modified labour market economic paradigm.

  1. Einstein in matrix form. Exact derivation of the theory of special and general relativity without tensors

    Energy Technology Data Exchange (ETDEWEB)

    Ludyk, Guenter [Bremen Univ. (Germany). Physics and Electrical Engineering

    2013-11-01

    Derives the fundamental equations of Einstein's theory of special and general relativity using matrix calculus, without the help of tensors. Provides necessary mathematical tools in a user-friendly way, either directly in the text or in the appendices. Appendices contain an introduction to classical dynamics as a refresher of known fundamental physics. Rehearses vector and matrix calculus, differential geometry, and some special solutions of general relativity in the appendices. This book is an introduction to the theories of Special and General Relativity. The target audience are physicists, engineers and applied scientists who are looking for an understandable introduction to the topic - without too much new mathematics. The fundamental equations of Einsteins theory of Special and General Relativity are derived using matrix calculus, without the help of tensors. This feature makes the book special and a valuable tool for scientists and engineers with no experience in the field of tensor calculus. In part I the foundations of Special Relativity are developed, part II describes the structure and principle of General Relativity. Part III explains the Schwarzschild solution of spherical body gravity and examines the ''Black Hole'' phenomenon. Any necessary mathematical tools are user friendly provided, either directly in the text or in the appendices.

  2. Einstein in matrix form. Exact derivation of the theory of special and general relativity without tensors

    International Nuclear Information System (INIS)

    Ludyk, Guenter

    2013-01-01

    Derives the fundamental equations of Einstein's theory of special and general relativity using matrix calculus, without the help of tensors. Provides necessary mathematical tools in a user-friendly way, either directly in the text or in the appendices. Appendices contain an introduction to classical dynamics as a refresher of known fundamental physics. Rehearses vector and matrix calculus, differential geometry, and some special solutions of general relativity in the appendices. This book is an introduction to the theories of Special and General Relativity. The target audience are physicists, engineers and applied scientists who are looking for an understandable introduction to the topic - without too much new mathematics. The fundamental equations of Einsteins theory of Special and General Relativity are derived using matrix calculus, without the help of tensors. This feature makes the book special and a valuable tool for scientists and engineers with no experience in the field of tensor calculus. In part I the foundations of Special Relativity are developed, part II describes the structure and principle of General Relativity. Part III explains the Schwarzschild solution of spherical body gravity and examines the ''Black Hole'' phenomenon. Any necessary mathematical tools are user friendly provided, either directly in the text or in the appendices.

  3. Developing Item Response Theory-Based Short Forms to Measure the Social Impact of Burn Injuries.

    Science.gov (United States)

    Marino, Molly E; Dore, Emily C; Ni, Pengsheng; Ryan, Colleen M; Schneider, Jeffrey C; Acton, Amy; Jette, Alan M; Kazis, Lewis E

    2018-03-01

    To develop self-reported short forms for the Life Impact Burn Recovery Evaluation (LIBRE) Profile. Short forms based on the item parameters of discrimination and average difficulty. A support network for burn survivors, peer support networks, social media, and mailings. Burn survivors (N=601) older than 18 years. Not applicable. The LIBRE Profile. Ten-item short forms were developed to cover the 6 LIBRE Profile scales: Relationships with Family & Friends, Social Interactions, Social Activities, Work & Employment, Romantic Relationships, and Sexual Relationships. Ceiling effects were ≤15% for all scales; floor effects were item bank, computerized adaptive test, and short forms are all scored along the same metric, and therefore scores are comparable regardless of the mode of administration. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  4. The integral form of D = 3 Chern-Simons theories probing C{sup n}/Γ singularities

    Energy Technology Data Exchange (ETDEWEB)

    Fre, P. [Dipartimento di Fisica, Universita di Torino (Italy); INFN - Sezione di Torino (Italy); Arnold-Regge Center, Torino (Italy); National Research Nuclear University MEPhI, (Moscow Engineering Physics Institute), Moscow (Russian Federation); Grassi, P.A. [INFN - Sezione di Torino (Italy); Arnold-Regge Center, Torino (Italy); DISIT, Universita del Piemonte Orientale, Alessandria (Italy); Center for Gravitational Physics, Yukawa Institute for Theoretical Physics, Kyoto University (Japan)

    2017-10-15

    We consider D=3 supersymmetric Chern Simons gauge theories both from the point of view of their formal structure and of their applications to the AdS{sub 4}/CFT{sub 3} correspondence. From the structural view-point, we use the new formalism of integral forms in superspace that utilizes the rheonomic Lagrangians and the Picture Changing Operators, as an algorithmic tool providing the connection between different approaches to supersymmetric theories. We provide here the generalization to an arbitrary Kaehler manifold with arbitrary gauge group and arbitrary superpotential of the rheonomic lagrangian of D=3 matter coupled gauge theories constructed years ago. From the point of view of the AdS{sub 4}/CFT{sub 3} correspondence and more generally of M2-branes we emphasize the role of the Kaehler quotient data in determining the field content and the interactions of the Cherns Simons gauge theory when the transverse space to the brane is a non-compact Kaehler quotient K{sub 4} of some flat variety with respect to a suitable group. The crepant resolutions of C{sup n}/Γ singularities fall in this category. In the present paper we anticipate the general scheme how the geometrical data are to be utilized in the construction of the D=3 Chern-Simons Theory supposedly dual to the corresponding M2-brane solution. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. A Closed-Form Solution to Tensor Voting: Theory and Applications

    OpenAIRE

    Wu, Tai-Pang; Yeung, Sai-Kit; Jia, Jiaya; Tang, Chi-Keung; Medioni, Gerard

    2016-01-01

    We prove a closed-form solution to tensor voting (CFTV): given a point set in any dimensions, our closed-form solution provides an exact, continuous and efficient algorithm for computing a structure-aware tensor that simultaneously achieves salient structure detection and outlier attenuation. Using CFTV, we prove the convergence of tensor voting on a Markov random field (MRF), thus termed as MRFTV, where the structure-aware tensor at each input site reaches a stationary state upon convergence...

  6. Comments on "A closed-form solution to Tensor voting: theory and applications"

    OpenAIRE

    Maggiori, Emmanuel; Lotito, Pablo Andres; Manterola, Hugo Luis; del Fresno, Mariana

    2017-01-01

    We comment on a paper that describes a closed-form formulation to Tensor Voting, a technique to perceptually group clouds of points, usually applied to infer features in images. The authors proved an analytic solution to the technique, a highly relevant contribution considering that the original formulation required numerical integration, a time-consuming task. Their work constitutes the first closed-form expression for the Tensor Voting framework. In this work we first observe that the propo...

  7. Two-Point Incremental Forming with Partial Die: Theory and Experimentation

    Science.gov (United States)

    Silva, M. B.; Martins, P. A. F.

    2013-04-01

    This paper proposes a new level of understanding of two-point incremental forming (TPIF) with partial die by means of a combined theoretical and experimental investigation. The theoretical developments include an innovative extension of the analytical model for rotational symmetric single point incremental forming (SPIF), originally developed by the authors, to address the influence of the major operating parameters of TPIF and to successfully explain the differences in formability between SPIF and TPIF. The experimental work comprised the mechanical characterization of the material and the determination of its formability limits at necking and fracture by means of circle grid analysis and benchmark incremental sheet forming tests. Results show the adequacy of the proposed analytical model to handle the deformation mechanics of SPIF and TPIF with partial die and demonstrate that neck formation is suppressed in TPIF, so that traditional forming limit curves are inapplicable to describe failure and must be replaced by fracture forming limits derived from ductile damage mechanics. The overall geometric accuracy of sheet metal parts produced by TPIF with partial die is found to be better than that of parts fabricated by SPIF due to smaller elastic recovery upon unloading.

  8. Implementing monitoring technologies in care homes for people with dementia: A qualitative exploration using Normalization Process Theory.

    Science.gov (United States)

    Hall, Alex; Wilson, Christine Brown; Stanmore, Emma; Todd, Chris

    2017-07-01

    Ageing societies and a rising prevalence of dementia are associated with increasing demand for care home places. Monitoring technologies (e.g. bed-monitoring systems; wearable location-tracking devices) are appealing to care homes as they may enhance safety, increase resident freedom, and reduce staff burden. However, there are ethical concerns about the use of such technologies, and it is unclear how they might be implemented to deliver their full range of potential benefits. This study explored facilitators and barriers to the implementation of monitoring technologies in care homes. Embedded multiple-case study with qualitative methods. Three dementia-specialist care homes in North-West England. Purposive sample of 24 staff (including registered nurses, clinical specialists, senior managers and care workers), 9 relatives and 9 residents. 36 semi-structured interviews with staff, relatives and residents; 175h of observation; resident care record review. Data collection informed by Normalization Process Theory, which seeks to account for how novel interventions become routine practice. Data analysed using Framework Analysis. Findings are presented under three main themes: 1. Reasons for using technologies: The primary reason for using monitoring technologies was to enhance safety. This often seemed to override consideration of other potential benefits (e.g. increased resident freedom) or ethical concerns (e.g. resident privacy); 2. Ways in which technologies were implemented: Some staff, relatives and residents were not involved in discussions and decision-making, which seemed to limit understandings of the potential benefits and challenges from the technologies. Involvement of residents appeared particularly challenging. Staff highlighted the importance of training, but staff training appeared mainly informal which did not seem sufficient to ensure that staff fully understood the technologies; 3. Use of technologies in practice: Technologies generated frequent

  9. Afriphone Literature as a Prototypical Form of African Literature: Insights from Prototype Theory

    Science.gov (United States)

    Bodomo, Adams

    2016-01-01

    What is the most prototypical form of African literature? Shouldn't we be using African languages to produce African literary texts, shouldn't we produce more Afriphone African literature compared to Europhone African literature or Afro-Europhone literature? This issue underlies the reality that the vast majority of African writers presumably…

  10. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    Energy Technology Data Exchange (ETDEWEB)

    Bateman, Grant A. [John Hunter Hospital, Department of Medical Imaging, Newcastle (Australia); Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C. [Hunter Medical Research Institute, Clinical Neurosciences Program, Newcastle (Australia); Schofield, Peter [James Fletcher Hospital, Neuropsychiatry Unit, Newcastle (Australia)

    2005-10-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  11. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    International Nuclear Information System (INIS)

    Bateman, Grant A.; Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C.; Schofield, Peter

    2005-01-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  12. On-the-Job Ethics – Proximity Morality Forming in Medical School: A grounded theory analysis using survey data

    Directory of Open Access Journals (Sweden)

    Hans O. Thulesius, MD, Ph.D.

    2009-03-01

    Full Text Available On-the-job-ethics exist in all businesses and can also be called proximity morality forming. In this paper we propose that medical students take a proximity morality stance towards ethics education at medical school. This means that they want to form physician morality “on the job” instead of being taught ethics like any other subject. On-the-job-ethics for medical students involves learning ethics that is used when practicing ethics. Learning ethics includes comprehensive ethics courses in which quality lectures provide ethics grammar useful for the ethics practicing in attitude exercises and vignette reflections in tutored group discussions. On-the-job-ethics develops professional identity, handles diversity of religious and existential worldviews, trains students described as ethically naive, processes difficult clinical experiences, and desists negative role modeling from physicians in clinical or teaching situations. This grounded theory analysis was made from a questionnaire survey on attitudes to ethics education with 409 Swedish medical students participating. We analyzed over 8000 words of open-ended responses and multiplechoice questions using classic grounded theory procedures, but also compared questionnaire data using statistics such as multiple regression models. The paper gives an example of how grounded theory can be used with a limited amount of survey data.

  13. Theory of the l-state population of Rydberg states formed in ion-solid collisions

    International Nuclear Information System (INIS)

    Kemmler, J.; Burgdoerfer, J.; Reinhold, C.O.

    1991-01-01

    The experimentally observed high-l-state population of ions excited in ion-solid interactions differs sharply from l-state populations produced in ion-atom collisions. We have studied the population dynamics of electronic excitation and transport within the framework of a classical transport theory for O 2+ (2-MeV/u) ions traversing C foils. The resulting delayed-photon-emission intensities are found to be in very good agreement with experiment. Initial phase-space conditions have been obtained from both classical-trajectory Monte Carlo calculations and random initial distributions. We find evidence that the very-high-l-state populations produced in ion-solid collisions are the result of a diffusion to high-l states under the influence of multiple scattering in the bulk of the solid

  14. Three-Index Symmetric Matter Representations of SU(2) in F-Theory from Non-Tate Form Weierstrass Models

    CERN Document Server

    Klevers, Denis

    2016-01-01

    We give an explicit construction of a class of F-theory models with matter in the three-index symmetric (4) representation of SU(2). This matter is realized at codimension two loci in the F-theory base where the divisor carrying the gauge group is singular; the associated Weierstrass model does not have the form associated with a generic SU(2) Tate model. For 6D theories, the matter is localized at a triple point singularity of arithmetic genus g=3 in the curve supporting the SU(2) group. This is the first explicit realization of matter in F-theory in a representation corresponding to a genus contribution greater than one. The construction is realized by "unHiggsing" a model with a U(1) gauge factor under which there is matter with charge q=3. The resulting SU(2) models can be further unHiggsed to realize non-Abelian G_2xSU(2) models with more conventional matter content or SU(2)^3 models with trifundamental matter. The U(1) models used as the basis for this construction do not seem to have a Weierstrass real...

  15. Comments on "A Closed-Form Solution to Tensor Voting: Theory and Applications".

    Science.gov (United States)

    Maggiori, Emmanuel; Lotito, Pablo; Manterola, Hugo Luis; del Fresno, Mariana

    2014-12-01

    We comment on a paper that describes a closed-form formulation to Tensor Voting, a technique to perceptually group clouds of points, usually applied to infer features in images. The authors proved an analytic solution to the technique, a highly relevant contribution considering that the original formulation required numerical integration, a time-consuming task. Their work constitutes the first closed-form expression for the Tensor Voting framework. In this work we first observe that the proposed formulation leads to unexpected results which do not satisfy the constraints for a Tensor Voting output, hence they cannot be interpreted. Given that the closed-form expression is said to be an analytic equivalent solution, unexpected outputs should not be encountered unless there are flaws in the proof. We analyzed the underlying math to find which were the causes of these unexpected results. In this commentary we show that their proposal does not in fact provide a proper analytic solution to Tensor Voting and we indicate the flaws in the proof.

  16. Evaluation of various mass-transport theory and empiricism used in the interpretation of leaching data for cementitious waste forms

    International Nuclear Information System (INIS)

    Spence, R. D.; Godbee, H. W.; Tallent, O. K.; Nestor, C. W.; McDaniel, E. W.

    1991-01-01

    Despite the demonstrated importance of diffusion control in leaching, other mechanisms have been observed to play a role. Thus, leaching from porous solid bodies is not simple diffusion. However, only the theory of simple diffusion has been developed well enough for extrapolation. This diffusion theory, used in data analysis by ANSI/ANS-16.1 and the NEWBOX program, can help in trying to extrapolate and predict the performance of solidified waste forms over decades and centuries, but the limitations and increased uncertainty of such applications must be understood. Treating leaching as a semi-infinite medium problem, as done in the Cote model, results in simpler equations but limits application to early leaching behavior (when less than 20% of a given component has been leached)

  17. Interpretation of leaching data for cementitious waste forms using analytical solutions based on mass transport theory and empiricism

    International Nuclear Information System (INIS)

    Spence, R.D.; Godbee, H.W.; Tallent, O.K.; McDaniel, E.W.; Nestor, C.W.

    1991-01-01

    Despite the demonstrated importance of diffusion control in leaching, other mechanisms have been observed to play a role and leaching from porous solid bodies is not simple diffusion. Only simple diffusion theory has been developed well enough for extrapolation, as yet. The well developed diffusion theory, used in data analysis by ANSI/ANS-16.1 and the NEWBOX program, can help in trying to extrapolate and predict the performance of solidified waste forms over decades and centuries, but the limitations and increased uncertainty must be understood in so doing. Treating leaching as a semi-infinite medium problem, as done in the Cote model, results in simpler equations, but limits, application to early leaching behavior when less than 20% of a given component has been leached. 18 refs., 2 tabs

  18. Quantum theory with an energy operator defined as a quartic form of the momentum

    Energy Technology Data Exchange (ETDEWEB)

    Bezák, Viktor, E-mail: bezak@fmph.uniba.sk

    2016-09-15

    Quantum theory of the non-harmonic oscillator defined by the energy operator proposed by Yurke and Buks (2006) is presented. Although these authors considered a specific problem related to a model of transmission lines in a Kerr medium, our ambition is not to discuss the physical substantiation of their model. Instead, we consider the problem from an abstract, logically deductive, viewpoint. Using the Yurke–Buks energy operator, we focus attention on the imaginary-time propagator. We derive it as a functional of the Mehler kernel and, alternatively, as an exact series involving Hermite polynomials. For a statistical ensemble of identical oscillators defined by the Yurke–Buks energy operator, we calculate the partition function, average energy, free energy and entropy. Using the diagonal element of the canonical density matrix of this ensemble in the coordinate representation, we define a probability density, which appears to be a deformed Gaussian distribution. A peculiarity of this probability density is that it may reveal, when plotted as a function of the position variable, a shape with two peaks located symmetrically with respect to the central point.

  19. A closed-form solution to tensor voting: theory and applications.

    Science.gov (United States)

    Wu, Tai-Pang; Yeung, Sai-Kit; Jia, Jiaya; Tang, Chi-Keung; Medioni, Gérard

    2012-08-01

    We prove a closed-form solution to tensor voting (CFTV): Given a point set in any dimensions, our closed-form solution provides an exact, continuous, and efficient algorithm for computing a structure-aware tensor that simultaneously achieves salient structure detection and outlier attenuation. Using CFTV, we prove the convergence of tensor voting on a Markov random field (MRF), thus termed as MRFTV, where the structure-aware tensor at each input site reaches a stationary state upon convergence in structure propagation. We then embed structure-aware tensor into expectation maximization (EM) for optimizing a single linear structure to achieve efficient and robust parameter estimation. Specifically, our EMTV algorithm optimizes both the tensor and fitting parameters and does not require random sampling consensus typically used in existing robust statistical techniques. We performed quantitative evaluation on its accuracy and robustness, showing that EMTV performs better than the original TV and other state-of-the-art techniques in fundamental matrix estimation for multiview stereo matching. The extensions of CFTV and EMTV for extracting multiple and nonlinear structures are underway.

  20. Computationally simple, analytic, closed form solution of the Coulomb self-interaction problem in Kohn Sham density functional theory

    International Nuclear Information System (INIS)

    Gonis, Antonios; Daene, Markus W.; Nicholson, Don M.; Stocks, George Malcolm

    2012-01-01

    We have developed and tested in terms of atomic calculations an exact, analytic and computationally simple procedure for determining the functional derivative of the exchange energy with respect to the density in the implementation of the Kohn Sham formulation of density functional theory (KS-DFT), providing an analytic, closed-form solution of the self-interaction problem in KS-DFT. We demonstrate the efficacy of our method through ground-state calculations of the exchange potential and energy for atomic He and Be atoms, and comparisons with experiment and the results obtained within the optimized effective potential (OEP) method.

  1. A comparative study of deficit pattern in theory of mind and emotion regulation methods in evaluating patients with bipolar disorder and normal individuals

    OpenAIRE

    Ali Fakhari; Khalegh Minashiri; Abolfazl Fallahi; Mohammad Taher Panah

    2013-01-01

    BACKGROUND: This study compared patterns of deficit in "theory of mind" and "emotion regulation" in patientswith bipolar disorder and normal individuals. METHODS: In this causal-comparative study, subjects were 20 patients with bipolar disorder and 20 normalindividuals. Patients were selected via convenience sampling method among hospitalized patients at Razi hospital ofTabriz, Iran. The data was collected through two scales: Reading the Mind in the Eyes Test and Emotion RegulationQuestionnai...

  2. Form factors and charge radii in a quantum chromodynamics-inspired potential model using variationally improved perturbation theory

    International Nuclear Information System (INIS)

    Hazarika, Bhaskar Jyoti; Choudhury, D.K.

    2015-01-01

    We use variationally improved perturbation theory (VIPT) for calculating the elastic form factors and charge radii of D, D s , B, B s and B c mesons in a quantum chromodynamics (QCD)-inspired potential model. For that, we use linear-cum-Coulombic potential and opt the Coulombic part first as parent and then the linear part as parent. The results show that charge radii and form factors are quite small for the Coulombic parent compared to the linear parent. Also, the analysis leads to a lower as well as upper bounds on the four-momentum transfer Q 2 , hinting at a workable range of Q 2 within this approach, which may be useful in future experimental analyses. Comparison of both the options shows that the linear parent is the better option. (author)

  3. The Electric Dipole Form Factor of the Nucleon in Chiral Perturbation Theory to Sub-leading Order

    CERN Document Server

    Mereghetti, E; Hockings, W H; Maekawa, C M; van Kolck, U

    2011-01-01

    The electric dipole form factor (EDFF) of the nucleon stemming from the QCD theta term and from the quark color-electric dipole moments is calculated in chiral perturbation theory to sub-leading order. This is the lowest order in which the isoscalar EDFF receives a calculable, non-analytic contribution from the pion cloud. In the case of the theta term, the expected lower bound on the deuteron electric dipole moment is |d_d| > 1.4 10^(-4) \\theta e fm. The momentum dependence of the isovector EDFF is proportional to a non-derivative time-reversal-violating pion-nucleon coupling, and the scale for momentum variation ---appearing, in particular, in the radius of the form factor--- is the pion mass.

  4. 师范生师德培养与思想政治理论课%Teachers' Professional Ethics and Ideological Political Theory Courses for Normal University Students

    Institute of Scientific and Technical Information of China (English)

    冉静; 王京强; 冯晋

    2015-01-01

    在思想政治理论课多元化的教育功能之中,道德教育是较为突出的功能之一。思想政治理论课在属性、目标和过程等方面与师范生师德培养存在密切的关联性,对师范生师德培养起着积极而有效的推动和促进作用。文章阐释了师范生师德培养的意义,并着重论述了思想政治理论课在师范生师德培养中的作用。%In the pluralism of educational function for t he Ideological and Political Theory Courses, Teachers' professional ethics is one of the prominent functions. The course is closely related to students’ morality cultivation in the aspects of attribution,goals and process,which greatly promotes normal university students’ professional ethics. And it plays a positive and effective role in promoting and facilitating teachers' professional ethics of normal university students. This article expounds the significance of the cultivating teachers' professional ethics of normal university students,and mainly discusses the role the ideological and political theory course plays in cultivating normal university students' professional ethics.

  5. The Collinearity Free and Bias Reduced Regression Estimation Project: The Theory of Normalization Ridge Regression. Report No. 2.

    Science.gov (United States)

    Bulcock, J. W.; And Others

    Multicollinearity refers to the presence of highly intercorrelated independent variables in structural equation models, that is, models estimated by using techniques such as least squares regression and maximum likelihood. There is a problem of multicollinearity in both the natural and social sciences where theory formulation and estimation is in…

  6. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    NARCIS (Netherlands)

    Casault, Sébastien; Groen, Arend J.; Linton, Jonathan D.; Linton, Jonathan

    2015-01-01

    Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered.

  7. Theory of charge transport in diffusive normal metal/conventional superconductor point contacts in the presence of magnetic impurity

    NARCIS (Netherlands)

    Yokoyama, T.; Tanaka, Y.; Golubov, Alexandre Avraamovitch; Inoue, J.; Asano, Y.

    2006-01-01

    Charge transport in the diffusive normal metal/insulator/s-wave superconductor junctions is studied in the presence of the magnetic impurity for various situations, where we have used the Usadel equation with Nazarov's generalized boundary condition. It is revealed that the magnetic impurity

  8. [Comparative research into the process of forming the theory of constitution in ancient western medicine and that of four trigrams constitution in Korean medicine and contents of two theories of constitution].

    Science.gov (United States)

    Park, Joo-Hong

    2009-06-01

    After conducting comparative research into the process of forming the Theory of Constitution in Ancient Western Medicine and that of Four Trigrams Constitution(Sasang Constitution) in Korean Medicine and contents of two Theories of Constitution in terms of medical history, both theories were found to be formed by an interaction between philosophy and medicine, followed by a combination of the two, on a philosophical basis. The Theory of Constitution in Ancient Western Medicine began with the Theory of Four Elements presented by Empedocles, followed by the Theory of Four Humors presented by Hippocrates and the Theory of Four Temperaments by Galenos, forming and developing the Theory of Constitution. After the Middle Ages, there was no significant advance in the Theory of Constitution by modern times ; however, it developed into the theory of constitution type of Kretschmer and others after the 19th century and into the scientific theory of constitution based on genetics presented by Garrod and others early in the 20th century. The Theory of Four Trigrams Constitution began with the Theory of Constitution in Huangdi Neijing, followed by developments and influences of existing medicine called beginning, restoration, and revival periods and DongeuisoosebowonSaSangChoBonGwon based on the original philosophy of Four Trigrams presented by Lee Je-ma, which is found in GyeokChiGo, DongMuYuGo and so on, ultimately forming and developing into the Theory of Four Trigrams Constitution in Dongeuisoosebowon. Recently, a lot of research is being conducted into making it objective in order to achieve reproducibility in diagnosis and so forth of Four Trigrams Constitution.

  9. Phenomenological theory of the normal and superconductive states of Cu-O and Bi-O metals

    International Nuclear Information System (INIS)

    Varma, C.M.

    1991-01-01

    The universal normal state anomalies in the CuO metals follow from a marginal Fermi liquid hypothesis: there exists a contribution to the polarizability over most of momentum space proportional to omega/T for omega/T much less than 1 and constant thereafter up to a cutoff omega(sub c). Using the same excitation spectrum, the properties of the superconductive state were calculated. The right order of T(sub c) can be obtained, the zero temperature gap, 2 delta (0)/T(sub c) and the nuclear relaxation rate near T(sub c). The possible microscopic physics leading to the marginal Fermi liquid hypothesis is discussed

  10. The Swift/UVOT catalogue of NGC 4321 star-forming sources: a case against density wave theory

    Science.gov (United States)

    Ferreras, Ignacio; Cropper, Mark; Kawata, Daisuke; Page, Mat; Hoversten, Erik A.

    2012-08-01

    We study the star-forming regions in the spiral galaxy NGC 4321 (M100). We take advantage of the spatial resolution (2.5 arcsec full width at half-maximum) of the Swift/Ultraviolet/Optical Telescope camera and the availability of three ultraviolet (UV) passbands in the region 1600 spiral arms. The Hα luminosities of the sources have a strong decreasing radial trend, suggesting more massive star-forming regions in the central part of the galaxy. When segregated with respect to near-UV (NUV)-optical colour, blue sources have a significant excess of flux in the IR at 8 μm, revealing the contribution from polycyclic aromatic hydrocarbons, although the overall reddening of these sources stays below E(B - V) = 0.2 mag. The distribution of distances to the spiral arms is compared for subsamples selected according to Hα luminosity, NUV-optical colour or ages derived from a population synthesis model. An offset would be expected between these subsamples as a function of radius if the pattern speed of the spiral arm were constant - as predicted by classic density wave theory. No significant offsets are found, favouring instead a mechanism where the pattern speed has a radial dependence.

  11. Usability of a theory of visual attention (TVA) for parameter-based measurement of attention I: evidence from normal subjects

    DEFF Research Database (Denmark)

    Finke, Kathrin; Bublak, Peter; Krummenacher, Joseph

    2005-01-01

    four separable attentional components: processing speed, working memory storage capacity, spatial distribution of attention, and top-down control. A number of studies (Duncan et al., 1999; Habekost & Bundesen, 2003; Peers et al., 2005) have already demonstrated the clinical relevance......The present study investigated the usability of whole and partial report of briefly displayed letter arrays as a diagnostic tool for the assessment of attentional functions. The tool is based on Bundesen's (1990, 1998, 2002; Bundesen et al., 2005) theory of visual attention (TVA), which assumes...... of these parameters. The present study was designed to examine whether (a) a shortened procedure bears sufficient accuracy and reliability, (b) whether the procedures reveal attentional constructs with clinical relevance, and (c) whether the mathematically independent parameters are also empirically independent...

  12. A unified bond theory, probabilistic meso-scale modeling, and experimental validation of deformed steel rebar in normal strength concrete

    Science.gov (United States)

    Wu, Chenglin

    Bond between deformed rebar and concrete is affected by rebar deformation pattern, concrete properties, concrete confinement, and rebar-concrete interfacial properties. Two distinct groups of bond models were traditionally developed based on the dominant effects of concrete splitting and near-interface shear-off failures. Their accuracy highly depended upon the test data sets selected in analysis and calibration. In this study, a unified bond model is proposed and developed based on an analogy to the indentation problem around the rib front of deformed rebar. This mechanics-based model can take into account the combined effect of concrete splitting and interface shear-off failures, resulting in average bond strengths for all practical scenarios. To understand the fracture process associated with bond failure, a probabilistic meso-scale model of concrete is proposed and its sensitivity to interface and confinement strengths are investigated. Both the mechanical and finite element models are validated with the available test data sets and are superior to existing models in prediction of average bond strength (rib spacing-to-height ratio of deformed rebar. It can accurately predict the transition of failure modes from concrete splitting to rebar pullout and predict the effect of rebar surface characteristics as the rib spacing-to-height ratio increases. Based on the unified theory, a global bond model is proposed and developed by introducing bond-slip laws, and validated with testing of concrete beams with spliced reinforcement, achieving a load capacity prediction error of less than 26%. The optimal rebar parameters and concrete cover in structural designs can be derived from this study.

  13. FORMING PROSPECTIVE PRIMAPY SCHOOL TEACHERS’ NATIONAL SELF-IDENTIFICATION IN THE COURSE “FOLK DANCE THEORY AND METHODOLOGY”

    Directory of Open Access Journals (Sweden)

    Volodymyr Kotov

    2017-04-01

    Full Text Available The article highlights the urgent problem of contemporary art pedagogy – involvement to training future professional choreographic traditions of different nations. Addressing to this problem is caused by a number of socio-political events in Ukraine, mainstreaming of national and international education, integration of Ukrainian education with the European educational space, intensive development of domestic students’ intercultural communication with young people from different countries, which is the basis for updating national art education. Prospective choreographers, who are being training at pedagogical universities to manage children's dance groups, should actively be involved into creating their own productions of folk dance various genres. It promotes the formation of choreographers’ professional competence and pedagogical skills. The development of Ukrainian dance “Opryshky” is proposed – a joint creative work of the teacher and students who get higher education degree in SHEE “Donbass State Pedagogical University” (Bachelor's Degree. Development of the dance contains schematic drawings of dance figures, it is recommended for use in forming choreographers’ professional skills while studying the course "Folk Dance Theory and Methodology". The author admits that folklore material requires a cautious, respectful attitude. Therefore, modern folk stage dances are integrally to combine traditional choreographic manner with its new interpretations. The author believes the actual capture of different nations’ choreographic culture improves intercultural youth communication; involves future professionals into the traditions of different nations; form professional skills of managers of children’s dance groups. The author concluded that a dance always reflects consciousness of different nations; future choreographers should be aware of characteristic features of dances of different world nations so that on the basis of

  14. Construção constitucional e teorias da democracia Forms of constitution making and theories of democracy

    Directory of Open Access Journals (Sweden)

    Andrew Arato

    1997-01-01

    Full Text Available Com base numa tipologia de formas de construção constitucional e com ênfase nos processos de constitucionalização chega-se a um conjunto de princípios básicos para vincular a construção constitucional às exigências da democracia. A análise concentra-se simultaneamente na reconstrução de casos históricos e no exame crítico das teorias relevantes. Especial atenção é dada às reivindicações do modelo norte-americano como exemplar. Os processos em andamento de mudança constitucional na Europa do Leste também são discutidos.On the basis of a typology of forms of constitution making and with emphasis on the constitutionalization process a set of basic principles for linking constitution making with the demands of democracy is put forward. The analysis deals both with the reconstruction of historical cases and with the critical examination of relevant theories. Special attention is given to the claims concerning the exemplary character of the American model. The ongoing processes of constitutional change in Eastern Europe are also discussed.

  15. Theory of mind and wisdom: The development of different forms of perspective-taking in late adulthood.

    Science.gov (United States)

    Rakoczy, Hannes; Wandt, Raphaela; Thomas, Stefanie; Nowak, Jana; Kunzmann, Ute

    2018-02-01

    How does perspective-taking develop over the lifespan? This question has been investigated in two separate research traditions, dealing with theory of mind (ToM) and wisdom, respectively. Operating in almost complete isolation from each other, and using rather different conceptual approaches, these two traditions have produced seemingly contradictory results: While perspective-taking has been consistently found to decline in old age in ToM research, studies on wisdom have mostly found that perspective-taking remains constant or sometimes even increases in later adulthood. This study sought to integrate these two lines of research and clarify the seemingly contradictory patterns of findings by systematically testing for both forms of perspective-taking and their potential cognitive foundations. The results revealed (1) the dissociation in developmental patterns between ToM perspective-taking (declining with age) and wisdom-related perspective-taking (no decline with age) also held - documented here for the first time - in one and the same sample of younger versus older adults; (2) this dissociation was of limited generality: It did not (or only partly) hold once the material of the two types of tasks was more closely matched; and (3) the divergent developmental patterns of ToM perspective-taking versus wisdom-related perspective-taking could be accounted for to some degree by the fact that only TOM perspective-taking was related to developmental changes in fluid intelligence. © 2017 The British Psychological Society.

  16. A critical analysis of the implementation of service user involvement in primary care research and health service development using normalization process theory.

    Science.gov (United States)

    Tierney, Edel; McEvoy, Rachel; O'Reilly-de Brún, Mary; de Brún, Tomas; Okonkwo, Ekaterina; Rooney, Michelle; Dowrick, Chris; Rogers, Anne; MacFarlane, Anne

    2016-06-01

    There have been recent important advances in conceptualizing and operationalizing involvement in health research and health-care service development. However, problems persist in the field that impact on the scope for meaningful involvement to become a routine - normalized - way of working in primary care. In this review, we focus on current practice to critically interrogate factors known to be relevant for normalization - definition, enrolment, enactment and appraisal. Ours was a multidisciplinary, interagency team, with community representation. We searched EBSCO host for papers from 2007 to 2011 and engaged in an iterative, reflexive approach to sampling, appraising and analysing the literature following the principles of a critical interpretive synthesis approach and using Normalization Process Theory. Twenty-six papers were chosen from 289 papers, as a purposeful sample of work that is reported as service user involvement in the field. Few papers provided a clear working definition of service user involvement. The dominant identified rationale for enrolling service users in primary care projects was linked with policy imperatives for co-governance and emancipatory ideals. The majority of methodologies employed were standard health services research methods that do not qualify as research with service users. This indicates a lack of congruence between the stated aims and methods. Most studies only reported positive outcomes, raising questions about the balance or completeness of the published appraisals. To improve normalization of meaningful involvement in primary care, it is necessary to encourage explicit reporting of definitions, methodological innovation to enhance co-governance and dissemination of research processes and findings. © 2014 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  17. Understanding the challenges to implementing case management for people with dementia in primary care in England: a qualitative study using Normalization Process Theory.

    Science.gov (United States)

    Bamford, Claire; Poole, Marie; Brittain, Katie; Chew-Graham, Carolyn; Fox, Chris; Iliffe, Steve; Manthorpe, Jill; Robinson, Louise

    2014-11-08

    Case management has been suggested as a way of improving the quality and cost-effectiveness of support for people with dementia. In this study we adapted and implemented a successful United States' model of case management in primary care in England. The results are reported elsewhere, but a key finding was that little case management took place. This paper reports the findings of the process evaluation which used Normalization Process Theory to understand the barriers to implementation. Ethnographic methods were used to explore the views and experiences of case management. Interviews with 49 stakeholders (patients, carers, case managers, health and social care professionals) were supplemented with observation of case managers during meetings and initial assessments with patients. Transcripts and field notes were analysed initially using the constant comparative approach and emerging themes were then mapped onto the framework of Normalization Process Theory. The primary focus during implementation was on the case managers as isolated individuals, with little attention being paid to the social or organizational context within which they worked. Barriers relating to each of the four main constructs of Normalization Process Theory were identified, with a lack of clarity over the scope and boundaries of the intervention (coherence); variable investment in the intervention (cognitive participation); a lack of resources, skills and training to deliver case management (collective action); and limited reflection and feedback on the case manager role (reflexive monitoring). Despite the intuitive appeal of case management to all stakeholders, there were multiple barriers to implementation in primary care in England including: difficulties in embedding case managers within existing well-established community networks; the challenges of protecting time for case management; and case managers' inability to identify, and act on, emerging patient and carer needs (an essential, but

  18. From Holonomy of the Ising Model Form Factors to n-Fold Integrals and the Theory of Elliptic Curves

    Directory of Open Access Journals (Sweden)

    Salah Boukraa

    2007-10-01

    Full Text Available We recall the form factors $f^(j_{N,N}$ corresponding to the $lambda$-extension $C(N,N; lambda$ of the two-point diagonal correlation function of the Ising model on the square lattice and their associated linear differential equations which exhibit both a "Russian-doll" nesting, and a decomposition of the linear differential operators as a direct sum of operators (equivalent to symmetric powers of the differential operator of the complete elliptic integral $E$. The scaling limit of these differential operators breaks the direct sum structure but not the "Russian doll" structure, the "scaled" linear differential operators being no longer Fuchsian. We then introduce some multiple integrals of the Ising class expected to have the same singularities as the singularities of the $n$-particle contributions $chi^{(n}$ to the susceptibility of the square lattice Ising model. We find the Fuchsian linear differential equations satisfied by these multiple integrals for $n = 1, 2, 3, 4$ and, only modulo a prime, for $n = 5$ and 6, thus providing a large set of (possible new singularities of the $chi^{(n}$. We get the location of these singularities by solving the Landau conditions. We discuss the mathematical, as well as physical, interpretation of these new singularities. Among the singularities found, we underline the fact that the quadratic polynomial condition $1 + 3w + 4w^2 = 0$, that occurs in the linear differential equation of $chi^{(3}$, actually corresponds to the occurrence of complex multiplication for elliptic curves. The interpretation of complex multiplication for elliptic curves as complex fixed points of generators of the exact renormalization group is sketched. The other singularities occurring in our multiple integrals are not related to complex multiplication situations, suggesting a geometric interpretation in terms of more general (motivic mathematical structures beyond the theory of elliptic curves. The scaling limit of the (lattice

  19. Simulated glass-forming polymer melts: dynamic scattering functions, chain length effects, and mode-coupling theory analysis.

    Science.gov (United States)

    Frey, S; Weysser, F; Meyer, H; Farago, J; Fuchs, M; Baschnagel, J

    2015-02-01

    We present molecular-dynamics simulations for a fully flexible model of polymer melts with different chain length N ranging from short oligomers (N = 4) to values near the entanglement length (N = 64). For these systems we explore the structural relaxation of the supercooled melt near the critical temperature T c of mode-coupling theory (MCT). Coherent and incoherent scattering functions are analyzed in terms of the idealized MCT. For temperatures T > T c we provide evidence for the space-time factorization property of the β relaxation and for the time-temperature superposition principle (TTSP) of the α relaxation, and we also discuss deviations from these predictions for T ≈ T c. For T larger than the smallest temperature where the TTSP holds we perform a quantitative analysis of the dynamics with the asymptotic MCT predictions for the late β regime. Within MCT a key quantity, in addition to T c, is the exponent parameter λ. For the fully flexible polymer models studied we find that λ is independent of N and has a value (λ = 0.735 ) typical of simple glass-forming liquids. On the other hand, the critical temperature increases with chain length toward an asymptotic value T c (∞) . This increase can be described by T c (∞) - T c(N) ∼ 1/N and may be interpreted in terms of the N dependence of the monomer density ρ, if we assume that the MCT glass transition is ruled by a soft-sphere-like constant coupling parameter Γ c = ρ c T c (-1/4), where ρ c is the monomer density at T c. In addition, we also estimate T c from a Hansen-Verlet-like criterion and MCT calculations based on structural input from the simulation. For our polymer model both the Hansen-Verlet criterion and the MCT calculations suggest T c to decrease with increasing chain length, in contrast to the direct analysis of the simulation data.

  20. Effect of ions on sulfuric acid-water binary particle formation: 2. Experimental data and comparison with QC-normalized classical nucleation theory

    CERN Document Server

    Duplissy, J.; Franchin, A.; Tsagkogeorgas, G.; Kangasluoma, J.; Wimmer, D.; Vuollekoski, H.; Schobesberger, S.; Lehtipalo, K.; Flagan, R. C.; Brus, D.; Donahue, N. M.; Vehkamäki, H.; Almeida, J.; Amorim, A.; Barmet, P.; Bianchi, F.; Breitenlechner, M.; Dunne, E. M.; Guida, R.; Henschel, H.; Junninen, H.; Kirkby, J.; Kürten, A.; Kupc, A.; Määttänen, A.; Makhmutov, V.; Mathot, S.; Nieminen, T.; Onnela, A.; Praplan, A. P.; Riccobono, F.; Rondo, L.; Steiner, G.; Tome, A.; Walther, H.; Baltensperger, U.; Carslaw, K. S.; Dommen, J.; Hansel, A.; Petäjä, T.; Sipilä, M.; Stratmann, F.; Vrtala, A.; Wagner, P. E.; Worsnop, D. R.; Curtius, J.; Kulmala, M.

    2015-09-04

    We report comprehensive, demonstrably contaminant‐free measurements of binary particle formation rates by sulfuric acid and water for neutral and ion‐induced pathways conducted in the European Organization for Nuclear Research Cosmics Leaving Outdoor Droplets chamber. The recently developed Atmospheric Pressure interface‐time of flight‐mass spectrometer was used to detect contaminants in charged clusters and to identify runs free of any contaminants. Four parameters were varied to cover ambient conditions: sulfuric acid concentration (105 to 109 mol cm−3), relative humidity (11% to 58%), temperature (207 K to 299 K), and total ion concentration (0 to 6800 ions cm−3). Formation rates were directly measured with novel instruments at sizes close to the critical cluster size (mobility size of 1.3 nm to 3.2 nm). We compare our results with predictions from Classical Nucleation Theory normalized by Quantum Chemical calculation (QC‐normalized CNT), which is described in a companion pape...

  1. 'It Opened My Eyes'-examining the impact of a multifaceted chlamydia testing intervention on general practitioners using Normalization Process Theory.

    Science.gov (United States)

    Yeung, Anna; Hocking, Jane; Guy, Rebecca; Fairley, Christopher K; Smith, Kirsty; Vaisey, Alaina; Donovan, Basil; Imrie, John; Gunn, Jane; Temple-Smith, Meredith

    2018-03-28

    Chlamydia is the most common notifiable sexually transmissible infection in Australia. Left untreated, it can develop into pelvic inflammatory disease and infertility. The majority of notifications come from general practice and it is ideally situated to test young Australians. The Australian Chlamydia Control Effectiveness Pilot (ACCEPt) was a multifaceted intervention that aimed to reduce chlamydia prevalence by increasing testing in 16- to 29-year-olds attending general practice. GPs were interviewed to describe the effectiveness of the ACCEPt intervention in integrating chlamydia testing into routine practice using Normalization Process Theory (NPT). GPs were purposively selected based on age, gender, geographic location and size of practice at baseline and midpoint. Interview data were analysed regarding the intervention components and results were interpreted using NPT. A total of 44 GPs at baseline and 24 at midpoint were interviewed. Most GPs reported offering a test based on age at midpoint versus offering a test based on symptoms or patient request at baseline. Quarterly feedback was the most significant ACCEPt component for facilitating a chlamydia test. The ACCEPt intervention has been able to moderately normalize chlamydia testing among GPs, although the components had varying levels of effectiveness. NPT can demonstrate the effective implementation of an intervention in general practice and has been valuable in understanding which components are essential and which components can be improved upon.

  2. Balancing the Roles of Explicit Instruction of Text Form Language and Schema Theory in Student Non-Fiction Writing: Problems and Possibilities

    Science.gov (United States)

    Broer van Arragon, Kathleen

    2003-01-01

    The focus of this study will be on the intersection of the following domains: Second Language Acquisition research on cohesion and coherence, discourse acquisition of young children, the effect of text form-focused instruction on student non-fiction writing and the impact of schema theory on student decision-making during the writing process.

  3. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 1. Technical Report #1216

    Science.gov (United States)

    Anderson, Daniel; Park, Jasmine, Bitnara; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest/and alternate form) and G-Theory/D-Study research on the easy CBM reading measures, grades 1-5. Data were gathered in the spring 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due…

  4. Non normal and non quadratic anisotropic plasticity coupled with ductile damage in sheet metal forming: Application to the hydro bulging test

    International Nuclear Information System (INIS)

    Badreddine, Houssem; Saanouni, Khemaies; Dogui, Abdelwaheb

    2007-01-01

    In this work an improved material model is proposed that shows good agreement with experimental data for both hardening curves and plastic strain ratios in uniaxial and equibiaxial proportional loading paths for steel metal until the final fracture. This model is based on non associative and non normal flow rule using two different orthotropic equivalent stresses in both yield criterion and plastic potential functions. For the plastic potential the classical Hill 1948 quadratic equivalent stress is considered while for the yield criterion the Karafillis and Boyce 1993 non quadratic equivalent stress is used taking into account the non linear mixed (kinematic and isotropic) hardening. Applications are made to hydro bulging tests using both circular and elliptical dies. The results obtained with different particular cases of the model such as the normal quadratic and the non normal non quadratic cases are compared and discussed with respect to the experimental results

  5. Modular forms

    NARCIS (Netherlands)

    Edixhoven, B.; van der Geer, G.; Moonen, B.; Edixhoven, B.; van der Geer, G.; Moonen, B.

    2008-01-01

    Modular forms are functions with an enormous amount of symmetry that play a central role in number theory, connecting it with analysis and geometry. They have played a prominent role in mathematics since the 19th century and their study continues to flourish today. Modular forms formed the

  6. Development of the Parent Form of the Preschool Children's Communication Skills Scale and Comparison of the Communication Skills of Children with Normal Development and with Autism Spectrum Disorder

    Science.gov (United States)

    Aydin, Aydan

    2016-01-01

    This study aims at developing an assessment scale for identifying preschool children's communication skills, at distinguishing children with communication deficiencies and at comparing the communication skills of children with normal development (ND) and those with autism spectrum disorder (ASD). Participants were 427 children of up to 6 years of…

  7. Low-energy analysis of the nucleon electromagnetic form factors[12.39.Fe; 13.40.Gp; 14.20.Dh; Nucleon electromagnetic form factors; Chiral perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Kubis, Bastian. E-mail: b.kubis@fz-juelich.de; Meissner, Ulf-G. E-mail: Ulf-G.Meissner@fz-juelich.de

    2001-01-01

    We analyze the electromagnetic form factors of the nucleon to fourth order in relativistic baryon chiral perturbation theory. We employ the recently proposed infrared regularization scheme and show that the convergence of the chiral expansion is improved as compared to the heavy-fermion approach. We also discuss the inclusion of vector mesons and obtain an accurate description of all four-nucleon form factors for momentum transfer squared up to Q{sup 2}{approx_equal}0.4 GeV{sup 2}.

  8. Evaluation of the Psychometric Properties of the Asian Adolescent Depression Scale and Construction of a Short Form: An Item Response Theory Analysis.

    Science.gov (United States)

    Lo, Barbara Chuen Yee; Zhao, Yue; Kwok, Alice Wai Yee; Chan, Wai; Chan, Calais Kin Yuen

    2017-07-01

    The present study applied item response theory to examine the psychometric properties of the Asian Adolescent Depression Scale and to construct a short form among 1,084 teenagers recruited from secondary schools in Hong Kong. Findings suggested that some items of the full form reflected higher levels of severity and were more discriminating than others, and the Asian Adolescent Depression Scale was useful in measuring a broad range of depressive severity in community youths. Differential item functioning emerged in several items where females reported higher depressive severity than males. In the short form construction, preliminary validation suggested that, relative to the 20-item full form, our derived short form offered significantly greater diagnostic performance and stronger discriminatory ability in differentiating depressed and nondepressed groups, and simultaneously maintained adequate measurement precision with a reduced response burden in assessing depression in the Asian adolescents. Cultural variance in depressive symptomatology and clinical implications are discussed.

  9. Molecular conformational analysis, vibrational spectra and normal coordinate analysis of trans-1,2-bis(3,5-dimethoxy phenyl)-ethene based on density functional theory calculations.

    Science.gov (United States)

    Joseph, Lynnette; Sajan, D; Chaitanya, K; Isac, Jayakumary

    2014-03-25

    The conformational behavior and structural stability of trans-1,2-bis(3,5-dimethoxy phenyl)-ethene (TDBE) were investigated by using density functional theory (DFT) method with the B3LYP/6-311++G(d,p) basis set combination. The vibrational wavenumbers of TDBE were computed at DFT level and complete vibrational assignments were made on the basis of normal coordinate analysis calculations (NCA). The DFT force field transformed to natural internal coordinates was corrected by a well-established set of scale factors that were found to be transferable to the title compound. The infrared and Raman spectra were also predicted from the calculated intensities. The observed Fourier transform infrared (FTIR) and Fourier transform (FT) Raman vibrational wavenumbers were analyzed and compared with the theoretically predicted vibrational spectra. Comparison of the simulated spectra with the experimental spectra provides important information about the ability of the computational method to describe the vibrational modes. Information about the size, shape, charge density distribution and site of chemical reactivity of the molecules has been obtained by mapping electron density isosurface with electrostatic potential surfaces (ESP). Copyright © 2013 Elsevier B.V. All rights reserved.

  10. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 5. Technical Report #1220

    Science.gov (United States)

    Lai, Cheng-Fei; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  11. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 2. Technical Report #1217

    Science.gov (United States)

    Anderson, Daniel; Lai, Cheg-Fei; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest an alternate form) and G-Theory/D-Study on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from the convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due to…

  12. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Passage Reading Fluency Assessments: Grade 4. Technical Report #1219

    Science.gov (United States)

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  13. Atomic motions in the αβ-region of glass-forming polymers: molecular versus mode coupling theory approach

    International Nuclear Information System (INIS)

    Colmenero, Juan; Narros, Arturo; Alvarez, Fernando; Arbe, Arantxa; Moreno, Angel J

    2007-01-01

    We present fully atomistic molecular dynamics simulation results on a main-chain polymer, 1,4-polybutadiene, in the merging region of the α- and β-relaxations. A real-space analysis reveals the occurrence of localized motions ('β-like') in addition to the diffusive structural relaxation. A molecular approach provides a direct connection between the local conformational changes reflected in the atomic motions and the secondary relaxations in this polymer. Such local processes occur just in the time window where the β-process of the mode coupling theory is expected. We show that the application of this theory is still possible and yields an unusually large value of the exponent parameter. This result might originate from the competition between two mechanisms for dynamic arrest: intermolecular packing and intramolecular barriers for local conformational changes ('β-like')

  14. Effect of psychological intervention in the form of relaxation and guided imagery on cellular immune function in normal healthy subjects. An overview

    DEFF Research Database (Denmark)

    Zachariae, R; Kristensen, J S; Hokland, P

    1991-01-01

    The present study measured the effects of relaxation and guided imagery on cellular immune function. During a period of 10 days 10 healthy subjects were given one 1-hour relaxation procedure and one combined relaxation and guided imagery procedure, instructing the subjects to imagine their immune...... on the immune defense and could form the basis of further studies on psychological intervention and immunological status. Udgivelsesdato: 1990-null...

  15. Matrix forming characteristics of inner and outer human meniscus cells on 3D collagen scaffolds under normal and low oxygen tensions.

    Science.gov (United States)

    Croutze, Roger; Jomha, Nadr; Uludag, Hasan; Adesida, Adetola

    2013-12-13

    Limited intrinsic healing potential of the meniscus and a strong correlation between meniscal injury and osteoarthritis have prompted investigation of surgical repair options, including the implantation of functional bioengineered constructs. Cell-based constructs appear promising, however the generation of meniscal constructs is complicated by the presence of diverse cell populations within this heterogeneous tissue and gaps in the information concerning their response to manipulation of oxygen tension during cell culture. Four human lateral menisci were harvested from patients undergoing total knee replacement. Inner and outer meniscal fibrochondrocytes (MFCs) were expanded to passage 3 in growth medium supplemented with basic fibroblast growth factor (FGF-2), then embedded in porous collagen type I scaffolds and chondrogenically stimulated with transforming growth factor β3 (TGF-β3) under 21% (normal or normoxic) or 3% (hypoxic) oxygen tension for 21 days. Following scaffold culture, constructs were analyzed biochemically for glycosaminoglycan production, histologically for deposition of extracellular matrix (ECM), as well as at the molecular level for expression of characteristic mRNA transcripts. Constructs cultured under normal oxygen tension expressed higher levels of collagen type II (p = 0.05), aggrecan (p oxygen tension. There was no significant difference in expression of these genes between scaffolds seeded with MFCs isolated from inner or outer regions of the tissue following 21 days chondrogenic stimulation (p > 0.05). Cells isolated from inner and outer regions of the human meniscus demonstrated equivalent differentiation potential toward chondrogenic phenotype and ECM production. Oxygen tension played a key role in modulating the redifferentiation of meniscal fibrochondrocytes on a 3D collagen scaffold in vitro.

  16. The Direct and Indirect Impact of Pharmaceutical Industry in Economic Expansion and Job Creation: Evidence from Bootstrapping and Normal Theory Methods

    Directory of Open Access Journals (Sweden)

    Rizwan Raheem Ahmed

    2018-05-01

    Full Text Available The objective of this research article is to examine the role of Pakistan’s pharmaceutical industry in job creation opportunities, with the sacred intention to eradicate poverty, and expansion in economic activities. This research is quantitative in nature, and the data is directly gathered through closed-ended questionnaires from 300 respondents. Besides predictors’, four mediating variables have also been taken into consideration that contribute indirectly in job creation opportunities. Bootstrapping and Normal theory methods have been employed in order to examine the impact of predictors’ and mediating variables. The result of this research confirmed that pharmaceutical industry plays a vital role in job creation in Pakistan. It is further concluded that the pharmaceutical industry has a direct and significant impact in job creation by providing indigenous and direct job opportunities in sales, marketing, and other supporting departments for both skilled and unskilled workers. Pharmaceutical industry also provides indirect job opportunities through other industries, which are very much linked with this industry, such as: pharmaceutical distributors, dealers, retailers, wholesalers, hotel industry, and event management industry. It is also determined that pharmaceutical industry is acting like knowledge and skills imparting institutions. Therefore, skilled-based training and organizational learning are major mediating variables that transform unskilled people into human assets, which further trigger the future job prospects. Since pharmaceutical industry is one of the biggest industries in Pakistan, providing plenteous opportunities of new jobs with consistent growth. Thus, mediating variables such as motivation and interpersonal influence also preceded an active role in new job creation

  17. Reconstructing Normality

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov

    2012-01-01

    Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...

  18. Group contribution methodology based on the statistical associating fluid theory for heteronuclear molecules formed from Mie segments.

    Science.gov (United States)

    Papaioannou, Vasileios; Lafitte, Thomas; Avendaño, Carlos; Adjiman, Claire S; Jackson, George; Müller, Erich A; Galindo, Amparo

    2014-02-07

    A generalization of the recent version of the statistical associating fluid theory for variable range Mie potentials [Lafitte et al., J. Chem. Phys. 139, 154504 (2013)] is formulated within the framework of a group contribution approach (SAFT-γ Mie). Molecules are represented as comprising distinct functional (chemical) groups based on a fused heteronuclear molecular model, where the interactions between segments are described with the Mie (generalized Lennard-Jonesium) potential of variable attractive and repulsive range. A key feature of the new theory is the accurate description of the monomeric group-group interactions by application of a high-temperature perturbation expansion up to third order. The capabilities of the SAFT-γ Mie approach are exemplified by studying the thermodynamic properties of two chemical families, the n-alkanes and the n-alkyl esters, by developing parameters for the methyl, methylene, and carboxylate functional groups (CH3, CH2, and COO). The approach is shown to describe accurately the fluid-phase behavior of the compounds considered with absolute average deviations of 1.20% and 0.42% for the vapor pressure and saturated liquid density, respectively, which represents a clear improvement over other existing SAFT-based group contribution approaches. The use of Mie potentials to describe the group-group interaction is shown to allow accurate simultaneous descriptions of the fluid-phase behavior and second-order thermodynamic derivative properties of the pure fluids based on a single set of group parameters. Furthermore, the application of the perturbation expansion to third order for the description of the reference monomeric fluid improves the predictions of the theory for the fluid-phase behavior of pure components in the near-critical region. The predictive capabilities of the approach stem from its formulation within a group-contribution formalism: predictions of the fluid-phase behavior and thermodynamic derivative properties of

  19. From Theory to Experiment: Hadron Electromagnetic Form Factors in Space-like and Time-like Regions

    International Nuclear Information System (INIS)

    Tomasi-Gustafsson, E.; Gakh, G.I.; Rekalo, A.P.

    2007-01-01

    Hadron electromagnetic form factors contain information on the intrinsic structure of the hadrons. The pioneering work developed at the Kharkov Physical-Technical Institute in the 60's on the relation between the polarized cross section and the proton form factors triggered a number of experiments. Such experiments could be performed only recently due to the progress in accelerator and polarimetry techniques. The principle of these measurements is recalled and surprise and very precise results obtained on proton are presented. The actual status of nucleon electromagnetic form factors is reviewed, with special attention to the basic work done in Kharkov Physical-Technical Institute. This Paper is devoted to the memory of Prof. M.P. Rekalo

  20. Discretely Conservative Finite-Difference Formulations for Nonlinear Conservation Laws in Split Form: Theory and Boundary Conditions

    Science.gov (United States)

    Fisher, Travis C.; Carpenter, Mark H.; Nordstroem, Jan; Yamaleev, Nail K.; Swanson, R. Charles

    2011-01-01

    Simulations of nonlinear conservation laws that admit discontinuous solutions are typically restricted to discretizations of equations that are explicitly written in divergence form. This restriction is, however, unnecessary. Herein, linear combinations of divergence and product rule forms that have been discretized using diagonal-norm skew-symmetric summation-by-parts (SBP) operators, are shown to satisfy the sufficient conditions of the Lax-Wendroff theorem and thus are appropriate for simulations of discontinuous physical phenomena. Furthermore, special treatments are not required at the points that are near physical boundaries (i.e., discrete conservation is achieved throughout the entire computational domain, including the boundaries). Examples are presented of a fourth-order, SBP finite-difference operator with second-order boundary closures. Sixth- and eighth-order constructions are derived, and included in E. Narrow-stencil difference operators for linear viscous terms are also derived; these guarantee the conservative form of the combined operator.

  1. Transformative Learning and the Form That Transforms: Towards a Psychosocial Theory of Recognition Using Auto/ Biographical Narrative Research

    Science.gov (United States)

    West, Linden

    2014-01-01

    In this article, I interrogate the changing forms that may be fundamental to transformative learning and how these are best chronicled and understood. Drawing on auto/biographical narrative research, I challenge the continuing primacy of a kind of overly disembodied, decontextualized cognition as the basis of transformation. Notions of epistemic…

  2. Hartree-Fock theory for the equilibrium shape of light nuclei; Theorie Hartree-Fock de la forme d'equilibre des noyaux legers

    Energy Technology Data Exchange (ETDEWEB)

    Ripka, G [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-09-01

    Most of the content of this thesis is published in english in Advances In Nuclear Physics, Vol. 1 (Editors: Baranger and Vogt - Plenum Press). The Hartree- Fock equations are derived. The expansions of the orbits and the possible symmetries of the Hartree-Fock field are discussed. Wavefunctions of even-even N = Z nuclei are given for 12 {<=} A {<=} 40. The role of the monopole, quadrupole and exchange components of the force are discussed. The multiplicity of the solutions and the effect of the spin-orbit interaction are discussed. Exact angular momentum projection is used to generate rotational bands. The validity of the adiabatic rotational model in light nuclei is discussed. Hartree-Fock calculations are extended to include major-shell mixing in order to obtain quadrupole deformations without the use of effective charge. The incompressibility, of nuclei is discussed and the compatibility between the Hartree-Fock solutions, the Mottelson model of quadrupole deformations and the SU3 states of J.P. Elliott and M. Moshinsky is established. (author) [French] La theorie de Hartree-Fock est appliquee au calcul des fonctions d'onde des noyaux legers deformes. Les equations de Hartree-Fock, les symetries permises et le choix du developpement des orbites sont discutes. Les fonctions d'onde des noyaux pair-pairs N = Z (12 {<=} A {<=} 40) sont tabulees. Les contributions des composantes monopolaires et quadrupolaires ainsi que des termes d'echange de la force nucleon-nucleon sont discutees. La methode de projection de moment cinetique est utilisee pour engendrer les bandes de rotation. La validite du modele rotationnel adiabatique est discutee. Les calculs de Hartree-Fock qui tiennent compte du melange de plusieurs couches majeures dans chaque orbite sont appliques au calcul des deformations quadrupolaires sans l'utilisation de charge effective. L'incompressibilite des noyaux et la compatibilite des fonctions d'onde de Hartree- Fock avec les fonctions d'onde SU3 de J

  3. Hartree-Fock theory for the equilibrium shape of light nuclei; Theorie Hartree-Fock de la forme d'equilibre des noyaux legers

    Energy Technology Data Exchange (ETDEWEB)

    Ripka, G. [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-09-01

    Most of the content of this thesis is published in english in Advances In Nuclear Physics, Vol. 1 (Editors: Baranger and Vogt - Plenum Press). The Hartree- Fock equations are derived. The expansions of the orbits and the possible symmetries of the Hartree-Fock field are discussed. Wavefunctions of even-even N = Z nuclei are given for 12 {<=} A {<=} 40. The role of the monopole, quadrupole and exchange components of the force are discussed. The multiplicity of the solutions and the effect of the spin-orbit interaction are discussed. Exact angular momentum projection is used to generate rotational bands. The validity of the adiabatic rotational model in light nuclei is discussed. Hartree-Fock calculations are extended to include major-shell mixing in order to obtain quadrupole deformations without the use of effective charge. The incompressibility, of nuclei is discussed and the compatibility between the Hartree-Fock solutions, the Mottelson model of quadrupole deformations and the SU3 states of J.P. Elliott and M. Moshinsky is established. (author) [French] La theorie de Hartree-Fock est appliquee au calcul des fonctions d'onde des noyaux legers deformes. Les equations de Hartree-Fock, les symetries permises et le choix du developpement des orbites sont discutes. Les fonctions d'onde des noyaux pair-pairs N = Z (12 {<=} A {<=} 40) sont tabulees. Les contributions des composantes monopolaires et quadrupolaires ainsi que des termes d'echange de la force nucleon-nucleon sont discutees. La methode de projection de moment cinetique est utilisee pour engendrer les bandes de rotation. La validite du modele rotationnel adiabatique est discutee. Les calculs de Hartree-Fock qui tiennent compte du melange de plusieurs couches majeures dans chaque orbite sont appliques au calcul des deformations quadrupolaires sans l'utilisation de charge effective. L'incompressibilite des noyaux et la compatibilite des fonctions d'onde de Hartree- Fock avec les

  4. Forming, changing, and acting on attitude toward affirmative action programs in employment: a theory-driven approach.

    Science.gov (United States)

    Bell, M P; Harrison, D A; McLaughlin, M E

    2000-10-01

    A model of attitude toward affirmative action programs (AAPs) was applied in 4 studies involving 1,622 participants. In Study 1, attributes people tacitly associate with AAPs were identified by open-ended elicitation. Using those attributes, an instrument was developed and administered in Studies 2, 3, and 4. In those studies, a multiplicative composite of beliefs and evaluations about the AAP attributes predicted AAP attitude, consistent with M. Fishbein and I. Ajzen's (1975) theory of reasoned action. Demographic effects on AAP attitude were partially mediated by this composite. In Studies 3 and 4, an experimental manipulation of AAP information was successful in changing AAP attitude, but in a way that polarized existing demographic differences. Study 4 also showed that AAP attitude and subjective norm jointly and uniquely predicted intentions to perform AAP-related behaviors. Intentions predicted the actual behavior of mailing postcards to political representatives reflecting participants' support for AAPs.

  5. Exploring drivers and challenges in implementation of health promotion in community mental health services: a qualitative multi-site case study using Normalization Process Theory.

    Science.gov (United States)

    Burau, Viola; Carstensen, Kathrine; Fredens, Mia; Kousgaard, Marius Brostrøm

    2018-01-24

    There is an increased interest in improving the physical health of people with mental illness. Little is known about implementing health promotion interventions in adult mental health organisations where many users also have physical health problems. The literature suggests that contextual factors are important for implementation in community settings. This study focused on the change process and analysed the implementation of a structural health promotion intervention in community mental health organisations in different contexts in Denmark. The study was based on a qualitative multiple-case design and included two municipal and two regional provider organisations. Data were various written sources and 13 semi-structured interviews with 22 key managers and frontline staff. The analysis was organised around the four main constructs of Normalization Process Theory: Coherence, Cognitive Participation, Collective Action, and Reflexive Monitoring. Coherence: Most respondents found the intervention to be meaningful in that the intervention fitted well into existing goals, practices and treatment approaches. Cognitive Participation: Management engagement varied across providers and low engagement impeded implementation. Engaging all staff was a general problem although some of the initial resistance was apparently overcome. Collective Action: Daily enactment depended on staff being attentive and flexible enough to manage the complex needs and varying capacities of users. Reflexive Monitoring: During implementation, staff evaluations of the progress and impact of the intervention were mostly informal and ad hoc and staff used these to make on-going adjustments to activities. Overall, characteristics of context common to all providers (work force and user groups) seemed to be more important for implementation than differences in the external political-administrative context. In terms of research, future studies should adopt a more bottom-up, grounded description of context

  6. Normal growth, altered growth? Study of the relationship between harris lines and bone form within a post-medieval plague cemetery (Dendermonde, Belgium, 16th Century).

    Science.gov (United States)

    Boucherie, Alexandra; Castex, Dominique; Polet, Caroline; Kacki, Sacha

    2017-01-01

    Harris lines (HLs) are defined as transverse, mineralized lines associated with temporary growth arrest. In paleopathology, HLs are used to reconstruct health status of past populations. However, their etiology is still obscure. The aim of this article is to test the reliability of HLs as an arrested growth marker by investigating their incidence on human metrical parameters. The study was performed on 69 individuals (28 adults, 41 subadults) from the Dendermonde plague cemetery (Belgium, 16th century). HLs were rated on distal femora and both ends of tibiae. Overall prevalence and age-at-formation of each detected lines were calculated. ANOVA analyses were conducted within subadult and adult samples to test if the presence of HLs did impact size and shape parameters of the individuals. At Dendermonde, 52% of the individuals had at least one HL. The age-at-formation was estimated between 5 and 9 years old for the subadults and between 10 and 14 years old for the adults. ANOVA analyses showed that the presence of HLs did not affect the size of the individuals. However, significant differences in shape parameters were highlighted by HL presence. Subadults with HLs displayed slighter shape parameters than the subadults without, whereas the adults with HLs had larger measurements than the adults without. The results suggest that HLs can have a certain impact on shape parameters. The underlying causes can be various, especially for the early formed HLs. However, HLs deposited around puberty are more likely to be physiological lines reflecting hormonal secretions. Am. J. Hum. Biol. 29:e22885, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  7. THE ISSUE OF FORMING FUTURE MUSIC TEACHERS’ PROFESSIONAL COMPETENCE BY COMPUTER TECHNOLOGY TOOLS IN THE THEORY OF NATIONAL ART

    Directory of Open Access Journals (Sweden)

    Lyudmila Gavrilova

    2017-04-01

    Full Text Available The article deals with theoretical aspects of forming future music teachers’ professional competence by computer technology tools. The concept of professional competence has become a major criterion of preparing students for professional activities. The issue of the article is relevant as the competence approach has become a basis of implementing computer technologies into future music teachers’ training. The authors give a detailed analysis of implementing computer technologies into musical education. The special attention is paid to using a computer in musical education and making electronic pedagogical resources. The aim of the article is to outline the directions of national art research in the process of implementing computer tools that is one of the most efficient ways of updating process of future music teachers’ training. The article reveals theoretical aspects of forming future music teachers’ professional competence by computer technology tools. The authors point out that implementing musical and computer technologies into music art practice is realized in some directions: using a computer as a new musical instrument in composers, sound engineers, and arrangers’ activities; using a computer for studying the quality of music sound, analysing sounds and music compositions, spectral analysis of acoustic characteristics of singers’ voice; studying ancient music manuscripts due to digital technology; developing hardware and software for music education. A distinct direction of research is the pedagogical aspect of using a computer in music education (music and the use of special software for recording and editing music, the use of multimedia to enhance visibility in education, development of e-learning resources, etc.. The authors conclude that implementing computer technologies into future music teachers’ training makes this process more efficient. In the authors’ opinion the widespread introduction of distance learning

  8. Modeling of contact theories for the manipulation of biological micro/nanoparticles in the form of circular crowned rollers based on the atomic force microscope

    International Nuclear Information System (INIS)

    Korayem, M. H.; Khaksar, H.; Taheri, M.

    2013-01-01

    This article has dealt with the development and modeling of various contact theories for biological nanoparticles shaped as cylinders and circular crowned rollers for application in the manipulation of different biological micro/nanoparticles based on Atomic Force Microscope. First, the effective contact forces were simulated, and their impact on contact mechanics simulation was investigated. In the next step, the Hertz contact model was simulated and compared for gold and DNA nanoparticles with the three types of spherical, cylindrical, and circular crowned roller type contact geometries. Then by reducing the length of the cylindrical section in the circular crowned roller geometry, the geometry of the body was made to approach that of a sphere, and the results were compared for DNA nanoparticles. To anticipatory validate the developed theories, the results of the cylindrical and the circular crowned roller contacts were compared with the results of the existing spherical contact simulations. Following the development of these contact models for the manipulation of various biological micro/nanoparticles, the cylindrical and the circular crowned roller type contact theories were modeled based on the theories of Lundberg, Dowson, Nikpur, Heoprich, and Hertz for the manipulation of biological micro/nanoparticles. Then, for a more accurate validation, the results obtained from the simulations were compared with those obtained by the finite element method and with the experimental results available in previous articles. The previous research works on the simulation of nanomanipulation have mainly investigated the contact theories used in the manipulation of spherical micro/nanoparticles. However since in real biomanipulation situations, biological micro/nanoparticles of more complex shapes need to be displaced in biological environments, this article therefore has modeled and compared, for the first time, different contact theories for use in the biomanipulation of

  9. Normalization in Lie algebras via mould calculus and applications

    Science.gov (United States)

    Paul, Thierry; Sauzin, David

    2017-11-01

    We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.

  10. An explanation of forms of planetary orbits and estimation of angular shift of the Mercury' perihelion using the statistical theory of gravitating spheroidal bodies

    Science.gov (United States)

    Krot, A. M.

    2013-09-01

    This work develops a statistical theory of gravitating spheroidal bodies to calculate the orbits of planets and explore forms of planetary orbits with regard to the Alfvén oscillating force [1] in the Solar system and other exoplanetary systems. The statistical theory of formation of gravitating spheroidal bodies has been proposed in [2]-[5]. Starting the conception for forming a spheroidal body inside a gas-dust protoplanetary nebula, this theory solves the problem of gravitational condensation of a gas-dust protoplanetary cloud with a view to planetary formation in its own gravitational field [3] as well as derives a new law of the Solar system planetary distances which generalizes the wellknown laws [2], [3]. This work also explains an origin of the Alfvén oscillating force modifying forms of planetary orbits within the framework of the statistical theory of gravitating spheroidal bodies [5]. Due to the Alfvén oscillating force moving solid bodies in a distant zone of a rotating spheroidal body have elliptic trajectories. It means that orbits for the enough remote planets from the Sun in Solar system are described by ellipses with focus in the origin of coordinates and with small eccentricities. The nearby planet to Sun named Mercury has more complex trajectory. Namely, in case of Mercury the angular displacement of a Newtonian ellipse is observed during its one rotation on an orbit, i.e. a regular (century) shift of the perihelion of Mercury' orbit occurs. According to the statistical theory of gravitating spheroidal bodies [2]-[5] under the usage of laws of celestial mechanics in conformity to cosmogonic bodies (especially, to stars) it is necessary to take into account an extended substance called a stellar corona. In this connection the stellar corona can be described by means of model of rotating and gravitating spheroidal body [5]. Moreover, the parameter of gravitational compression α of a spheroidal body (describing the Sun, in particular) has been

  11. Harmonic Maass forms and mock modular forms

    CERN Document Server

    Bringmann, Kathrin; Ono, Ken

    2017-01-01

    Modular forms and Jacobi forms play a central role in many areas of mathematics. Over the last 10-15 years, this theory has been extended to certain non-holomorphic functions, the so-called "harmonic Maass forms". The first glimpses of this theory appeared in Ramanujan's enigmatic last letter to G. H. Hardy written from his deathbed. Ramanujan discovered functions he called "mock theta functions" which over eighty years later were recognized as pieces of harmonic Maass forms. This book contains the essential features of the theory of harmonic Maass forms and mock modular forms, together with a wide variety of applications to algebraic number theory, combinatorics, elliptic curves, mathematical physics, quantum modular forms, and representation theory.

  12. Asymptotic form factor of non-Abelian gauge theories, planar diagrammatics and complex poles as resonances in the analytic s-matrix

    International Nuclear Information System (INIS)

    Knight, D.W.

    1976-01-01

    Reasons are given for studying the form factor and a method for constructing all believed-to-be leading form factor diagrams in a certain class of non-Abelian gauge theories (NAGT's) in typical kinematic limits. The possibility that the form factor ''exponentiates'' in NAGT's (as it does in QED) is discussed. A method is given for constructing all 1CI planar diagrams (this is, all 1PI diagrams except those which separate upon cutting at a vertex) directly from one's heat--that is, without the need to refer to tables, et cetera. It is noted that the material is believed to be essentially completely original, that is, the technique for constructing all 1CI planar diagrams in an iterative fashion is completely new. Of course, one can construct them in an essentially random fashion, but this technique is slow and extremely error prone compared with the iterative technique given. The idea of associating an elastic resonance with a complex pole in the analytic scattering amplitude, T(E), is discussed. Calculations of the pole position and the residue of the Δ 33 resonance are given, along with an analysis of experimentally induced error in the pole position

  13. Modelling of tension stiffening for normal and high strength concrete

    DEFF Research Database (Denmark)

    Christiansen, Morten Bo; Nielsen, Mogens Peter

    1998-01-01

    form the model is extended to apply to biaxial stress fields as well. To determine the biaxial stress field, the theorem of minimum complementary elastic energy is used. The theory has been compared with tests on rods, disks, and beams of both normal and high strength concrete, and very good results...

  14. Quasiparticle Green's function theory of the Josephson effect in chiral p-wave superconductor/diffusive normal metal/chiral p-wave superconductor junctions

    NARCIS (Netherlands)

    Sawa, Y.; Yokoyama, T.; Tanaka, Y.; Golubov, Alexandre Avraamovitch

    2007-01-01

    We study the Josephson effect in chiral p-wave superconductor/diffusive normal metal (DN)/chiral p-wave superconductor (CP/DN/CP) junctions using quasiclassical Green's function formalism with proper boundary conditions. The px+ipy-wave symmetry of superconducting order parameter is chosen which is

  15. Theory of Josephson effect in Sr2RuO4/diffusive normal metal/Sr2RuO4 junctions

    NARCIS (Netherlands)

    Sawa, Y.; Yokoyama, T.; Tanaka, Y.; Golubov, Alexandre Avraamovitch

    2007-01-01

    We derive a generalized Nazarov’s boundary condition for diffusive normal metal (DN)/chiral p-wave superconductor (CP) interface including the macroscopic phase of the superconductor. The Josephson effect is studied in CP/DN/CP junctions solving the Usadel equations under the above boundary

  16. Corrosion mechanisms for metal alloy waste forms: experiment and theory Level 4 Milestone M4FT-14LA0804024 Fuel Cycle Research & Development

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiang-Yang [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Taylor, Christopher D. [The Ohio State Univ., Columbus, OH (United States). Fontana Corrosion Center; Kim, Eunja [Univ. of Nevada, Las Vegas, NV (United States); Goff, George Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kolman, David Gary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-07-31

    This document meets Level 4 Milestone: Corrosion mechanisms for metal alloy waste forms - experiment and theory. A multiphysics model is introduces that will provide the framework for the quantitative prediction of corrosion rates of metallic waste forms incorporating the fission product Tc. The model requires a knowledge of the properties of not only the metallic waste form, but also the passive oxide films that will be generated on the waste form, and the chemistry of the metal/oxide and oxide/environment interfaces. in collaboration with experimental work, the focus of this work is on obtaining these properties from fundamental atomistic models. herein we describe the overall multiphysics model, which is based on MacDonald's point-defect model for passivity. We then present the results of detailed electronic-structure calculations for the determination of the compatibility and properties of Tc when incorporated into intermetallic oxide phases. This work is relevant to the formation of multi-component oxides on metal surfaces that will incorporate Tc, and provide a kinetic barrier to corrosion (i.e. the release of Tc to the environment). Atomistic models that build upon the electronic structure calculations are then described using the modified embedded atom method to simulate metallic dissolution, and Buckingham potentials to perform classical molecular dynamics and statics simulations of the technetium (and, later, iron-technetium) oxide phases. Electrochemical methods were then applied to provide some benchmark information of the corrosion and electrochemical properties of Technetium metal. The results indicate that published information on Tc passivity is not complete and that further investigation is warranted.

  17. Complete Normal Ordering 1: Foundations

    CERN Document Server

    Ellis, John; Skliros, Dimitri P.

    2016-01-01

    We introduce a new prescription for quantising scalar field theories perturbatively around a true minimum of the full quantum effective action, which is to `complete normal order' the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all `cephalopod' Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of `complete normal ordering' (which is an extension of the standard field theory definition of normal ordering) reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative i...

  18. Crossover from normal (N) Ohmic subdivision to superconducting (S) equipartition of current in parallel conductors at the N-S transition: Theory

    OpenAIRE

    Kumar, N.

    2007-01-01

    The recently observed (1) equipartition of current in parallel at and below the Normal-Superconducting (N-S) transition can be understood in terms of a Landau-Ginzburg order-parameter phenomenology. This complements the explanation proposed earlier (1) based on the flux-flow resistance providing a nonlinear negative current feedback towards equipartition when the transition is approached from above. The present treatment also unifies the usual textbook inductive subdivision expected much belo...

  19. Chandra-SDSS Normal and Star-Forming Galaxies. I. X-Ray Source Properties of Galaxies Detected by the Chandra X-Ray Observatory in SDSS DR2

    Science.gov (United States)

    Hornschemeier, A. E.; Heckman, T. M.; Ptak, A. F.; Tremonti, C. A.; Colbert, E. J. M.

    2005-01-01

    We have cross-correlated X-ray catalogs derived from archival Chandra X-Ray Observatory ACIS observations with a Sloan Digital Sky Survey Data Release 2 (DR2) galaxy catalog to form a sample of 42 serendipitously X-ray-detected galaxies over the redshift interval 0.03normal galaxies and those in the deepest X-ray surveys. Our chief purpose is to compare optical spectroscopic diagnostics of activity (both star formation and accretion) with X-ray properties of galaxies. Our work supports a normalization value of the X-ray-star formation rate correlation consistent with the lower values published in the literature. The difference is in the allocation of X-ray emission to high-mass X-ray binaries relative to other components, such as hot gas, low-mass X-ray binaries, and/or active galactic nuclei (AGNs). We are able to quantify a few pitfalls in the use of lower resolution, lower signal-to-noise ratio optical spectroscopy to identify X-ray sources (as has necessarily been employed for many X-ray surveys). Notably, we find a few AGNs that likely would have been misidentified as non-AGN sources in higher redshift studies. However, we do not find any X-ray-hard, highly X-ray-luminous galaxies lacking optical spectroscopic diagnostics of AGN activity. Such sources are members of the ``X-ray-bright, optically normal galaxy'' (XBONG) class of AGNs.

  20. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  1. Relativistic Normal Coupled-Cluster Theory for Accurate Determination of Electric Dipole Moments of Atoms: First Application to the ^{199}Hg Atom.

    Science.gov (United States)

    Sahoo, B K; Das, B P

    2018-05-18

    Recent relativistic coupled-cluster (RCC) calculations of electric dipole moments (EDMs) of diamagnetic atoms due to parity and time-reversal violating (P,T-odd) interactions, which are essential ingredients for probing new physics beyond the standard model of particle interactions, differ substantially from the previous theoretical results. It is therefore necessary to perform an independent test of the validity of these results. In view of this, the normal coupled-cluster method has been extended to the relativistic regime [relativistic normal coupled-cluster (RNCC) method] to calculate the EDMs of atoms by simultaneously incorporating the electrostatic and P,T-odd interactions in order to overcome the shortcomings of the ordinary RCC method. This new relativistic method has been applied to ^{199}Hg, which currently has a lower EDM limit than that of any other system. The results of our RNCC and self-consistent RCC calculations of the EDM of this atom are found to be close. The discrepancies between these two results on the one hand and those of previous calculations on the other are elucidated. Furthermore, the electric dipole polarizability of this atom, which has computational similarities with the EDM, is evaluated and it is in very good agreement with its measured value.

  2. Relativistic Normal Coupled-Cluster Theory for Accurate Determination of Electric Dipole Moments of Atoms: First Application to the 199Hg Atom

    Science.gov (United States)

    Sahoo, B. K.; Das, B. P.

    2018-05-01

    Recent relativistic coupled-cluster (RCC) calculations of electric dipole moments (EDMs) of diamagnetic atoms due to parity and time-reversal violating (P ,T -odd) interactions, which are essential ingredients for probing new physics beyond the standard model of particle interactions, differ substantially from the previous theoretical results. It is therefore necessary to perform an independent test of the validity of these results. In view of this, the normal coupled-cluster method has been extended to the relativistic regime [relativistic normal coupled-cluster (RNCC) method] to calculate the EDMs of atoms by simultaneously incorporating the electrostatic and P ,T -odd interactions in order to overcome the shortcomings of the ordinary RCC method. This new relativistic method has been applied to 199Hg, which currently has a lower EDM limit than that of any other system. The results of our RNCC and self-consistent RCC calculations of the EDM of this atom are found to be close. The discrepancies between these two results on the one hand and those of previous calculations on the other are elucidated. Furthermore, the electric dipole polarizability of this atom, which has computational similarities with the EDM, is evaluated and it is in very good agreement with its measured value.

  3. Existence of a soluble form of CD50 (intercellular adhesion molecule-3) produced upon human lymphocyte activation. Present in normal human serum and levels are increased in the serum of systemic lupus erythematosus patients.

    Science.gov (United States)

    Pino-Otín, M R; Viñas, O; de la Fuente, M A; Juan, M; Font, J; Torradeflot, M; Pallarés, L; Lozano, F; Alberola-Ila, J; Martorell, J

    1995-03-15

    CD50 (ICAM-3) is a leukocyte differentiation Ag expressed almost exclusively on hemopoietic cells, with a key role in the first steps of immune response. To develop a specific sandwich ELISA to detect a soluble CD50 form (sCD50), two different mAbs (140-11 and 101-1D2) recognizing non-overlapping epitopes were used. sCD50 was detected in the supernatant of stimulated PBMCs, with the highest levels after CD3 triggering. Simultaneously, the CD50 surface expression diminished during the first 24 h. sCD50 isolated from culture supernatant and analyzed by immunoblotting showed an apparent m.w. of 95 kDa, slightly smaller than the membrane form. These data, together with Northern blot kinetics analysis, suggest that sCD50 is cleaved from cell membrane. Furthermore, we detect sCD50 in normal human sera and higher levels in sera of systemic lupus erythematosus (SLE) patients, especially in those in active phase. The sCD50 levels showed a positive correlation with sCD27 levels (r = 0.4213; p = 0.0026). Detection of sCD50, both after in vitro CD3 triggering of PBMCs and increased in SLE sera, suggests that sCD50 could be used as a marker of lymphocyte stimulation.

  4. Theory of exotic superconductivity and normal states of heavy electron and high temperature superconductivity materials. Progress report, February 15, 1994--February 14, 1995

    International Nuclear Information System (INIS)

    Cox, D.L.

    1995-01-01

    This is a progress report for the DOE project covering the period 2/15/94 to 2/14/95. The PI had a fruitful sabbatical during this period, and had some important new results, particularly in the area of new phenomenology for heavy fermion superconductivity. Significant new research accomplishments are in the area of odd-in-time-reversal pairing states/staggered superconductivity, the two-channel Kondo lattice, and a general model for Ce impurities which admits one-, two-, and three-channel Kondo effects. Papers submitted touch on these areas: staggered superconductivity - a new phenomenology for UPt 3 ; theory of the two-channel Kondo lattice in infinite dimensions; general model of a Ce 3+ impurity. Other work was done in the areas: Knight shift in heavy fermion alloys and compounds; symmetry analysis of singular pairing correlations for the two-channel Kondo impurity model

  5. Normal co-ordinate analysis, molecular structural, non-linear optical, second order perturbation studies of Tizanidine by density functional theory.

    Science.gov (United States)

    Sheela, N R; Muthu, S; Sampathkrishnan, S; Al-Saadi, Abdulaziz A

    2015-03-15

    The spectroscopic techniques and semi-empirical molecular calculations have been utilized to analyze the drug Tizanidine (5CDIBTA). The solid phase Fourier Transform Infrared (FTIR) and Fourier Transform Raman (FTR) spectral analysis of 5CDIBTA is carried out along with density functional theory (DFT) calculations (B3LYP) with the 6-311++G(d,p) basis set. Detailed interpretation of the vibrational spectra of the compound has been made on the basis of the calculated potential energy distribution (PED). The individual atomic charges by NPA using B3LYP method is studied. A study on the Mulliken atomic charges, frontier molecular orbitals (HOMO-LUMO), molecular electrostatic potential (MEP) and thermodynamic properties were performed. The electric dipole moment (μ) and the first hyperpolarizability (α) values of the investigated molecule were also computed. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. ESTUDIO ESTADÍSTICO DEL NÚMERO DE REGLAS RESULTANTES AL TRANSFORMAR UNA GRAMÁTICA LIBRE DE CONTEXTO A LA FORMA NORMAL DE CHOMSKY STATISTICAL STUDY OF THE NUMBER OF RESULTING RULES WHEN TRANSFORMING A CONTEXT-FREE GRAMMAR TO CHOMSKY NORMAL FORM

    Directory of Open Access Journals (Sweden)

    Fredy Ángel Miguel Amaya Robayo

    2010-08-01

    Full Text Available Es un hecho conocido que toda gramática libre de contexto puede ser transformada a la forma normal de Chomsky de tal forma que los lenguajes generados por las dos gramáticas son equivalentes. Una gramática en forma normal de Chomsky (FNC, tiene algunas ventajas, por ejemplo sus árboles de derivación son binarios, la forma de sus reglas más simples etc. Por eso es siempre deseable poder trabajar con una gramática en FNC en las aplicaciones que lo requieran. Existe un algoritmo que permite transformar una gramática libre de contexto a una en FNC, sin embargo la cantidad de reglas generadas al hacer la transformación depende del número de reglas en la gramática inicial así como de otras características. En este trabajo se analiza desde el punto de vista experimental y estadístico, la relación existente entre el número de reglas iniciales y el número de reglas que resultan luego de transformar una Gramática Libre de Contexto a la FNC. Esto permite planificar la cantidad de recursos computacionales necesarios en caso de tratar con gramáticas de alguna complejidad.It is well known that any context-free grammar can be transformed to the Chomsky normal form so that the languages generated by each one are equivalent. A grammar in Chomsky Normal Form (CNF, has some advantages: their derivation trees are binary, simplest rules and so on. So it is always desirable to work with a grammar in CNF in applications that require them. There is an algorithm that can transform a context-free grammar to one CNF grammar, however the number of rules generated after the transformation depends on the initial grammar and other circumstances. In this work we analyze from the experimental and statistical point of view the relationship between the number of initial rules and the number of resulting rules after transforming. This allows you to plan the amount of computational resources needed in case of dealing with grammars of some complexity.

  7. Dynamical response of the Galileo Galilei on the ground rotor to test the equivalence principle: Theory, simulation, and experiment. I. The normal modes

    International Nuclear Information System (INIS)

    Comandi, G.L.; Chiofalo, M.L.; Toncelli, R.; Bramanti, D.; Polacco, E.; Nobili, A.M.

    2006-01-01

    Recent theoretical work suggests that violation of the equivalence principle might be revealed in a measurement of the fractional differential acceleration η between two test bodies-of different compositions, falling in the gravitational field of a source mass--if the measurement is made to the level of η≅10 -13 or better. This being within the reach of ground based experiments gives them a new impetus. However, while slowly rotating torsion balances in ground laboratories are close to reaching this level, only an experiment performed in a low orbit around the Earth is likely to provide a much better accuracy. We report on the progress made with the 'Galileo Galilei on the ground' (GGG) experiment, which aims to compete with torsion balances using an instrument design also capable of being converted into a much higher sensitivity space test. In the present and following articles (Part I and Part II), we demonstrate that the dynamical response of the GGG differential accelerometer set into supercritical rotation-in particular, its normal modes (Part I) and rejection of common mode effects (Part II)-can be predicted by means of a simple but effective model that embodies all the relevant physics. Analytical solutions are obtained under special limits, which provide the theoretical understanding. A simulation environment is set up, obtaining a quantitative agreement with the available experimental data on the frequencies of the normal modes and on the whirling behavior. This is a needed and reliable tool for controlling and separating perturbative effects from the expected signal, as well as for planning the optimization of the apparatus

  8. Teaching Form as Form

    DEFF Research Database (Denmark)

    Keiding, Tina Bering

    2012-01-01

    understanding of form per se, or, to use an expression from this text, of form as form. This challenge can be reduced to one question: how can design teaching support students in achieving not only the ability to recognize and describe different form-related concepts in existing design (i.e. analytical...

  9. The Colourful Space of Pure Form. The Problem of Colour in Theory and Practice in the Work of Stanisław Ignacy Witkiewicz

    Directory of Open Access Journals (Sweden)

    Anna Żakiewicz

    2010-01-01

    Full Text Available Colour was a very important problem in the whole work of Stanisław Ignacy Witkiewicz. Not only a whole chapter of his basic theoretical work Nowe formy w malarstwie (New Forms in Painting, published 1919 was devoted to that but the artist also examined his theory in practice executing many paintings presenting his ideas (e.g. Self-portrait, 1913, Portrait of Feliks Lewiński, 1917, Portrait of Maria Witkiewiczowa, 1918 and many others. A base for that was a harmony of complementary colours: green and red, blue and orange, violet and yellow. Many theorists before Witkacy were interested in that: Wolfgang Goethe, Philip Otto Runge, Wassili Kandinsky. The artist didn’t mention them in his deliberations but a psychologist, Hermann Ebinghaus (Grundzüge der Psychologie, 1897 and other artists only: Paul Gauguin and Pablo Picasso. Witkacy even copied one of Gauguin’s paintings, Te Arii Vahine and described it in his writings as the best example of Pure Form. Witkacy enthusiastically described Picasso’s paintings in his first novel 622 upadki Bunga (622 Downfalls of Bungo, written ca. 1910, first published 1972 and mentioned many times in his theoretical works. The artist appreciated also Georges Seurat’s paintings and his thoughts on colours. Witkacy himself created a detailed systempresenting the best colour schemes and described many compositions corresponding to that. In 1938 the artist wrote an article O istocie malarstwa (On the Essence of Painting, which was an opposition to his earlier thoughts. In the end of his life the artist decided that the most important was imagination instead of theoretical rules.

  10. FORMING PROFESSIONAL SKILLS OF THE PROSPECTIVE HEADS OF CHILDREN'S DANCE GROUPS DURING THE CHOREOGRAPHIC ACTIVITIES IN THE COURSE "FOLK DANCE THEORY AND METHODOLOGY"

    Directory of Open Access Journals (Sweden)

    Volodymyr Kotov

    2016-11-01

    Full Text Available The article highlights the urgent problem of contemporary art pedagogy – involvement to training future professional choreographic traditions of different nations. Addressing to this problem is caused by a number of socio-political events in Ukraine, mainstreaming of national and international education, integration of Ukrainian education with the European educational space, intensive development of domestic students’ intercultural communication with young people from different countries, which is the basis for updating national art education. Prospective choreographers, who are being training at pedagogical universities to manage children's dance groups, should actively be involved into creating their own productions of folk dance various genres. It promotes the formation of choreographers’ professional competence and pedagogical skills. The development of Georgian "Lezginka" is proposed – a joint creative work of the teacher and students who get higher education degree in SHEE “Donbass State Pedagogical University” (Bachelor's Degree. Development of the dance contains schematic drawings of dance figures, it is recommended for use in forming choreographers’ professional skills while studying the course "Folk Dance Theory and Methodology". The author admits that folklore material requires a cautious, respectful attitude. Therefore, modern folk stage dances are integrally to combine traditional choreographic manner with its new interpretations. The author believes the actual capture of different nations’ choreographic culture improves intercultural youth communication; involves future professionals into the traditions of different nations; form professional skills of managers of children’s dance groups. The author concluded that a dance always reflects consciousness of different nations; future choreographers should be aware of characteristic features of dances of different world nations so that on the basis of traditional

  11. Combination of classical test theory (CTT) and item response theory (IRT) analysis to study the psychometric properties of the French version of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF).

    Science.gov (United States)

    Bourion-Bédès, Stéphanie; Schwan, Raymund; Epstein, Jonathan; Laprevote, Vincent; Bédès, Alex; Bonnet, Jean-Louis; Baumann, Cédric

    2015-02-01

    The study aimed to examine the construct validity and reliability of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF) according to both classical test and item response theories. The psychometric properties of the French version of this instrument were investigated in a cross-sectional, multicenter study. A total of 124 outpatients with a substance dependence diagnosis participated in the study. Psychometric evaluation included descriptive analysis, internal consistency, test-retest reliability, and validity. The dimensionality of the instrument was explored using a combination of the classical test, confirmatory factor analysis (CFA), and an item response theory analysis, the Person Separation Index (PSI), in a complementary manner. The results of the Q-LES-Q-SF revealed that the questionnaire was easy to administer and the acceptability was good. The internal consistency and the test-retest reliability were 0.9 and 0.88, respectively. All items were significantly correlated with the total score and the SF-12 used in the study. The CFA with one factor model was good, and for the unidimensional construct, the PSI was found to be 0.902. The French version of the Q-LES-Q-SF yielded valid and reliable clinical assessments of the quality of life for future research and clinical practice involving French substance abusers. In response to recent questioning regarding the unidimensionality or bidimensionality of the instrument and according to the underlying theoretical unidimensional construct used for its development, this study suggests the Q-LES-Q-SF as a one-dimension questionnaire in French QoL studies.

  12. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, A.V.

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru

  13. Form birefringence analysis in a grating by means of modal theory; Analisis de la birrefringencia de forma en una rejilla mediante la teoria modal

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Ponce, G.; Solano, Cristina [Centro de Investigaciones en Optica A.C, Guanajuato (Mexico)

    2001-02-01

    The propagation of a plane electromagnetic wave through a grating using the modal theory is analyzed. The eigenproblem is solved in function of the ratio illumination wavelength {lambda} to grating period (d). When this ratio is much greater than one (quasiestatic limit), the grating shows a response similar to an uniaxial film. It is possible to approximate the eigenfunction for calculating the effective refractive indices of the birefringent element. This effect is called form birefringence and can be used to design retardation plates. [Spanish] En este trabajo se analiza la propagacion de una onda electromagnetica plana a traves de una rejilla utilizando la teoria modal. Estableciendo el problema de valor caracteristico se determina que la solucion es dependiente de la razon longitud de onda de iluminacion {lambda} y el periodo de la rejilla (d). En el caso {lambda}/d mayor que 1, conocido como el limite cuasiestatico, la rejilla se comporta como una pelicula uniaxial, obteniendose los indices ordinario y extraordinario a partir de una aproximacion a la ecuacion caracteristica. Este fenomeno, llamado birrefringencia de forma, puede emplearse para disenar elementos birrefringentes.

  14. Item Response Theory Modeling and Categorical Regression Analyses of the Five-Factor Model Rating Form: A Study on Italian Community-Dwelling Adolescent Participants and Adult Participants.

    Science.gov (United States)

    Fossati, Andrea; Widiger, Thomas A; Borroni, Serena; Maffei, Cesare; Somma, Antonella

    2017-06-01

    To extend the evidence on the reliability and construct validity of the Five-Factor Model Rating Form (FFMRF) in its self-report version, two independent samples of Italian participants, which were composed of 510 adolescent high school students and 457 community-dwelling adults, respectively, were administered the FFMRF in its Italian translation. Adolescent participants were also administered the Italian translation of the Borderline Personality Features Scale for Children-11 (BPFSC-11), whereas adult participants were administered the Italian translation of the Triarchic Psychopathy Measure (TriPM). Cronbach α values were consistent with previous findings; in both samples, average interitem r values indicated acceptable internal consistency for all FFMRF scales. A multidimensional graded item response theory model indicated that the majority of FFMRF items had adequate discrimination parameters; information indices supported the reliability of the FFMRF scales. Both categorical (i.e., item-level) and scale-level regression analyses suggested that the FFMRF scores may predict a nonnegligible amount of variance in the BPFSC-11 total score in adolescent participants, and in the TriPM scale scores in adult participants.

  15. Development of a short form Social Interaction Anxiety (SIAS) and Social Phobia Scale (SPS) using nonparametric item response theory: the SIAS-6 and the SPS-6.

    Science.gov (United States)

    Peters, Lorna; Sunderland, Matthew; Andrews, Gavin; Rapee, Ronald M; Mattick, Richard P

    2012-03-01

    Shortened forms of the Social Interaction Anxiety Scale (SIAS) and the Social Phobia Scale (SPS) were developed using nonparametric item response theory methods. Using data from socially phobic participants enrolled in 5 treatment trials (N = 456), 2 six-item scales (the SIAS-6 and the SPS-6) were developed. The validity of the scores on the SIAS-6 and the SPS-6 was then tested using traditional methods for their convergent validity in an independent clinical sample and a student sample, as well as for their sensitivity to change and diagnostic sensitivity in the clinical sample. The scores on the SIAS-6 and the SPS-6 correlated as well as the scores on the original SIAS and SPS, with scores on measures of related constructs, discriminated well between those with and without a diagnosis of social phobia, providing cutoffs for diagnosis and were as sensitive to measuring change associated with treatment as were the SIAS and SPS. Together, the SIAS-6 and the SPS-6 appear to be an efficient method of measuring symptoms of social phobia and provide a brief screening tool.

  16. Forms and genesis of species abundance distributions

    Directory of Open Access Journals (Sweden)

    Evans O. Ochiaga

    2015-12-01

    Full Text Available Species abundance distribution (SAD is one of the most important metrics in community ecology. SAD curves take a hollow or hyperbolic shape in a histogram plot with many rare species and only a few common species. In general, the shape of SAD is largely log-normally distributed, although the mechanism behind this particular SAD shape still remains elusive. Here, we aim to review four major parametric forms of SAD and three contending mechanisms that could potentially explain this highly skewed form of SAD. The parametric forms reviewed here include log series, negative binomial, lognormal and geometric distributions. The mechanisms reviewed here include the maximum entropy theory of ecology, neutral theory and the theory of proportionate effect.

  17. Malware Normalization

    OpenAIRE

    Christodorescu, Mihai; Kinder, Johannes; Jha, Somesh; Katzenbeisser, Stefan; Veith, Helmut

    2005-01-01

    Malware is code designed for a malicious purpose, such as obtaining root privilege on a host. A malware detector identifies malware and thus prevents it from adversely affecting a host. In order to evade detection by malware detectors, malware writers use various obfuscation techniques to transform their malware. There is strong evidence that commercial malware detectors are susceptible to these evasion tactics. In this paper, we describe the design and implementation of a malware normalizer ...

  18. Complete normal ordering 1: Foundations

    Directory of Open Access Journals (Sweden)

    John Ellis

    2016-08-01

    Full Text Available We introduce a new prescription for quantising scalar field theories (in generic spacetime dimension and background perturbatively around a true minimum of the full quantum effective action, which is to ‘complete normal order’ the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all ‘cephalopod’ Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of ‘complete normal ordering’ (which is an extension of the standard field theory definition of normal ordering reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative interactions, and by using a point splitting ‘trick’ we extend this result to theories with derivative interactions, such as those appearing as non-linear σ-models in the world-sheet formulation of string theory. We focus here on theories with trivial vacua, generalising the discussion to non-trivial vacua in a follow-up paper.

  19. A mixture theory model of fluid and solute transport in the microvasculature of normal and malignant tissues. II: Factor sensitivity analysis, calibration, and validation.

    Science.gov (United States)

    Schuff, M M; Gore, J P; Nauman, E A

    2013-12-01

    The treatment of cancerous tumors is dependent upon the delivery of therapeutics through the blood by means of the microcirculation. Differences in the vasculature of normal and malignant tissues have been recognized, but it is not fully understood how these differences affect transport and the applicability of existing mathematical models has been questioned at the microscale due to the complex rheology of blood and fluid exchange with the tissue. In addition to determining an appropriate set of governing equations it is necessary to specify appropriate model parameters based on physiological data. To this end, a two stage sensitivity analysis is described which makes it possible to determine the set of parameters most important to the model's calibration. In the first stage, the fluid flow equations are examined and a sensitivity analysis is used to evaluate the importance of 11 different model parameters. Of these, only four substantially influence the intravascular axial flow providing a tractable set that could be calibrated using red blood cell velocity data from the literature. The second stage also utilizes a sensitivity analysis to evaluate the importance of 14 model parameters on extravascular flux. Of these, six exhibit high sensitivity and are integrated into the model calibration using a response surface methodology and experimental intra- and extravascular accumulation data from the literature (Dreher et al. in J Natl Cancer Inst 98(5):335-344, 2006). The model exhibits good agreement with the experimental results for both the mean extravascular concentration and the penetration depth as a function of time for inert dextran over a wide range of molecular weights.

  20. Hamiltonian formulation of theory with higher order derivatives

    International Nuclear Information System (INIS)

    Gitman, D.M.; Lyakhovich, S.L.; Tyutin, I.V.

    1983-01-01

    A method of ''hamiltonization'' of a special theory with higher order derivatives is described. In a nonspecial case the result coincides with the known Ostrogradsky formulation. It is shown that in the nonspecial theory the lagrange equations of motion are reduced to the normal form

  1. A Quantum Version of Wigner's Transition State Theory

    NARCIS (Netherlands)

    Schubert, R.; Waalkens, H.; Wiggins, S.

    A quantum version of a recent realization of Wigner's transition state theory in phase space is presented. The theory developed builds on a quantum normal form which locally decouples the quantum dynamics near the transition state to any desired order in (h) over bar. This leads to an explicit

  2. A Quantum Version of Wigner’s Transition State Theory

    NARCIS (Netherlands)

    Schubert, R.; Waalkens, H.; Wiggins, S.

    2009-01-01

    A quantum version of a recent realization of Wigner’s transition state theory in phase space is presented. The theory developed builds on a quantum normal form which locally decouples the quantum dynamics near the transition state to any desired order in ħ. This leads to an explicit algorithm to

  3. Normal accidents

    International Nuclear Information System (INIS)

    Perrow, C.

    1989-01-01

    The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de

  4. Geriatric Anxiety Scale: item response theory analysis, differential item functioning, and creation of a ten-item short form (GAS-10).

    Science.gov (United States)

    Mueller, Anne E; Segal, Daniel L; Gavett, Brandon; Marty, Meghan A; Yochim, Brian; June, Andrea; Coolidge, Frederick L

    2015-07-01

    The Geriatric Anxiety Scale (GAS; Segal et al. (Segal, D. L., June, A., Payne, M., Coolidge, F. L. and Yochim, B. (2010). Journal of Anxiety Disorders, 24, 709-714. doi:10.1016/j.janxdis.2010.05.002) is a self-report measure of anxiety that was designed to address unique issues associated with anxiety assessment in older adults. This study is the first to use item response theory (IRT) to examine the psychometric properties of a measure of anxiety in older adults. A large sample of older adults (n = 581; mean age = 72.32 years, SD = 7.64 years, range = 60 to 96 years; 64% women; 88% European American) completed the GAS. IRT properties were examined. The presence of differential item functioning (DIF) or measurement bias by age and sex was assessed, and a ten-item short form of the GAS (called the GAS-10) was created. All GAS items had discrimination parameters of 1.07 or greater. Items from the somatic subscale tended to have lower discrimination parameters than items on the cognitive or affective subscales. Two items were flagged for DIF, but the impact of the DIF was negligible. Women scored significantly higher than men on the GAS and its subscales. Participants in the young-old group (60 to 79 years old) scored significantly higher on the cognitive subscale than participants in the old-old group (80 years old and older). Results from the IRT analyses indicated that the GAS and GAS-10 have strong psychometric properties among older adults. We conclude by discussing implications and future research directions.

  5. A Complete, Co-Inductive Syntactic Theory of Sequential Control and State

    DEFF Research Database (Denmark)

    Støvring, Kristian; Lassen, Soren Bo

    2007-01-01

    equivalences between recursive imperative higher-order programs. The theory is modular in the sense that eager normal form bisimilarity for each of the calculi extended with continuations and/or mutable references is a fully abstract extension of eager normal form bisimilarity for its sub-calculi.  For each...

  6. Pursuing Normality

    DEFF Research Database (Denmark)

    Madsen, Louise Sofia; Handberg, Charlotte

    2018-01-01

    implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease......BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... was to understand the reflections on cancer survivorship care of lymphoma survivors to aid the future planning of cancer survivorship care and overcome barriers to participation. METHODS: Data were generated in a hematological ward during 4 months of ethnographic fieldwork, including participant observation and 46...

  7. Differential forms of supermanifolds

    International Nuclear Information System (INIS)

    Beresin, P.A.

    1979-01-01

    The theory of differential and pseUdo-differential forms on supermanifolds is constructed. The definition and notations of superanalogy of the Pontryagin and Chern characteristic classes are given. The theory considered is purely local. The scheme suggested here generalizes the so-called Weil homomorphism for superspace which lays on the basis of the Chern and Potryagin characteristic class theory. The theory can be extended to the global supermanifolds

  8. Association between level of personality organization as assessed with theory-driven profiles of the Dutch Short Form of the MMPI and outcome of inpatient treatment for personality disorder

    NARCIS (Netherlands)

    Scholte, W.R.; Eurelings-Bontekoe, E.H.M.; Tiemens, B.G.; Verheul, R.; Meerman, A.; Hutschemaekers, G.J.M.

    2014-01-01

    The association between level of personality organization as assessed by theory-driven profile interpretation of the MMPI (Hathaway & McKinley, 1943) Dutch Short Form and treatment outcome was investigated in a naturalistic follow-up study among 121 psychotherapy inpatients who had been treated for

  9. Differential treatment response of subtypes of patients with borderline personality organization, as assessed with theory-driven profiles of the Dutch short form of the MMPI: a naturalistic follow-up study

    NARCIS (Netherlands)

    Eurelings-Bontekoe, E.H.M.; Peen, J.; Noteboom, A.; Alkema, M.; Dekker, J.J.M.

    2012-01-01

    We investigated the validity of different subtypes of borderline personality organization (BPO) as assessed by theory-driven profiles of the Minnesota Multiphasic Personality Disorder (MMPI; Hathaway & McKinley, 1943) Dutch Short Form (DSFM; Eurelings-Bontekoe, Onnink, Williams, & Snellen, 2008) in

  10. From chaos to unification: U theory vs. M theory

    International Nuclear Information System (INIS)

    Ye, Fred Y.

    2009-01-01

    A unified physical theory called U theory, that is different from M theory, is defined and characterized. U theory, which includes spinor and twistor theory, loop quantum gravity, causal dynamical triangulations, E-infinity unification theory, and Clifford-Finslerian unifications, is based on physical tradition and experimental foundations. In contrast, M theory pays more attention to mathematical forms. While M theory is characterized by supersymmetry string theory, U theory is characterized by non-supersymmetry unified field theory.

  11. Normalization of satellite imagery

    Science.gov (United States)

    Kim, Hongsuk H.; Elman, Gregory C.

    1990-01-01

    Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.

  12. Controlled cross-over study in normal subjects of naloxone-preceding-lactate infusions; respiratory and subjective responses: relationship to endogenous opioid system, suffocation false alarm theory and childhood parental loss.

    Science.gov (United States)

    Preter, M; Lee, S H; Petkova, E; Vannucci, M; Kim, S; Klein, D F

    2011-02-01

    The expanded suffocation false alarm theory (SFA) hypothesizes that dysfunction in endogenous opioidergic regulation increases sensitivity to CO2, separation distress and panic attacks. In panic disorder (PD) patients, both spontaneous clinical panics and lactate-induced panics markedly increase tidal volume (TV), whereas normals have a lesser effect, possibly due to their intact endogenous opioid system. We hypothesized that impairing the opioidergic system by naloxone could make normal controls parallel PD patients' response when lactate challenged. Whether actual separations and losses during childhood (childhood parental loss, CPL) affected naloxone-induced respiratory contrasts was explored. Subjective panic-like symptoms were analyzed although pilot work indicated that the subjective aspect of anxious panic was not well modeled by this specific protocol. Randomized cross-over sequences of intravenous naloxone (2 mg/kg) followed by lactate (10 mg/kg), or saline followed by lactate, were given to 25 volunteers. Respiratory physiology was objectively recorded by the LifeShirt. Subjective symptomatology was also recorded. Impairment of the endogenous opioid system by naloxone accentuates TV and symptomatic response to lactate. This interaction is substantially lessened by CPL. Opioidergic dysregulation may underlie respiratory pathophysiology and suffocation sensitivity in PD. Comparing specific anti-panic medications with ineffective anti-panic agents (e.g. propranolol) can test the specificity of the naloxone+lactate model. A screen for putative anti-panic agents and a new pharmacotherapeutic approach are suggested. Heuristically, the experimental unveiling of the endogenous opioid system impairing effects of CPL and separation in normal adults opens a new experimental, investigatory area.

  13. Controlled cross-over study in normal subjects of naloxone-preceding-lactate infusions; respiratory and subjective responses: relationship to endogenous opioid system, suffocation false alarm theory and childhood parental loss

    Science.gov (United States)

    Preter, M.; Lee, S. H.; Petkova, E.; Vannucci, M.; Kim, S.; Klein, D. F.

    2015-01-01

    Background The expanded suffocation false alarm theory (SFA) hypothesizes that dysfunction in endogenous opioidergic regulation increases sensitivity to CO2, separation distress and panic attacks. In panic disorder (PD) patients, both spontaneous clinical panics and lactate-induced panics markedly increase tidal volume (TV), whereas normals have a lesser effect, possibly due to their intact endogenous opioid system. We hypothesized that impairing the opioidergic system by naloxone could make normal controls parallel PD patients' response when lactate challenged. Whether actual separations and losses during childhood (childhood parental loss, CPL) affected naloxone-induced respiratory contrasts was explored. Subjective panic-like symptoms were analyzed although pilot work indicated that the subjective aspect of anxious panic was not well modeled by this specific protocol. Method Randomized cross-over sequences of intravenous naloxone (2 mg/kg) followed by lactate (10 mg/kg), or saline followed by lactate, were given to 25 volunteers. Respiratory physiology was objectively recorded by the LifeShirt. Subjective symptomatology was also recorded. Results Impairment of the endogenous opioid system by naloxone accentuates TV and symptomatic response to lactate. This interaction is substantially lessened by CPL. Conclusions Opioidergic dysregulation may underlie respiratory pathophysiology and suffocation sensitivity in PD. Comparing specific anti-panic medications with ineffective anti-panic agents (e.g. propranolol) can test the specificity of the naloxone + lactate model. A screen for putative anti-panic agents and a new pharmacotherapeutic approach are suggested. Heuristically, the experimental unveiling of the endogenous opioid system impairing effects of CPL and separation in normal adults opens a new experimental, investigatory area. PMID:20444308

  14. Simple Closed-Form Expression for Penning Reaction Rate Coefficients for Cold Molecular Collisions by Non-Hermitian Time-Independent Adiabatic Scattering Theory.

    Science.gov (United States)

    Pawlak, Mariusz; Ben-Asher, Anael; Moiseyev, Nimrod

    2018-01-09

    We present a simple expression and its derivation for reaction rate coefficients for cold anisotropic collision experiments based on adiabatic variational theory and time-independent non-Hermitian scattering theory. We demonstrate that only the eigenenergies of the resulting one-dimensional Schrödinger equation for different complex adiabats are required. The expression is applied to calculate the Penning ionization rate coefficients of an excited metastable helium atom with molecular hydrogen in an energy range spanning from hundreds of kelvins down to the millikelvin regime. Except for trivial quantities like the masses of the nuclei and the bond length of the diatomic molecule participating in the collision, one needs as input data only the complex potential energy surface (CPES). In calculations, we used recently obtained ab initio CPES by D. Bhattacharya et al. ( J. Chem. Theory Comput. 2017 , 13 , 1682 - 1690 ) without fitting parameters. The results show good accord with current measurements ( Nat. Phys. 2017 , 13 , 35 - 38 ).

  15. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, Andrei V

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of string theory in the modern picture of the physical world. Even though quantum field theory describes a wide range of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments which are our concern in this review. (reviews of topical problems)

  16. Superstring theory

    International Nuclear Information System (INIS)

    Schwarz, J.H.

    1985-01-01

    Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions

  17. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  18. The challenge of transferring an implementation strategy from academia to the field: a process evaluation of local quality improvement collaboratives in Dutch primary care using the normalization process theory.

    Science.gov (United States)

    Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy

    2014-12-01

    A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.

  19. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Word and Passage Reading Fluency Assessments: Grade 3. Technical Report #1218

    Science.gov (United States)

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  20. Against Logical Form

    Directory of Open Access Journals (Sweden)

    P N Johnson-Laird

    2010-10-01

    Full Text Available An old view in logic going back to Aristotle is that an inference is valid in virtue of its logical form. Many psychologists have adopted the same point of view about human reasoning: the first step is to recover the logical form of an inference, and the second step is to apply rules of inference that match these forms in order to prove that the conclusion follows from the premises. The present paper argues against this idea. The logical form of an inference transcends the grammatical forms of the sentences used to express it, because logical form also depends on context. Context is not readily expressed in additional premises. And the recovery of logical form leads ineluctably to the need for infinitely many axioms to capture the logical properties of relations. An alternative theory is that reasoning depends on mental models, and this theory obviates the need to recover logical form.

  1. Interpolation theory

    CERN Document Server

    Lunardi, Alessandra

    2018-01-01

    This book is the third edition of the 1999 lecture notes of the courses on interpolation theory that the author delivered at the Scuola Normale in 1998 and 1999. In the mathematical literature there are many good books on the subject, but none of them is very elementary, and in many cases the basic principles are hidden below great generality. In this book the principles of interpolation theory are illustrated aiming at simplification rather than at generality. The abstract theory is reduced as far as possible, and many examples and applications are given, especially to operator theory and to regularity in partial differential equations. Moreover the treatment is self-contained, the only prerequisite being the knowledge of basic functional analysis.

  2. Problems of Theory and Practice of the Imposing of Administrative Punishment in the Form of a Warning for Violation of the Customs Legislation of the Customs Union of the Eurasian Economic Union

    Directory of Open Access Journals (Sweden)

    Margarita N. Kobzar-Frolova

    2017-12-01

    Full Text Available In the mechanism of legal regulation of administrative responsibility for violations of the customs legislation of the Customs Union of the Eurasian Economic Union there are legal gaps, in particular, in theory and practice, the imposition of administrative punishment in the form of a warning that must be eliminated. The methodological basis of the work is the dialectical method and the system of general scientific and particular scientific methods based on it. For the reliability of the study, the logical method, the method of scientific analysis, the method of synthesis, the analogy method, the methods of generalizing and describing the obtained data, and other methods of investigation were used. For scientific discussion the system of features of administrative punishment is presented in the form of a warning, its direction is shown. The author's definition of the concept of “administrative punishment in the form of a warning” is proposed.

  3. Applying normalization process theory to understand implementation of a family violence screening and care model in maternal and child health nursing practice: a mixed method process evaluation of a randomised controlled trial.

    Science.gov (United States)

    Hooker, Leesa; Small, Rhonda; Humphreys, Cathy; Hegarty, Kelsey; Taft, Angela

    2015-03-28

    In Victoria, Australia, Maternal and Child Health (MCH) services deliver primary health care to families with children 0-6 years, focusing on health promotion, parenting support and early intervention. Family violence (FV) has been identified as a major public health concern, with increased prevalence in the child-bearing years. Victorian Government policy recommends routine FV screening of all women attending MCH services. Using Normalization Process Theory (NPT), we aimed to understand the barriers and facilitators of implementing an enhanced screening model into MCH nurse clinical practice. NPT informed the process evaluation of a pragmatic, cluster randomised controlled trial in eight MCH nurse teams in metropolitan Melbourne, Victoria, Australia. Using mixed methods (surveys and interviews), we explored the views of MCH nurses, MCH nurse team leaders, FV liaison workers and FV managers on implementation of the model. Quantitative data were analysed by comparing proportionate group differences and change within trial arm over time between interim and impact nurse surveys. Qualitative data were inductively coded, thematically analysed and mapped to NPT constructs (coherence, cognitive participation, collective action and reflexive monitoring) to enhance our understanding of the outcome evaluation. MCH nurse participation rates for interim and impact surveys were 79% (127/160) and 71% (114/160), respectively. Twenty-three key stakeholder interviews were completed. FV screening work was meaningful and valued by participants; however, the implementation coincided with a significant (government directed) change in clinical practice which impacted on full engagement with the model (coherence and cognitive participation). The use of MCH nurse-designed FV screening/management tools in focussed women's health consultations and links with FV services enhanced the participants' work (collective action). Monitoring of FV work (reflexive monitoring) was limited. The use of

  4. Discrete Curvature Theories and Applications

    KAUST Repository

    Sun, Xiang

    2016-08-25

    Discrete Di erential Geometry (DDG) concerns discrete counterparts of notions and methods in di erential geometry. This thesis deals with a core subject in DDG, discrete curvature theories on various types of polyhedral surfaces that are practically important for free-form architecture, sunlight-redirecting shading systems, and face recognition. Modeled as polyhedral surfaces, the shapes of free-form structures may have to satisfy di erent geometric or physical constraints. We study a combination of geometry and physics { the discrete surfaces that can stand on their own, as well as having proper shapes for the manufacture. These proper shapes, known as circular and conical meshes, are closely related to discrete principal curvatures. We study curvature theories that make such surfaces possible. Shading systems of freeform building skins are new types of energy-saving structures that can re-direct the sunlight. From these systems, discrete line congruences across polyhedral surfaces can be abstracted. We develop a new curvature theory for polyhedral surfaces equipped with normal congruences { a particular type of congruences de ned by linear interpolation of vertex normals. The main results are a discussion of various de nitions of normality, a detailed study of the geometry of such congruences, and a concept of curvatures and shape operators associated with the faces of a triangle mesh. These curvatures are compatible with both normal congruences and the Steiner formula. In addition to architecture, we consider the role of discrete curvatures in face recognition. We use geometric measure theory to introduce the notion of asymptotic cones associated with a singular subspace of a Riemannian manifold, which is an extension of the classical notion of asymptotic directions. We get a simple expression of these cones for polyhedral surfaces, as well as convergence and approximation theorems. We use the asymptotic cones as facial descriptors and demonstrate the

  5. Mapping Theory

    DEFF Research Database (Denmark)

    Smith, Shelley

    This paper came about within the context of a 13-month research project, Focus Area 1 - Method and Theory, at the Center for Public Space Research at the Royal Academy of the Arts School of Architecture in Copenhagen, Denmark. This project has been funded by RealDania. The goals of the research...... project, Focus Area 1 - Method and Theory, which forms the framework for this working paper, are: * To provide a basis from which to discuss the concept of public space in a contemporary architectural and urban context - specifically relating to theory and method * To broaden the discussion of the concept...

  6. A Teoria Queer e a Sociologia: o desafio de uma analítica da normalização Queer Theory and Sociology: the challenging analysis of normalization

    Directory of Open Access Journals (Sweden)

    Richard Miskolci

    2009-06-01

    Full Text Available Originada a partir dos Estudos Culturais norte-americanos, a Teoria Queer ganhou notoriedade como contraponto crítico aos estudos sociológicos sobre minorias sexuais e à política identitária dos movimentos sociais. Baseada em uma aplicação criativa da filosofia pós-estruturalista para a compreensão da forma como a sexualidade estrutura a ordem social contemporânea, há mais de uma década debatem-se suas afinidades e tensões com relação às ciências sociais e, em particular, com a Sociologia. Este artigo se insere no debate, analisa as similaridades e distinções entre as duas e, por fim, expõe um panorama do diálogo presente que aponta para a convergência possível no projeto queer de criar uma analítica da normalização.Originated from the American Cultural Studies, Queer Theory has gained a reputation as a critical counterpoint to the sociological studies on sexual minorities and the political identity of the social movements. Based on a creative implementation of post-structuralist philosophy for the understanding of how sexuality shapes the contemporary social order, for more than a decade its affinities and tensions, as related to the social sciences, and particularly to sociology, are in discussion. This article joins the debate, analyzes the similarities and distinctions between those two, and finally presents an overview of the current dialogue, pointing to a possible convergence in the queer project to enable an analysis of normalization.

  7. Crystal Structure, Vibrational Spectroscopy and ab Initio Density Functional Theory Calculations on the Ionic Liquid forming 1,1,3,3-Tetramethylguanidinium bis{(trifluoromethyl)sulfonyl}amide

    DEFF Research Database (Denmark)

    Berg, Rolf W.; Riisager, Anders; Nguyen van Buu, Olivier

    2009-01-01

    The salt 1,1,3,3-tetramethylguanidinium bis{(trifluoromethyl)sulfonyl}amide, [((CH3)(2)N)(2)C=NH2](+)[N(SO2-CF3)(2)](-) or [tmgH][NTf2], easily forms an ionic liquid with high SO2 absorbing capacity. The crystal structure of the salt was determined at 120(2) K by X-ray diffraction. The structure...

  8. Assessment of density functional theory for bonds formed between rare gases and open-shell atoms: a computational study of small molecules containing He, Ar, Kr and Xe

    International Nuclear Information System (INIS)

    Bertolus, Marjorie; Major, Mohamed; Brenner, Valerie

    2012-01-01

    The validity of the description of the DFT approximations currently implemented in plane wave DFT codes (LDA, GGA, meta-GGA, hybrid, GGA + empirical dispersion correction) for interactions between rare gases and open-shell atoms which form materials is poorly known. We have performed a first assessment of the accuracy of these functionals for the description of the bonds formed by helium, argon, krypton and xenon with various open-shell atoms. This evaluation has been done on model molecular systems for which precise experimental data are available and reference post-Hartree-Fock calculations (CCSD(T) using large basis sets) are feasible. The results show that when the rare gas atom shares density with the neighbouring atoms, the GGA functionals yield good geometries and qualitatively correct binding energies, even if these are quite significantly overestimated. The use of hybrid functionals enables us to obtain good geometries and satisfactory binding energies. For compounds in which the rare gas atom forms weak dispersive-like bonding, the accuracy yielded by the various functionals is not as good. No functional gives satisfactory binding energies for all the compounds investigated. Several GGA and hybrid functionals yield correct geometries, even if some isomers are not obtained. One GGA functional (PBE) yields qualitatively correct results for the compounds of the three rare gases and several hybrid functionals give satisfactory energies for He compounds. The addition of an empirical dispersive correction improves the results on association compounds, but several isomers are not found. (authors)

  9. Assessment of density functional theory for bonds formed between rare gases and open-shell atoms: a computational study of small molecules containing He, Ar, Kr and Xe.

    Science.gov (United States)

    Bertolus, Marjorie; Major, Mohamed; Brenner, Valérie

    2012-01-14

    The validity of the description of the DFT approximations currently implemented in plane wave DFT codes (LDA, GGA, meta-GGA, hybrid, GGA + empirical dispersion correction) for interactions between rare gases and open-shell atoms which form materials is poorly known. We have performed a first assessment of the accuracy of these functionals for the description of the bonds formed by helium, argon, krypton and xenon with various open-shell atoms. This evaluation has been done on model molecular systems for which precise experimental data are available and reference post-Hartree-Fock calculations (CCSD(T) using large basis sets) are feasible. The results show that when the rare gas atom shares density with the neighbouring atoms, the GGA functionals yield good geometries and qualitatively correct binding energies, even if these are quite significantly overestimated. The use of hybrid functionals enables us to obtain good geometries and satisfactory binding energies. For compounds in which the rare gas atom forms weak dispersive-like bonding, the accuracy yielded by the various functionals is not as good. No functional gives satisfactory binding energies for all the compounds investigated. Several GGA and hybrid functionals yield correct geometries, even if some isomers are not obtained. One GGA functional (PBE) yields qualitatively correct results for the compounds of the three rare gases and several hybrid functionals give satisfactory energies for He compounds. The addition of an empirical dispersive correction improves the results on association compounds, but several isomers are not found.

  10. A new integrability theory for certain nonlinear physical problems

    International Nuclear Information System (INIS)

    Berger, M.S.

    1993-01-01

    A new mathematically sound integrability theory for certain nonlinear problems defined by ordinary or partial differential equations is defined. The new theory works in an arbitrary finite number of space dimensions. Moreover, if a system is integrable in the new sense described here, it has a remarkable stability property that distinguishes if from any previously known integrability ideas. The new theory proceeds by establishing a ''global normal form'' for the problem at hand. This normal form holds subject to canonical coordinate transformations, extending such classical ideas by using new nonlinear methods of infinite dimensional functional analysis. The global normal form in question is related to the mathematical theory of singularities of mappings of H. Whitney and R. Thom extended globally and form finite to infinite dimensions. Thus bifurcation phenomena are naturally included in the new integrability theory. Typical examples include the classically nonintegrable Riccati equation, certain non-Euclidean mean field theories, certain parabolic reaction diffusion equations and the hyperbolic nonlinear telegrapher's equation. (Author)

  11. Normal Pressure Hydrocephalus (NPH)

    Science.gov (United States)

    ... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...

  12. The theory of nilpotent groups

    CERN Document Server

    Clement, Anthony E; Zyman, Marcos

    2017-01-01

    This monograph presents both classical and recent results in the theory of nilpotent groups and provides a self-contained, comprehensive reference on the topic.  While the theorems and proofs included can be found throughout the existing literature, this is the first book to collect them in a single volume.  Details omitted from the original sources, along with additional computations and explanations, have been added to foster a stronger understanding of the theory of nilpotent groups and the techniques commonly used to study them.  Topics discussed include collection processes, normal forms and embeddings, isolators, extraction of roots, P-localization, dimension subgroups and Lie algebras, decision problems, and nilpotent groups of automorphisms.  Requiring only a strong undergraduate or beginning graduate background in algebra, graduate students and researchers in mathematics will find The Theory of Nilpotent Groups to be a valuable resource.

  13. Normal growth spurt and final height despite low levels of all forms of circulating insulin-like growth factor-I in a patient with acid-labile subunit deficiency

    DEFF Research Database (Denmark)

    Domené, Horacio M; Martínez, Alicia S; Frystyk, Jan

    2007-01-01

    BACKGROUND: In a recently described patient with acid-labile subunit (ALS) deficiency, the inability to form ternary complexes resulted in a marked reduction in circulating total insulin-like growth factor (IGF)-I, whereas skeletal growth was only marginally affected. To further study the role of...

  14. Normal gravity field in relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  15. Gravitation and quadratic forms

    International Nuclear Information System (INIS)

    Ananth, Sudarshan; Brink, Lars; Majumdar, Sucheta; Mali, Mahendra; Shah, Nabha

    2017-01-01

    The light-cone Hamiltonians describing both pure (N=0) Yang-Mills and N=4 super Yang-Mills may be expressed as quadratic forms. Here, we show that this feature extends to theories of gravity. We demonstrate how the Hamiltonians of both pure gravity and N=8 supergravity, in four dimensions, may be written as quadratic forms. We examine the effect of residual reparametrizations on the Hamiltonian and the resulting quadratic form.

  16. Gravitation and quadratic forms

    Energy Technology Data Exchange (ETDEWEB)

    Ananth, Sudarshan [Indian Institute of Science Education and Research,Pune 411008 (India); Brink, Lars [Department of Physics, Chalmers University of Technology,S-41296 Göteborg (Sweden); Institute of Advanced Studies and Department of Physics & Applied Physics,Nanyang Technological University,Singapore 637371 (Singapore); Majumdar, Sucheta [Indian Institute of Science Education and Research,Pune 411008 (India); Mali, Mahendra [School of Physics, Indian Institute of Science Education and Research,Thiruvananthapuram, Trivandrum 695016 (India); Shah, Nabha [Indian Institute of Science Education and Research,Pune 411008 (India)

    2017-03-31

    The light-cone Hamiltonians describing both pure (N=0) Yang-Mills and N=4 super Yang-Mills may be expressed as quadratic forms. Here, we show that this feature extends to theories of gravity. We demonstrate how the Hamiltonians of both pure gravity and N=8 supergravity, in four dimensions, may be written as quadratic forms. We examine the effect of residual reparametrizations on the Hamiltonian and the resulting quadratic form.

  17. Exploring the role of cognitive and structural forms of social capital in HIV/AIDS trends in the Kagera region of Tanzania - a grounded theory study.

    Science.gov (United States)

    Frumence, Gasto; Eriksson, Malin; Nystrom, Lennarth; Killewo, Japhet; Emmelin, Maria

    2011-04-01

    The article presents a synthesis of data from three village case studies focusing on how structural and cognitive social capital may have influenced the progression of the HIV epidemic in the Kagera region of Tanzania. Grounded theory was used to develop a theoretical model describing the possible links between structural and cognitive social capital and the impact on sexual health behaviours. Focus group discussions and key informant interviews were carried out to represent the range of experiences of existing social capital. Both structural and cognitive social capital were active avenues for community members to come together, empower each other, and develop norms, values, trust and reciprocal relations. This empowerment created an enabling environment in which members could adopt protective behaviours against HIV infection. On the one hand, we observed that involvement in formal and informal organisations resulted in a reduction of numbers of sexual partners, led people to demand abstinence from sexual relations until marriage, caused fewer opportunities for casual sex, and gave individuals the agency to demand the use of condoms. On the other hand, strict membership rules and regulations excluded some members, particularly excessive alcohol drinkers and debtors, from becoming members of the social groups, which increased their vulnerability in terms of exposure to HIV. Social gatherings (especially those organised during the night) were also found to increase youths' risk of HIV infection through instances of unsafe sex. We conclude that even though social capital may at times have negative effects on individuals' HIV-prevention efforts, this study provides initial evidence that social capital is largely protective through empowering vulnerable groups such as women and the poor to protect against HIV infection and by promoting protective sexual behaviours.

  18. Three forms of relativity

    International Nuclear Information System (INIS)

    Strel'tsov, V.N.

    1992-01-01

    The physical sense of three forms of the relativity is discussed. The first - instant from - respects in fact the traditional approach based on the concept of instant distance. The normal form corresponds the radar formulation which is based on the light or retarded distances. The front form in the special case is characterized by 'observable' variables, and the known method of k-coefficient is its obvious expression. 16 refs

  19. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  20. Covariant Noncommutative Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Estrada-Jimenez, S [Licenciaturas en Fisica y en Matematicas, Facultad de Ingenieria, Universidad Autonoma de Chiapas Calle 4a Ote. Nte. 1428, Tuxtla Gutierrez, Chiapas (Mexico); Garcia-Compean, H [Departamento de Fisica, Centro de Investigacion y de Estudios Avanzados del IPN P.O. Box 14-740, 07000 Mexico D.F., Mexico and Centro de Investigacion y de Estudios Avanzados del IPN, Unidad Monterrey Via del Conocimiento 201, Parque de Investigacion e Innovacion Tecnologica (PIIT) Autopista nueva al Aeropuerto km 9.5, Lote 1, Manzana 29, cp. 66600 Apodaca Nuevo Leon (Mexico); Obregon, O [Instituto de Fisica de la Universidad de Guanajuato P.O. Box E-143, 37150 Leon Gto. (Mexico); Ramirez, C [Facultad de Ciencias Fisico Matematicas, Universidad Autonoma de Puebla, P.O. Box 1364, 72000 Puebla (Mexico)

    2008-07-02

    The covariant approach to noncommutative field and gauge theories is revisited. In the process the formalism is applied to field theories invariant under diffeomorphisms. Local differentiable forms are defined in this context. The lagrangian and hamiltonian formalism is consistently introduced.

  1. Covariant Noncommutative Field Theory

    International Nuclear Information System (INIS)

    Estrada-Jimenez, S.; Garcia-Compean, H.; Obregon, O.; Ramirez, C.

    2008-01-01

    The covariant approach to noncommutative field and gauge theories is revisited. In the process the formalism is applied to field theories invariant under diffeomorphisms. Local differentiable forms are defined in this context. The lagrangian and hamiltonian formalism is consistently introduced

  2. Challenging the Ideology of Normal in Schools

    Science.gov (United States)

    Annamma, Subini A.; Boelé, Amy L.; Moore, Brooke A.; Klingner, Janette

    2013-01-01

    In this article, we build on Brantlinger's work to critique the binary of normal and abnormal applied in US schools that create inequities in education. Operating from a critical perspective, we draw from Critical Race Theory, Disability Studies in Education, and Cultural/Historical Activity Theory to build a conceptual framework for…

  3. Special theory of relativity

    CERN Document Server

    Kilmister, Clive William

    1970-01-01

    Special Theory of Relativity provides a discussion of the special theory of relativity. Special relativity is not, like other scientific theories, a statement about the matter that forms the physical world, but has the form of a condition that the explicit physical theories must satisfy. It is thus a form of description, playing to some extent the role of the grammar of physics, prescribing which combinations of theoretical statements are admissible as descriptions of the physical world. Thus, to describe it, one needs also to describe those specific theories and to say how much they are limit

  4. Manufacturing technology for practical Josephson voltage normals

    International Nuclear Information System (INIS)

    Kohlmann, Johannes; Kieler, Oliver

    2016-01-01

    In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.

  5. What genre theory does

    DEFF Research Database (Denmark)

    Andersen, Jack

    2015-01-01

    Purpose To provide a small overview of genre theory and its associated concepts and to show how genre theory has had its antecedents in certain parts of the social sciences and not in the humanities. Findings The chapter argues that the explanatory force of genre theory may be explained with its...... emphasis on everyday genres, de facto genres. Originality/value By providing an overview of genre theory, the chapter demonstrates the wealth and richness of forms of explanations in genre theory....

  6. Denotational Aspects of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2005-01-01

    of soundness (the output term, if any, is in normal form and ß-equivalent to the input term); identification (ß-equivalent terms are mapped to the same result); and completeness (the function is defined for all terms that do have normal forms). We also show how the semantic construction enables a simple yet...... formal correctness proof for the normalization algorithm, expressed as a functional program in an ML-like, call-by-value language. Finally, we generalize the construction to produce an infinitary variant of normal forms, namely Böhm trees. We show that the three-part characterization of correctness...

  7. The construction of normal expectations

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Røpke, Inge

    2008-01-01

    The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...

  8. Strange mesons and kaon-to-pion transition form factors from holography

    International Nuclear Information System (INIS)

    Abidin, Zainul; Carlson, Carl E.

    2009-01-01

    We present a calculation of the K l3 transition form factors using the AdS/QCD correspondence. We also solidify and extend our ability to calculate quantities in the flavor-broken versions of AdS/QCD. The normalization of the form factors is a crucial ingredient for extracting |V us | from data, and the results obtained here agree well with results from chiral perturbation theory and lattice gauge theory. The slopes and curvature of the form factors agree well with the data, and with what results are available from other methods of calculation.

  9. O pensamento criativo de Paul Klee: arte e música na constituição da Teoria da Forma The creative thinking of Paul Klee: art and music in the formation of the Theory of Form

    Directory of Open Access Journals (Sweden)

    Rosana Costa Ramalho de Castro

    2010-01-01

    Full Text Available Estudo sobre a Teoria da Forma concebida no início do século XX pelo artista plástico Paul Klee e publicado no livro O Pensamento Criativo (KLEE, 1920. A Teoria da Forma de Paul Klee é uma demonstração do pensamento artístico que adota pressupostos formais, previamente estabelecidos para resultar na prática da representação artística. Klee identificou as relações formais entre a música e as artes visuais, apresentando conexões entre a linha melódica e a linha no desenho; o ritmo e as seqüências de módulos e sub-módulos; os tempos dos compassos e as divisões da pintura; a métrica da música e a modulação da forma e da cor nas artes visuais. Klee também apresentou suas experiências com superposição de cores e texturas para representar visualmente a polifonia. A Teoria da Forma de Paul Klee é um exemplo de estudo que pressupõe modelos formais para a elaboração artística e projetual.Study on the Theory of Form conceived in the early twentieth century by artist Paul Klee and published in the book The Creative Thinking (KLEE, 1920. The Theory of Form of Paul Klee is a demonstration of an artistic thought that adopts the previously established formal prerequisites that result in the practice of artistic representation. Klee identified the formal relationship between music and the visual arts, providing connections between the melodic line and the line in the drawing, rhythm and sequence of modules and sub-modules, the pulses of the measures and the divisions of the painting, metrics in music and the modulation of shape and color in the visual arts. Klee also presented his experiences with overlapping colors and textures to visually represent polyphony. The Theory of Form of Paul Klee is an example of a study that requires formal models for the artistic and design elaboration.

  10. Local homotopy theory

    CERN Document Server

    Jardine, John F

    2015-01-01

    This monograph on the homotopy theory of topologized diagrams of spaces and spectra gives an expert account of a subject at the foundation of motivic homotopy theory and the theory of topological modular forms in stable homotopy theory. Beginning with an introduction to the homotopy theory of simplicial sets and topos theory, the book covers core topics such as the unstable homotopy theory of simplicial presheaves and sheaves, localized theories, cocycles, descent theory, non-abelian cohomology, stacks, and local stable homotopy theory. A detailed treatment of the formalism of the subject is interwoven with explanations of the motivation, development, and nuances of ideas and results. The coherence of the abstract theory is elucidated through the use of widely applicable tools, such as Barr's theorem on Boolean localization, model structures on the category of simplicial presheaves on a site, and cocycle categories. A wealth of concrete examples convey the vitality and importance of the subject in topology, n...

  11. Ascorbate/menadione-induced oxidative stress kills cancer cells that express normal or mutated forms of the oncogenic protein Bcr-Abl. An in vitro and in vivo mechanistic study.

    Science.gov (United States)

    Beck, Raphaël; Pedrosa, Rozangela Curi; Dejeans, Nicolas; Glorieux, Christophe; Levêque, Philippe; Gallez, Bernard; Taper, Henryk; Eeckhoudt, Stéphane; Knoops, Laurent; Calderon, Pedro Buc; Verrax, Julien

    2011-10-01

    Numerous studies suggest that generation of oxidative stress could be useful in cancer treatment. In this study, we evaluated, in vitro and in vivo, the antitumor potential of oxidative stress induced by ascorbate/menadione (asc/men). This combination of a reducing agent (ascorbate) and a redox active quinone (menadione) generates redox cycling leading to formation of reactive oxygen species (ROS). Asc/men was tested in several cell types including K562 cells (a stable human-derived leukemia cell line), freshly isolated leukocytes from patients with chronic myeloid leukemia, BaF3 cells (a murine pro-B cell line) transfected with Bcr-Abl and peripheral blood leukocytes derived from healthy donors. Although these latter cells were resistant to asc/men, survival of all the other cell lines was markedly reduced, including the BaF3 cells expressing either wild-type or mutated Bcr-Abl. In a standard in vivo model of subcutaneous tumor transplantation, asc/men provoked a significant delay in the proliferation of K562 and BaF3 cells expressing the T315I mutated form of Bcr-Abl. No effect of asc/men was observed when these latter cells were injected into blood of mice most probably because of the high antioxidant potential of red blood cells, as shown by in vitro experiments. We postulate that cancer cells are more sensitive to asc/men than healthy cells because of their lack of antioxidant enzymes, mainly catalase. The mechanism underlying this cytotoxicity involves the oxidative cleavage of Hsp90 with a subsequent loss of its chaperone function thus leading to degradation of wild-type and mutated Bcr-Abl protein.

  12. Toward a Unified Consciousness Theory

    Science.gov (United States)

    Johnson, Richard H.

    1977-01-01

    The beginning of a holistic theory that can treat paranormal phenomena as normal human development is presented. Implications for counseling, counselor education, and counselor supervision are discussed. (Author)

  13. (EOI) Form

    International Development Research Centre (IDRC) Digital Library (Canada)

    Dorine Odongo

    COLLABORATING TECHNICAL AGENCIES: EXPRESSION OF INTEREST FORM. • Please read the information provided about the initiative and the eligibility requirements in the Prospectus before completing this application form. • Ensure all the sections of the form are accurately completed and saved in PDF format.

  14. Foundations of a Theory of Social Forms

    NARCIS (Netherlands)

    L. Polos (Laszlo); M.T. Hannan; G.R. Carroll

    2000-01-01

    textabstractIn the early transition era in Russia entry barriers for commercial banks were about absent. It resulted in the mushrooming of hundreds of small, poorly-endowed and inexperienced banks. In this paper we address the question whether the claimed benefits of low entry barriers - competition

  15. Normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  16. Random Generators and Normal Numbers

    OpenAIRE

    Bailey, David H.; Crandall, Richard E.

    2002-01-01

    Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...

  17. Field theory and strings

    International Nuclear Information System (INIS)

    Bonara, L.; Cotta-Ramusino, P.; Rinaldi, M.

    1987-01-01

    It is well-known that type I and heterotic superstring theories have a zero mass spectrum which correspond to the field content of N=1 supergravity theory coupled to supersymmetric Yang-Mills theory in 10-D. The authors study the field theory ''per se'', in the hope that simple consistency requirements will determine the theory completely once one knows the field content inherited from string theory. The simplest consistency requirements are: N=1 supersymmetry; and absence of chiral anomalies. This is what the authors discuss in this paper here leaving undetermined the question of the range of validity of the resulting field theory. As is known, a model of N=1 supergravity (SUGRA) coupled to supersymmetric Yang-Mills (SYM) theory was known in the form given by Chapline and Manton. The coupling of SUGRA to SYM was determined by the definition of the ''field strength'' 3-form H in this paper

  18. Normal foot and ankle

    International Nuclear Information System (INIS)

    Weissman, S.D.

    1989-01-01

    The foot may be thought of as a bag of bones tied tightly together and functioning as a unit. The bones re expected to maintain their alignment without causing symptomatology to the patient. The author discusses a normal radiograph. The bones must have normal shape and normal alignment. The density of the soft tissues should be normal and there should be no fractures, tumors, or foreign bodies

  19. Quantum potential theory

    CERN Document Server

    Schürmann, Michael

    2008-01-01

    This volume contains the revised and completed notes of lectures given at the school "Quantum Potential Theory: Structure and Applications to Physics," held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald from February 26 to March 10, 2007. Quantum potential theory studies noncommutative (or quantum) analogs of classical potential theory. These lectures provide an introduction to this theory, concentrating on probabilistic potential theory and it quantum analogs, i.e. quantum Markov processes and semigroups, quantum random walks, Dirichlet forms on C* and von Neumann algebras, and boundary theory. Applications to quantum physics, in particular the filtering problem in quantum optics, are also presented.

  20. Splittings of free groups, normal forms and partitions of ends

    Indian Academy of Sciences (India)

    geodesic laminations and show that this space is compact. Many of the ... determined by the partition of ends of ˜M associated to the spheres. In §4, we recall ... As is well-known we can associate to a graph a topological space. Geometrically ...

  1. Nonpolynomial vector fields under the Lotka-Volterra normal form

    Science.gov (United States)

    Hernández-Bermejo, Benito; Fairén, Víctor

    1995-02-01

    We carry out the generalization of the Lotka-Volterra embedding to flows not explicitly recognizable under the generalized Lotka-Volterra format. The procedure introduces appropriate auxiliary variables, and it is shown how, to a great extent, the final Lotka-Volterra system is independent of their specific definition. Conservation of the topological equivalence during the process is also demonstrated.

  2. Normal forms for characteristic functions on n-ary relations

    NARCIS (Netherlands)

    D.J.N. van Eijck (Jan)

    2004-01-01

    textabstractFunctions of type (n) are characteristic functions on n-ary relations. Keenan established their importance for natural language semantics, by showing that natural language has many examples of irreducible type (n) functions, i.e., functions of type (n) that cannot be represented as

  3. Imaging the corpus callosum, septum pellucidum and fornix in children: normal anatomy and variations of normality

    International Nuclear Information System (INIS)

    Griffiths, Paul D.; Batty, Ruth; Connolly, Dan J.A.; Reeves, Michael J.

    2009-01-01

    The midline structures of the supra-tentorial brain are important landmarks for judging if the brain has formed correctly. In this article, we consider the normal appearances of the corpus callosum, septum pellucidum and fornix as shown on MR imaging in normal and near-normal states. (orig.)

  4. Building theory through design

    DEFF Research Database (Denmark)

    Markussen, Thomas

    2017-01-01

    This chapter deals with a fundamental matter of concern in research through design: how can design work lead to the building of new theory? Controversy exists about the balance between theory and design work in research through design. While some researchers see theory production as the scientific...... hallmark of this type of research, others argue for design work being the primary achievement, with theory serving the auxiliary function of inspiring new designs. This paper demonstrates how design work and theory can be appreciated as two equally important outcomes of research through design. To set...... the scene, it starts out by briefly examining ideas on this issue presented in existing research literature. Hereafter, it introduces three basic forms in which design work can lead to theory that is referred to as extending theories, scaffolding theories and blending theories. Finally, it is discussed how...

  5. Transport through hybrid superconducting/normal nanostructures

    Energy Technology Data Exchange (ETDEWEB)

    Futterer, David

    2013-01-29

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  6. Transport through hybrid superconducting/normal nanostructures

    International Nuclear Information System (INIS)

    Futterer, David

    2013-01-01

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  7. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  8. Calculation of the Nucleon Axial Form Factor Using Staggered Lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Aaron S. [Fermilab; Hill, Richard J. [Perimeter Inst. Theor. Phys.; Kronfeld, Andreas S. [Fermilab; Li, Ruizi [Indiana U.; Simone, James N. [Fermilab

    2016-10-14

    The nucleon axial form factor is a dominant contribution to errors in neutrino oscillation studies. Lattice QCD calculations can help control theory errors by providing first-principles information on nucleon form factors. In these proceedings, we present preliminary results on a blinded calculation of $g_A$ and the axial form factor using HISQ staggered baryons with 2+1+1 flavors of sea quarks. Calculations are done using physical light quark masses and are absolutely normalized. We discuss fitting form factor data with the model-independent $z$ expansion parametrization.

  9. Geophysical Field Theory

    International Nuclear Information System (INIS)

    Eloranta, E.

    2003-11-01

    The geophysical field theory includes the basic principles of electromagnetism, continuum mechanics, and potential theory upon which the computational modelling of geophysical phenomena is based on. Vector analysis is the main mathematical tool in the field analyses. Electrostatics, stationary electric current, magnetostatics, and electrodynamics form a central part of electromagnetism in geophysical field theory. Potential theory concerns especially gravity, but also electrostatics and magnetostatics. Solid state mechanics and fluid mechanics are central parts in continuum mechanics. Also the theories of elastic waves and rock mechanics belong to geophysical solid state mechanics. The theories of geohydrology and mass transport form one central field theory in geophysical fluid mechanics. Also heat transfer is included in continuum mechanics. (orig.)

  10. Biocultural Theory

    DEFF Research Database (Denmark)

    Carroll, Joseph; Clasen, Mathias; Jonsson, Emelie

    2017-01-01

    Biocultural theory is an integrative research program designed to investigate the causal interactions between biological adaptations and cultural constructions. From the biocultural perspective, cultural processes are rooted in the biological necessities of the human life cycle: specifically human...... of research as contributions to a coherent, collective research program. This article argues that a mature biocultural paradigm needs to be informed by at least 7 major research clusters: (a) gene-culture coevolution; (b) human life history theory; (c) evolutionary social psychology; (d) anthropological...... forms of birth, growth, survival, mating, parenting, and sociality. Conversely, from the biocultural perspective, human biological processes are constrained, organized, and developed by culture, which includes technology, culturally specific socioeconomic and political structures, religious...

  11. Comparison of Attachment theory and Cognitive-Motivational Structure theory.

    Science.gov (United States)

    Malerstein, A J

    2005-01-01

    Attachment theory and Cognitive-Motivational Structure (CMS) are similar in most respects. They differ primarily in their proposal of when, during development, one's sense of the self and of the outside world are formed. I propose that the theories supplement each other after about age seven years--when Attachment theory's predictions of social function become unreliable, CMS theory comes into play.

  12. Anomalous normal mode oscillations in semiconductor microcavities

    Energy Technology Data Exchange (ETDEWEB)

    Wang, H. [Univ. of Oregon, Eugene, OR (United States). Dept. of Physics; Hou, H.Q.; Hammons, B.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-04-01

    Semiconductor microcavities as a composite exciton-cavity system can be characterized by two normal modes. Under an impulsive excitation by a short laser pulse, optical polarizations associated with the two normal modes have a {pi} phase difference. The total induced optical polarization is then expected to exhibit a sin{sup 2}({Omega}t)-like oscillation where 2{Omega} is the normal mode splitting, reflecting a coherent energy exchange between the exciton and cavity. In this paper the authors present experimental studies of normal mode oscillations using three-pulse transient four wave mixing (FWM). The result reveals surprisingly that when the cavity is tuned far below the exciton resonance, normal mode oscillation in the polarization is cos{sup 2}({Omega}t)-like, in contrast to what is expected form the simple normal mode model. This anomalous normal mode oscillation reflects the important role of virtual excitation of electronic states in semiconductor microcavities.

  13. Normalization of Deviation: Quotation Error in Human Factors.

    Science.gov (United States)

    Lock, Jordan; Bearman, Chris

    2018-05-01

    Objective The objective of this paper is to examine quotation error in human factors. Background Science progresses through building on the work of previous research. This requires accurate quotation. Quotation error has a number of adverse consequences: loss of credibility, loss of confidence in the journal, and a flawed basis for academic debate and scientific progress. Quotation error has been observed in a number of domains, including marine biology and medicine, but there has been little or no previous study of this form of error in human factors, a domain that specializes in the causes and management of error. Methods A study was conducted examining quotation accuracy of 187 extracts from 118 published articles that cited a control article (Vaughan's 1996 book: The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA). Results Of extracts studied, 12.8% ( n = 24) were classed as inaccurate, with 87.2% ( n = 163) being classed as accurate. A second dimension of agreement was examined with 96.3% ( n = 180) agreeing with the control article and only 3.7% ( n = 7) disagreeing. The categories of accuracy and agreement form a two by two matrix. Conclusion Rather than simply blaming individuals for quotation error, systemic factors should also be considered. Vaughan's theory, normalization of deviance, is one systemic theory that can account for quotation error. Application Quotation error is occurring in human factors and should receive more attention. According to Vaughan's theory, the normal everyday systems that promote scholarship may also allow mistakes, mishaps, and quotation error to occur.

  14. Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.

    Science.gov (United States)

    Sznitman, Sharon R; Taubman, Danielle S

    2016-09-01

    Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.

  15. Baby Poop: What's Normal?

    Science.gov (United States)

    ... I'm breast-feeding my newborn and her bowel movements are yellow and mushy. Is this normal for baby poop? Answers from Jay L. Hoecker, M.D. Yellow, mushy bowel movements are perfectly normal for breast-fed babies. Still, ...

  16. Visual Memories Bypass Normalization.

    Science.gov (United States)

    Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam

    2018-05-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

  17. Making nuclear 'normal'

    International Nuclear Information System (INIS)

    Haehlen, Peter; Elmiger, Bruno

    2000-01-01

    The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be

  18. I can never be normal

    DEFF Research Database (Denmark)

    Andreassen, Rikke; Ahmed Andresen, Uzma

    2014-01-01

    and daycare institutions, shape their racial and gendered experiences. Drawing upon black feminist theory, postcolonial theory, critical race and whiteness studies, the two women illustrate inclusions and exclusions in their society based on gender, race, class and sexuality – and especially pinpoint to how......This article focuses on the doing and undoing of race in daily life practices in Denmark. It takes the form of a dialogue between two women, a heterosexual Muslim woman of color and a lesbian white woman, who discuss and analyze how their daily life, e.g. interactions with their children’s schools...... left behind – prevents contemporary people from addressing existing patterns of racial discrimination, inclusion and exclusion in their daily lives, as well as from connecting their contemporary struggles to historical struggles and inequalities. Furthermore, they illustrate how food, class and race...

  19. Emotion processes in normal and abnormal development and preventive intervention.

    Science.gov (United States)

    Izard, Carroll E; Fine, Sarah; Mostow, Allison; Trentacosta, Christopher; Campbell, Jan

    2002-01-01

    We present an analysis of the role of emotions in normal and abnormal development and preventive intervention. The conceptual framework stems from three tenets of differential emotions theory (DET). These principles concern the constructs of emotion utilization; intersystem connections among modular emotion systems, cognition, and action; and the organizational and motivational functions of discrete emotions. Particular emotions and patterns of emotions function differentially in different periods of development and in influencing the cognition and behavior associated with different forms of psychopathology. Established prevention programs have not emphasized the concept of emotion as motivation. It is even more critical that they have generally neglected the idea of modulating emotions, not simply to achieve self-regulation, but also to utilize their inherently adaptive functions as a means of facilitating the development of social competence and preventing psychopathology. The paper includes a brief description of a theory-based prevention program and suggestions for complementary targeted interventions to address specific externalizing and internalizing problems. In the final section, we describe ways in which emotion-centered preventions can provide excellent opportunities for research on the development of normal and abnormal behavior.

  20. Advances in metal forming expert system for metal forming

    CERN Document Server

    Hingole, Rahulkumar Shivajirao

    2015-01-01

    This comprehensive book offers a clear account of the theory and applications of advanced metal forming. It provides a detailed discussion of specific forming processes, such as deep drawing, rolling, bending extrusion and stamping. The author highlights recent developments of metal forming technologies and explains sound, new and powerful expert system techniques for solving advanced engineering problems in metal forming. In addition, the basics of expert systems, their importance and applications to metal forming processes, computer-aided analysis of metalworking processes, formability analysis, mathematical modeling and case studies of individual processes are presented.

  1. Landau theory and giant room-temperature barocaloric effect in Mnormal'>F3 metal trifluorides

    Energy Technology Data Exchange (ETDEWEB)

    Corrales-Salazar, A. [Univ. of Costa Rica, San Jose (Costa Rica); Brierley, R. T. [Yale Univ., New Haven, CT (United States); Littlewood, P. B. [Univ. of Chicago, IL (United States); Argonne National Lab. (ANL), Argonne, IL (United States); Guzmán-Verri, G. G. [Univ. of Costa Rica, San Jose (Costa Rica); Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-10-17

    The structural phase transitions of MF3 (M = Al, Cr, V, Fe, Ti, Sc) metal trifluorides are studied within a simple Landau theory consisting of tilts of rigid MF6 octahedra associated with soft antiferrodistortive optic modes that are coupled to long-wavelength strain generating acoustic phonons. We calculate the temperature and pressure dependence of several quantities such as the spontaneous distortions, volume expansion, and shear strains as well as T - P phase diagrams. By contrasting our model to experiments we quantify the deviations from mean-field behavior and find that the tilt fluctuations of the MF6 octahedra increase with metal cation size. We apply our model to predict giant barocaloric effects in Sc-substituted TiF3 of up to about 15 JK-1 kg-1 for modest hydrostatic compressions of 0.2GPa. The effect extends over a wide temperature range of over 140K (including room temperature) due to a large predicted rate, dTc/dP = 723K GPa-1, which exceeds those of typical barocaloric materials. Our results suggest that open lattice frameworks such as the trifluorides are an attractive platform to search for giant barocaloric effects.

  2. Theory of particle interactions

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Shirkov, D.V.

    1986-01-01

    Development and modern state of the theory of elementary particle interactions is described. The main aim of the paper is to give a picture of quantum field theory development in the form easily available for physicists not occupied in this field of science. Besides the outline of chronological development of main representations, the description of renormalization and renorm-groups, gauge theories, models of electro-weak interactions and quantum chromodynamics, the latest investigations related to joining all interactions and supersymmetries is given

  3. Wigner's dynamical transition state theory in phase space : classical and quantum

    NARCIS (Netherlands)

    Waalkens, Holger; Schubert, Roman; Wiggins, Stephen

    We develop Wigner's approach to a dynamical transition state theory in phase space in both the classical and quantum mechanical settings. The key to our development is the construction of a normal form for describing the dynamics in the neighbourhood of a specific type of saddle point that governs

  4. Normal Pressure Hydrocephalus

    Science.gov (United States)

    ... improves the chance of a good recovery. Without treatment, symptoms may worsen and cause death. What research is being done? The NINDS conducts and supports research on neurological disorders, including normal pressure hydrocephalus. Research on disorders such ...

  5. Normality in Analytical Psychology

    Science.gov (United States)

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  6. Normal pressure hydrocephalus

    Science.gov (United States)

    Hydrocephalus - occult; Hydrocephalus - idiopathic; Hydrocephalus - adult; Hydrocephalus - communicating; Dementia - hydrocephalus; NPH ... Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. ... Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders ...

  7. Normal Functioning Family

    Science.gov (United States)

    ... Spread the Word Shop AAP Find a Pediatrician Family Life Medical Home Family Dynamics Adoption & Foster Care ... Español Text Size Email Print Share Normal Functioning Family Page Content Article Body Is there any way ...

  8. Normal growth and development

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/002456.htm Normal growth and development To use the sharing features on this page, please enable JavaScript. A child's growth and development can be divided into four periods: ...

  9. Sequences, groups, and number theory

    CERN Document Server

    Rigo, Michel

    2018-01-01

    This collaborative book presents recent trends on the study of sequences, including combinatorics on words and symbolic dynamics, and new interdisciplinary links to group theory and number theory. Other chapters branch out from those areas into subfields of theoretical computer science, such as complexity theory and theory of automata. The book is built around four general themes: number theory and sequences, word combinatorics, normal numbers, and group theory. Those topics are rounded out by investigations into automatic and regular sequences, tilings and theory of computation, discrete dynamical systems, ergodic theory, numeration systems, automaton semigroups, and amenable groups.  This volume is intended for use by graduate students or research mathematicians, as well as computer scientists who are working in automata theory and formal language theory. With its organization around unified themes, it would also be appropriate as a supplemental text for graduate level courses.

  10. Notes on Game Theory

    National Research Council Canada - National Science Library

    Washburn, Alan

    2000-01-01

    .... Nothing could be further from the truth. In fact, the authors hoped that their theory might form the basis of decision making in all situations where multiple decision makers can affect an outcome, a large class of situations that includes...

  11. Fuzzy logic of Aristotelian forms

    Energy Technology Data Exchange (ETDEWEB)

    Perlovsky, L.I. [Nichols Research Corp., Lexington, MA (United States)

    1996-12-31

    Model-based approaches to pattern recognition and machine vision have been proposed to overcome the exorbitant training requirements of earlier computational paradigms. However, uncertainties in data were found to lead to a combinatorial explosion of the computational complexity. This issue is related here to the roles of a priori knowledge vs. adaptive learning. What is the a-priori knowledge representation that supports learning? I introduce Modeling Field Theory (MFT), a model-based neural network whose adaptive learning is based on a priori models. These models combine deterministic, fuzzy, and statistical aspects to account for a priori knowledge, its fuzzy nature, and data uncertainties. In the process of learning, a priori fuzzy concepts converge to crisp or probabilistic concepts. The MFT is a convergent dynamical system of only linear computational complexity. Fuzzy logic turns out to be essential for reducing the combinatorial complexity to linear one. I will discuss the relationship of the new computational paradigm to two theories due to Aristotle: theory of Forms and logic. While theory of Forms argued that the mind cannot be based on ready-made a priori concepts, Aristotelian logic operated with just such concepts. I discuss an interpretation of MFT suggesting that its fuzzy logic, combining a-priority and adaptivity, implements Aristotelian theory of Forms (theory of mind). Thus, 2300 years after Aristotle, a logic is developed suitable for his theory of mind.

  12. Theory in practice

    CERN Multimedia

    2003-01-01

    With the start of next year, CERN's Theory Division and Experimental Physics Division will merge to form the new Department of Physics. The Bulletin looks back at an era, has a closer a look at what the Theory Division is and what makes it so special.

  13. Instantaneous stochastic perturbation theory

    International Nuclear Information System (INIS)

    Lüscher, Martin

    2015-01-01

    A form of stochastic perturbation theory is described, where the representative stochastic fields are generated instantaneously rather than through a Markov process. The correctness of the procedure is established to all orders of the expansion and for a wide class of field theories that includes all common formulations of lattice QCD.

  14. Normal modes and continuous spectra

    International Nuclear Information System (INIS)

    Balmforth, N.J.; Morrison, P.J.

    1994-12-01

    The authors consider stability problems arising in fluids, plasmas and stellar systems that contain singularities resulting from wave-mean flow or wave-particle resonances. Such resonances lead to singularities in the differential equations determining the normal modes at the so-called critical points or layers. The locations of the singularities are determined by the eigenvalue of the problem, and as a result, the spectrum of eigenvalues forms a continuum. They outline a method to construct the singular eigenfunctions comprising the continuum for a variety of problems

  15. Waltz's Theory of Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2009-01-01

    -empiricism and anti-positivism of his position. Followers and critics alike have treated Waltzian neorealism as if it was at bottom a formal proposition about cause-effect relations. The extreme case of Waltz being so victorious in the discipline, and yet being consistently mis-interpreted on the question of theory......, shows the power of a dominant philosophy of science in US IR, and thus the challenge facing any ambitious theorising. The article suggests a possible movement of fronts away from the ‘fourth debate' between rationalism and reflectivism towards one of theory against empiricism. To help this new agenda...

  16. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  17. Politics, Security, Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2011-01-01

    theory is found to ‘act politically’ through three structural features that systematically shape the political effects of using the theory. The article further discusses – on the basis of the preceding articles in the special issue – three emerging debates around securitization theory: ethics......This article outlines three ways of analysing the ‘politics of securitization’, emphasizing an often-overlooked form of politics practised through theory design. The structure and nature of a theory can have systematic political implications. Analysis of this ‘politics of securitization......’ is distinct from both the study of political practices of securitization and explorations of competing concepts of politics among security theories. It means tracking what kinds of analysis the theory can produce and whether such analysis systematically impacts real-life political struggles. Securitization...

  18. Cubical version of combinatorial differential forms

    DEFF Research Database (Denmark)

    Kock, Anders

    2010-01-01

    The theory of combinatorial differential forms is usually presented in simplicial terms. We present here a cubical version; it depends on the possibility of forming affine combinations of mutual neighbour points in a manifold, in the context of synthetic differential geometry.......The theory of combinatorial differential forms is usually presented in simplicial terms. We present here a cubical version; it depends on the possibility of forming affine combinations of mutual neighbour points in a manifold, in the context of synthetic differential geometry....

  19. Entire cyclic cohomology and modular theory

    International Nuclear Information System (INIS)

    Stoytchev, O.Ts.

    1992-04-01

    We display a close relationship between C* and W*-dynamical systems with KMS states on them and entire cyclic cohomology theory. We construct a character form which assigns to each such system (A, α, R) an even entire cyclic cocycle of the subalgebra A of differentiable (with respect to the given automorphism group) elements of A. We argue that the most interesting case is the von Neumann algebra one, where the automorphism group is determined uniquely by the faithful normal state on the algebra (the modular group) and where the character may provide important information about the algebra. (author). 11 refs

  20. Analytic theory of the gyrotron

    International Nuclear Information System (INIS)

    Lentini, P.J.

    1989-06-01

    An analytic theory is derived for a gyrotron operating in the linear gain regime. The gyrotron is a coherent source of microwave and millimeter wave radiation based on an electron beam emitting at cyclotron resonance Ω in a strong, uniform magnetic field. Relativistic equations of motion and first order perturbation theory are used. Results are obtained in both laboratory and normalized variables. An expression for cavity threshold gain is derived in the linear regime. An analytic expression for the electron phase angle in momentum space shows that the effect of the RF field is to form bunches that are equal to the unperturbed transit phase plus a correction term which varies as the sine of the input phase angle. The expression for the phase angle is plotted and bunching effects in and out of phase (0 and -π) with respect to the RF field are evident for detunings leading to gain and absorption, respectively. For exact resonance, field frequency ω = Ω, a bunch also forms at a phase of -π/2. This beam yields the same energy exchange with the RF field as an unbunched, (nonrelativistic) beam. 6 refs., 10 figs

  1. Proposed experimental test of an alternative electrodynamic theory of superconductors

    Energy Technology Data Exchange (ETDEWEB)

    Hirsch, J.E., E-mail: jhirsch@ucsd.edu

    2015-01-15

    Highlights: • A new experimental test of electric screening in superconductors is proposed. • The electric screening length is predicted to be much larger than in normal metals. • The reason this was not seen in earlier experiments is explained. • This is not predicted by the conventional BCS theory of superconductivity. - Abstract: An alternative form of London’s electrodynamic theory of superconductors predicts that the electrostatic screening length is the same as the magnetic penetration depth. We argue that experiments performed to date do not rule out this alternative formulation and propose an experiment to test it. Experimental evidence in its favor would have fundamental implications for the understanding of superconductivity.

  2. The energy spectrum of electromagnetic normal modes in dissipative media: modes between two metal half spaces

    International Nuclear Information System (INIS)

    Sernelius, Bo E

    2008-01-01

    The energy spectrum of electromagnetic normal modes plays a central role in the theory of the van der Waals and Casimir interaction. Here we study the modes in connection with the van der Waals interaction between two metal half spaces. Neglecting dissipation leads to distinct normal modes with real-valued frequencies. Including dissipation seems to have the effect that these distinct modes move away from the real axis into the complex frequency plane. The summation of the zero-point energies of these modes render a complex-valued result. Using the contour integration, resulting from the use of the generalized argument principle, gives a real-valued and different result. We resolve this contradiction and show that the spectrum of true normal modes forms a continuum with real frequencies

  3. Monitoring the normal body

    DEFF Research Database (Denmark)

    Nissen, Nina Konstantin; Holm, Lotte; Baarts, Charlotte

    2015-01-01

    of practices for monitoring their bodies based on different kinds of calculations of weight and body size, observations of body shape, and measurements of bodily firmness. Biometric measurements are familiar to them as are health authorities' recommendations. Despite not belonging to an extreme BMI category...... provides us with knowledge about how to prevent future overweight or obesity. This paper investigates body size ideals and monitoring practices among normal-weight and moderately overweight people. Methods : The study is based on in-depth interviews combined with observations. 24 participants were...... recruited by strategic sampling based on self-reported BMI 18.5-29.9 kg/m2 and socio-demographic factors. Inductive analysis was conducted. Results : Normal-weight and moderately overweight people have clear ideals for their body size. Despite being normal weight or close to this, they construct a variety...

  4. Compressed normalized block difference for object tracking

    Science.gov (United States)

    Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge

    2018-04-01

    Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.

  5. A shell approach for fibrous reinforcement forming simulations

    Science.gov (United States)

    Liang, B.; Colmars, J.; Boisse, P.

    2018-05-01

    Because of the slippage between fibers, the basic assumptions of classical plate and shell theories are not verified by fiber reinforcement during a forming. However, simulations of reinforcement forming use shell finite elements when wrinkles development is important. A shell formulation is proposed for the forming simulations of continuous fiber reinforcements. The large tensile stiffness leads to the quasi inextensibility in the fiber directions. The fiber bending stiffness determines the curvature of the reinforcement. The calculation of tensile and bending virtual works are based on the precise geometry of the single fiber. Simulations and experiments are compared for different reinforcements. It is shown that the proposed fibrous shell approach not only correctly simulates the deflections but also the rotations of the through thickness material normals.

  6. Between Theory and Practice

    DEFF Research Database (Denmark)

    Dindler, Christian; Dalsgaard, Peter

    2014-01-01

    We present the notion of ‘bridging concepts’ as a particular form of intermediary knowledge in HCI research, residing between theory and practice. We argue that bridging concepts address the challenge of facilitating exchange between theory and practice in HCI, and we compare it to other....... These constituents specify how bridging concepts, as a form of knowledge, are accountable to both theory and practice. We present an analysis of the concept of ‘peepholes’ as an example of a bridging concept aimed at spurring user curiosity and engagement....

  7. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  8. The normal holonomy group

    International Nuclear Information System (INIS)

    Olmos, C.

    1990-05-01

    The restricted holonomy group of a Riemannian manifold is a compact Lie group and its representation on the tangent space is a product of irreducible representations and a trivial one. Each one of the non-trivial factors is either an orthogonal representation of a connected compact Lie group which acts transitively on the unit sphere or it is the isotropy representation of a single Riemannian symmetric space of rank ≥ 2. We prove that, all these properties are also true for the representation on the normal space of the restricted normal holonomy group of any submanifold of a space of constant curvature. 4 refs

  9. Conference L-Functions and Automorphic Forms

    CERN Document Server

    Kohnen, Winfried; LAF

    2017-01-01

    This book presents a collection of carefully refereed research articles and lecture notes stemming from the Conference "Automorphic Forms and L-Functions", held at the University of Heidelberg in 2016.  The theory of automorphic forms and their associated L-functions is one of the central research areas in modern number theory, linking number theory, arithmetic geometry, representation theory, and complex analysis in many profound ways.  The 19 papers cover a wide range of topics within the scope of the conference, including automorphic L-functions and their special values, p-adic modular forms, Eisenstein series, Borcherds products, automorphic periods, and many more.

  10. New directions in Dirichlet forms

    CERN Document Server

    Jost, Jürgen; Mosco, Umberto; Rockner, Michael; Sturm, Karl-Theodor

    1998-01-01

    The theory of Dirichlet forms brings together methods and insights from the calculus of variations, stochastic analysis, partial differential and difference equations, potential theory, Riemannian geometry and more. This book features contributions by leading experts and provides up-to-date, authoritative accounts on exciting developments in the field and on new research perspectives. Topics covered include the following: stochastic analysis on configuration spaces, specifically a mathematically rigorous approach to the stochastic dynamics of Gibbs measures and infinite interacting particle systems; subelliptic PDE, homogenization, and fractals; geometric aspects of Dirichlet forms on metric spaces and function theory on such spaces; generalized harmonic maps as nonlinear analogues of Dirichlet forms, with an emphasis on non-locally compact situations; and a stochastic approach based on Brownian motion to harmonic maps and their regularity. Various new connections between the topics are featured, and it is de...

  11. Chemical forms of radioiodine

    International Nuclear Information System (INIS)

    Tachikawa, Enzo

    1979-01-01

    Release of radioiodine built-up during reactor operations presents a potential problem from the standpoint of environmental safety. Among the chemical forms of radioiodine, depending upon the circumstances, organic iodides cast a most serious problem because of its difficulties in the trapping and because of its stability compared to other chemical forms. Furthermore, pellet-cladding interaction (PCl) fuel failures in LWR fuel rods are believed to be stress corrosion cracks caused by embrittling fission product species, radioiodine. To deal with these problems, knowledge is required on the chemical behaviors of radioiodine in and out of fuels, as well as the release behaviors from fuels. Here a brief review is given of these respects, in aiming at clearing-up the questions still remaining unknown. The data seem to indicate that radioiodine exists as a combined form in fuels. upon heating slightly irradiated fuels, the iodine atoms are released in a chemical form associated with uranium atoms. Experiments, however, as needed with specimen of higher burnup, where the interactions of radioiodine with metallic fission products could be favored. The dominant release mechanism of radioiodine under normal operating temperatures will be diffusion to grain boundaries leading to open surfaces. Radiation-induced internal traps, however, after the rate of diffusion significantly. The carbon sources of organic iodides formed under various conditions and its formation mechanisms have also been considered. (author)

  12. Modular forms a classical approach

    CERN Document Server

    Cohen, Henri

    2017-01-01

    The theory of modular forms is a fundamental tool used in many areas of mathematics and physics. It is also a very concrete and "fun" subject in itself and abounds with an amazing number of surprising identities. This comprehensive textbook, which includes numerous exercises, aims to give a complete picture of the classical aspects of the subject, with an emphasis on explicit formulas. After a number of motivating examples such as elliptic functions and theta functions, the modular group, its subgroups, and general aspects of holomorphic and nonholomorphic modular forms are explained, with an emphasis on explicit examples. The heart of the book is the classical theory developed by Hecke and continued up to the Atkin-Lehner-Li theory of newforms and including the theory of Eisenstein series, Rankin-Selberg theory, and a more general theory of theta series including the Weil representation. The final chapter explores in some detail more general types of modular forms such as half-integral weight, Hilbert, Jacob...

  13. Normal and Abnormal Behavior in Early Childhood

    OpenAIRE

    Spinner, Miriam R.

    1981-01-01

    Evaluation of normal and abnormal behavior in the period to three years of age involves many variables. Parental attitudes, determined by many factors such as previous childrearing experience, the bonding process, parental psychological status and parental temperament, often influence the labeling of behavior as normal or abnormal. This article describes the forms of crying, sleep and wakefulness, and affective responses from infancy to three years of age.

  14. Advancing Normal Birth: Organizations, Goals, and Research

    OpenAIRE

    Hotelling, Barbara A.; Humenick, Sharron S.

    2005-01-01

    In this column, the support for advancing normal birth is summarized, based on a comparison of the goals of Healthy People 2010, Lamaze International, the Coalition for Improving Maternity Services, and the midwifery model of care. Research abstracts are presented to provide evidence that the midwifery model of care safely and economically advances normal birth. Rates of intervention experienced, as reported in the Listening to Mothers survey, are compared to the forms of care recommended by ...

  15. Design Theory Projectability

    DEFF Research Database (Denmark)

    Baskerville, Richard; Pries-Heje, Jan

    2014-01-01

    design science research is materially prescriptive, it requires a different perspective in developing the breadth of applications of design theories. In this paper we propose different concepts that embody forms of general technological knowledge The concept of projectability, developed originally......Technological knowledge has been characterized as having a scope that is specific to a particular problem. However, the information systems community is exploring forms of design science research that provide a promising avenue to technological knowledge with broader scope: design theories. Because...... as a means of distinguishing realized generalizations from unrealized generalizations, helps explain how design theories, being prescriptive, possess a different form of applicability. The concept of entrenchment describes the use of a theory in many projections. Together these concepts provide a means...

  16. Normality in Analytical Psychology

    Directory of Open Access Journals (Sweden)

    Steve Myers

    2013-11-01

    Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  17. Medically-enhanced normality

    DEFF Research Database (Denmark)

    Møldrup, Claus; Traulsen, Janine Morgall; Almarsdóttir, Anna Birna

    2003-01-01

    Objective: To consider public perspectives on the use of medicines for non-medical purposes, a usage called medically-enhanced normality (MEN). Method: Examples from the literature were combined with empirical data derived from two Danish research projects: a Delphi internet study and a Telebus...

  18. The Normal Fetal Pancreas.

    Science.gov (United States)

    Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon

    2017-10-01

    The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.

  19. Moral Exemplars in Theory and Practice

    Science.gov (United States)

    Zagzebski, Linda

    2013-01-01

    In this article I outline an original form of ethical theory that I call exemplarist virtue theory. The theory is intended to serve the philosophical purposes of a comprehensive moral theory, but it is also intended to serve the practical purpose of moral education by structuring the theory around a motivating emotion--the emotion of admiration.…

  20. Normal radiographic findings. 4. act. ed.

    International Nuclear Information System (INIS)

    Moeller, T.B.

    2003-01-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  1. Normal radiographic findings. 4. act. ed.; Roentgennormalbefunde

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, T.B. [Gemeinschaftspraxis fuer Radiologie und Nuklearmedizin, Dillingen (Germany)

    2003-07-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  2. Contributor Form

    Directory of Open Access Journals (Sweden)

    Chief Editor

    2014-09-01

    to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4 the right to republish the work in a collection of articles in any other mechanical or electronic format. We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf. All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.S NoAuthors' NamesContribution (IJCME Guidelines{1 substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; 2 drafting the article or revising it critically for important intellectual content; and 3 final approval of the version to be published. Authors should meet conditions 1, 2, and 3}.SignatureDate                              Note: All the authors are required to sign independently in this form in the sequence given above. In case an author has left the institution/country and whose whereabouts are not known, the senior author may sign on his/her behalf taking the responsibility.No addition/deletion/ or any change in the sequence of the authorship will be permissible at a later stage, without valid reasons and permission of the Editor.If the authorship is contested at any stage, the article will be either returned or will not be processed for publication till the issue is solved.Maximum up to 4 authors for short communication and up to 6 authors for original article.

  3. Contributors Form

    Directory of Open Access Journals (Sweden)

    Chief Editor

    2016-06-01

    to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4 the right to republish the work in a collection of articles in any other mechanical or electronic format. We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf. All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.S NoAuthors' NamesContribution (IJCME Guidelines{1 substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; 2 drafting the article or revising it critically for important intellectual content; and 3 final approval of the version to be published. Authors should meet conditions 1, 2, and 3}.SignatureDate                              Note: All the authors are required to sign independently in this form in the sequence given above. In case an author has left the institution/country and whose whereabouts are not known, the senior author may sign on his/her behalf taking the responsibility.No addition/deletion/ or any change in the sequence of the authorship will be permissible at a later stage, without valid reasons and permission of the Editor.If the authorship is contested at any stage, the article will be either returned or will not be processed for publication till the issue is solved.Maximum up to 4 authors for short communication and up to 6 authors for original article.

  4. Stationary scattering theory

    International Nuclear Information System (INIS)

    Combes, J.M.

    1980-10-01

    A complementary approach to the time dependent scattering theory for one-body Schroedinger operators is presented. The stationary theory is concerned with objects of quantum theory like scattering waves and amplitudes. In the more recent abstract stationary theory some generalized form of the Lippman-Schwinger equation plays the basic role. Solving this equation leads to a linear map between generalized eigenfunctions of the perturbed and unperturbed operators. This map is the section at fixed energy of the wave-operator from the time dependent theory. Although the radiation condition does not appears explicitely in this formulation it can be shown to hold a posteriori in a variety of situations thus restoring the link with physical theories

  5. Confectionery-based dose forms.

    Science.gov (United States)

    Tangso, Kristian J; Ho, Quy Phuong; Boyd, Ben J

    2015-01-01

    Conventional dosage forms such as tablets, capsules and syrups are prescribed in the normal course of practice. However, concerns about patient preferences and market demands have given rise to the exploration of novel unconventional dosage forms. Among these, confectionery-based dose forms have strong potential to overcome compliance problems. This report will review the availability of these unconventional dose forms used in treating the oral cavity and for systemic drug delivery, with a focus on medicated chewing gums, medicated lollipops, and oral bioadhesive devices. The aim is to stimulate increased interest in the opportunities for innovative new products that are available to formulators in this field, particularly for atypical patient populations.

  6. Ring Theory

    CERN Document Server

    Jara, Pascual; Torrecillas, Blas

    1988-01-01

    The papers in this proceedings volume are selected research papers in different areas of ring theory, including graded rings, differential operator rings, K-theory of noetherian rings, torsion theory, regular rings, cohomology of algebras, local cohomology of noncommutative rings. The book will be important for mathematicians active in research in ring theory.

  7. Game theory

    DEFF Research Database (Denmark)

    Hendricks, Vincent F.

    Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....

  8. Evaluating Transfer Entropy for Normal and y-Order Normal Distributions

    Czech Academy of Sciences Publication Activity Database

    Hlaváčková-Schindler, Kateřina; Toulias, T. L.; Kitsos, C. P.

    2016-01-01

    Roč. 17, č. 5 (2016), s. 1-20 ISSN 2231-0851 Institutional support: RVO:67985556 Keywords : Transfer entropy * time series * Kullback-Leibler divergence * causality * generalized normal distribution Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/hlavackova-schindler-0461261.pdf

  9. Idiopathic Normal Pressure Hydrocephalus

    Directory of Open Access Journals (Sweden)

    Basant R. Nassar BS

    2016-04-01

    Full Text Available Idiopathic normal pressure hydrocephalus (iNPH is a potentially reversible neurodegenerative disease commonly characterized by a triad of dementia, gait, and urinary disturbance. Advancements in diagnosis and treatment have aided in properly identifying and improving symptoms in patients. However, a large proportion of iNPH patients remain either undiagnosed or misdiagnosed. Using PubMed search engine of keywords “normal pressure hydrocephalus,” “diagnosis,” “shunt treatment,” “biomarkers,” “gait disturbances,” “cognitive function,” “neuropsychology,” “imaging,” and “pathogenesis,” articles were obtained for this review. The majority of the articles were retrieved from the past 10 years. The purpose of this review article is to aid general practitioners in further understanding current findings on the pathogenesis, diagnosis, and treatment of iNPH.

  10. Normal Weight Dyslipidemia

    DEFF Research Database (Denmark)

    Ipsen, David Hojland; Tveden-Nyborg, Pernille; Lykkesfeldt, Jens

    2016-01-01

    Objective: The liver coordinates lipid metabolism and may play a vital role in the development of dyslipidemia, even in the absence of obesity. Normal weight dyslipidemia (NWD) and patients with nonalcoholic fatty liver disease (NAFLD) who do not have obesity constitute a unique subset...... of individuals characterized by dyslipidemia and metabolic deterioration. This review examined the available literature on the role of the liver in dyslipidemia and the metabolic characteristics of patients with NAFLD who do not have obesity. Methods: PubMed was searched using the following keywords: nonobese......, dyslipidemia, NAFLD, NWD, liver, and metabolically obese/unhealthy normal weight. Additionally, article bibliographies were screened, and relevant citations were retrieved. Studies were excluded if they had not measured relevant biomarkers of dyslipidemia. Results: NWD and NAFLD without obesity share a similar...

  11. Introduction to gauge field theory

    International Nuclear Information System (INIS)

    Bailin, D.; Love, A.

    1986-01-01

    This book provides a postgraduate level introduction to gauge field theory entirely from a path integral standpoint without any reliance on the more traditional method of canonical quantisation. The ideas are developed by quantising the self-interacting scalar field theory, and are then used to deal with all the gauge field theories relevant to particle physics, quantum electrodynamics, quantum chromodynamics, electroweak theory, grand unified theories, and field theories at non-zero temperature. The use of these theories to make precise experimental predictions requires the development of the renormalised theories. This book provides a knowledge of relativistic quantum mechanics, but not of quantum field theory. The topics covered form a foundation for a knowledge of modern relativistic quantum field theory, providing a comprehensive coverage with emphasis on the details of actual calculations rather than the phenomenology of the applications

  12. Zero cosmological constant from normalized general relativity

    International Nuclear Information System (INIS)

    Davidson, Aharon; Rubin, Shimon

    2009-01-01

    Normalizing the Einstein-Hilbert action by the volume functional makes the theory invariant under constant shifts in the Lagrangian. The associated field equations then resemble unimodular gravity whose otherwise arbitrary cosmological constant is now determined as a Machian universal average. We prove that an empty space-time is necessarily Ricci tensor flat, and demonstrate the vanishing of the cosmological constant within the scalar field paradigm. The cosmological analysis, carried out at the mini-superspace level, reveals a vanishing cosmological constant for a universe which cannot be closed as long as gravity is attractive. Finally, we give an example of a normalized theory of gravity which does give rise to a non-zero cosmological constant.

  13. Ethics and "normal birth".

    Science.gov (United States)

    Lyerly, Anne Drapkin

    2012-12-01

    The concept of "normal birth" has been promoted as ideal by several international organizations, although debate about its meaning is ongoing. In this article, I examine the concept of normalcy to explore its ethical implications and raise a trio of concerns. First, in its emphasis on nonuse of technology as a goal, the concept of normalcy may marginalize women for whom medical intervention is necessary or beneficial. Second, in its emphasis on birth as a socially meaningful event, the mantra of normalcy may unintentionally avert attention to meaning in medically complicated births. Third, the emphasis on birth as a normal and healthy event may be a contributor to the long-standing tolerance for the dearth of evidence guiding the treatment of illness during pregnancy and the failure to responsibly and productively engage pregnant women in health research. Given these concerns, it is worth debating not just what "normal birth" means, but whether the term as an ideal earns its keep. © 2012, Copyright the Authors Journal compilation © 2012, Wiley Periodicals, Inc.

  14. Gauge field theory

    International Nuclear Information System (INIS)

    Aref'eva, I.Ya.; Slavnov, A.A.

    1981-01-01

    This lecture is devoted to the discussion of gauge field theory permitting from the single point of view to describe all the interactions of elementary particles. The authors used electrodynamics and the Einstein theory of gravity to search for a renormgroup fixing a form of Lagrangian. It is shown that the gauge invariance added with the requirement of the minimum number of arbitraries in Lagrangian fixes unambigously the form of the electromagnetic interaction. The generalization of this construction for more complicate charge spaces results in the Yang-Mills theory. The interaction form in this theory is fixed with the relativity principle in the charge space. A quantum scheme of the Yang-Mills fields through the explicit separation of true dynamic variables is suggested. A comfortable relativistically invariant diagram technique for the calculation of a producing potential for the Green functions is described. The Ward generalized identities have been obtained and a procedure of the elimination of ultraviolet and infrared divergencies has been accomplished. Within the framework of QCD (quantum-chromodynamic) the phenomenon of the asymptotic freedom being the most successful prediction of the gauge theory of strong interactions was described. Working methods with QCD outside the framework of the perturbation theory have been described from a coupling constant. QCD is represented as a single theory possessing both the asymptotical freedom and the freedom retaining quarks [ru

  15. String theory

    International Nuclear Information System (INIS)

    Chan Hongmo.

    1987-10-01

    The paper traces the development of the String Theory, and was presented at Professor Sir Rudolf Peierls' 80sup(th) Birthday Symposium. The String theory is discussed with respect to the interaction of strings, the inclusion of both gauge theory and gravitation, inconsistencies in the theory, and the role of space-time. The physical principles underlying string theory are also outlined. (U.K.)

  16. Normal mode analysis for linear resistive magnetohydrodynamics

    International Nuclear Information System (INIS)

    Kerner, W.; Lerbinger, K.; Gruber, R.; Tsunematsu, T.

    1984-10-01

    The compressible, resistive MHD equations are linearized around an equilibrium with cylindrical symmetry and solved numerically as a complex eigenvalue problem. This normal mode code allows to solve for very small resistivity eta proportional 10 -10 . The scaling of growthrates and layer width agrees very well with analytical theory. Especially, both the influence of current and pressure on the instabilities is studied in detail; the effect of resistivity on the ideally unstable internal kink is analyzed. (orig.)

  17. 10 CFR 71.71 - Normal conditions of transport.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Normal conditions of transport. 71.71 Section 71.71 Energy..., Special Form, and LSA-III Tests 2 § 71.71 Normal conditions of transport. (a) Evaluation. Evaluation of each package design under normal conditions of transport must include a determination of the effect on...

  18. Developing Visualization Support System for Teaching/Learning Database Normalization

    Science.gov (United States)

    Folorunso, Olusegun; Akinwale, AdioTaofeek

    2010-01-01

    Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…

  19. Unipacs: A Language Arts Curriculum Theory, Abstractions, Statements in Context, and Language Change; And Instructional Packets: Symbol-Referent, Denotation and Connotation, Appropriateness, Dialect, Occasion, and Form and Media.

    Science.gov (United States)

    Madison Public Schools, WI.

    Based on the belief that the most appropriate focus of a language arts curriculum is the process and content of communication, these several unipacs (instructional packets) explore some essential elements of communication which should be incorporated into a curricular theory: (1) abstraction , which is the assertion that words may be classified as…

  20. Data structures theory and practice

    CERN Document Server

    Berztiss, A T

    1971-01-01

    Computer Science and Applied Mathematics: Data Structures: Theory and Practice focuses on the processes, methodologies, principles, and approaches involved in data structures, including algorithms, decision trees, Boolean functions, lattices, and matrices. The book first offers information on set theory, functions, and relations, and graph theory. Discussions focus on linear formulas of digraphs, isomorphism of digraphs, basic definitions in the theory of digraphs, Boolean functions and forms, lattices, indexed sets, algebra of sets, and order pair and related concepts. The text then examines

  1. A simplified four-unknown shear and normal deformation theory

    Indian Academy of Sciences (India)

    The in-plane longitudinal stress ¯σ1 versus the side-to-thickness ratio a/h of a ..... in which Ei are Young's moduli in the material principal directions, vij are Poisson's ratios, ..... drical shells integrated with piezoelectric fiber reinforced composite ...

  2. Smooth functors vs. differential forms

    NARCIS (Netherlands)

    Schreiber, U.; Waldorf, K.

    2011-01-01

    We establish a relation between smooth 2-functors defined on the path 2-groupoid of a smooth manifold and differential forms on this manifold. This relation can be understood as a part of a dictionary between fundamental notions from category theory and differential geometry. We show that smooth

  3. Temporal form in interaction design

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Winther, Morten Trøstrup; Mørch, Nina

    2015-01-01

    temporal forms by letting a series of expert designers reflect upon them. We borrow a framework from Boorstin’s film theory in which he distinguishes between the voyeuristic, the vicarious, and the visceral experience. We show how to use rhythms, complexity, gentle or forceful behavior, etc., to create...

  4. Masturbation, sexuality, and adaptation: normalization in adolescence.

    Science.gov (United States)

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  5. Entanglement entropy of non-unitary integrable quantum field theory

    Directory of Open Access Journals (Sweden)

    Davide Bianchini

    2015-07-01

    Full Text Available In this paper we study the simplest massive 1+1 dimensional integrable quantum field theory which can be described as a perturbation of a non-unitary minimal conformal field theory: the Lee–Yang model. We are particularly interested in the features of the bi-partite entanglement entropy for this model and on building blocks thereof, namely twist field form factors. Non-unitarity selects out a new type of twist field as the operator whose two-point function (appropriately normalized yields the entanglement entropy. We compute this two-point function both from a form factor expansion and by means of perturbed conformal field theory. We find good agreement with CFT predictions put forward in a recent work involving the present authors. In particular, our results are consistent with a scaling of the entanglement entropy given by ceff3log⁡ℓ where ceff is the effective central charge of the theory (a positive number related to the central charge and ℓ is the size of the region. Furthermore the form factor expansion of twist fields allows us to explore the large region limit of the entanglement entropy and find the next-to-leading order correction to saturation. We find that this correction is very different from its counterpart in unitary models. Whereas in the latter case, it had a form depending only on few parameters of the model (the particle spectrum, it appears to be much more model-dependent for non-unitary models.

  6. Quantization and non-holomorphic modular forms

    CERN Document Server

    Unterberger, André

    2000-01-01

    This is a new approach to the theory of non-holomorphic modular forms, based on ideas from quantization theory or pseudodifferential analysis. Extending the Rankin-Selberg method so as to apply it to the calculation of the Roelcke-Selberg decomposition of the product of two Eisenstein series, one lets Maass cusp-forms appear as residues of simple, Eisenstein-like, series. Other results, based on quantization theory, include a reinterpretation of the Lax-Phillips scattering theory for the automorphic wave equation, in terms of distributions on R2 automorphic with respect to the linear action of SL(2,Z).

  7. Variational principles for locally variational forms

    International Nuclear Information System (INIS)

    Brajercik, J.; Krupka, D.

    2005-01-01

    We present the theory of higher order local variational principles in fibered manifolds, in which the fundamental global concept is a locally variational dynamical form. Any two Lepage forms, defining a local variational principle for this form, differ on intersection of their domains, by a variationally trivial form. In this sense, but in a different geometric setting, the local variational principles satisfy analogous properties as the variational functionals of the Chern-Simons type. The resulting theory of extremals and symmetries extends the first order theories of the Lagrange-Souriau form, presented by Grigore and Popp, and closed equivalents of the first order Euler-Lagrange forms of Hakova and Krupkova. Conceptually, our approach differs from Prieto, who uses the Poincare-Cartan forms, which do not have higher order global analogues

  8. Amorphous gauge glass theory

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Bennett, D.L.

    1987-08-01

    Assuming that a lattice gauge theory describes a fundamental attribute of Nature, it should be pointed out that such a theory in the form of a gauge glass is a weaker assumption than a regular lattice model in as much as it is not constrained by the imposition of translational invariance; translational invariance is, however, recovered approximately in the long wavelength or continuum limit. (orig./WL)

  9. Salt dependence of compression normal forces of quenched polyelectrolyte brushes

    Science.gov (United States)

    Hernandez-Zapata, Ernesto; Tamashiro, Mario N.; Pincus, Philip A.

    2001-03-01

    We obtained mean-field expressions for the compression normal forces between two identical opposing quenched polyelectrolyte brushes in the presence of monovalent salt. The brush elasticity is modeled using the entropy of ideal Gaussian chains, while the entropy of the microions and the electrostatic contribution to the grand potential is obtained by solving the non-linear Poisson-Boltzmann equation for the system in contact with a salt reservoir. For the polyelectrolyte brush we considered both a uniformly charged slab as well as an inhomogeneous charge profile obtained using a self-consistent field theory. Using the Derjaguin approximation, we related the planar-geometry results to the realistic two-crossed cylinders experimental set up. Theoretical predictions are compared to experimental measurements(Marc Balastre's abstract, APS March 2001 Meeting.) of the salt dependence of the compression normal forces between two quenched polyelectrolyte brushes formed by the adsorption of diblock copolymers poly(tert-butyl styrene)-sodium poly(styrene sulfonate) [PtBs/NaPSS] onto an octadecyltriethoxysilane (OTE) hydrophobically modified mica, as well as onto bare mica.

  10. Cold-formed steel design

    CERN Document Server

    Yu, Wei-Wen

    2010-01-01

    The definitive text in the field, thoroughly updated and expanded Hailed by professionals around the world as the definitive text on the subject, Cold-Formed Steel Design is an indispensable resource for all who design for and work with cold-formed steel. No other book provides such exhaustive coverage of both the theory and practice of cold-formed steel construction. Updated and expanded to reflect all the important developments that have occurred in the field over the past decade, this Fourth Edition of the classic text provides you with more of the detailed, up-to-the-minute techni

  11. Strength of Gamma Rhythm Depends on Normalization

    Science.gov (United States)

    Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.

    2013-01-01

    Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427

  12. Supergravity theories

    International Nuclear Information System (INIS)

    Uehara, S.

    1985-01-01

    Of all supergravity theories, the maximal, i.e., N = 8 in 4-dimension or N = 1 in 11-dimension, theory should perform the unification since it owns the highest degree of symmetry. As to the N = 1 in d = 11 theory, it has been investigated how to compactify to the d = 4 theories. From the phenomenological point of view, local SUSY GUTs, i.e., N = 1 SUSY GUTs with soft breaking terms, have been studied from various angles. The structures of extended supergravity theories are less understood than those of N = 1 supergravity theories, and matter couplings in N = 2 extended supergravity theories are under investigation. The harmonic superspace was recently proposed which may be useful to investigate the quantum effects of extended supersymmetry and supergravity theories. As to the so-called Kaluza-Klein supergravity, there is another possibility. (Mori, K.)

  13. Topos theory

    CERN Document Server

    Johnstone, PT

    2014-01-01

    Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.

  14. Assessment of Theories for Free Vibration Analysis of Homogeneous and Multilayered Plates

    Directory of Open Access Journals (Sweden)

    Erasmo Carrera

    2004-01-01

    Full Text Available This paper assesses classical and advanced theories for free vibrational response of homogeneous and multilayered simply supported plates. Closed form solutions are given for thick and thin geometries. Single layer and multilayered plates made of metallic, composite and piezo-electric materials, are considered. Classical theories based on Kirchhoff and Reissner-Mindlin assumptions are compared with refined theories obtained by enhancing the order of the expansion of the displacement fields in the thickness direction z. The effect of the Zig-Zag form of the displacement distribution in z as well as of the Interlaminar Continuity of transverse shear and normal stresses at the layer interface were evaluated. A number of conclusions have been drawn. These conclusions could be used as desk-bed in order to choose the most valuable theories for a given problem.

  15. Gauge field theories

    International Nuclear Information System (INIS)

    Pokorski, S.

    1987-01-01

    Quantum field theory forms the present theoretical framework for the understanding of the fundamental interactions of particle physics. This book examines gauge theories and their symmetries with an emphasis on their physical and technical aspects. The author discusses field-theoretical techniques and encourages the reader to perform many of the calculations presented. This book includes a brief introduction to perturbation theory, the renormalization programme, and the use of the renormalization group equation. Several topics of current research interest are covered, including chiral symmetry and its breaking, anomalies, and low energy effective lagrangians and some basics of supersymmetry

  16. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  17. Gauge theories

    International Nuclear Information System (INIS)

    Lee, B.W.

    1976-01-01

    Some introductory remarks to Yang-Mills fields are given and the problem of the Coulomb gauge is considered. The perturbation expansion for quantized gauge theories is discussed and a survey of renormalization schemes is made. The role of Ward-Takahashi identities in gauge theories is discussed. The author then discusses the renormalization of pure gauge theories and theories with spontaneously broken symmetry. (B.R.H.)

  18. Densified waste form and method for forming

    Science.gov (United States)

    Garino, Terry J.; Nenoff, Tina M.; Sava Gallis, Dorina Florentina

    2015-08-25

    Materials and methods of making densified waste forms for temperature sensitive waste material, such as nuclear waste, formed with low temperature processing using metallic powder that forms the matrix that encapsulates the temperature sensitive waste material. The densified waste form includes a temperature sensitive waste material in a physically densified matrix, the matrix is a compacted metallic powder. The method for forming the densified waste form includes mixing a metallic powder and a temperature sensitive waste material to form a waste form precursor. The waste form precursor is compacted with sufficient pressure to densify the waste precursor and encapsulate the temperature sensitive waste material in a physically densified matrix.

  19. 29 CFR 1904.29 - Forms.

    Science.gov (United States)

    2010-07-01

    ... OSHA 300 Log. Instead, enter “privacy case” in the space normally used for the employee's name. This...) Basic requirement. You must use OSHA 300, 300-A, and 301 forms, or equivalent forms, for recordable injuries and illnesses. The OSHA 300 form is called the Log of Work-Related Injuries and Illnesses, the 300...

  20. Duffin-Kemmer formulation of gauge theories

    International Nuclear Information System (INIS)

    Okubo, S.; Tosa, Y.

    1979-01-01

    Gauge theories, including the Yang-Mills theory as well as Einstein's general relativity, are reformulated in first-order differential forms. In this generalized Duffin-Kemmer formalism, gauge theories take very simple forms with only cubic interactions. Moreover, every local gauge transformation, e.g., that of Yang and Mills or Einstein, etc., has an essentially similar form. Other examples comprise a gauge theory akin to the Sugawara theory of currents and the nonlinear realization of chiral symmetry. The octonion algebra is found possibly relevant to the discussion of the Yang-Mills theory