WorldWideScience

Sample records for normal form analysis

  1. Analysis of a renormalization group method and normal form theory for perturbed ordinary differential equations

    Science.gov (United States)

    DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.

    2008-06-01

    For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).

  2. Normal form analysis of linear beam dynamics in a coupled storage ring

    International Nuclear Information System (INIS)

    Wolski, Andrzej; Woodley, Mark D.

    2004-01-01

    The techniques of normal form analysis, well known in the literature, can be used to provide a straightforward characterization of linear betatron dynamics in a coupled lattice. Here, we consider both the beam distribution and the betatron oscillations in a storage ring. We find that the beta functions for uncoupled motion generalize in a simple way to the coupled case. Defined in the way that we propose, the beta functions remain well behaved (positive and finite) under all circumstances, and have essentially the same physical significance for the beam size and betatron oscillation amplitude as in the uncoupled case. Application of this analysis to the online modeling of the PEP-II rings is also discussed

  3. Application of normal form methods to the analysis of resonances in particle accelerators

    International Nuclear Information System (INIS)

    Davies, W.G.

    1992-01-01

    The transformation to normal form in a Lie-algebraic framework provides a very powerful method for identifying and analysing non-linear behaviour and resonances in particle accelerators. The basic ideas are presented and illustrated. (author). 4 refs

  4. Normal forms in Poisson geometry

    NARCIS (Netherlands)

    Marcut, I.T.

    2013-01-01

    The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric

  5. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  6. Nonlinear dynamics exploration through normal forms

    CERN Document Server

    Kahn, Peter B

    2014-01-01

    Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations

  7. TRASYS form factor matrix normalization

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1992-01-01

    A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.

  8. A structure-preserving approach to normal form analysis of power systems; Una propuesta de preservacion de estructura al analisis de su forma normal en sistemas de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Carrillo, Irma

    2008-01-15

    Power system dynamic behavior is inherently nonlinear and is driven by different processes at different time scales. The size and complexity of these mechanisms has stimulated the search for methods that reduce the original dimension but retain a certain degree of accuracy. In this dissertation, a novel nonlinear dynamical analysis method for the analysis of large amplitude oscillations that embraces ideas from normal form theory and singular perturbation techniques is proposed. This approach allows the full potential of the normal form method to be reached, and is suitably general for application to a wide variety of nonlinear systems. Drawing on the formal theory of dynamical systems, a structure-preserving model of the system is developed that preservers network and load characteristics. By exploiting the separation of fast and slow time scales of the model, an efficient approach based on singular perturbation techniques, is then derived for constructing a nonlinear power system representation that accurately preserves network structure. The method requires no reduction of the constraint equations and gives therefore, information about the effect of network and load characteristics on system behavior. Analytical expressions are then developed that provide approximate solutions to system performance near a singularity and techniques for interpreting these solutions in terms of modal functions are given. New insights into the nature of nonlinear oscillations are also offered and criteria for characterizing network effects on nonlinear system behavior are proposed. Theoretical insight into the behavior of dynamic coupling of differential-algebraic equations and the origin of nonlinearity is given, and implications for analyzing for design and placement of power system controllers in complex nonlinear systems are discussed. The extent of applicability of the proposed procedure is demonstrated by analyzing nonlinear behavior in two realistic test power systems

  9. Normal form theory and spectral sequences

    OpenAIRE

    Sanders, Jan A.

    2003-01-01

    The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.

  10. Normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)

  11. Normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  12. Normal equivariant forms of vector fields

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    We prove a theorem of linearization of type Siegel and a theorem of normal forms of type Poincare-Dulac for germs of holomorphic vector fields in the origin of C 2 , Γ -equivariants, where Γ is a finite subgroup of GL (2,C). (author). 5 refs

  13. Normal form for mirror machine Hamiltonians

    International Nuclear Information System (INIS)

    Dragt, A.J.; Finn, J.M.

    1979-01-01

    A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior

  14. AFP Algorithm and a Canonical Normal Form for Horn Formulas

    OpenAIRE

    Majdoddin, Ruhollah

    2014-01-01

    AFP Algorithm is a learning algorithm for Horn formulas. We show that it does not improve the complexity of AFP Algorithm, if after each negative counterexample more that just one refinements are performed. Moreover, a canonical normal form for Horn formulas is presented, and it is proved that the output formula of AFP Algorithm is in this normal form.

  15. An Algorithm for Higher Order Hopf Normal Forms

    Directory of Open Access Journals (Sweden)

    A.Y.T. Leung

    1995-01-01

    Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.

  16. Normal form and synchronization of strict-feedback chaotic systems

    International Nuclear Information System (INIS)

    Wang, Feng; Chen, Shihua; Yu Minghai; Wang Changping

    2004-01-01

    This study concerns the normal form and synchronization of strict-feedback chaotic systems. We prove that, any strict-feedback chaotic system can be rendered into a normal form with a invertible transform and then a design procedure to synchronize the normal form of a non-autonomous strict-feedback chaotic system is presented. This approach needs only a scalar driving signal to realize synchronization no matter how many dimensions the chaotic system contains. Furthermore, the Roessler chaotic system is taken as a concrete example to illustrate the procedure of designing without transforming a strict-feedback chaotic system into its normal form. Numerical simulations are also provided to show the effectiveness and feasibility of the developed methods

  17. Normal form of linear systems depending on parameters

    International Nuclear Information System (INIS)

    Nguyen Huynh Phan.

    1995-12-01

    In this paper we resolve completely the problem to find normal forms of linear systems depending on parameters for the feedback action that we have studied for the special case of controllable linear systems. (author). 24 refs

  18. Volume-preserving normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-01-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)

  19. Volume-preserving normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  20. Analysis of the nonlinear dynamic behavior of power systems using normal forms of superior order; Analisis del comportamiento dinamico no lineal de sistemas de potencia usando formas normales de orden superior

    Energy Technology Data Exchange (ETDEWEB)

    Marinez Carrillo, Irma

    2003-08-01

    This thesis investigates the application of parameter disturbance methods of analysis to the nonlinear dynamic systems theory, for the study of the stability of small signal of electric power systems. The work is centered in the determination of two fundamental aspects of interest in the study of the nonlinear dynamic behavior of the system: the characterization and quantification of the nonlinear interaction degree between the fundamental ways of oscillation of the system and the study of the ways with greater influence in the response of the system in the presence of small disturbances. With these objectives, a general mathematical model, based on the application of the expansion in series of power of the nonlinear model of the power system and the theory of normal forms of vector fields is proposed for the study of the dynamic behavior of the power system. The proposed tool generalizes the existing methods in the literature to consider effects of superior order in the dynamic model of the power system. Starting off of this representation, a methodology is proposed to obtain analytical solutions of loop back and the extension of the existing methods is investigated to identify and quantify the of interaction degree among the fundamental ways of oscillation of the system. The developed tool allows, from analytical expressions of loop backs, the development of analytical measures to evaluate the stress degree in the system, the interaction between the fundamental ways of oscillation and the determination of stability borders. The conceptual development of the proposed method in this thesis offers, on the other hand, a great flexibility to incorporate detailed models of the power system and the evaluation of diverse measures of the nonlinear modal interaction. Finally, the results are presented of the application of the method of analysis proposed for the study of the nonlinear dynamic behavior in a machine-infinite bus system considering different modeled degrees

  1. Utilizing Nested Normal Form to Design Redundancy Free JSON Schemas

    Directory of Open Access Journals (Sweden)

    Wai Yin Mok

    2016-12-01

    Full Text Available JSON (JavaScript Object Notation is a lightweight data-interchange format for the Internet. JSON is built on two structures: (1 a collection of name/value pairs and (2 an ordered list of values (http://www.json.org/. Because of this simple approach, JSON is easy to use and it has the potential to be the data interchange format of choice for the Internet. Similar to XML, JSON schemas allow nested structures to model hierarchical data. As data interchange over the Internet increases exponentially due to cloud computing or otherwise, redundancy free JSON data are an attractive form of communication because they improve the quality of data communication through eliminating update anomaly. Nested Normal Form, a normal form for hierarchical data, is a precise characterization of redundancy. A nested table, or a hierarchical schema, is in Nested Normal Form if and only if it is free of redundancy caused by multivalued and functional dependencies. Using Nested Normal Form as a guide, this paper introduces a JSON schema design methodology that begins with UML use case diagrams, communication diagrams and class diagrams that model a system under study. Based on the use cases’ execution frequencies and the data passed between involved parties in the communication diagrams, the proposed methodology selects classes from the class diagrams to be the roots of JSON scheme trees and repeatedly adds classes from the class diagram to the scheme trees as long as the schemas satisfy Nested Normal Form. This process continues until all of the classes in the class diagram have been added to some JSON scheme trees.

  2. Normal Forms for Fuzzy Logics: A Proof-Theoretic Approach

    Czech Academy of Sciences Publication Activity Database

    Cintula, Petr; Metcalfe, G.

    2007-01-01

    Roč. 46, č. 5-6 (2007), s. 347-363 ISSN 1432-0665 R&D Projects: GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * normal form * proof theory * hypersequents Subject RIV: BA - General Mathematics Impact factor: 0.620, year: 2007

  3. A New One-Pass Transformation into Monadic Normal Form

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We present a translation from the call-by-value λ-calculus to monadic normal forms that includes short-cut boolean evaluation. The translation is higher-order, operates in one pass, duplicates no code, generates no chains of thunks, and is properly tail recursive. It makes a crucial use of symbolic...

  4. Automatic identification and normalization of dosage forms in drug monographs

    Science.gov (United States)

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  5. Fast Bitwise Implementation of the Algebraic Normal Form Transform

    OpenAIRE

    Bakoev, Valentin

    2017-01-01

    The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...

  6. A New Normal Form for Multidimensional Mode Conversion

    International Nuclear Information System (INIS)

    Tracy, E. R.; Richardson, A. S.; Kaufman, A. N.; Zobin, N.

    2007-01-01

    Linear conversion occurs when two wave types, with distinct polarization and dispersion characteristics, are locally resonant in a nonuniform plasma [1]. In recent work, we have shown how to incorporate a ray-based (WKB) approach to mode conversion in numerical algorithms [2,3]. The method uses the ray geometry in the conversion region to guide the reduction of the full NxN-system of wave equations to a 2x2 coupled pair which can be solved and matched to the incoming and outgoing WKB solutions. The algorithm in [2] assumes the ray geometry is hyperbolic and that, in ray phase space, there is an 'avoided crossing', which is the most common type of conversion. Here, we present a new formulation that can deal with more general types of conversion [4]. This formalism is based upon the fact (first proved in [5]) that it is always possible to put the 2x2 wave equation into a 'normal' form, such that the diagonal elements of the dispersion matrix Poisson-commute with the off-diagonals (at leading order). Therefore, if we use the diagonals (rather than the eigenvalues or the determinant) of the dispersion matrix as ray Hamiltonians, the off-diagonals will be conserved quantities. When cast into normal form, the 2x2 dispersion matrix has a very natural physical interpretation: the diagonals are the uncoupled ray hamiltonians and the off-diagonals are the coupling. We discuss how to incorporate the normal form into ray tracing algorithms

  7. Normalization Of Thermal-Radiation Form-Factor Matrix

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1994-01-01

    Report describes algorithm that adjusts form-factor matrix in TRASYS computer program, which calculates intraspacecraft radiative interchange among various surfaces and environmental heat loading from sources such as sun.

  8. Diagonalization and Jordan Normal Form--Motivation through "Maple"[R

    Science.gov (United States)

    Glaister, P.

    2009-01-01

    Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package…

  9. On the relationship between LTL normal forms and Büchi automata

    DEFF Research Database (Denmark)

    Li, Jianwen; Pu, Geguang; Zhang, Lijun

    2013-01-01

    In this paper, we revisit the problem of translating LTL formulas to Büchi automata. We first translate the given LTL formula into a special disjuctive-normal form (DNF). The formula will be part of the state, and its DNF normal form specifies the atomic properties that should hold immediately...

  10. Normal forms of invariant vector fields under a finite group action

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    Let Γ be a finite subgroup of GL(n,C). This subgroup acts on the space of germs of holomorphic vector fields vanishing at the origin in C n . We prove a theorem of invariant conjugation to a normal form and linearization for the subspace of invariant elements and we give a description of these normal forms in dimension n=2. (author)

  11. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.

    Science.gov (United States)

    Frejlich, Pedro; Mărcuț, Ioan

    2018-01-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  12. On some hypersurfaces with time like normal bundle in pseudo Riemannian space forms

    International Nuclear Information System (INIS)

    Kashani, S.M.B.

    1995-12-01

    In this work we classify immersed hypersurfaces with constant sectional curvature in pseudo Riemannian space forms if the normal bundle is time like and the mean curvature is constant. (author). 9 refs

  13. Theory and praxis pf map analsys in CHEF part 1: Linear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /Fermilab

    2008-10-01

    This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.

  14. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    International Nuclear Information System (INIS)

    Michelotti, Leo

    2009-01-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from

  15. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /FERMILAB

    2009-04-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material

  16. Normal Forms for Retarded Functional Differential Equations and Applications to Bogdanov-Takens Singularity

    Science.gov (United States)

    Faria, T.; Magalhaes, L. T.

    The paper addresses, for retarded functional differential equations (FDEs), the computation of normal forms associated with the flow on a finite-dimensional invariant manifold tangent to invariant spaces for the infinitesimal generator of the linearized equation at a singularity. A phase space appropriate to the computation of these normal forms is introduced, and adequate nonresonance conditions for the computation of the normal forms are derived. As an application, the general situation of Bogdanov-Takens singularity and its versal unfolding for scalar retarded FDEs with nondegeneracy at second order is considered, both in the general case and in the case of differential-delay equations of the form ẋ( t) = ƒ( x( t), x( t-1)).

  17. Mandibulary dental arch form differences between level four polynomial method and pentamorphic pattern for normal occlusion sample

    Directory of Open Access Journals (Sweden)

    Y. Yuliana

    2011-07-01

    Full Text Available The aim of an orthodontic treatment is to achieve aesthetic, dental health and the surrounding tissues, occlusal functional relationship, and stability. The success of an orthodontic treatment is influenced by many factors, such as diagnosis and treatment plan. In order to do a diagnosis and a treatment plan, medical record, clinical examination, radiographic examination, extra oral and intra oral photos, as well as study model analysis are needed. The purpose of this study was to evaluate the differences in dental arch form between level four polynomial and pentamorphic arch form and to determine which one is best suitable for normal occlusion sample. This analytic comparative study was conducted at Faculty of Dentistry Universitas Padjadjaran on 13 models by comparing the dental arch form using the level four polynomial method based on mathematical calculations, the pattern of the pentamorphic arch and mandibular normal occlusion as a control. The results obtained were tested using statistical analysis T student test. The results indicate a significant difference both in the form of level four polynomial method and pentamorphic arch form when compared with mandibular normal occlusion dental arch form. Level four polynomial fits better, compare to pentamorphic arch form.

  18. Optimization of accelerator parameters using normal form methods on high-order transfer maps

    Energy Technology Data Exchange (ETDEWEB)

    Snopok, Pavel [Michigan State Univ., East Lansing, MI (United States)

    2007-05-01

    Methods of analysis of the dynamics of ensembles of charged particles in collider rings are developed. The following problems are posed and solved using normal form transformations and other methods of perturbative nonlinear dynamics: (1) Optimization of the Tevatron dynamics: (a) Skew quadrupole correction of the dynamics of particles in the Tevatron in the presence of the systematic skew quadrupole errors in dipoles; (b) Calculation of the nonlinear tune shift with amplitude based on the results of measurements and the linear lattice information; (2) Optimization of the Muon Collider storage ring: (a) Computation and optimization of the dynamic aperture of the Muon Collider 50 x 50 GeV storage ring using higher order correctors; (b) 750 x 750 GeV Muon Collider storage ring lattice design matching the Tevatron footprint. The normal form coordinates have a very important advantage over the particle optical coordinates: if the transformation can be carried out successfully (general restrictions for that are not much stronger than the typical restrictions imposed on the behavior of the particles in the accelerator) then the motion in the new coordinates has a very clean representation allowing to extract more information about the dynamics of particles, and they are very convenient for the purposes of visualization. All the problem formulations include the derivation of the objective functions, which are later used in the optimization process using various optimization algorithms. Algorithms used to solve the problems are specific to collider rings, and applicable to similar problems arising on other machines of the same type. The details of the long-term behavior of the systems are studied to ensure the their stability for the desired number of turns. The algorithm of the normal form transformation is of great value for such problems as it gives much extra information about the disturbing factors. In addition to the fact that the dynamics of particles is represented

  19. Bioactive form of resveratrol in glioblastoma cells and its safety for normal brain cells

    Directory of Open Access Journals (Sweden)

    Xiao-Hong Shu

    2013-05-01

    Full Text Available ABSTRACTBackground: Resveratrol, a plant polyphenol existing in grapes and many other natural foods, possesses a wide range of biological activities including cancer prevention. It has been recognized that resveratrol is intracellularly biotransformed to different metabolites, but no direct evidence has been available to ascertain its bioactive form because of the difficulty to maintain resveratrol unmetabolized in vivo or in vitro. It would be therefore worthwhile to elucidate the potential therapeutic implications of resveratrol metabolism using a reliable resveratrol-sensitive cancer cells.Objective: To identify the real biological form of trans-resveratrol and to evaluate the safety of the effective anticancer dose of resveratrol for the normal brain cells.Methods: The samples were prepared from the condition media and cell lysates of human glioblastoma U251 cells, and were purified by solid phase extraction (SPE. The samples were subjected to high performance liquid chromatography (HPLC and liquid chromatography/tandem mass spectrometry (LC/MS analysis. According to the metabolite(s, trans-resveratrol was biotransformed in vitro by the method described elsewhere, and the resulting solution was used to treat U251 cells. Meanwhile, the responses of U251 and primarily cultured rat normal brain cells (glial cells and neurons to 100μM trans-resveratrol were evaluated by multiple experimental methods.Results: The results revealed that resveratrol monosulfate was the major metabolite in U251 cells. About half fraction of resveratrol monosulfate was prepared in vitro and this trans-resveratrol and resveratrol monosulfate mixture showed little inhibitory effect on U251 cells. It is also found that rat primary brain cells (PBCs not only resist 100μM but also tolerate as high as 200μM resveratrol treatment.Conclusions: Our study thus demonstrated that trans-resveratrol was the bioactive form in glioblastoma cells and, therefore, the biotransforming

  20. Quantifying Normal Craniofacial Form and Baseline Craniofacial Asymmetry in the Pediatric Population.

    Science.gov (United States)

    Cho, Min-Jeong; Hallac, Rami R; Ramesh, Jananie; Seaward, James R; Hermann, Nuno V; Darvann, Tron A; Lipira, Angelo; Kane, Alex A

    2018-03-01

    Restoring craniofacial symmetry is an important objective in the treatment of many craniofacial conditions. Normal form has been measured using anthropometry, cephalometry, and photography, yet all of these modalities have drawbacks. In this study, the authors define normal pediatric craniofacial form and craniofacial asymmetry using stereophotogrammetric images, which capture a densely sampled set of points on the form. After institutional review board approval, normal, healthy children (n = 533) with no known craniofacial abnormalities were recruited at well-child visits to undergo full head stereophotogrammetric imaging. The children's ages ranged from 0 to 18 years. A symmetric three-dimensional template was registered and scaled to each individual scan using 25 manually placed landmarks. The template was deformed to each subject's three-dimensional scan using a thin-plate spline algorithm and closest point matching. Age-based normal facial models were derived. Mean facial asymmetry and statistical characteristics of the population were calculated. The mean head asymmetry across all pediatric subjects was 1.5 ± 0.5 mm (range, 0.46 to 4.78 mm), and the mean facial asymmetry was 1.2 ± 0.6 mm (range, 0.4 to 5.4 mm). There were no significant differences in the mean head or facial asymmetry with age, sex, or race. Understanding the "normal" form and baseline distribution of asymmetry is an important anthropomorphic foundation. The authors present a method to quantify normal craniofacial form and baseline asymmetry in a large pediatric sample. The authors found that the normal pediatric craniofacial form is asymmetric, and does not change in magnitude with age, sex, or race.

  1. A normal form approach to the theory of nonlinear betatronic motion

    International Nuclear Information System (INIS)

    Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.

    1994-01-01

    The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)

  2. SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS

    Directory of Open Access Journals (Sweden)

    A. V. Sokolov

    2016-01-01

    Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.

  3. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  4. Reconstruction of normal forms by learning informed observation geometries from data.

    Science.gov (United States)

    Yair, Or; Talmon, Ronen; Coifman, Ronald R; Kevrekidis, Ioannis G

    2017-09-19

    The discovery of physical laws consistent with empirical observations is at the heart of (applied) science and engineering. These laws typically take the form of nonlinear differential equations depending on parameters; dynamical systems theory provides, through the appropriate normal forms, an "intrinsic" prototypical characterization of the types of dynamical regimes accessible to a given model. Using an implementation of data-informed geometry learning, we directly reconstruct the relevant "normal forms": a quantitative mapping from empirical observations to prototypical realizations of the underlying dynamics. Interestingly, the state variables and the parameters of these realizations are inferred from the empirical observations; without prior knowledge or understanding, they parametrize the dynamics intrinsically without explicit reference to fundamental physical quantities.

  5. Closed-form confidence intervals for functions of the normal mean and standard deviation.

    Science.gov (United States)

    Donner, Allan; Zou, G Y

    2012-08-01

    Confidence interval methods for a normal mean and standard deviation are well known and simple to apply. However, the same cannot be said for important functions of these parameters. These functions include the normal distribution percentiles, the Bland-Altman limits of agreement, the coefficient of variation and Cohen's effect size. We present a simple approach to this problem by using variance estimates recovered from confidence limits computed for the mean and standard deviation separately. All resulting confidence intervals have closed forms. Simulation results demonstrate that this approach performs very well for limits of agreement, coefficients of variation and their differences.

  6. Limiting Normal Operator in Quasiconvex Analysis

    Czech Academy of Sciences Publication Activity Database

    Aussel, D.; Pištěk, Miroslav

    2015-01-01

    Roč. 23, č. 4 (2015), s. 669-685 ISSN 1877-0533 R&D Projects: GA ČR GA15-00735S Institutional support: RVO:67985556 Keywords : Quasiconvex function * Sublevel set * Normal operator Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2015 http://library.utia.cas.cz/separaty/2015/MTR/pistek-0453552.pdf

  7. Normal mode analysis for linear resistive magnetohydrodynamics

    International Nuclear Information System (INIS)

    Kerner, W.; Lerbinger, K.; Gruber, R.; Tsunematsu, T.

    1984-10-01

    The compressible, resistive MHD equations are linearized around an equilibrium with cylindrical symmetry and solved numerically as a complex eigenvalue problem. This normal mode code allows to solve for very small resistivity eta proportional 10 -10 . The scaling of growthrates and layer width agrees very well with analytical theory. Especially, both the influence of current and pressure on the instabilities is studied in detail; the effect of resistivity on the ideally unstable internal kink is analyzed. (orig.)

  8. On the construction of the Kolmogorov normal form for the Trojan asteroids

    CERN Document Server

    Gabern, F; Locatelli, U

    2004-01-01

    In this paper we focus on the stability of the Trojan asteroids for the planar Restricted Three-Body Problem (RTBP), by extending the usual techniques for the neighbourhood of an elliptic point to derive results in a larger vicinity. Our approach is based on the numerical determination of the frequencies of the asteroid and the effective computation of the Kolmogorov normal form for the corresponding torus. This procedure has been applied to the first 34 Trojan asteroids of the IAU Asteroid Catalog, and it has worked successfully for 23 of them. The construction of this normal form allows for computer-assisted proofs of stability. To show it, we have implemented a proof of existence of families of invariant tori close to a given asteroid, for a high order expansion of the Hamiltonian. This proof has been successfully applied to three Trojan asteroids.

  9. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.; Spoto, F.; Scollo, Giuseppe; Nijholt, Antinus

    2003-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq 1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  10. Generating all permutations by context-free grammars in Chomsky normal form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2006-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq1$, with

  11. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2004-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  12. THE METHOD OF CONSTRUCTING A BOOLEAN FORMULA OF A POLYGON IN THE DISJUNCTIVE NORMAL FORM

    Directory of Open Access Journals (Sweden)

    A. A. Butov

    2014-01-01

    Full Text Available The paper focuses on finalizing the method of finding a polygon Boolean formula in disjunctive normal form, described in the previous article [1]. An improved method eliminates the drawback asso-ciated with the existence of a class of problems for which the solution is only approximate. The pro-posed method always allows to find an exact solution. The method can be used, in particular, in the systems of computer-aided design of integrated circuits topology.

  13. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, James A.; Heinemann, Klaus [New Mexico Univ., Albuquerque, NM (United States). Dept. of Mathematics and Statistics; Vogt, Mathias [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Gooden, Matthew [North Carolina State Univ., Raleigh, NC (United States). Dept. of Physics

    2013-03-15

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length {lambda} of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As {lambda} varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in

  14. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    International Nuclear Information System (INIS)

    Ellison, James A.; Heinemann, Klaus; Gooden, Matthew

    2013-03-01

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length λ of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As λ varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in the

  15. Morphosyntactic Neural Analysis for Generalized Lexical Normalization

    Science.gov (United States)

    Leeman-Munk, Samuel Paul

    2016-01-01

    The phenomenal growth of social media, web forums, and online reviews has spurred a growing interest in automated analysis of user-generated text. At the same time, a proliferation of voice recordings and efforts to archive culture heritage documents are fueling demand for effective automatic speech recognition (ASR) and optical character…

  16. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  17. High molecular gas fractions in normal massive star-forming galaxies in the young Universe.

    Science.gov (United States)

    Tacconi, L J; Genzel, R; Neri, R; Cox, P; Cooper, M C; Shapiro, K; Bolatto, A; Bouché, N; Bournaud, F; Burkert, A; Combes, F; Comerford, J; Davis, M; Schreiber, N M Förster; Garcia-Burillo, S; Gracia-Carpio, J; Lutz, D; Naab, T; Omont, A; Shapley, A; Sternberg, A; Weiner, B

    2010-02-11

    Stars form from cold molecular interstellar gas. As this is relatively rare in the local Universe, galaxies like the Milky Way form only a few new stars per year. Typical massive galaxies in the distant Universe formed stars an order of magnitude more rapidly. Unless star formation was significantly more efficient, this difference suggests that young galaxies were much more molecular-gas rich. Molecular gas observations in the distant Universe have so far largely been restricted to very luminous, rare objects, including mergers and quasars, and accordingly we do not yet have a clear idea about the gas content of more normal (albeit massive) galaxies. Here we report the results of a survey of molecular gas in samples of typical massive-star-forming galaxies at mean redshifts of about 1.2 and 2.3, when the Universe was respectively 40% and 24% of its current age. Our measurements reveal that distant star forming galaxies were indeed gas rich, and that the star formation efficiency is not strongly dependent on cosmic epoch. The average fraction of cold gas relative to total galaxy baryonic mass at z = 2.3 and z = 1.2 is respectively about 44% and 34%, three to ten times higher than in today's massive spiral galaxies. The slow decrease between z approximately 2 and z approximately 1 probably requires a mechanism of semi-continuous replenishment of fresh gas to the young galaxies.

  18. Generating All Circular Shifts by Context-Free Grammars in Greibach Normal Form

    NARCIS (Netherlands)

    Asveld, Peter R.J.

    2007-01-01

    For each alphabet Σn = {a1,a2,…,an}, linearly ordered by a1 < a2 < ⋯ < an, let Cn be the language of circular or cyclic shifts over Σn, i.e., Cn = {a1a2 ⋯ an-1an, a2a3 ⋯ ana1,…,ana1 ⋯ an-2an-1}. We study a few families of context-free grammars Gn (n ≥1) in Greibach normal form such that Gn generates

  19. Normal form of particle motion under the influence of an ac dipole

    Directory of Open Access Journals (Sweden)

    R. Tomás

    2002-05-01

    Full Text Available ac dipoles in accelerators are used to excite coherent betatron oscillations at a drive frequency close to the tune. These beam oscillations may last arbitrarily long and, in principle, there is no significant emittance growth if the ac dipole is adiabatically turned on and off. Therefore the ac dipole seems to be an adequate tool for nonlinear diagnostics provided the particle motion is well described in the presence of the ac dipole and nonlinearities. Normal forms and Lie algebra are powerful tools to study the nonlinear content of an accelerator lattice. In this article a way to obtain the normal form of the Hamiltonian of an accelerator with an ac dipole is described. The particle motion to first order in the nonlinearities is derived using Lie algebra techniques. The dependence of the Hamiltonian terms on the longitudinal coordinate is studied showing that they vary differently depending on the ac dipole parameters. The relation is given between the lines of the Fourier spectrum of the turn-by-turn motion and the Hamiltonian terms.

  20. Principal Typings in a Restricted Intersection Type System for Beta Normal Forms with De Bruijn Indices

    Directory of Open Access Journals (Sweden)

    Daniel Ventura

    2010-01-01

    Full Text Available The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it.

  1. Log-Normality and Multifractal Analysis of Flame Surface Statistics

    Science.gov (United States)

    Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.

    2013-11-01

    The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.

  2. Analysis of the normal optical, Michel and molecular potentials on ...

    Indian Academy of Sciences (India)

    6. — journal of. June 2016 physics pp. 1275–1286. Analysis of the normal ... the levels are obtained for the three optical potentials to estimate the quality ... The experimental angular distribution data for the 40Ca(6Li, d)44Ti reaction .... analysed using the normal optical, Michel and molecular potentials within the framework.

  3. A Mathematical Framework for Critical Transitions: Normal Forms, Variance and Applications

    Science.gov (United States)

    Kuehn, Christian

    2013-06-01

    Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast-subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension-two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.

  4. Processability analysis of candidate waste forms

    International Nuclear Information System (INIS)

    Gould, T.H. Jr.; Dunson, J.B. Jr.; Eisenberg, A.M.; Haight, H.G. Jr.; Mello, V.E.; Schuyler, R.L. III.

    1982-01-01

    A quantitative merit evaluation, or processability analysis, was performed to assess the relative difficulty of remote processing of Savannah River Plant high-level wastes for seven alternative waste form candidates. The reference borosilicate glass process was rated as the simplest, followed by FUETAP concrete, glass marbles in a lead matrix, high-silica glass, crystalline ceramics (SYNROC-D and tailored ceramics), and coated ceramic particles. Cost estimates for the borosilicate glass, high-silica glass, and ceramic waste form processing facilities are also reported

  5. Child in a Form: The Definition of Normality and Production of Expertise in Teacher Statement Forms--The Case of Northern Finland, 1951-1990

    Science.gov (United States)

    Koskela, Anne; Vehkalahti, Kaisa

    2017-01-01

    This article shows the importance of paying attention to the role of professional devices, such as standardised forms, as producers of normality and deviance in the history of education. Our case study focused on the standardised forms used by teachers during child guidance clinic referrals and transfers to special education in northern Finland,…

  6. Normal mode analysis and applications in biological physics.

    Science.gov (United States)

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  7. Analysis of Key Factors Driving Japan’s Military Normalization

    Science.gov (United States)

    2017-09-01

    no change to our policy of not giving in to terrorism.”40 Though the prime minister was democratically supported, Koizumi’s leadership style took...of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and...analysis of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and

  8. Metacognition and Reading: Comparing Three Forms of Metacognition in Normally Developing Readers and Readers with Dyslexia.

    Science.gov (United States)

    Furnes, Bjarte; Norman, Elisabeth

    2015-08-01

    Metacognition refers to 'cognition about cognition' and includes metacognitive knowledge, strategies and experiences (Efklides, 2008; Flavell, 1979). Research on reading has shown that better readers demonstrate more metacognitive knowledge than poor readers (Baker & Beall, 2009), and that reading ability improves through strategy instruction (Gersten, Fuchs, Williams, & Baker, 2001). The current study is the first to specifically compare the three forms of metacognition in dyslexic (N = 22) versus normally developing readers (N = 22). Participants read two factual texts, with learning outcome measured by a memory task. Metacognitive knowledge and skills were assessed by self-report. Metacognitive experiences were measured by predictions of performance and judgments of learning. Individuals with dyslexia showed insight into their reading problems, but less general knowledge of how to approach text reading. They more often reported lack of available reading strategies, but groups did not differ in the use of deep and surface strategies. Learning outcome and mean ratings of predictions of performance and judgments of learning were lower in dyslexic readers, but not the accuracy with which metacognitive experiences predicted learning. Overall, the results indicate that dyslexic reading and spelling problems are not generally associated with lower levels of metacognitive knowledge, metacognitive strategies or sensitivity to metacognitive experiences in reading situations. 2015 The Authors. Dyslexia Published by John Wiley & Sons Ltd.

  9. ANALYSIS OF FORMING TREAD WHEEL SETS

    Directory of Open Access Journals (Sweden)

    Igor IVANOV

    2017-09-01

    Full Text Available This paper shows the results of a theoretical study of profile high-speed grinding (PHSG for forming tread wheel sets during repair instead of turning and mold-milling. Significant disadvantages of these methods are low capacity to adapt to the tool and inhomogeneous structure of the wheel material. This leads to understated treatment regimens and difficulties in recovering wheel sets with thermal and mechanical defects. This study carried out modeling and analysis of emerging cutting forces. Proposed algorithms describe the random occurrence of the components of the cutting forces in the restoration profile of wheel sets with an inhomogeneous structure of the material. To identify the statistical features of randomly generated structures fractal dimension and the method of random additions were used. The multifractal spectrum formed is decomposed into monofractals by wavelet transform. The proposed method allows you to create the preconditions for controlling the parameters of the treatment process.

  10. Investigation of reliability, validity and normality Persian version of the California Critical Thinking Skills Test; Form B (CCTST

    Directory of Open Access Journals (Sweden)

    Khallli H

    2003-04-01

    Full Text Available Background: To evaluate the effectiveness of the present educational programs in terms of students' achieving problem solving, decision making and critical thinking skills, reliable, valid and standard instrument are needed. Purposes: To Investigate the Reliability, validity and Norm of CCTST Form.B .The California Critical Thinking Skills Test contain 34 multi-choice questions with a correct answer in the jive Critical Thinking (CT cognitive skills domain. Methods: The translated CCTST Form.B were given t0405 BSN nursing students ojNursing Faculties located in Tehran (Tehran, Iran and Shahid Beheshti Universitiesthat were selected in the through random sampling. In order to determine the face and content validity the test was translated and edited by Persian and English language professor and researchers. it was also confirmed by judgments of a panel of medical education experts and psychology professor's. CCTST reliability was determined with internal consistency and use of KR-20. The construct validity of the test was investigated with factor analysis and internal consistency and group difference. Results: The test coefficien for reliablity was 0.62. Factor Analysis indicated that CCTST has been formed from 5 factor (element namely: Analysis, Evaluation, lriference, Inductive and Deductive Reasoning. Internal consistency method shows that All subscales have been high and positive correlation with total test score. Group difference method between nursing and philosophy students (n=50 indicated that there is meaningfUl difference between nursing and philosophy students scores (t=-4.95,p=0.OOO1. Scores percentile norm also show that percentile offifty scores related to 11 raw score and 95, 5 percentiles are related to 17 and 6 raw score ordinary. Conclusions: The Results revealed that the questions test is sufficiently reliable as a research tool, and all subscales measure a single construct (Critical Thinking and are able to distinguished the

  11. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  12. Analysis of promoter activity in transgenic plants by normalizing ...

    Indian Academy of Sciences (India)

    Analysis of promoter activity in transgenic plants by normalizing expression with a reference gene: anomalies due to the influence of the test promoter on the reference promoter. Simran Bhullar Suma Chakravarthy Deepak Pental Pradeep Kumar Burma. Articles Volume 34 Issue 6 December 2009 pp 953-962 ...

  13. Analysis of forming limit in tube hydroforming

    International Nuclear Information System (INIS)

    Kim, Chan Il; Yang, Seung Hang; Kim, Young Suk

    2013-01-01

    The automotive industry has shown increasing interest in tube hydroforming. Despite many automobile structural parts being produced from cylindrical tubes, failures frequently occur during tube hydroforming under improper forming conditions. These problems include wrinkling, buckling, folding back, and bursting. We perform analytical studies to determine forming limits in tube hydroforming and demonstrate how these forming limits are influenced by the loading path. Theoretical results for the forming limits of wrinkling and bursting are compared with experimental results for an aluminum tube.

  14. NOLB: Nonlinear Rigid Block Normal Mode Analysis Method

    OpenAIRE

    Hoffmann , Alexandre; Grudinin , Sergei

    2017-01-01

    International audience; We present a new conceptually simple and computationally efficient method for nonlinear normal mode analysis called NOLB. It relies on the rotations-translations of blocks (RTB) theoretical basis developed by Y.-H. Sanejouand and colleagues. We demonstrate how to physically interpret the eigenvalues computed in the RTB basis in terms of angular and linear velocities applied to the rigid blocks and how to construct a nonlinear extrapolation of motion out of these veloci...

  15. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Science.gov (United States)

    Edneral, Victor

    2018-02-01

    This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  16. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Directory of Open Access Journals (Sweden)

    Edneral Victor

    2018-01-01

    Full Text Available This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  17. Stochastic analysis in discrete and continuous settings with normal martingales

    CERN Document Server

    Privault, Nicolas

    2009-01-01

    This volume gives a unified presentation of stochastic analysis for continuous and discontinuous stochastic processes, in both discrete and continuous time. It is mostly self-contained and accessible to graduate students and researchers having already received a basic training in probability. The simultaneous treatment of continuous and jump processes is done in the framework of normal martingales; that includes the Brownian motion and compensated Poisson processes as specific cases. In particular, the basic tools of stochastic analysis (chaos representation, gradient, divergence, integration by parts) are presented in this general setting. Applications are given to functional and deviation inequalities and mathematical finance.

  18. Cognitive Factors in the Choice of Syntactic Form by Aphasic and Normal Speakers of English and Japanese: The Speaker's Impulse.

    Science.gov (United States)

    Menn, Lise; And Others

    This study examined the role of empathy in the choice of syntactic form and the degree of independence of pragmatic and syntactic abilities in a range of aphasic patients. Study 1 involved 9 English-speaking and 9 Japanese-speaking aphasic subjects with 10 English-speaking and 4 Japanese normal controls. Study 2 involved 14 English- and 6…

  19. A simple global representation for second-order normal forms of Hamiltonian systems relative to periodic flows

    International Nuclear Information System (INIS)

    Avendaño-Camacho, M; Vallejo, J A; Vorobjev, Yu

    2013-01-01

    We study the determination of the second-order normal form for perturbed Hamiltonians relative to the periodic flow of the unperturbed Hamiltonian H 0 . The formalism presented here is global, and can be easily implemented in any computer algebra system. We illustrate it by means of two examples: the Hénon–Heiles and the elastic pendulum Hamiltonians. (paper)

  20. Algorithms for finding Chomsky and Greibach normal forms for a fuzzy context-free grammar using an algebraic approach

    Energy Technology Data Exchange (ETDEWEB)

    Lee, E.T.

    1983-01-01

    Algorithms for the construction of the Chomsky and Greibach normal forms for a fuzzy context-free grammar using the algebraic approach are presented and illustrated by examples. The results obtained in this paper may have useful applications in fuzzy languages, pattern recognition, information storage and retrieval, artificial intelligence, database and pictorial information systems. 16 references.

  1. Non normal modal analysis of oscillations in boiling water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Suarez-Antola, Roberto, E-mail: roberto.suarez@miem.gub.uy [Ministerio de Industria, Energia y Mineria (MIEM), Montevideo (Uruguay); Flores-Godoy, Jose-Job, E-mail: job.flores@ibero.mx [Universidad Iberoamericana (UIA), Mexico, DF (Mexico). Dept. de Fisica Y Matematicas

    2013-07-01

    The first objective of the present work is to construct a simple reduced order model for BWR stability analysis, combining a two nodes nodal model of the thermal hydraulics with a two modes modal model of the neutronics. Two coupled non-linear integral-differential equations are obtained, in terms of one global (in phase) and one local (out of phase) power amplitude, with direct and cross feedback reactivities given as functions of thermal hydraulics core variables (void fractions and temperatures). The second objective is to apply the effective life time approximation to further simplify the nonlinear equations. Linear approximations for the equations of the amplitudes of the global and regional modes are derived. The linearized equation for the amplitude of the global mode corresponds to a decoupled and damped harmonic oscillator. An analytical closed form formula for the damping coefficient, as a function of the parameters space of the BWR, is obtained. The coefficient changes its sign (with the corresponding modification in the decay ratio) when a stability boundary is crossed. This produces a supercritical Hopf bifurcation, with the steady state power of the reactor as the bifurcation parameter. However, the linearized equation for the amplitude of the regional mode corresponds always to an over-damped and always coupled (with the amplitude of the global mode) harmonic oscillator, for every set of possible values of core parameters (including the steady state power of the reactor) in the framework of the present mathematical model. The equation for the above mentioned over damped linear oscillator is closely connected with a non-normal operator. Due to this connection, there could be a significant transient growth of some solutions of the linear equation. This behavior allows a significant shrinking of the basin of attraction of the equilibrium state. The third objective is to apply the above approach to partially study the stability of the regional mode and

  2. Shear Stress-Normal Stress (Pressure) Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Science.gov (United States)

    Noguchi, Hiroshi; Takehara, Kimie; Ohashi, Yumiko; Suzuki, Ryo; Yamauchi, Toshimasa; Kadowaki, Takashi; Sanada, Hiromi

    2016-01-01

    Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure) and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure) ratio (SPR) was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH) as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure), concretely, peak values (SPR-p) and time integral values (SPR-i). The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i. PMID:28050567

  3. Shear Stress-Normal Stress (Pressure Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Directory of Open Access Journals (Sweden)

    Ayumi Amemiya

    2016-01-01

    Full Text Available Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure ratio (SPR was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure, concretely, peak values (SPR-p and time integral values (SPR-i. The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i.

  4. An approach to normal forms of Kuramoto model with distributed delays and the effect of minimal delay

    Energy Technology Data Exchange (ETDEWEB)

    Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)

    2015-09-25

    Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.

  5. Explorations in statistics: the analysis of ratios and normalized data.

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-09-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of Explorations in Statistics explores the analysis of ratios and normalized-or standardized-data. As researchers, we compute a ratio-a numerator divided by a denominator-to compute a proportion for some biological response or to derive some standardized variable. In each situation, we want to control for differences in the denominator when the thing we really care about is the numerator. But there is peril lurking in a ratio: only if the relationship between numerator and denominator is a straight line through the origin will the ratio be meaningful. If not, the ratio will misrepresent the true relationship between numerator and denominator. In contrast, regression techniques-these include analysis of covariance-are versatile: they can accommodate an analysis of the relationship between numerator and denominator when a ratio is useless.

  6. Financial Analysis of Hastily-Formed Networks

    Science.gov (United States)

    2006-09-01

    high-profile 101 Elvik Rune , “Cost-benefit analysis of ambulance and rescue helicopters in Norway...Systems Acquisition and Program Management. Rune , Elvik, “Cost-benefit analysis of ambulance and rescue helicopters in Norway: reflections on

  7. Center manifolds, normal forms and bifurcations of vector fields with application to coupling between periodic and steady motions

    Science.gov (United States)

    Holmes, Philip J.

    1981-06-01

    We study the instabilities known to aeronautical engineers as flutter and divergence. Mathematically, these states correspond to bifurcations to limit cycles and multiple equilibrium points in a differential equation. Making use of the center manifold and normal form theorems, we concentrate on the situation in which flutter and divergence become coupled, and show that there are essentially two ways in which this is likely to occur. In the first case the system can be reduced to an essential model which takes the form of a single degree of freedom nonlinear oscillator. This system, which may be analyzed by conventional phase-plane techniques, captures all the qualitative features of the full system. We discuss the reduction and show how the nonlinear terms may be simplified and put into normal form. Invariant manifold theory and the normal form theorem play a major role in this work and this paper serves as an introduction to their application in mechanics. Repeating the approach in the second case, we show that the essential model is now three dimensional and that far more complex behavior is possible, including nonperiodic and ‘chaotic’ motions. Throughout, we take a two degree of freedom system as an example, but the general methods are applicable to multi- and even infinite degree of freedom problems.

  8. Generation of Strategies for Environmental Deception in Two-Player Normal-Form Games

    Science.gov (United States)

    2015-06-18

    found in the literature is pre- sented by Kohlberg and Mertens [23]. A stable equilibrium by their definition is an equi- librium in an extensive-form...the equilibrium in this state provides them with an increased payoff. While interesting, Kohlberg and Mertens’ defi- 13 nition of equilibrium...stability used by Kohlberg and Mertens. Arsham’s work focuses on determining the amount by which a mixed-strategy Nash equilibrium’s payoff values can

  9. Normality Analysis for RFI Detection in Microwave Radiometry

    Directory of Open Access Journals (Sweden)

    Adriano Camps

    2009-12-01

    Full Text Available Radio-frequency interference (RFI present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.

  10. Chiral analysis of baryon form factors

    Energy Technology Data Exchange (ETDEWEB)

    Gail, T.A.

    2007-11-08

    This work presents an extensive theoretical investigation of the structure of the nucleon within the standard model of elementary particle physics. In particular, the long range contributions to a number of various form factors parametrizing the interactions of the nucleon with an electromagnetic probe are calculated. The theoretical framework for those calculations is chiral perturbation theory, the exact low energy limit of Quantum Chromo Dynamics, which describes such long range contributions in terms of a pion-cloud. In this theory, a nonrelativistic leading one loop order calculation of the form factors parametrizing the vector transition of a nucleon to its lowest lying resonance, the {delta}, a covariant calculation of the isovector and isoscalar vector form factors of the nucleon at next to leading one loop order and a covariant calculation of the isoscalar and isovector generalized vector form factors of the nucleon at leading one loop order are performed. In order to perform consistent loop calculations in the covariant formulation of chiral perturbation theory an appropriate renormalization scheme is defined in this work. All theoretical predictions are compared to phenomenology and results from lattice QCD simulations. These comparisons allow for a determination of the low energy constants of the theory. Furthermore, the possibility of chiral extrapolation, i.e. the extrapolation of lattice data from simulations at large pion masses down to the small physical pion mass is studied in detail. Statistical as well as systematic uncertainties are estimated for all results throughout this work. (orig.)

  11. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  12. Normal co-ordinate analysis of 1, 8-dibromooctane

    Science.gov (United States)

    Singh, Devinder; Jaggi, Neena; Singh, Nafa

    2010-02-01

    The organic compound 1,8-dibromooctane (1,8-DBO) exists in liquid phase at ambient temperatures and has versatile synthetic applications. In its liquid phase 1,8-DBO has been expected to exist in four most probable conformations, with all its carbon atoms in the same plane, having symmetries C 2h , C i , C 2 and C 1 . In the present study a detailed vibrational analysis in terms of assignment of Fourier transform infrared (FT-IR) and Raman bands of this molecule using normal co-ordinate calculations has been done. A systematic set of symmetry co-ordinates has been constructed for this molecule and normal co-ordinate analysis is carried out using the computer program MOLVIB. The force-field transferred from already studied lower chain bromo-alkanes is subjected to refinement so as to fit the observed infrared and Raman frequencies with those of calculated ones. The potential energy distribution (PED) has also been calculated for each mode of vibration of the molecule for the assumed conformations.

  13. Study on electric parameters of wild and cultivated cotton forms being in normal state and irradiated

    International Nuclear Information System (INIS)

    Nazirov, N.N.; Kamalov, N.; Norbaev, N.

    1978-01-01

    The radiation effect on electric conductivity of tissues in case of alternating current, electrical capacity and cell impedance has been studied. Gamma irradiation of seedlings results in definite changes of electric factors of cells (electric conductivity, electric capacity, impedance). It is shown that especially strong changes have been revealed during gamma irradiation of radiosensitive wild form of cotton plants. The deviation of cell electric factors from the standard depends on the violation of evolutionally composed ion heterogeneity and cell colloid system state, which results in changes in their structure and metabolism in them

  14. First-order systems of linear partial differential equations: normal forms, canonical systems, transform methods

    Directory of Open Access Journals (Sweden)

    Heinz Toparkus

    2014-04-01

    Full Text Available In this paper we consider first-order systems with constant coefficients for two real-valued functions of two real variables. This is both a problem in itself, as well as an alternative view of the classical linear partial differential equations of second order with constant coefficients. The classification of the systems is done using elementary methods of linear algebra. Each type presents its special canonical form in the associated characteristic coordinate system. Then you can formulate initial value problems in appropriate basic areas, and you can try to achieve a solution of these problems by means of transform methods.

  15. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  16. Comparative analysis of JKR Sarawak form of contract and Malaysia Standard form of building contract (PWD203A)

    Science.gov (United States)

    Yunus, A. I. A.; Muhammad, W. M. N. W.; Saaid, M. N. F.

    2018-04-01

    Standard form of contract is normally being used in Malaysia construction industry in establishing legal relation between contracting parties. Generally, most of Malaysia federal government construction project used PWD203A which is a standard form of contract to be used where Bills of Quantities Form Part of the Contract and it is issued by Public Works Department (PWD/JKR). On the other hand in Sarawak, the largest state in Malaysia, the state government has issued their own standard form of contract namely JKR Sarawak Form of Contract 2006. Even both forms have been used widely in construction industry; there is still lack of understanding on both forms. The aim of this paper is to identify significant provision on both forms of contract. Document analysis has been adopted in conducting an in-depth review on both forms. It is found that, both forms of contracts have differences and similarities on several provisions specifically matters to definitions and general; execution of the works; payments, completion and final account; and delay, dispute resolution and determination.

  17. Single cell analysis of normal and leukemic hematopoiesis.

    Science.gov (United States)

    Povinelli, Benjamin J; Rodriguez-Meira, Alba; Mead, Adam J

    2018-02-01

    The hematopoietic system is well established as a paradigm for the study of cellular hierarchies, their disruption in disease and therapeutic use in regenerative medicine. Traditional approaches to study hematopoiesis involve purification of cell populations based on a small number of surface markers. However, such population-based analysis obscures underlying heterogeneity contained within any phenotypically defined cell population. This heterogeneity can only be resolved through single cell analysis. Recent advances in single cell techniques allow analysis of the genome, transcriptome, epigenome and proteome in single cells at an unprecedented scale. The application of these new single cell methods to investigate the hematopoietic system has led to paradigm shifts in our understanding of cellular heterogeneity in hematopoiesis and how this is disrupted in disease. In this review, we summarize how single cell techniques have been applied to the analysis of hematopoietic stem/progenitor cells in normal and malignant hematopoiesis, with a particular focus on recent advances in single-cell genomics, including how these might be utilized for clinical application. Copyright © 2017. Published by Elsevier Ltd.

  18. Analysis of KNU1 loss of normal feedwater

    International Nuclear Information System (INIS)

    Kim, Hho-Jung; Chung, Bub-Dong; Lee, Young-Jin; Kim, Jin-Soo

    1986-01-01

    Simulation of the system thermal-hydraulic parameters was carried out following the KNU1 (Korea Nuclear Unit-1) loss of normal feedwater transient sequence occurred on November 14, 1984. Results were compared with the plant transient data, and good agreements were obtained. Some deviations were found in the parameters such as the steam flowrate and the RCS (Reactor Coolant system) average temperature, around the time of reactor trip. It can be expected since the thermal-hydraulic parameters encounter rapid transitions due to the large reduction of the reactor thermal power in a short period of time and, thereby, the plant data involve transient uncertainties. The analysis was performed using the RELAP5/MOD1/NSC developed through some modifications of the interphase drag and the wall heat transfer modeling routines of the RELAP5/MOD1/CY018. (author)

  19. [Raman, FTIR spectra and normal mode analysis of acetanilide].

    Science.gov (United States)

    Liang, Hui-Qin; Tao, Ya-Ping; Han, Li-Gang; Han, Yun-Xia; Mo, Yu-Jun

    2012-10-01

    The Raman and FTIR spectra of acetanilide (ACN) were measured experimentally in the regions of 3 500-50 and 3 500-600 cm(-1) respectively. The equilibrium geometry and vibration frequencies of ACN were calculated based on density functional theory (DFT) method (B3LYP/6-311G(d, p)). The results showed that the theoretical calculation of molecular structure parameters are in good agreement with previous report and better than the ones calculated based on 6-31G(d), and the calculated frequencies agree well with the experimental ones. Potential energy distribution of each frequency was worked out by normal mode analysis, and based on this, a detailed and accurate vibration frequency assignment of ACN was obtained.

  20. Post-UV colony-forming ability of normal fibroblast strains and of the xeroderma pigmentosum group G strain

    International Nuclear Information System (INIS)

    Barrett, S.F.; Tarone, R.E.; Moshell, A.N.; Ganges, M.B.; Robbins, J.H.

    1981-01-01

    In xeroderma pigmentosum, an inherited disorder of defective DNA repair, post-uv colony-forming ability of fibroblasts from patients in complementation groups A through F correlates with the patients' neurological status. The first xeroderma pigmentosum patient assigned to the recently discovered group G had the neurological abnormalities of XP. Researchers have determined the post-uv colony-forming ability of cultured fibroblasts from this patient and from 5 more control donors. Log-phase fibroblasts were irradiated with 254 nm uv light from a germicidal lamp, trypsinized, and replated at known densities. After 2 to 4 weeks' incubation the cells were fixed, stained and scored for colony formation. The strains' post-uv colony-forming ability curves were obtained by plotting the log of the percent remaining post-uv colony-forming ability as a function of the uv dose. The post-uv colony-forming ability of 2 of the 5 new normal strains was in the previously defined control donor zone, but that of the other 3 extended down to the level of the most resistant xeroderma pigmentosum strain. The post-uv colony-forming ability curve of the group G fibroblasts was not significantly different from the curves of the group D fibroblast strains from patients with clinical histories similar to that of the group G patient

  1. The method of normal forms for singularly perturbed systems of Fredholm integro-differential equations with rapidly varying kernels

    Energy Technology Data Exchange (ETDEWEB)

    Bobodzhanov, A A; Safonov, V F [National Research University " Moscow Power Engineering Institute" , Moscow (Russian Federation)

    2013-07-31

    The paper deals with extending the Lomov regularization method to classes of singularly perturbed Fredholm-type integro-differential systems, which have not so far been studied. In these the limiting operator is discretely noninvertible. Such systems are commonly known as problems with unstable spectrum. Separating out the essential singularities in the solutions to these problems presents great difficulties. The principal one is to give an adequate description of the singularities induced by 'instability points' of the spectrum. A methodology for separating singularities by using normal forms is developed. It is applied to the above type of systems and is substantiated in these systems. Bibliography: 10 titles.

  2. Analysis of WWER-440 fuel performance under normal operating conditions

    Energy Technology Data Exchange (ETDEWEB)

    Gunduz, Oe; Koese, S; Akbas, T [Atomenerjisi Komisyonu, Ankara (Turkey); Colak, Ue [Ankara Nuclear Research and Training Center (Turkey)

    1994-12-31

    FRAPCON-2 code originally developed for LWR fuel behaviour simulation is used to analyse the WWER-440 fuel rod behaviour at normal operational conditions. The code is capable of utilizing different models for mechanical analysis and gas release calculations. Heat transfer calculations are accomplished through a collocation technique by the method of weighted residuals. Temperature and burnup element properties are evaluated using MATPRO package. As the material properties of Zr-1%Nb used as cladding in WWER-440s are not provided in the code, Zircaloy-4 is used as a substitute for Zr-1%Nb. Mac-Donald-Weisman model is used for gas release calculation. FRACAS-1 and FRACAS-2 models are used in the mechanical calculations. It is assumed that the reactor was operated for 920 days (three consecutive cycles), the burnup being 42000 Mwd/t U. Results of the fuel rod behaviour analysis are given for three axial nodes: bottom node, central node and top node. The variations of the following characteristic fuel rod parameters are studied through the prescribed power history: unmoved gap thickness, gap heat transfer coefficient, fuel axial elongation, cladding axial elongation, fuel centerline temperature and ZrO-thickness at cladding surface. The value of each parameter is calculated as a function of the effective power days for the three nodes by using FRACAS-1 and FRACAS-2 codes for comparison.The results show that calculations with deformable pellet approximation with FRACAS-II model could provide better information for the behaviour of a typical fuel rod. Calculations indicate that fuel rod failure is not observed during the operation. All fuel rod parameters investigated are found to be within the safety limits. It is concluded, however, that for better assessment of reactor safety these calculations should be extended for transient conditions such as LOCA. 1 tab., 10 figs., 4 refs.

  3. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game

    Directory of Open Access Journals (Sweden)

    Adam Karbowski

    2017-09-01

    Full Text Available The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants’ attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  4. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game.

    Science.gov (United States)

    Karbowski, Adam; Ramsza, Michał

    2017-01-01

    The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants' attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  5. Comparative analysis of the essential oils from normal and hairy ...

    African Journals Online (AJOL)

    The essential oils were extracted with steam distillation from normal and hairy roots of Panax japonicus C.A. Meyer. The constituents of essential oils were analyzed by gas chromatography mass spectrometry (GC-MS). The results showed that 40 and 46 kinds of compounds were identified from the essential oils of normal ...

  6. Molecular analysis of quality protein (QPM) and normal maize ...

    African Journals Online (AJOL)

    Jane

    2011-10-24

    Oct 24, 2011 ... genetic variation among QPM and normal maize varieties is important for an efficient selection and .... olymerase chain reaction (PCR) amplifications of RAPD and ISSR ..... Genetic characterization of Malawian cowpea.

  7. Analysis of AHWR downcomer piping supported on elastoplastic dampers and subjected to normal and earthquake loadings

    International Nuclear Information System (INIS)

    Dubey, P.N.; Reddy, G.R.; Vaze, K.K.; Ghosh, A.K.

    2010-05-01

    Three layouts have been considered for AHWR downcomer for codal qualification in order to ensure its structural integrity under normal and occasional loads. In addition to codal qualification a good piping layout should have less number of bends and weld joints in order to reduce the in-service inspection cost. Less number of bends will reduce the pressure drop in natural circulation and lesser number of weld joints will reduce the total time of in-service inspection that finally reduces the radiation dose to the workers. Conventional seismic design approach of piping with snubbers leads to high cost, maintenance and possible locking causing undue higher thermal stress during normal operation. New seismic supports in the form of Elasto-Plastic Damper (EPD) are the best suited for nuclear piping because of their simple design, low cost, passive nature and ease in installation. In this report the characteristics of EPD obtained from theory, finite element analysis and tests have been presented and comparison has also been made among the three. Analysis method and code qualification of AHWR downcomer piping considering the loadings due to normal operating and occasional loads such as earthquake have been discussed in detail. This report also explains the concept of single support and multi-support response spectrum analysis methods. The results obtained by using both types of supports i.e. conventional and EPD supports have been compared and use of EPD supports in AHWR downcomer pipe is recommended. (author)

  8. Mathematical analysis of the normal anatomy of the aging fovea.

    Science.gov (United States)

    Nesmith, Brooke; Gupta, Akash; Strange, Taylor; Schaal, Yuval; Schaal, Shlomit

    2014-08-28

    To mathematically analyze anatomical changes that occur in the normal fovea during aging. A total of 2912 spectral-domain optical coherence tomography (SD-OCT) normal foveal scans were analyzed. Subjects were healthy individuals, aged 13 to 97 years, with visual acuity ≥20/40 and without evidence of foveal pathology. Using automated symbolic regression software Eureqa (version 0.98), foveal thickness maps of 390 eyes were analyzed using several measurements: parafoveal retinal thickness at 50 μm consecutive intervals, parafoveal maximum retinal thickness at two points lateral to central foveal depression, distance between two points of maximum retinal thickness, maximal foveal slope at two intervals lateral to central foveal depression, and central length of foveal depression. A unique mathematical equation representing the mathematical analog of foveal anatomy was derived for every decade, between 10 and 100 years. The mathematical regression function for normal fovea followed first order sine curve of level 10 complexity for the second decade of life. The mathematical regression function became more complex with normal aging, up to level 43 complexity (0.085 fit; P < 0.05). Young foveas had higher symmetry (0.92 ± 0.10) along midline, whereas aged foveas had significantly less symmetry (0.76 ± 0.27, P < 0.01) along midline and steeper maximal slopes (29 ± 32°, P < 0.01). Normal foveal anatomical configuration changes with age. Normal aged foveas are less symmetric along midline with steeper slopes. Differentiating between normal aging and pathologic changes using SD-OCT scans may allow early diagnosis, follow-up, and better management of the aging population. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  9. Characteristic analysis on UAV-MIMO channel based on normalized correlation matrix.

    Science.gov (United States)

    Gao, Xi jun; Chen, Zi li; Hu, Yong Jiang

    2014-01-01

    Based on the three-dimensional GBSBCM (geometrically based double bounce cylinder model) channel model of MIMO for unmanned aerial vehicle (UAV), the simple form of UAV space-time-frequency channel correlation function which includes the LOS, SPE, and DIF components is presented. By the methods of channel matrix decomposition and coefficient normalization, the analytic formula of UAV-MIMO normalized correlation matrix is deduced. This formula can be used directly to analyze the condition number of UAV-MIMO channel matrix, the channel capacity, and other characteristic parameters. The simulation results show that this channel correlation matrix can be applied to describe the changes of UAV-MIMO channel characteristics under different parameter settings comprehensively. This analysis method provides a theoretical basis for improving the transmission performance of UAV-MIMO channel. The development of MIMO technology shows practical application value in the field of UAV communication.

  10. Cy5 total protein normalization in Western blot analysis.

    Science.gov (United States)

    Hagner-McWhirter, Åsa; Laurin, Ylva; Larsson, Anita; Bjerneld, Erik J; Rönn, Ola

    2015-10-01

    Western blotting is a widely used method for analyzing specific target proteins in complex protein samples. Housekeeping proteins are often used for normalization to correct for uneven sample loads, but these require careful validation since expression levels may vary with cell type and treatment. We present a new, more reliable method for normalization using Cy5-prelabeled total protein as a loading control. We used a prelabeling protocol based on Cy5 N-hydroxysuccinimide ester labeling that produces a linear signal response. We obtained a low coefficient of variation (CV) of 7% between the ratio of extracellular signal-regulated kinase (ERK1/2) target to Cy5 total protein control signals over the whole loading range from 2.5 to 20.0μg of Chinese hamster ovary cell lysate protein. Corresponding experiments using actin or tubulin as controls for normalization resulted in CVs of 13 and 18%, respectively. Glyceraldehyde-3-phosphate dehydrogenase did not produce a proportional signal and was not suitable for normalization in these cells. A comparison of ERK1/2 signals from labeled and unlabeled samples showed that Cy5 prelabeling did not affect antibody binding. By using total protein normalization we analyzed PP2A and Smad2/3 levels with high confidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Effect analysis of core barrel openings under CEFR normal condition

    International Nuclear Information System (INIS)

    Zhang Yabo; Yang Hongyi

    2008-01-01

    Openings on the bottom of core barrel are important part of the decay heat removal system of China Experimental Fast Reactor (CEFR), which are designed to discharge the decay heat from reactor under accident condition. This paper analyses the effect of the openings design on the normal operation condition using the famouse CFD code CFX. The result indicates that the decay heat can be discharged safely and at the same time the effect of core barrel openings on the normal operation condition is acceptable. (authors)

  12. CASTOR: Normal-mode analysis of resistive MHD plasmas

    NARCIS (Netherlands)

    Kerner, W.; Goedbloed, J. P.; Huysmans, G. T. A.; Poedts, S.; Schwarz, E.

    1998-01-01

    The CASTOR (complex Alfven spectrum of toroidal plasmas) code computes the entire spectrum of normal-modes in resistive MHD for general tokamak configurations. The applied Galerkin method, in conjunction with a Fourier finite-element discretisation, leads to a large scale eigenvalue problem A (x)

  13. Analysis of assistance procedures to normal birth in primiparous

    Directory of Open Access Journals (Sweden)

    Joe Luiz Vieira Garcia Novo

    2016-04-01

    Full Text Available Introduction: Current medical technologies in care in birth increased maternal and fetal benefits persist, despite numerous unnecessary procedures. The purpose of the normal childbirth care is to have healthy women and newborns, using a minimum of safe interventions. Objective: To analyze the assistance to normal delivery in secondary care maternity. Methodology: A total of 100 primiparous mothers who had vaginal delivery were included, in which care practices used were categorized: 1 according to the WHO classification for assistance to normal childbirth: effective, harmful, used with caution and used inappropriately; 2 associating calculations with the Bologna Index parameters: presence of a birth partner, partograph, no stimulation of labor, delivery in non-supine position, and mother-newborn skin-to-skin contact. Results: Birth partners (85%, correctly filled partographs (62%, mother-newborn skin-to-skin contact (36%, use of oxytocin (87%, use of parenteral nutrition during labor (86% and at delivery (74%, episiotomy (94% and uterine fundal pressure in the expulsion stage (58%. The overall average value of the Bologna Index of the mothers analyzed was 1.95. Conclusions: Some effective procedures recommended by WHO (presence of a birth partner, some effective and mandatory practices were not complied with (partograph completely filled, potentially harmful or ineffective procedures were used (oxytocin in labor/post-partum, as well as inadequate procedures (uterine fundal pressure during the expulsion stage, use of forceps and episiotomy. The maternity’s care model did not offer excellence procedures in natural birth to their mothers in primiparity, (BI=1.95.

  14. Solitary-wave families of the Ostrovsky equation: An approach via reversible systems theory and normal forms

    International Nuclear Information System (INIS)

    Roy Choudhury, S.

    2007-01-01

    The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned

  15. Alternative pseudodifferential analysis with an application to modular forms

    CERN Document Server

    Unterberger, André

    2008-01-01

    This volume introduces an entirely new pseudodifferential analysis on the line, the opposition of which to the usual (Weyl-type) analysis can be said to reflect that, in representation theory, between the representations from the discrete and from the (full, non-unitary) series, or that between modular forms of the holomorphic and substitute for the usual Moyal-type brackets. This pseudodifferential analysis relies on the one-dimensional case of the recently introduced anaplectic representation and analysis, a competitor of the metaplectic representation and usual analysis. Besides researchers and graduate students interested in pseudodifferential analysis and in modular forms, the book may also appeal to analysts and physicists, for its concepts making possible the transformation of creation-annihilation operators into automorphisms, simultaneously changing the usual scalar product into an indefinite but still non-degenerate one.

  16. DSA analysis of the normal and variant hepatic arterial anatomy

    International Nuclear Information System (INIS)

    Lv Penghua; Wang Jie; Shi Haibing; Feng Yaoliang; Chen Huizhu; Chen Yuqin

    2005-01-01

    Objective: To observe and analyze the normal and variant hepatic arterial anatomy by DSA. Methods: One thousand and two hundreds patients with routine celiac and/or selective hepatic arteriography from November 1994 to March 2003 were retrospectively analyzed, some of them were further simultaneously undergone superior mesenteric arteriography, left gastric arteriography or inferior phrenic arteriography etc. Results: 873 (72.8%) patients had the standard hepatic arterial anatomy. 156(13.0%) patients had variant left hepatic arteries (LHAs), 120(10.0%) with variant right hepatic arteries (RHAs) and 21 (1.8%) of a variant anatomy involving both LHA and RHA. The common hepatic artery (CHA) of 1170 (97.5%) patients originated from the celiac artery. 92.0% proper hepatic artery (PHA) was the direct extension of CHA. The RHA was mainly (89.8%) derived from the PHA. There was some variation of the middle hepatic artery (MHA) with more than 62.2% arising from the LHA. The LHA was derived from the PHA (44.6%) or the RHA(30.2%) or other arteries (25.2%). Conclusions: The knowledge of normal and variant anatomy of hepatic vasculature by DSA may be very helpful for intervention therapy and hepatosurgery. (authors)

  17. Discriminant analysis of normal and malignant breast tissue based upon INAA investigation of elemental concentration

    International Nuclear Information System (INIS)

    Kwanhoong Ng; Senghuat Ong; Bradley, D.A.; Laimeng Looi

    1997-01-01

    Discriminant analysis of six trace element concentrations measured by instrumental neutron activation analysis (INAA) in 26 paired-samples of malignant and histologically normal human breast tissues shows the technique to be a potentially valuable clinical tool for making malignant-normal classification. Nonparametric discriminant analysis is performed for the data obtained. Linear and quadratic discriminant analyses are also carried out for comparison. For this data set a formal analysis shows that the elements which may be useful in distinguishing between malignant and normal tissues are Ca, Rb and Br, providing correct classification for 24 out of 26 normal samples and 22 out of 26 malignant samples. (Author)

  18. Machine learning methods for clinical forms analysis in mental health.

    Science.gov (United States)

    Strauss, John; Peguero, Arturo Martinez; Hirst, Graeme

    2013-01-01

    In preparation for a clinical information system implementation, the Centre for Addiction and Mental Health (CAMH) Clinical Information Transformation project completed multiple preparation steps. An automated process was desired to supplement the onerous task of manual analysis of clinical forms. We used natural language processing (NLP) and machine learning (ML) methods for a series of 266 separate clinical forms. For the investigation, documents were represented by feature vectors. We used four ML algorithms for our examination of the forms: cluster analysis, k-nearest neigh-bours (kNN), decision trees and support vector machines (SVM). Parameters for each algorithm were optimized. SVM had the best performance with a precision of 64.6%. Though we did not find any method sufficiently accurate for practical use, to our knowledge this approach to forms has not been used previously in mental health.

  19. Analysis of a Unilateral Contact Problem with Normal Compliance

    Directory of Open Access Journals (Sweden)

    Touzaline Arezki

    2014-06-01

    Full Text Available The paper deals with the study of a quasistatic unilateral contact problem between a nonlinear elastic body and a foundation. The contact is modelled with a normal compliance condition associated to unilateral constraint and the Coulomb's friction law. The adhesion between contact surfaces is taken into account and is modelled with a surface variable, the bonding field, whose evolution is described by a first-order differential equation. We establish a variational formulation of the mechanical problem and prove an existence and uniqueness result in the case where the coefficient of friction is bounded by a certain constant. The technique of the proof is based on arguments of time-dependent variational inequalities, differential equations and fixed-point theorem.

  20. Stability analysis of rough surfaces in adhesive normal contact

    Science.gov (United States)

    Rey, Valentine; Bleyer, Jeremy

    2018-03-01

    This paper deals with adhesive frictionless normal contact between one elastic flat solid and one stiff solid with rough surface. After computation of the equilibrium solution of the energy minimization principle and respecting the contact constraints, we aim at studying the stability of this equilibrium solution. This study of stability implies solving an eigenvalue problem with inequality constraints. To achieve this goal, we propose a proximal algorithm which enables qualifying the solution as stable or unstable and that gives the instability modes. This method has a low computational cost since no linear system inversion is required and is also suitable for parallel implementation. Illustrations are given for the Hertzian contact and for rough contact.

  1. Herpes Simplex Encephalitis Presenting with Normal CSF Analysis

    International Nuclear Information System (INIS)

    Ahmed, R.; Kiani, I. G.; Shah, F.; Rehman, R. N.; Haq, M. E.

    2013-01-01

    A 28 years old female presented with headache, fever, altered sensorium and right side weakness for one week. She was febrile and drowsy with right sided hemiplegia and papilledema. Tuberculous or bacterial meningitis, tuberculoma and abscess were at the top of the diagnosis list followed by Herpes simplex meningo-encephalitis (HSE). MRI showed abnormal signal intensity of left temporal lobe without significant post-contrast enhancement and midline shift. CSF examination was normal, gram stain and Ziehl-Neelsen stain showed no micro-organism, or acid fast bacilli. CSF for MTB PCR was negative. PCR DNA for Herpes simplex 1 on CSF was detected. Acyclovir was started and the patient was discharged after full recovery. A high index of suspicion is required for HSE diagnosis in Pakistan where other infections predominantly affect the brain and HSE may be overlooked as a potential diagnosis. (author)

  2. Analysis of normal tongue by dynamic enhanced MRI

    International Nuclear Information System (INIS)

    Ariyoshi, Yasunori; Shimahara, Masashi

    2003-01-01

    We qualitatively evaluated dynamic enhanced MR images of normal tongues of 26 patients without oral malignancy, inflammatory diseases or systemic diseases. The selected slices were not affected by apparent artifacts including motion and susceptibility, and the tongue shape was delineated as symmetrical on coronal images, which were obtained using a T1 weighted spin echo pulse sequence (repetition time/echo time (TR/TE)=200/20). Slices at the incisor and molar levels were evaluated. Structures that could be identified on each pre-contrast image could also be identified on the post-contrast dynamic enhanced image. However, identification of the intrinsic tongue musculature was impossible on the images that were composed of symmetrical, relatively high signal areas surrounded by a low signal area. Both areas were gradually but apparently enhanced. The sublingual space was easily identified at the molar level, as it was rapidly enhanced and symmetrically delineated on each image, however, it was difficult to determine at the incisor level. Further, the lingual septum could also be identified in almost all images at the molar level, and showed no enhancement pattern, whereas, the mucosal surface of the dorsum tongue was rapidly enhanced, and identified on each image. (author)

  3. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  4. Sampling and analysis strategies to support waste form qualification

    International Nuclear Information System (INIS)

    Westsik, J.H. Jr.; Pulsipher, B.A.; Eggett, D.L.; Kuhn, W.L.

    1989-04-01

    As part of the waste acceptance process, waste form producers will be required to (1) demonstrate that their glass waste form will meet minimum specifications, (2) show that the process can be controlled to consistently produce an acceptable waste form, and (3) provide documentation that the waste form produced meets specifications. Key to the success of these endeavors is adequate sampling and chemical and radiochemical analyses of the waste streams from the waste tanks through the process to the final glass product. This paper suggests sampling and analysis strategies for meeting specific statistical objectives of (1) detection of compositions outside specification limits, (2) prediction of final glass product composition, and (3) estimation of composition in process vessels for both reporting and guiding succeeding process steps. 2 refs., 1 fig., 3 tabs

  5. Formability models for warm sheet metal forming analysis

    Science.gov (United States)

    Jiang, Sen

    Several closed form models for the prediction of strain space sheet metal formability as a function of temperature and strain rate are proposed. The proposed models require only failure strain information from the uniaxial tension test at an elevated temperature setting and failure strain information from the traditionally defined strain space forming limit diagram at room temperature, thereby featuring the advantage of offering a full forming limit description without having to carry out expensive experimental studies for multiple modes of deformation under the elevated temperature. The Power law, Voce, and Johnson-Cook hardening models are considered along with the yield criterions of Hill's 48 and Logan-Hosford yield criteria. Acceptable correlations between the theory and experiment are reported for all the models under a plane strain condition. Among all the proposed models, the model featuring Johnson-Cook hardening model and Logan-Hosford yield behavior (LHJC model) was shown to best correlate with experiment. The sensitivity of the model with respect to various forming parameters is discussed. This work is significant to those aiming to incorporate closed-form formability models directly into numerical simulation programs for the purpose of design and analysis of products manufactured through the warm sheet metal forming process. An improvement based upon Swift's diffuse necking theory, is suggested in order to enhance the reliability of the model for biaxial stretch conditions. Theory relating to this improvement is provided in Appendix B.

  6. Surface analysis: its uses and abuses in waste form evaluation

    International Nuclear Information System (INIS)

    McVay, G.L.; Pederson, L.R.

    1981-01-01

    Surface and near-surface analytical techniques are significant aids in understanding waste form-aqueous solution interactions. They can be beneficially employed to evaluate reaction layers on waste forms, to assess surface treatments prior to and after leaching, and to identify interactions with waste forms. Surface analyses are best used in conjunction with other types of analyses, such as solution analyses, in order to obtain a better overall understanding of reaction processes. In spite of all the benefits to be gained by using surface analyses, misinterpretations can result if care is not taken to properly obtain and analyze the data. In particular, the density variations through a reaction layer must be accounted for in both sputtering and data analysis techniques

  7. Micro analysis of fringe field formed inside LDA measuring volume

    International Nuclear Information System (INIS)

    Ghosh, Abhijit; Nirala, A K

    2016-01-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement. (paper)

  8. Comparative Study of Various Normal Mode Analysis Techniques Based on Partial Hessians

    OpenAIRE

    GHYSELS, AN; VAN SPEYBROECK, VERONIQUE; PAUWELS, EWALD; CATAK, SARON; BROOKS, BERNARD R.; VAN NECK, DIMITRI; WAROQUIER, MICHEL

    2010-01-01

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and...

  9. Optimal consistency in microRNA expression analysis using reference-gene-based normalization.

    Science.gov (United States)

    Wang, Xi; Gardiner, Erin J; Cairns, Murray J

    2015-05-01

    Normalization of high-throughput molecular expression profiles secures differential expression analysis between samples of different phenotypes or biological conditions, and facilitates comparison between experimental batches. While the same general principles apply to microRNA (miRNA) normalization, there is mounting evidence that global shifts in their expression patterns occur in specific circumstances, which pose a challenge for normalizing miRNA expression data. As an alternative to global normalization, which has the propensity to flatten large trends, normalization against constitutively expressed reference genes presents an advantage through their relative independence. Here we investigated the performance of reference-gene-based (RGB) normalization for differential miRNA expression analysis of microarray expression data, and compared the results with other normalization methods, including: quantile, variance stabilization, robust spline, simple scaling, rank invariant, and Loess regression. The comparative analyses were executed using miRNA expression in tissue samples derived from subjects with schizophrenia and non-psychiatric controls. We proposed a consistency criterion for evaluating methods by examining the overlapping of differentially expressed miRNAs detected using different partitions of the whole data. Based on this criterion, we found that RGB normalization generally outperformed global normalization methods. Thus we recommend the application of RGB normalization for miRNA expression data sets, and believe that this will yield a more consistent and useful readout of differentially expressed miRNAs, particularly in biological conditions characterized by large shifts in miRNA expression.

  10. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    Energy Technology Data Exchange (ETDEWEB)

    Belushkin, M.

    2007-09-29

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  11. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    International Nuclear Information System (INIS)

    Belushkin, M.

    2007-01-01

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the ππ, K anti K and the ρπ continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  12. Normality Tests for Statistical Analysis: A Guide for Non-Statisticians

    Science.gov (United States)

    Ghasemi, Asghar; Zahediasl, Saleh

    2012-01-01

    Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808

  13. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    Science.gov (United States)

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  14. Must analysis of meaning follow analysis of form? A time course analysis.

    Science.gov (United States)

    Feldman, Laurie B; Milin, Petar; Cho, Kit W; Moscoso Del Prado Martín, Fermín; O'Connor, Patrick A

    2015-01-01

    Many models of word recognition assume that processing proceeds sequentially from analysis of form to analysis of meaning. In the context of morphological processing, this implies that morphemes are processed as units of form prior to any influence of their meanings. Some interpret the apparent absence of differences in recognition latencies to targets (SNEAK) in form and semantically similar (sneaky-SNEAK) and in form similar and semantically dissimilar (sneaker-SNEAK) prime contexts at a stimulus onset asynchrony (SOA) of 48 ms as consistent with this claim. To determine the time course over which degree of semantic similarity between morphologically structured primes and their targets influences recognition in the forward masked priming variant of the lexical decision paradigm, we compared facilitation for the same targets after semantically similar and dissimilar primes across a range of SOAs (34-100 ms). The effect of shared semantics on recognition latency increased linearly with SOA when long SOAs were intermixed (Experiments 1A and 1B) and latencies were significantly faster after semantically similar than dissimilar primes at homogeneous SOAs of 48 ms (Experiment 2) and 34 ms (Experiment 3). Results limit the scope of form-then-semantics models of recognition and demonstrate that semantics influences even the very early stages of recognition. Finally, once general performance across trials has been accounted for, we fail to provide evidence for individual differences in morphological processing that can be linked to measures of reading proficiency.

  15. Must analysis of meaning follow analysis of form? A time course analysis

    Directory of Open Access Journals (Sweden)

    Laurie eFeldman

    2015-03-01

    Full Text Available Many models of word recognition assume that processing proceeds sequentially from analysis of form to analysis of meaning. In the context of morphological processing, this implies that morphemes are processed as units of form prior to any influence of their meanings. Some interpret the apparent absence of differences in recognition latencies to targets (SNEAK in form and semantically similar (sneaky-SNEAK and in form similar and semantically dissimilar (sneaker-SNEAK prime contexts at an SOA of 48 ms as consistent with this claim. To determine the time course over which degree of semantic similarity between morphologically structured primes and their targets influences recognition in the forward masked priming variant of the lexical decision paradigm, we compared facilitation for the same targets after semantically similar and dissimilar primes across a range of SOAs. The effect of shared semantics on recognition latency increased linearly with SOA when long SOAs were intermixed (Exp. 1 and latencies were significantly faster after semantically similar than dissimilar primes at homogeneous SOAs of 48 ms (Exp. 2 and 34 ms (Exp. 3. Results limit the scope of form-then-semantics models of recognition and demonstrate that semantics influences even the very early stages of recognition. Finally, once general behavior across trials has been accounted for, we fail to provide evidence for individual differences in morphological processing that can be linked to measures of reading proficiency.

  16. Dispersive analysis of the scalar form factor of the nucleon

    Science.gov (United States)

    Hoferichter, M.; Ditsche, C.; Kubis, B.; Meißner, U.-G.

    2012-06-01

    Based on the recently proposed Roy-Steiner equations for pion-nucleon ( πN) scattering [1], we derive a system of coupled integral equations for the π π to overline N N and overline K K to overline N N S-waves. These equations take the form of a two-channel Muskhelishvili-Omnès problem, whose solution in the presence of a finite matching point is discussed. We use these results to update the dispersive analysis of the scalar form factor of the nucleon fully including overline K K intermediate states. In particular, we determine the correction {Δ_{σ }} = σ ( {2M_{π }^2} ) - {σ_{{π N}}} , which is needed for the extraction of the pion-nucleon σ term from πN scattering, as a function of pion-nucleon subthreshold parameters and the πN coupling constant.

  17. EMERALD-NORMAL, Routine Radiation Release and Dose for PWR Design Analysis and Operation Analysis

    International Nuclear Information System (INIS)

    Gillespie, S.G.; Brunot, W.K.

    1976-01-01

    1 - Description of problem or function: EMERALD-NORMAL is designed for the calculation of radiation releases and exposures resulting from normal operation of a large pressurized water reactor. The approach used is similar to an analog simulation of a real system. Each component or volume in the plant which contains a radioactive material is represented by a subroutine which keeps track of the production, transfer, decay, and absorption of radioactivity in that volume. During the course of the analysis, activity is transferred from subroutine to subroutine in the program as it would be transferred from place to place in the plant. Some of this activity is then released to the atmosphere and to the discharge canal. The rates of transfer, leakage, production, cleanup, decay, and release are read as input to the program. Subroutines are also included which calculate the off-site radiation exposures at various distances for individual isotopes and sums of isotopes. The program contains a library of physical data for the forty isotopes of most interest in licensing calculations, and other isotopes can be added or substituted. Because of the flexible nature of the simulation approach, the EMERALD-NORMAL program can be used for most calculations involving the production and release of radioactive material. These include design, operation, and licensing studies. 2 - Method of solution: Explicit solutions of first-order linear differential equations are included. In addition, a subroutine is provided which solves a set of simultaneous linear algebraic equations. 3 - Restrictions on the complexity of the problem: Many parameters and systems included in the program, particularly the radiation waste-treatment system, are unique to the PG and E Diablo Canyon PWR plant. Maxima of: 50 isotopes, 9 distances, 16 angular sectors, 1 operating period, 1 reactor power level

  18. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  19. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  20. Analysis of Fringe Field Formed Inside LDA Measurement Volume Using Compact Two Hololens Imaging Systems

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.; Yadav, H. L.

    2018-03-01

    We have designed and fabricated four LDA optical setups consisting of aberration compensated four different compact two hololens imaging systems. We have experimentally investigated and realized a hololens recording geometry which is interferogram of converging spherical wavefront with mutually coherent planar wavefront. Proposed real time monitoring and actual fringe field analysis techniques allow complete characterizations of fringes formed at measurement volume and permit to evaluate beam quality, alignment and fringe uniformity with greater precision. After experimentally analyzing the fringes formed at measurement volume by all four imaging systems, it is found that fringes obtained using compact two hololens imaging systems get improved both qualitatively and quantitatively compared to that obtained using conventional imaging system. Results indicate qualitative improvement of non-uniformity in fringe thickness and micro intensity variations perpendicular to the fringes, and quantitative improvement of 39.25% in overall average normalized standard deviations of fringe width formed by compact two hololens imaging systems compare to that of conventional imaging system.

  1. BIOCHEMICAL EFFECTS IN NORMAL AND STONE FORMING RATS TREATED WITH THE RIPE KERNEL JUICE OF PLANTAIN (MUSA PARADISIACA)

    Science.gov (United States)

    Devi, V. Kalpana; Baskar, R.; Varalakshmi, P.

    1993-01-01

    The effect of Musa paradisiaca stem kernel juice was investigated in experimental urolithiatic rats. Stone forming rats exhibited a significant elevation in the activities of two oxalate synthesizing enzymes - Glycollic acid oxidase and Lactate dehydrogenase. Deposition and excretion of stone forming constituents in kidney and urine were also increased in these rats. The enzyme activities and the level of crystalline components were lowered with the extract treatment. The extract also reduced the activities of urinary alkaline phosphatase, lactate dehydrogenase, r-glutamyl transferase, inorganic pyrophosphatase and β-glucuronidase in calculogenic rats. No appreciable changes were noticed with leucine amino peptidase activity in treated rats. PMID:22556626

  2. A numerical analysis on forming limits during spiral and concentric single point incremental forming

    Science.gov (United States)

    Gipiela, M. L.; Amauri, V.; Nikhare, C.; Marcondes, P. V. P.

    2017-01-01

    Sheet metal forming is one of the major manufacturing industries, which are building numerous parts for aerospace, automotive and medical industry. Due to the high demand in vehicle industry and environmental regulations on less fuel consumption on other hand, researchers are innovating new methods to build these parts with energy efficient sheet metal forming process instead of conventionally used punch and die to form the parts to achieve the lightweight parts. One of the most recognized manufacturing process in this category is Single Point Incremental Forming (SPIF). SPIF is the die-less sheet metal forming process in which the single point tool incrementally forces any single point of sheet metal at any process time to plastic deformation zone. In the present work, finite element method (FEM) is applied to analyze the forming limits of high strength low alloy steel formed by single point incremental forming (SPIF) by spiral and concentric tool path. SPIF numerical simulations were model with 24 and 29 mm cup depth, and the results were compare with Nakajima results obtained by experiments and FEM. It was found that the cup formed with Nakajima tool failed at 24 mm while cups formed by SPIF surpassed the limit for both depths with both profiles. It was also notice that the strain achieved in concentric profile are lower than that in spiral profile.

  3. The Coopersmith Self-Esteem Inventory: analysis and partial validation of a modified adult form.

    Science.gov (United States)

    Myhill, J; Lorr, M

    1978-01-01

    Determined the factor structure of an adult form of the Coopersmith Self-Esteem Inventory (SEI), tested several hypotheses related to its content, and assessed the utility of the five derived scores for differentiating psychiatric outpatients from normals. The modified Self-Esteem Inventory and six other scales were completed by 200 local-government employees. A principal components analysis of correlations among 58 SEI items and two marker variables revealed five factors. The rotated dimensions were labelled (1) anxiety; (2) defensiveness; (3) negative social attitude; (4) rejection of self; and (5) inadequacy of self. Fifty psychiatric outpatients were compared with 100 normals with respect to the five derived factor scores. Tests of significance indicated that the two groups differed significantly on all measures except the defensiveness or lie scale factor. It is concluded that the Coopersmith Inventory is complex and measures several characteristics in addition to self-esteem.

  4. ANALYSIS OF DIFFERENT FORMS OF SHOTS IN FOOTBALL

    Directory of Open Access Journals (Sweden)

    Miroslav Radoman

    2010-09-01

    Full Text Available The area of technical-tactical activities is very interesting and unexplored, especially because the game from year to year changes faster. The aim of this research is to perform analysis of different forms of shots in football (in the official statistics and their association with outcomes of football matches at the World Cup 2006. in Germany (FIFA World Cup Germany 2006.. The sample consists of observations of 64 games played at the World Cup 2006. in Germany. Sample characteristics are elements of games that are leading the official statistics, which is promoted for all FIFA competitions that take place under its auspices, including the World Championships. Of five elements for analysis was selected four. Based on the analysis set forth can be concluded that the team winning the game stands out significantly in all analyzed characteristics in relation to the game the team with a score draw or a defeat where the difference is less. All teams that have achieved the victory had offensive tactics or were technically dominant in relation to the opponent.

  5. An individual urinary proteome analysis in normal human beings to define the minimal sample number to represent the normal urinary proteome

    Directory of Open Access Journals (Sweden)

    Liu Xuejiao

    2012-11-01

    Full Text Available Abstract Background The urinary proteome has been widely used for biomarker discovery. A urinary proteome database from normal humans can provide a background for discovery proteomics and candidate proteins/peptides for targeted proteomics. Therefore, it is necessary to define the minimum number of individuals required for sampling to represent the normal urinary proteome. Methods In this study, inter-individual and inter-gender variations of urinary proteome were taken into consideration to achieve a representative database. An individual analysis was performed on overnight urine samples from 20 normal volunteers (10 males and 10 females by 1DLC/MS/MS. To obtain a representative result of each sample, a replicate 1DLCMS/MS analysis was performed. The minimal sample number was estimated by statistical analysis. Results For qualitative analysis, less than 5% of new proteins/peptides were identified in a male/female normal group by adding a new sample when the sample number exceeded nine. In addition, in a normal group, the percentage of newly identified proteins/peptides was less than 5% upon adding a new sample when the sample number reached 10. Furthermore, a statistical analysis indicated that urinary proteomes from normal males and females showed different patterns. For quantitative analysis, the variation of protein abundance was defined by spectrum count and western blotting methods. And then the minimal sample number for quantitative proteomic analysis was identified. Conclusions For qualitative analysis, when considering the inter-individual and inter-gender variations, the minimum sample number is 10 and requires a balanced number of males and females in order to obtain a representative normal human urinary proteome. For quantitative analysis, the minimal sample number is much greater than that for qualitative analysis and depends on the experimental methods used for quantification.

  6. The Effect of Normal Force on Tribocorrosion Behaviour of Ti-10Zr Alloy and Porous TiO2-ZrO2 Thin Film Electrochemical Formed

    Science.gov (United States)

    Dănăilă, E.; Benea, L.

    2017-06-01

    The tribocorrosion behaviour of Ti-10Zr alloy and porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy was evaluated in Fusayama-Mayer artificial saliva solution. Tribocorrosion experiments were performed using a unidirectional pin-on-disc experimental set-up which was mechanically and electrochemically instrumented, under various solicitation conditions. The effect of applied normal force on tribocorrosion performance of the tested materials was determined. Open circuit potential (OCP) measurements performed before, during and after sliding tests were applied in order to determine the tribocorrosion degradation. The applied normal force was found to greatly affect the potential during tribocorrosion experiments, an increase in the normal force inducing a decrease in potential accelerating the depassivation of the materials studied. The results show a decrease in friction coefficient with gradually increasing the normal load. It was proved that the porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy lead to an improvement of tribocorrosion resistance compared to non-anodized Ti-10Zr alloy intended for biomedical applications.

  7. A Hard X-Ray Study of the Normal Star-Forming Galaxy M83 with NuSTAR

    DEFF Research Database (Denmark)

    Yukita, M.; Hornschemeier, A. E.; Lehmer, B. D.

    2016-01-01

    We present the results from sensitive, multi-epoch NuSTAR observations of the late-type star-forming galaxy M83 (d = 4.6 Mpc). This is the first investigation to spatially resolve the hard (E > 10 keV) X-ray emission of this galaxy. The nuclear region and similar to 20 off-nuclear point sources......, including a previously discovered ultraluminous X-ray source, are detected in our NuSTAR observations. The X-ray hardnesses and luminosities of the majority of the point sources are consistent with hard X-ray sources resolved in the starburst galaxy NGC 253. We infer that the hard X-ray emission is most...

  8. Assessment of 1H NMR-based metabolomics analysis for normalization of urinary metals against creatinine.

    Science.gov (United States)

    Cassiède, Marc; Nair, Sindhu; Dueck, Meghan; Mino, James; McKay, Ryan; Mercier, Pascal; Quémerais, Bernadette; Lacy, Paige

    2017-01-01

    Proton nuclear magnetic resonance ( 1 H NMR, or NMR) spectroscopy and inductively coupled plasma-mass spectrometry (ICP-MS) are commonly used for metabolomics and metal analysis in urine samples. However, creatinine quantification by NMR for the purpose of normalization of urinary metals has not been validated. We assessed the validity of using NMR analysis for creatinine quantification in human urine samples in order to allow normalization of urinary metal concentrations. NMR and ICP-MS techniques were used to measure metabolite and metal concentrations in urine samples from 10 healthy subjects. For metabolite analysis, two magnetic field strengths (600 and 700MHz) were utilized. In addition, creatinine concentrations were determined by using the Jaffe method. Creatinine levels were strongly correlated (R 2 =0.99) between NMR and Jaffe methods. The NMR spectra were deconvoluted with a target database containing 151 metabolites that are present in urine. A total of 50 metabolites showed good correlation (R 2 =0.7-1.0) at 600 and 700MHz. Metal concentrations determined after NMR-measured creatinine normalization were comparable to previous reports. NMR analysis provided robust urinary creatinine quantification, and was sufficient for normalization of urinary metal concentrations. We found that NMR-measured creatinine-normalized urinary metal concentrations in our control subjects were similar to general population levels in Canada and the United Kingdom. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Graph analysis of cell clusters forming vascular networks

    Science.gov (United States)

    Alves, A. P.; Mesquita, O. N.; Gómez-Gardeñes, J.; Agero, U.

    2018-03-01

    This manuscript describes the experimental observation of vasculogenesis in chick embryos by means of network analysis. The formation of the vascular network was observed in the area opaca of embryos from 40 to 55 h of development. In the area opaca endothelial cell clusters self-organize as a primitive and approximately regular network of capillaries. The process was observed by bright-field microscopy in control embryos and in embryos treated with Bevacizumab (Avastin), an antibody that inhibits the signalling of the vascular endothelial growth factor (VEGF). The sequence of images of the vascular growth were thresholded, and used to quantify the forming network in control and Avastin-treated embryos. This characterization is made by measuring vessels density, number of cell clusters and the largest cluster density. From the original images, the topology of the vascular network was extracted and characterized by means of the usual network metrics such as: the degree distribution, average clustering coefficient, average short path length and assortativity, among others. This analysis allows to monitor how the largest connected cluster of the vascular network evolves in time and provides with quantitative evidence of the disruptive effects that Avastin has on the tree structure of vascular networks.

  10. Dynamic pathways to mediate reactions buried in thermal fluctuations. I. Time-dependent normal form theory for multidimensional Langevin equation.

    Science.gov (United States)

    Kawai, Shinnosuke; Komatsuzaki, Tamiki

    2009-12-14

    We present a novel theory which enables us to explore the mechanism of reaction selectivity and robust functions in complex systems persisting under thermal fluctuation. The theory constructs a nonlinear coordinate transformation so that the equation of motion for the new reaction coordinate is independent of the other nonreactive coordinates in the presence of thermal fluctuation. In this article we suppose that reacting systems subject to thermal noise are described by a multidimensional Langevin equation without a priori assumption for the form of potential. The reaction coordinate is composed not only of all the coordinates and velocities associated with the system (solute) but also of the random force exerted by the environment (solvent) with friction constants. The sign of the reaction coordinate at any instantaneous moment in the region of a saddle determines the fate of the reaction, i.e., whether the reaction will proceed through to the products or go back to the reactants. By assuming the statistical properties of the random force, one can know a priori a well-defined boundary of the reaction which separates the full position-velocity space in the saddle region into mainly reactive and mainly nonreactive regions even under thermal fluctuation. The analytical expression of the reaction coordinate provides the firm foundation on the mechanism of how and why reaction proceeds in thermal fluctuating environments.

  11. Analysis of EMG Signals in Aggressive and Normal Activities by Using Higher-Order Spectra

    Directory of Open Access Journals (Sweden)

    Necmettin Sezgin

    2012-01-01

    Full Text Available The analysis and classification of electromyography (EMG signals are very important in order to detect some symptoms of diseases, prosthetic arm/leg control, and so on. In this study, an EMG signal was analyzed using bispectrum, which belongs to a family of higher-order spectra. An EMG signal is the electrical potential difference of muscle cells. The EMG signals used in the present study are aggressive or normal actions. The EMG dataset was obtained from the machine learning repository. First, the aggressive and normal EMG activities were analyzed using bispectrum and the quadratic phase coupling of each EMG episode was determined. Next, the features of the analyzed EMG signals were fed into learning machines to separate the aggressive and normal actions. The best classification result was 99.75%, which is sufficient to significantly classify the aggressive and normal actions.

  12. Analysis of EMG Signals in Aggressive and Normal Activities by Using Higher-Order Spectra

    Science.gov (United States)

    Sezgin, Necmettin

    2012-01-01

    The analysis and classification of electromyography (EMG) signals are very important in order to detect some symptoms of diseases, prosthetic arm/leg control, and so on. In this study, an EMG signal was analyzed using bispectrum, which belongs to a family of higher-order spectra. An EMG signal is the electrical potential difference of muscle cells. The EMG signals used in the present study are aggressive or normal actions. The EMG dataset was obtained from the machine learning repository. First, the aggressive and normal EMG activities were analyzed using bispectrum and the quadratic phase coupling of each EMG episode was determined. Next, the features of the analyzed EMG signals were fed into learning machines to separate the aggressive and normal actions. The best classification result was 99.75%, which is sufficient to significantly classify the aggressive and normal actions. PMID:23193379

  13. Human evaluation in association to the mathematical analysis of arch forms: Two-dimensional study.

    Science.gov (United States)

    Zabidin, Nurwahidah; Mohamed, Alizae Marny; Zaharim, Azami; Marizan Nor, Murshida; Rosli, Tanti Irawati

    2018-03-01

    To evaluate the relationship between human evaluation of the dental-arch form, to complete a mathematical analysis via two different methods in quantifying the arch form, and to establish agreement with the fourth-order polynomial equation. This study included 64 sets of digitised maxilla and mandible dental casts obtained from a sample of dental arch with normal occlusion. For human evaluation, a convenient sample of orthodontic practitioners ranked the photo images of dental cast from the most tapered to the less tapered (square). In the mathematical analysis, dental arches were interpolated using the fourth-order polynomial equation with millimetric acetate paper and AutoCAD software. Finally, the relations between human evaluation and mathematical objective analyses were evaluated. Human evaluations were found to be generally in agreement, but only at the extremes of tapered and square arch forms; this indicated general human error and observer bias. The two methods used to plot the arch form were comparable. The use of fourth-order polynomial equation may be facilitative in obtaining a smooth curve, which can produce a template for individual arch that represents all potential tooth positions for the dental arch. Copyright © 2018 CEO. Published by Elsevier Masson SAS. All rights reserved.

  14. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  15. The analysis of bottom forming process for hybrid heating device

    Science.gov (United States)

    Bałon, Paweł; Świątoniowski, Andrzej; Kiełbasa, Bartłomiej

    2017-10-01

    In this paper the authors present an unusual method for bottom forming applicable for various industrial purposes including the manufacture of water heaters or pressure equipment. This method allows for the fabrication of the bottom of a given piece of stainless steel into a pre-determined shape conforming to the DIN standard which determines the most advantageous dimensions for the bottom cross section in terms of working pressure loading. The authors checked the validity of the method in a numerical and experimental way generating a tool designed to produce bottoms of specified geometry. Many problems are encountered during the design and production of parts, especially excessive sheet wrinkling over a large area of the part. The experiment showed that a lack of experience and numerical analysis in the design of such elements would result in the production of highly wrinkled parts. This defect would render the parts impossible to assemble with the cylindrical part. Many tool shops employ a method for drawing elements with a spherical surface which involves additional spinning, stamping, and grading operations, which greatly increases the cost of parts production. The authors present and compare two forming methods for spherical and parabolic objects, and experimentally confirm the validity of the sheet reversing method with adequate pressure force. The applied method produces parts in one drawing operation and in a following operation that is based on laser or water cutting to obtain a round blank. This reduces the costs of tooling manufacturing by requiring just one tool which can be placed on any hydraulic press with a minimum force of 2 000 kN.

  16. Analysis of visual appearance of retinal nerve fibers in high resolution fundus images: a study on normal subjects.

    Science.gov (United States)

    Kolar, Radim; Tornow, Ralf P; Laemmer, Robert; Odstrcilik, Jan; Mayer, Markus A; Gazarek, Jiri; Jan, Jiri; Kubena, Tomas; Cernosek, Pavel

    2013-01-01

    The retinal ganglion axons are an important part of the visual system, which can be directly observed by fundus camera. The layer they form together inside the retina is the retinal nerve fiber layer (RNFL). This paper describes results of a texture RNFL analysis in color fundus photographs and compares these results with quantitative measurement of RNFL thickness obtained from optical coherence tomography on normal subjects. It is shown that local mean value, standard deviation, and Shannon entropy extracted from the green and blue channel of fundus images are correlated with corresponding RNFL thickness. The linear correlation coefficients achieved values 0.694, 0.547, and 0.512 for respective features measured on 439 retinal positions in the peripapillary area from 23 eyes of 15 different normal subjects.

  17. Analysis of Visual Appearance of Retinal Nerve Fibers in High Resolution Fundus Images: A Study on Normal Subjects

    Directory of Open Access Journals (Sweden)

    Radim Kolar

    2013-01-01

    Full Text Available The retinal ganglion axons are an important part of the visual system, which can be directly observed by fundus camera. The layer they form together inside the retina is the retinal nerve fiber layer (RNFL. This paper describes results of a texture RNFL analysis in color fundus photographs and compares these results with quantitative measurement of RNFL thickness obtained from optical coherence tomography on normal subjects. It is shown that local mean value, standard deviation, and Shannon entropy extracted from the green and blue channel of fundus images are correlated with corresponding RNFL thickness. The linear correlation coefficients achieved values 0.694, 0.547, and 0.512 for respective features measured on 439 retinal positions in the peripapillary area from 23 eyes of 15 different normal subjects.

  18. Immunohistochemical analysis of Sonic hedgehog signalling in normal human urinary tract development

    OpenAIRE

    Jenkins, Dagan; Winyard, Paul J D; Woolf, Adrian S.

    2007-01-01

    Studies of mouse mutants have demonstrated that Sonic hedgehog (SHH) signalling has a functional role in morphogenesis and differentiation at multiple sites within the forming urinary tract, and urinary tract malformations have been reported in humans with mutations that disrupt SHH signalling. However, there is only strikingly sparse and fragmentary information about the expression of SHH and associated signalling genes in normal human urinary tract development. We used immunohistochemistry ...

  19. A viscometric approach of pH effect on hydrodynamic properties of human serum albumin in the normal form.

    Science.gov (United States)

    Monkos, Karol

    2013-03-01

    The paper presents the results of viscosity determinations on aqueous solutions of human serum albumin (HSA) at isoelectric point over a wide range of concentrations and at temperatures ranging from 5°C to 45°C. On the basis of a modified Arrhenius equation and Mooney's formula some hydrodynamic parameters were obtained. They are compared with those previously obtained for HSA in solutions at neutral pH. The activation energy and entropy of viscous flow and the intrinsic viscosity reach a maximum value, and the effective specific volume, the self-crowding factor and the Huggins coefficient a minimum value in solutions at isoelectric point. Using the dimensionless parameter [η]c, the existence of three ranges of concentrations: diluted, semi-diluted and concentrated, was shown. By applying Lefebvre's relation for the relative viscosity in the semi-dilute regime, the Mark-Houvink-Kuhn-Sakurada (MHKS) exponent was established. The analysis of the results obtained from the three ranges of concentrations showed that both conformation and stiffness of HSA molecules in solutions at isoelectric point and at neutral pH are the same.

  20. Comparison of normalization methods for the analysis of metagenomic gene abundance data.

    Science.gov (United States)

    Pereira, Mariana Buongermino; Wallroth, Mikael; Jonsson, Viktor; Kristiansson, Erik

    2018-04-20

    In shotgun metagenomics, microbial communities are studied through direct sequencing of DNA without any prior cultivation. By comparing gene abundances estimated from the generated sequencing reads, functional differences between the communities can be identified. However, gene abundance data is affected by high levels of systematic variability, which can greatly reduce the statistical power and introduce false positives. Normalization, which is the process where systematic variability is identified and removed, is therefore a vital part of the data analysis. A wide range of normalization methods for high-dimensional count data has been proposed but their performance on the analysis of shotgun metagenomic data has not been evaluated. Here, we present a systematic evaluation of nine normalization methods for gene abundance data. The methods were evaluated through resampling of three comprehensive datasets, creating a realistic setting that preserved the unique characteristics of metagenomic data. Performance was measured in terms of the methods ability to identify differentially abundant genes (DAGs), correctly calculate unbiased p-values and control the false discovery rate (FDR). Our results showed that the choice of normalization method has a large impact on the end results. When the DAGs were asymmetrically present between the experimental conditions, many normalization methods had a reduced true positive rate (TPR) and a high false positive rate (FPR). The methods trimmed mean of M-values (TMM) and relative log expression (RLE) had the overall highest performance and are therefore recommended for the analysis of gene abundance data. For larger sample sizes, CSS also showed satisfactory performance. This study emphasizes the importance of selecting a suitable normalization methods in the analysis of data from shotgun metagenomics. Our results also demonstrate that improper methods may result in unacceptably high levels of false positives, which in turn may lead

  1. Prevalence, Risk Factors, and Outcome of Myocardial Infarction with Angiographically Normal and Near-Normal Coronary Arteries: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Samad Ghaffari

    2016-12-01

    Full Text Available Context: Coronary artery diseases are mostly detected using angiographic methods demonstrating arteries status. Nevertheless, Myocardial Infarction (MI may occur in the presence of angiographically normal coronary arteries. Therefore, this study aimed to investigate the prevalence of MI with normal angiography and its possible etiologies in a systematic review. Evidence Acquisition: In this meta-analysis, the required data were collected from PubMed, Science Direct, Google Scholar, Scopus, Magiran, Scientific Information Database, and Medlib databases using the following keywords: “coronary angiograph”, “normal coronary arteries”, “near-normal coronary arteries”, “heart diseases”, “coronary artery disease”, “coronary disease”, “cardiac troponin I”, “Myocardial infarction”, “risk factor”, “prevalence”, “outcome”, and their Persian equivalents. Then, Comprehensive Meta-Analysis software, version 2 using randomized model was employed to determine the prevalence of each complication and perform the meta-analysis. P values less than 0.05 were considered to be statistically significant. Results: Totally, 20 studies including 139957 patients were entered into the analysis. The patients’ mean age was 47.62 ± 6.63 years and 64.4% of the patients were male. The prevalence of MI with normal or near-normal coronary arteries was 3.5% (CI = 95%, min = 2.2%, and max = 5.7%. Additionally, smoking and family history of cardiovascular diseases were the most important risk factors. The results showed no significant difference between MIs with normal angiography and 1- or 2-vessel involvement regarding the frequency of major adverse cardiac events (5.4% vs. 7.3%, P = 0.32. However, a significant difference was found between the patients with normal angiography and those with 3-vessel involvement in this regard (5.4% vs. 20.2%, P < 0.001. Conclusions: Although angiographic studies are required to assess the underlying

  2. Crater ejecta scaling laws: fundamental forms based on dimensional analysis

    International Nuclear Information System (INIS)

    Housen, K.R.; Schmidt, R.M.; Holsapple, K.A.

    1983-01-01

    A model of crater ejecta is constructed using dimensional analysis and a recently developed theory of energy and momentum coupling in cratering events. General relations are derived that provide a rationale for scaling laboratory measurements of ejecta to larger events. Specific expressions are presented for ejection velocities and ejecta blanket profiles in two limiting regimes of crater formation: the so-called gravity and strength regimes. In the gravity regime, ejectra velocities at geometrically similar launch points within craters vary as the square root of the product of crater radius and gravity. This relation implies geometric similarity of ejecta blankets. That is, the thickness of an ejecta blanket as a function of distance from the crater center is the same for all sizes of craters if the thickness and range are expressed in terms of crater radii. In the strength regime, ejecta velocities are independent of crater size. Consequently, ejecta blankets are not geometrically similar in this regime. For points away from the crater rim the expressions for ejecta velocities and thickness take the form of power laws. The exponents in these power laws are functions of an exponent, α, that appears in crater radius scaling relations. Thus experimental studies of the dependence of crater radius on impact conditions determine scaling relations for ejecta. Predicted ejection velocities and ejecta-blanket profiles, based on measured values of α, are compared to existing measurements of velocities and debris profiles

  3. Genetic Analysis of Somatic Cell Score in Danish Holsteins Using a Liability-Normal Mixture Model

    DEFF Research Database (Denmark)

    Madsen, P; Shariati, M M; Ødegård, J

    2008-01-01

    Mixture models are appealing for identifying hidden structures affecting somatic cell score (SCS) data, such as unrecorded cases of subclinical mastitis. Thus, liability-normal mixture (LNM) models were used for genetic analysis of SCS data, with the aim of predicting breeding values for such cas...

  4. Normalization of RNA-seq data using factor analysis of control genes or samples

    Science.gov (United States)

    Risso, Davide; Ngai, John; Speed, Terence P.; Dudoit, Sandrine

    2015-01-01

    Normalization of RNA-seq data has proven essential to ensure accurate inference of expression levels. Here we show that usual normalization approaches mostly account for sequencing depth and fail to correct for library preparation and other more-complex unwanted effects. We evaluate the performance of the External RNA Control Consortium (ERCC) spike-in controls and investigate the possibility of using them directly for normalization. We show that the spike-ins are not reliable enough to be used in standard global-scaling or regression-based normalization procedures. We propose a normalization strategy, remove unwanted variation (RUV), that adjusts for nuisance technical effects by performing factor analysis on suitable sets of control genes (e.g., ERCC spike-ins) or samples (e.g., replicate libraries). Our approach leads to more-accurate estimates of expression fold-changes and tests of differential expression compared to state-of-the-art normalization methods. In particular, RUV promises to be valuable for large collaborative projects involving multiple labs, technicians, and/or platforms. PMID:25150836

  5. Vibrational spectra and normal co-ordinate analysis of 2-aminopyridine and 2-amino picoline.

    Science.gov (United States)

    Jose, Sujin P; Mohan, S

    2006-05-01

    The Fourier transform infrared (FT-IR) and Raman (FT-R) spectra of 2-aminopyridine and 2-amino picoline were recorded and the observed frequencies were assigned to various modes of vibration in terms of fundamentals by assuming Cs point group symmetry. A normal co-ordinate analysis was also carried out for the proper assignment of the vibrational frequencies using simple valence force field. A complete vibrational analysis is presented here for the molecules and the results are briefly discussed.

  6. Recent progresses in ion beam analysis of aerosol at tandetron laboratory of Beijing Normal University

    International Nuclear Information System (INIS)

    Wang Guangfu; Lu Yongfang; Zhu Guanghua

    2007-01-01

    PIXE analysis of aerosol samples for measuring concentrations of elements with Z>12 is one of major applications at the GIC4117 Tandetron in Beijing Normal University. In order to measure H, C, N and O concentration in aerosol samples, proton non-Rutherford back scattering spectrometry (PNBS) and proton elastic scattering analysis(PESA) were employed with two Au(Si) surface barrier detectors at angles of 160 degree and 40 degree in the PIXE chamber. (authors)

  7. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    Energy Technology Data Exchange (ETDEWEB)

    Bateman, Grant A. [John Hunter Hospital, Department of Medical Imaging, Newcastle (Australia); Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C. [Hunter Medical Research Institute, Clinical Neurosciences Program, Newcastle (Australia); Schofield, Peter [James Fletcher Hospital, Neuropsychiatry Unit, Newcastle (Australia)

    2005-10-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  8. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    International Nuclear Information System (INIS)

    Bateman, Grant A.; Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C.; Schofield, Peter

    2005-01-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  9. Modal analysis of inter-area oscillations using the theory of normal modes

    Energy Technology Data Exchange (ETDEWEB)

    Betancourt, R.J. [School of Electromechanical Engineering, University of Colima, Manzanillo, Col. 28860 (Mexico); Barocio, E. [CUCEI, University of Guadalajara, Guadalajara, Jal. 44480 (Mexico); Messina, A.R. [Graduate Program in Electrical Engineering, Cinvestav, Guadalajara, Jal. 45015 (Mexico); Martinez, I. [State Autonomous University of Mexico, Toluca, Edo. Mex. 50110 (Mexico)

    2009-04-15

    Based on the notion of normal modes in mechanical systems, a method is proposed for the analysis and characterization of oscillatory processes in power systems. The method is based on the property of invariance of modal subspaces and can be used to represent complex power system modal behavior by a set of decoupled, two-degree-of-freedom nonlinear oscillator equations. Using techniques from nonlinear mechanics, a new approach is outlined, for determining the normal modes (NMs) of motion of a general n-degree-of-freedom nonlinear system. Equations relating the normal modes and the physical velocities and displacements are developed from the linearized system model and numerical issues associated with the application of the technique are discussed. In addition to qualitative insight, this method can be utilized in the study of nonlinear behavior and bifurcation analyses. The application of these procedures is illustrated on a planning model of the Mexican interconnected system using a quadratic nonlinear model. Specifically, the use of normal mode analysis as a basis for identifying modal parameters, including natural frequencies and damping ratios of general, linear systems with n degrees of freedom is discussed. Comparisons to conventional linear analysis techniques demonstrate the ability of the proposed technique to extract the different oscillation modes embedded in the oscillation. (author)

  10. The Impact of Normalization Methods on RNA-Seq Data Analysis

    Science.gov (United States)

    Zyprych-Walczak, J.; Szabelska, A.; Handschuh, L.; Górczak, K.; Klamecka, K.; Figlerowicz, M.; Siatkowski, I.

    2015-01-01

    High-throughput sequencing technologies, such as the Illumina Hi-seq, are powerful new tools for investigating a wide range of biological and medical problems. Massive and complex data sets produced by the sequencers create a need for development of statistical and computational methods that can tackle the analysis and management of data. The data normalization is one of the most crucial steps of data processing and this process must be carefully considered as it has a profound effect on the results of the analysis. In this work, we focus on a comprehensive comparison of five normalization methods related to sequencing depth, widely used for transcriptome sequencing (RNA-seq) data, and their impact on the results of gene expression analysis. Based on this study, we suggest a universal workflow that can be applied for the selection of the optimal normalization procedure for any particular data set. The described workflow includes calculation of the bias and variance values for the control genes, sensitivity and specificity of the methods, and classification errors as well as generation of the diagnostic plots. Combining the above information facilitates the selection of the most appropriate normalization method for the studied data sets and determines which methods can be used interchangeably. PMID:26176014

  11. Analysis of a Dynamic Viscoelastic Contact Problem with Normal Compliance, Normal Damped Response, and Nonmonotone Slip Rate Dependent Friction

    Directory of Open Access Journals (Sweden)

    Mikaël Barboteu

    2016-01-01

    Full Text Available We consider a mathematical model which describes the dynamic evolution of a viscoelastic body in frictional contact with an obstacle. The contact is modelled with a combination of a normal compliance and a normal damped response law associated with a slip rate-dependent version of Coulomb’s law of dry friction. We derive a variational formulation and an existence and uniqueness result of the weak solution of the problem is presented. Next, we introduce a fully discrete approximation of the variational problem based on a finite element method and on an implicit time integration scheme. We study this fully discrete approximation schemes and bound the errors of the approximate solutions. Under regularity assumptions imposed on the exact solution, optimal order error estimates are derived for the fully discrete solution. Finally, after recalling the solution of the frictional contact problem, some numerical simulations are provided in order to illustrate both the behavior of the solution related to the frictional contact conditions and the theoretical error estimate result.

  12. The Lecture as a Transmedial Pedagogical Form: A Historical Analysis

    Science.gov (United States)

    Friesen, Norm

    2011-01-01

    The lecture has been much maligned as a pedagogical form, yet it persists and even flourishes today in the form of the podcast, the TED talk, and the "smart" lecture hall. This article examines the lecture as a pedagogical genre, as "a site where differences between media are negotiated" (Franzel) as these media coevolve. This examination shows…

  13. Impact of PET/CT image reconstruction methods and liver uptake normalization strategies on quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kuhnert, Georg; Sterzer, Sergej; Kahraman, Deniz; Dietlein, Markus; Drzezga, Alexander; Kobe, Carsten [University Hospital of Cologne, Department of Nuclear Medicine, Cologne (Germany); Boellaard, Ronald [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Scheffler, Matthias; Wolf, Juergen [University Hospital of Cologne, Lung Cancer Group Cologne, Department I of Internal Medicine, Center for Integrated Oncology Cologne Bonn, Cologne (Germany)

    2016-02-15

    In oncological imaging using PET/CT, the standardized uptake value has become the most common parameter used to measure tracer accumulation. The aim of this analysis was to evaluate ultra high definition (UHD) and ordered subset expectation maximization (OSEM) PET/CT reconstructions for their potential impact on quantification. We analyzed 40 PET/CT scans of lung cancer patients who had undergone PET/CT. Standardized uptake values corrected for body weight (SUV) and lean body mass (SUL) were determined in the single hottest lesion in the lung and normalized to the liver for UHD and OSEM reconstruction. Quantitative uptake values and their normalized ratios for the two reconstruction settings were compared using the Wilcoxon test. The distribution of quantitative uptake values and their ratios in relation to the reconstruction method used were demonstrated in the form of frequency distribution curves, box-plots and scatter plots. The agreement between OSEM and UHD reconstructions was assessed through Bland-Altman analysis. A significant difference was observed after OSEM and UHD reconstruction for SUV and SUL data tested (p < 0.0005 in all cases). The mean values of the ratios after OSEM and UHD reconstruction showed equally significant differences (p < 0.0005 in all cases). Bland-Altman analysis showed that the SUV and SUL and their normalized values were, on average, up to 60 % higher after UHD reconstruction as compared to OSEM reconstruction. OSEM and HD reconstruction brought a significant difference for SUV and SUL, which remained constantly high after normalization to the liver, indicating that standardization of reconstruction and the use of comparable SUV measurements are crucial when using PET/CT. (orig.)

  14. Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud

    Science.gov (United States)

    Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.

    2018-04-01

    In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.

  15. Data analysis of x-ray fluorescence holography by subtracting normal component from inverse hologram

    International Nuclear Information System (INIS)

    Happo, Naohisa; Hayashi, Kouichi; Hosokawa, Shinya

    2010-01-01

    X-ray fluorescence holography (XFH) is a powerful technique for determining three-dimensional local atomic arrangements around a specific fluorescing element. However, the raw experimental hologram is predominantly a mixed hologram, i.e., a mixture of hologram generated in both normal and inverse modes, which produces unreliable atomic images. In this paper, we propose a practical subtraction method of the normal component from the inverse XFH data by a Fourier transform for the calculated hologram of a model ZnTe cluster. Many spots originating from the normal components could be properly removed using a mask function, and clear atomic images were reconstructed at adequate positions of the model cluster. This method was successfully applied to the analysis of experimental ZnTe single crystal XFH data. (author)

  16. Precision analysis in billet preparation for micro bulk metal forming

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Hansen, Hans N.

    2015-01-01

    The purpose of this research is to fabricate billets for an automated transfer press for micro forming. High performance transfer presses are wellknown in conventional metal forming and distinguished from their automation and mass production. The press used in this research is a vertical mechanical...... press. When using a vertical mechanical press, the material is fed as billets into the forming zone. Therefore, a large number of highly uniform billets are required to run mass production in such a setup. Shearing technique was used for manufacturing the billets. The efficiency of the shearing tool...

  17. Elemental analysis of the frontal lobe of 'normal' brain tissue and that affected by Alzheimer's disease

    International Nuclear Information System (INIS)

    Stedman, J.D.; Spyrou, N.M.

    1997-01-01

    'Normal' brain tissue and brain tissue affected by Alzheimer's disease has been taken from the frontal lobe of both hemispheres and their elemental compositions in terms of major, minor and trace elements compared. Brain samples were obtained from the MRC Alzheimer's Disease Brain Bank, London. 25 samples were taken from 18 individuals (5 males and 13 females) of mean age 79.9 ± 7.3 years with pathologically confirmed Alzheimer's disease and 26 samples from 15 individuals (8 males and 7 females) of mean age 71.8 ± 13.0 years with no pathological sings of Alzheimer's disease ('normals'). The elemental concentration of the samples were determined by the techniques of Rutherford backscattering (RBS) analysis, particle induced X-ray emission (PIXE) analysis and instrumental neutron activation analysis (INAA). Na, Mg, Al, Cl, K, Sc, Fe, Zn, Se, Br, Rb and Cs were detected by INAA and significant differences in concentrations were found between concentrations in normal and Alzheimer tissue for the elements. Na, Cl, K, Se, Br and Rb, P, S, Cl, K, Ca, Fe, Zn and Cd were detected by PIXE analysis and significant differences found for the elements P, S, Cl, K and Ca. (author)

  18. Comparative study of various normal mode analysis techniques based on partial Hessians.

    Science.gov (United States)

    Ghysels, An; Van Speybroeck, Veronique; Pauwels, Ewald; Catak, Saron; Brooks, Bernard R; Van Neck, Dimitri; Waroquier, Michel

    2010-04-15

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and application field but guidelines for the most suitable choice are lacking. We have investigated several partial Hessian methods, including the Partial Hessian Vibrational Analysis (PHVA), the Mobile Block Hessian (MBH), and the Vibrational Subsystem Analysis (VSA). In this article, we focus on the benefits and drawbacks of these methods, in terms of the reproduction of localized modes, collective modes, and the performance in partially optimized structures. We find that the PHVA is suitable for describing localized modes, that the MBH not only reproduces localized and global modes but also serves as an analysis tool of the spectrum, and that the VSA is mostly useful for the reproduction of the low frequency spectrum. These guidelines are illustrated with the reproduction of the localized amine-stretch, the spectrum of quinine and a bis-cinchona derivative, and the low frequency modes of the LAO binding protein. 2009 Wiley Periodicals, Inc.

  19. Analysis of normalized-characteristic curves and determination of the granulometric state of dissolved uranium dioxides

    International Nuclear Information System (INIS)

    Melichar, F.; Neumann, L.

    1977-01-01

    Methods are presented for the analysis of normalized-characteristic curves, which make it possible to determine the granulometric composition of a dissolved polydispersion - the cumulative mass distribution of particles - as a function of the relative particle size. If the size of the largest particle in the dissolved polydispersion is known, these methods allow the determination of the dependence of cumulative mass ratios of particles on their absolute sizes. In the inverse method of the geometrical model for determining the granulometric composition of a dissolved polydispersion, the polydispersion is represented by a finite number of monodispersions. An accurate analysis of normalized-characteristic equations leads to the Akselrud dissolution model. As against the other two methods, the latter allows the determination of the granulometric composition for an arbitrary number of particle sizes. The method of the granulometric atlas is a method for estimating the granulometric composition of a dissolved polydispersion and is based on comparison of a normalized-characteristic curve for an unknown granulometric composition with an atlas of normalized-characteristic curves for selected granulometric spectra of polydispersions. (author)

  20. Dispersive analysis of the pion transition form factor

    Science.gov (United States)

    Hoferichter, M.; Kubis, B.; Leupold, S.; Niecknig, F.; Schneider, S. P.

    2014-11-01

    We analyze the pion transition form factor using dispersion theory. We calculate the singly-virtual form factor in the time-like region based on data for the cross section, generalizing previous studies on decays and scattering, and verify our result by comparing to data. We perform the analytic continuation to the space-like region, predicting the poorly-constrained space-like transition form factor below , and extract the slope of the form factor at vanishing momentum transfer . We derive the dispersive formalism necessary for the extension of these results to the doubly-virtual case, as required for the pion-pole contribution to hadronic light-by-light scattering in the anomalous magnetic moment of the muon.

  1. Low-energy analysis of the nucleon electromagnetic form factors

    International Nuclear Information System (INIS)

    Kubis, Bastian.; Meissner, Ulf-G.

    2001-01-01

    We analyze the electromagnetic form factors of the nucleon to fourth order in relativistic baryon chiral perturbation theory. We employ the recently proposed infrared regularization scheme and show that the convergence of the chiral expansion is improved as compared to the heavy-fermion approach. We also discuss the inclusion of vector mesons and obtain an accurate description of all four-nucleon form factors for momentum transfer squared up to Q 2 ≅0.4 GeV 2

  2. Super-delta: a new differential gene expression analysis procedure with robust data normalization.

    Science.gov (United States)

    Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing

    2017-12-21

    Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super

  3. Analysis of form deviation in non-isothermal glass molding

    Science.gov (United States)

    Kreilkamp, H.; Grunwald, T.; Dambon, O.; Klocke, F.

    2018-02-01

    Especially in the market of sensors, LED lighting and medical technologies, there is a growing demand for precise yet low-cost glass optics. This demand poses a major challenge for glass manufacturers who are confronted with the challenge arising from the trend towards ever-higher levels of precision combined with immense pressure on market prices. Since current manufacturing technologies especially grinding and polishing as well as Precision Glass Molding (PGM) are not able to achieve the desired production costs, glass manufacturers are looking for alternative technologies. Non-isothermal Glass Molding (NGM) has been shown to have a big potential for low-cost mass manufacturing of complex glass optics. However, the biggest drawback of this technology at the moment is the limited accuracy of the manufactured glass optics. This research is addressing the specific challenges of non-isothermal glass molding with respect to form deviation of molded glass optics. Based on empirical models, the influencing factors on form deviation in particular form accuracy, waviness and surface roughness will be discussed. A comparison with traditional isothermal glass molding processes (PGM) will point out the specific challenges of non-isothermal process conditions. Furthermore, the underlying physical principle leading to the formation of form deviations will be analyzed in detail with the help of numerical simulation. In this way, this research contributes to a better understanding of form deviations in non-isothermal glass molding and is an important step towards new applications demanding precise yet low-cost glass optics.

  4. EMG normalization method based on grade 3 of manual muscle testing: Within- and between-day reliability of normalization tasks and application to gait analysis.

    Science.gov (United States)

    Tabard-Fougère, Anne; Rose-Dulcina, Kevin; Pittet, Vincent; Dayer, Romain; Vuillerme, Nicolas; Armand, Stéphane

    2018-02-01

    Electromyography (EMG) is an important parameter in Clinical Gait Analysis (CGA), and is generally interpreted with timing of activation. EMG amplitude comparisons between individuals, muscles or days need normalization. There is no consensus on existing methods. The gold standard, maximum voluntary isometric contraction (MVIC), is not adapted to pathological populations because patients are often unable to perform an MVIC. The normalization method inspired by the isometric grade 3 of manual muscle testing (isoMMT3), which is the ability of a muscle to maintain a position against gravity, could be an interesting alternative. The aim of this study was to evaluate the within- and between-day reliability of the isoMMT3 EMG normalizing method during gait compared with the conventional MVIC method. Lower limb muscles EMG (gluteus medius, rectus femoris, tibialis anterior, semitendinosus) were recorded bilaterally in nine healthy participants (five males, aged 29.7±6.2years, BMI 22.7±3.3kgm -2 ) giving a total of 18 independent legs. Three repeated measurements of the isoMMT3 and MVIC exercises were performed with an EMG recording. EMG amplitude of the muscles during gait was normalized by these two methods. This protocol was repeated one week later. Within- and between-day reliability of normalization tasks were similar for isoMMT3 and MVIC methods. Within- and between-day reliability of gait EMG normalized by isoMMT3 was higher than with MVIC normalization. These results indicate that EMG normalization using isoMMT3 is a reliable method with no special equipment needed and will support CGA interpretation. The next step will be to evaluate this method in pathological populations. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Difference-theoretical Analysis of Aesthetic Media and Forms

    Directory of Open Access Journals (Sweden)

    Franz Kasper Krönig

    2018-04-01

    Full Text Available The general medium/form-difference-theory as proposed by Fritz Heider (cf. Heider, 1959 has been seized on by the sociological systems theory as an epistemological and heuristic basis of such a generality that it can be applied to virtually all conceivable fields of research. One could arguably speak of a new paradigm that overcomes traditional differences such as subject/object and cause/effect. This approach has been applied to all types of art, and various research questions in the fields of aesthetics and art theory. This paper proposes a differentiation and categorisation of aesthetic media and forms in order to lay the groundwork for art criticism on a third way between subjective appreciation and objective reasoning. Musical examples demonstrate the applicability of the medium/form-difference-theoretical approach for the aesthetics of music and music criticism.

  6. Dispersive analysis of the pion transition form factor

    Energy Technology Data Exchange (ETDEWEB)

    Hoferichter, M. [Technische Universitaet Darmstadt, Institut fuer Kernphysik, Darmstadt (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, ExtreMe Matter Institute EMMI, Darmstadt (Germany); University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland); Kubis, B.; Niecknig, F.; Schneider, S.P. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik (Theorie) and Bethe Center for Theoretical Physics, Bonn (Germany); Leupold, S. [Uppsala Universitet, Institutionen foer fysik och astronomi, Box 516, Uppsala (Sweden)

    2014-11-15

    We analyze the pion transition form factor using dispersion theory. We calculate the singly-virtual form factor in the time-like region based on data for the e{sup +}e{sup -} → 3π cross section, generalizing previous studies on ω, φ → 3π decays and γπ → ππ scattering, and verify our result by comparing to e{sup +}e{sup -} → π{sup 0}γ data. We perform the analytic continuation to the space-like region, predicting the poorlyconstrained space-like transition form factor below 1 GeV, and extract the slope of the form factor at vanishing momentum transfer a{sub π} = (30.7 ± 0.6) x 10{sup -3}. We derive the dispersive formalism necessary for the extension of these results to the doubly-virtual case, as required for the pion-pole contribution to hadronic light-by-light scattering in the anomalous magnetic moment of the muon. (orig.)

  7. Neutron activation analysis of baths forming conversion layer on aluminium

    International Nuclear Information System (INIS)

    Szilagyi, Istvan; Maleczki, Emil; Bodizs, Denes

    1988-01-01

    Chromate layers were formed on the surface of aluminium using yellow and green chromating solutions. For the determination of the aluminium content neutron activation method was used. Nuclear effects disturbing the determination were eliminated by double irradiation technique. (author) 8 refs.; 4 figs

  8. Restoring normal eating behaviour in adolescents with anorexia nervosa: A video analysis of nursing interventions.

    Science.gov (United States)

    Beukers, Laura; Berends, Tamara; de Man-van Ginkel, Janneke M; van Elburg, Annemarie A; van Meijel, Berno

    2015-12-01

    An important part of inpatient treatment for adolescents with anorexia nervosa is to restore normal eating behaviour. Health-care professionals play a significant role in this process, but little is known about their interventions during patients' meals. The purpose of the present study was to describe nursing interventions aimed at restoring normal eating behaviour in patients with anorexia nervosa. The main research question was: 'Which interventions aimed at restoring normal eating behaviour do health-care professionals in a specialist eating disorder centre use during meal times for adolescents diagnosed with anorexia nervosa? The present study was a qualitative, descriptive study that used video recordings made during mealtimes. Thematic data analysis was applied. Four categories of interventions emerged from the data: (i) monitoring and instructing; (ii) encouraging and motivating; (iii) supporting and understanding; and (iv) educating. The data revealed a directive attitude aimed at promoting behavioural change, but always in combination with empathy and understanding. In the first stage of clinical treatment, health-care professionals focus primarily on changing patients' eating behaviour. However, they also address the psychosocial needs that become visible in patients as they struggle to restore normal eating behaviour. The findings of the present study can be used to assist health-care professionals, and improve multidisciplinary guidelines and health-care professionals' training programmes. © 2015 Australian College of Mental Health Nurses Inc.

  9. A phylogenetic analysis of normal modes evolution in enzymes and its relationship to enzyme function.

    Science.gov (United States)

    Lai, Jason; Jin, Jing; Kubelka, Jan; Liberles, David A

    2012-09-21

    Since the dynamic nature of protein structures is essential for enzymatic function, it is expected that functional evolution can be inferred from the changes in protein dynamics. However, dynamics can also diverge neutrally with sequence substitution between enzymes without changes of function. In this study, a phylogenetic approach is implemented to explore the relationship between enzyme dynamics and function through evolutionary history. Protein dynamics are described by normal mode analysis based on a simplified harmonic potential force field applied to the reduced C(α) representation of the protein structure while enzymatic function is described by Enzyme Commission numbers. Similarity of the binding pocket dynamics at each branch of the protein family's phylogeny was analyzed in two ways: (1) explicitly by quantifying the normal mode overlap calculated for the reconstructed ancestral proteins at each end and (2) implicitly using a diffusion model to obtain the reconstructed lineage-specific changes in the normal modes. Both explicit and implicit ancestral reconstruction identified generally faster rates of change in dynamics compared with the expected change from neutral evolution at the branches of potential functional divergences for the α-amylase, D-isomer-specific 2-hydroxyacid dehydrogenase, and copper-containing amine oxidase protein families. Normal mode analysis added additional information over just comparing the RMSD of static structures. However, the branch-specific changes were not statistically significant compared to background function-independent neutral rates of change of dynamic properties and blind application of the analysis would not enable prediction of changes in enzyme specificity. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Analysis of Voltage Forming Methods for Multiphase Inverters

    Directory of Open Access Journals (Sweden)

    Tadas Lipinskis

    2013-05-01

    Full Text Available The article discusses advantages of the multiphase AC induction motor over three or less phase motors. It presents possible stator winding configurations for a multiphase induction motor. Various fault control strategies were reviewed for phases feeding the motor. The authors propose a method for quality evaluation of voltage forming algorithm in the inverter. Simulation of a six-phase voltage source inverter, voltage in which is formed using a simple SPWM control algorithm, was performed in Matlab Simulink. Simulation results were evaluated using the proposed method. Inverter’s power stage was powered by 400 V DC source. The spectrum of output currents was analysed and the magnitude of the main frequency component was at least 12 times greater than the next biggest-magnitude component. The value of rectified inverter voltage was 373 V.Article in Lithuanian

  11. Biopower, Normalization, and HPV: A Foucauldian Analysis of the HPV Vaccine Controversy.

    Science.gov (United States)

    Engels, Kimberly S

    2016-09-01

    This article utilizes the Foucauldian concepts of biopower and normalization to give an analysis of the debate surrounding the controversial administration of the HPV vaccine to adolescents. My intention is not to solve the problem, rather to utilize a Foucauldian framework to bring various facets of the issue to light, specifically the way the vaccine contributes to strategies of power in reference to how young adults develop within relationships of power. To begin, the article provides an overview of the Foucauldian concepts of biopower and normalization, including how these two strategies of power were present in the administration of the smallpox vaccine in the 19th century. Next, information about HPV and the history of the current controversy in the United States is presented. Lastly, the article presents an analysis of the strategies of biopower and normalization present in the debate on HPV, including an emphasis on how the vaccination is similar to, and different from, 19th century smallpox vaccination. It also explores the way that mechanisms of disease control affect and are affected by individual subjects, in this case, adolescents.

  12. Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis

    Science.gov (United States)

    Che, E.; Olsen, M. J.

    2017-09-01

    Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.

  13. Evaluation of Different Normalization and Analysis Procedures for Illumina Gene Expression Microarray Data Involving Small Changes

    Science.gov (United States)

    Johnstone, Daniel M.; Riveros, Carlos; Heidari, Moones; Graham, Ross M.; Trinder, Debbie; Berretta, Regina; Olynyk, John K.; Scott, Rodney J.; Moscato, Pablo; Milward, Elizabeth A.

    2013-01-01

    While Illumina microarrays can be used successfully for detecting small gene expression changes due to their high degree of technical replicability, there is little information on how different normalization and differential expression analysis strategies affect outcomes. To evaluate this, we assessed concordance across gene lists generated by applying different combinations of normalization strategy and analytical approach to two Illumina datasets with modest expression changes. In addition to using traditional statistical approaches, we also tested an approach based on combinatorial optimization. We found that the choice of both normalization strategy and analytical approach considerably affected outcomes, in some cases leading to substantial differences in gene lists and subsequent pathway analysis results. Our findings suggest that important biological phenomena may be overlooked when there is a routine practice of using only one approach to investigate all microarray datasets. Analytical artefacts of this kind are likely to be especially relevant for datasets involving small fold changes, where inherent technical variation—if not adequately minimized by effective normalization—may overshadow true biological variation. This report provides some basic guidelines for optimizing outcomes when working with Illumina datasets involving small expression changes. PMID:27605185

  14. Normal levels of total body sodium and chlorine by neutron activation analysis

    International Nuclear Information System (INIS)

    Kennedy, N.S.J.; Eastell, R.; Smith, M.A.; Tothill, P.

    1983-01-01

    In vivo neutron activation analysis was used to measure total body sodium and chlorine in 18 male and 18 female normal adults. Corrections for body size were developed. Normalisation factors were derived which enable the prediction of the normal levels of sodium and chlorine in a subject. The coefficient of variation of normalised sodium was 5.9% in men and 6.9% in women, and of normalised chlorine 9.3% in men and 5.5% in women. In the range examined (40-70 years) no significant age dependence was observed for either element. Total body sodium was correlated with total body chlorine and total body calcium. Sodium excess, defined as the amount of body sodium in excess of that associated with chlorine, also correlated well with total body calcium. In females there was a mean annual loss of sodium excess of 1.2% after the menopause, similar to the loss of calcium. (author)

  15. Dynamic analysis of elastic rubber tired car wheel breaking under variable normal load

    Science.gov (United States)

    Fedotov, A. I.; Zedgenizov, V. G.; Ovchinnikova, N. I.

    2017-10-01

    The purpose of the paper is to analyze the dynamics of the braking of the wheel under normal load variations. The paper uses a mathematical simulation method according to which the calculation model of an object as a mechanical system is associated with a dynamically equivalent schematic structure of the automatic control. Transfer function tool analyzing structural and technical characteristics of an object as well as force disturbances were used. It was proved that the analysis of dynamic characteristics of the wheel subjected to external force disturbances has to take into account amplitude and phase-frequency characteristics. Normal load variations impact car wheel braking subjected to disturbances. The closer slip to the critical point is, the higher the impact is. In the super-critical area, load variations cause fast wheel blocking.

  16. Normal Mode Analysis to a Poroelastic Half-Space Problem under Generalized Thermoelasticity

    Directory of Open Access Journals (Sweden)

    Chunbao Xiong

    Full Text Available Abstract The thermo-hydro-mechanical problems associated with a poroelastic half-space soil medium with variable properties under generalized thermoelasticity theory were investigated in this study. By remaining faithful to Biot’s theory of dynamic poroelasticity, we idealized the foundation material as a uniform, fully saturated, poroelastic half-space medium. We first subjected this medium to time harmonic loads consisting of normal or thermal loads, then investigated the differences between the coupled thermohydro-mechanical dynamic models and the thermo-elastic dynamic models. We used normal mode analysis to solve the resulting non-dimensional coupled equations, then investigated the effects that non-dimensional vertical displacement, excess pore water pressure, vertical stress, and temperature distribution exerted on the poroelastic half-space medium and represented them graphically.

  17. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  18. Subject description form of crime prevention (morphological analysis)

    OpenAIRE

    Валерій Федорович Оболенцев

    2016-01-01

    Activities of the National Crime Prevention is a system object. Therefore, it should be improving on the basis of systems analysis techniques. The practice of systematic approach was realized in the works of  N. F. Kuznetsova, A. I. Dolgova, D. O. Li, V. M. Dryomin, O. Y. Manokha, O. G. Frolova. Crime models developed C. Y. Vitsin, Y. D. Bluvshteyn, N. V. Smetanina. We previously disclosed basic principles of system analysis system to prevent crime and its genetic and prognostic aspec...

  19. Usual normalization strategies for gene expression studies impair the detection and analysis of circadian patterns.

    Science.gov (United States)

    Figueredo, Diego de Siqueira; Barbosa, Mayara Rodrigues; Coimbra, Daniel Gomes; Dos Santos, José Luiz Araújo; Costa, Ellyda Fernanda Lopes; Koike, Bruna Del Vechio; Alexandre Moreira, Magna Suzana; de Andrade, Tiago Gomes

    2018-03-01

    Recent studies have shown that transcriptomes from different tissues present circadian oscillations. Therefore, the endogenous variation of total RNA should be considered as a potential bias in circadian studies of gene expression. However, normalization strategies generally include the equalization of total RNA concentration between samples prior to cDNA synthesis. Moreover, endogenous housekeeping genes (HKGs) frequently used for data normalization may exhibit circadian variation and distort experimental results if not detected or considered. In this study, we controlled experimental conditions from the amount of initial brain tissue samples through extraction steps, cDNA synthesis, and quantitative real time PCR (qPCR) to demonstrate a circadian oscillation of total RNA concentration. We also identified that the normalization of the RNA's yield affected the rhythmic profiles of different genes, including Per1-2 and Bmal1. Five widely used HKGs (Actb, Eif2a, Gapdh, Hprt1, and B2m) also presented rhythmic variations not detected by geNorm algorithm. In addition, the analysis of exogenous microRNAs (Cel-miR-54 and Cel-miR-39) spiked during RNA extraction suggests that the yield was affected by total RNA concentration, which may impact circadian studies of small RNAs. The results indicate that the approach of tissue normalization without total RNA equalization prior to cDNA synthesis can avoid bias from endogenous broad variations in transcript levels. Also, the circadian analysis of 2 -Cycle threshold (Ct) data, without HKGs, may be an alternative for chronobiological studies under controlled experimental conditions.

  20. Analysis of image plane's Illumination in Image-forming System

    International Nuclear Information System (INIS)

    Duan Lihua; Zeng Yan'an; Zhang Nanyangsheng; Wang Zhiguo; Yin Shiliang

    2011-01-01

    In the detection of optical radiation, the detecting accuracy is affected by optic power distribution of the detector's surface to a large extent. In addition, in the image-forming system, the quality of the image is greatly determined by the uniformity of the image's illumination distribution. However, in the practical optical system, affected by the factors such as field of view, false light and off axis and so on, the distribution of the image's illumination tends to be non uniform, so it is necessary to discuss the image plane's illumination in image-forming systems. In order to analyze the characteristics of the image-forming system at a full range, on the basis of photometry, the formulas to calculate the illumination of the imaging plane have been summarized by the numbers. Moreover, the relationship between the horizontal offset of the light source and the illumination of the image has been discussed in detail. After that, the influence of some key factors such as aperture angle, off-axis distance and horizontal offset on illumination of the image has been brought forward. Through numerical simulation, various theoretical curves of those key factors have been given. The results of the numerical simulation show that it is recommended to aggrandize the diameter of the exit pupil to increase the illumination of the image. The angle of view plays a negative role in the illumination distribution of the image, that is, the uniformity of the illumination distribution can be enhanced by compressing the angle of view. Lastly, it is proved that telecentric optical design is an effective way to advance the uniformity of the illumination distribution.

  1. FEM analysis of hollow hub forming in rolling extrusion process

    Directory of Open Access Journals (Sweden)

    J. Bartnicki

    2014-10-01

    Full Text Available In this paper are presented the results of numerical calculations of rolling extrusion process of a hollow hub. As the flanges manufacturing at both sides of the product is required, in the analyzed process of rolling extrusion, a rear bumper was implemented as additional tool limiting axial metal flow. Numerical calculations of the hub forming process were conducted basing on finite element method, applying software Deform3D and Simufact in conditions of three dimensional state of strain. The obtained satisfactory results show that it is possible to conduct the further research works of experimental character, with the application of a modernized aggregate for the rolling extrusion process PO-2.

  2. Metallurgical Analysis of Cracks Formed on Coal Fired Boiler Tube

    Science.gov (United States)

    Kishor, Rajat; Kyada, Tushal; Goyal, Rajesh K.; Kathayat, T. S.

    2015-02-01

    Metallurgical failure analysis was carried out for cracks observed on the outer surface of a boiler tube made of ASME SA 210 GR A1 grade steel. The cracks on the surface of the tube were observed after 6 months from the installation in service. A careful visual inspection, chemical analysis, hardness measurement, detailed microstructural analysis using optical and scanning electron microscopy coupled with energy dispersive X-ray spectroscopy were carried out to ascertain the cause for failure. Visual inspection of the failed tube revealed the presence of oxide scales and ash deposits on the surface of the tube exposed to fire. Many cracks extending longitudinally were observed on the surface of the tube. Bulging of the tube was also observed. The results of chemical analysis, hardness values and optical micrographs did not exhibit any abnormality at the region of failure. However, detailed SEM with EDS analysis confirmed the presence of various oxide scales. These scales initiated corrosion at both the inner and outer surfaces of the tube. In addition, excessive hoop stress also developed at the region of failure. It is concluded that the failure of the boiler tube took place owing to the combined effect of the corrosion caused by the oxide scales as well as the excessive hoop stress.

  3. A FORM ANALYSIS OF JAPANESE PEDESTRIAN DECKS AND EUROPEAN PLAZAS

    Directory of Open Access Journals (Sweden)

    ANDO Naomi

    2015-06-01

    Full Text Available This study compares Japanese pedestrian decks and European plazas as public pedestrian spaces. The characteristics of both types of spaces will be clarified through a schematic analysis. The connections of these spaces with their surroundings will also be analyzed. Further, the spatial image of these spaces are discussed. Pedestrian spaces in Romania will be discussed as well.

  4. Single-cell analysis of ploidy and centrosomes underscores the peculiarity of normal hepatocytes.

    Directory of Open Access Journals (Sweden)

    Francesca Faggioli

    Full Text Available Polyploidization is the most well recognized feature of the liver. Yet, a quantitative and behavioral analysis of centrosomes and DNA content in normal hepatocytes has been limited by the technical challenges of methods available. By using a novel approach employing FISH for chromosomes 18, X and Y we provide, for the first time, a detailed analysis of DNA copies during physiological development in the liver at single cell level. We demonstrate that aneuploidy and unbalanced DNA content in binucleated hepatocytes are common features in normal adult liver. Despite the common belief that hepatocytes contain 1, 2 or no more than 4 centrosomes, our double staining for centrosome associated proteins reveals extranumerary centrosomes in a high percentage of cells as early as 15 days of age. We show that in murine liver the period between 15 days and 1.5 months marks the transition from a prevalence of mononucleated cells to up to 75% of binucleated cells. Our data demonstrate that this timing correlates with a switch in centrosomes number. At 15 days the expected 1 or 2 centrosomes converge with several hepatocytes that contain 3 centrosomes; at 1.5 months the percentage of cells with 3 centrosomes decreases concomitantly with the increase of cells with more than 4 centrosomes. Our analysis shows that the extranumerary centrosomes emerge in concomitance with the process of binucleation and polyploidization and maintain α-tubulin nucleation activity. Finally, by integrating interphase FISH and immunofluorescent approaches, we detected an imbalance between centrosome number and DNA content in liver cells that deviates from the equilibrium expected in normal cells. We speculate that these unique features are relevant to the peculiar biological function of liver cells which are continuously challenged by stress, a condition that could predispose to genomic instability.

  5. Analysis of radionuclide dispersion at normal condition for AEC 1000 MW reactor power

    International Nuclear Information System (INIS)

    Sri Kuntjoro

    2010-01-01

    Analysis for radionuclide dispersion for the Atomic Energy Agency (AEC) 3,568 MWth Power Reactor, equal to the 1,000 MWe at normal condition has been done. Analysis was done for two piles that is separated by 500 m distance and angle of 90° one to other. Initial pace in doing the analysis is to determine reactors source term using ORIGEN2 and EMERALD NORMAL. computer code program. ORIGEN2 applied to determine radionuclide inventory emerged in the reactor. Hereinafter, by using Emerald Normal Computer code is calculated source term reaching the reactor stack. To analyze dose received by population is done by using PC-CREAM computer code. Calculation done for one and two PLTN attached in site candidate of plants. The result showed is that the highest radionuclide release for one PLTN is at 1 km distance and to 9 th zone toward ( 19.25° ) and for two PLTN is at 1 km distance and to 10 th zone toward (21.75° ). Radionuclide which up to population through two pathways that are foodstuff and inhalation. To foodstuff comes from radionuclide I 131 , and the biggest passed from milk product with 53.40 % for one and also two PLTN For inhalation pathway the highest radionuclide contribution come from Kr 85m is about 53.80 %. The highest total dose received by population is at 1 Km distance received by baby that is 4.10 µSi and 11.26 µSi for one and two PLTN respectively. Those result are very small compared to the maximum permission dose to population issued by regulatory body that is equal to 1 mSi. (author)

  6. Serial analysis of gene expression (SAGE) in normal human trabecular meshwork.

    Science.gov (United States)

    Liu, Yutao; Munro, Drew; Layfield, David; Dellinger, Andrew; Walter, Jeffrey; Peterson, Katherine; Rickman, Catherine Bowes; Allingham, R Rand; Hauser, Michael A

    2011-04-08

    To identify the genes expressed in normal human trabecular meshwork tissue, a tissue critical to the pathogenesis of glaucoma. Total RNA was extracted from human trabecular meshwork (HTM) harvested from 3 different donors. Extracted RNA was used to synthesize individual SAGE (serial analysis of gene expression) libraries using the I-SAGE Long kit from Invitrogen. Libraries were analyzed using SAGE 2000 software to extract the 17 base pair sequence tags. The extracted sequence tags were mapped to the genome using SAGE Genie map. A total of 298,834 SAGE tags were identified from all HTM libraries (96,842, 88,126, and 113,866 tags, respectively). Collectively, there were 107,325 unique tags. There were 10,329 unique tags with a minimum of 2 counts from a single library. These tags were mapped to known unique Unigene clusters. Approximately 29% of the tags (orphan tags) did not map to a known Unigene cluster. Thirteen percent of the tags mapped to at least 2 Unigene clusters. Sequence tags from many glaucoma-related genes, including myocilin, optineurin, and WD repeat domain 36, were identified. This is the first time SAGE analysis has been used to characterize the gene expression profile in normal HTM. SAGE analysis provides an unbiased sampling of gene expression of the target tissue. These data will provide new and valuable information to improve understanding of the biology of human aqueous outflow.

  7. 32 CFR 989.12 - AF Form 813, Request for Environmental Impact Analysis.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false AF Form 813, Request for Environmental Impact... FORCE ENVIRONMENTAL PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.12 AF Form 813, Request for Environmental Impact Analysis. The Air Force uses AF Form 813 to document the need for...

  8. Prospect theory: A parametric analysis of functional forms in Brazil

    Directory of Open Access Journals (Sweden)

    Robert Eugene Lobel

    2017-10-01

    Full Text Available This study aims to analyze risk preferences in Brazil based on prospect theory by estimating the risk aversion parameter of the expected utility theory (EUT for a select sample, in addition to the value and probability function parameter, assuming various functional forms, and a newly proposed value function, the modified log. This is the first such study in Brazil, and the parameter results are slightly different from studies in other countries, indicating that subjects are more risk averse and exhibit a smaller loss aversion. Probability distortion is the only common factor. As expected, the study finds that behavioral models are superior to EUT, and models based on prospect theory, the TK and Prelec weighting function, and the value power function show superior performance to others. Finally, the modified log function proposed in the study fits the data well, and can thus be used for future studies in Brazil.

  9. GC-MS-Based Endometabolome Analysis Differentiates Prostate Cancer from Normal Prostate Cells

    Directory of Open Access Journals (Sweden)

    Ana Rita Lima

    2018-03-01

    Full Text Available Prostate cancer (PCa is an important health problem worldwide. Diagnosis and management of PCa is very complex because the detection of serum prostate specific antigen (PSA has several drawbacks. Metabolomics brings promise for cancer biomarker discovery and for better understanding PCa biochemistry. In this study, a gas chromatography–mass spectrometry (GC-MS based metabolomic profiling of PCa cell lines was performed. The cell lines include 22RV1 and LNCaP from PCa with androgen receptor (AR expression, DU145 and PC3 (which lack AR expression, and one normal prostate cell line (PNT2. Regarding the metastatic potential, PC3 is from an adenocarcinoma grade IV with high metastatic potential, DU145 has a moderate metastatic potential, and LNCaP has a low metastatic potential. Using multivariate analysis, alterations in levels of several intracellular metabolites were detected, disclosing the capability of the endometabolome to discriminate all PCa cell lines from the normal prostate cell line. Discriminant metabolites included amino acids, fatty acids, steroids, and sugars. Six stood out for the separation of all the studied PCa cell lines from the normal prostate cell line: ethanolamine, lactic acid, β-Alanine, L-valine, L-leucine, and L-tyrosine.

  10. Research and analysis of contradictions and strategies of Hainan tourism development under the new normal background

    Directory of Open Access Journals (Sweden)

    Xie Xiangxiang

    2016-01-01

    Full Text Available At present, China’s economic and social development has entered into a new normal state, the construction of Hainan international tourism island also has entered into the middle and late period. Hainan tourism has made brilliant achievements, but the development contradictions have become increasingly serious. Based on the analysis of the current situation and realistic problems of Hainan tourism, this paper points out five principal problems of Hainan tourism development--industrial status, supply-demand relationship, cultural preservation, subject-object relationship and government-business relationship, and it also proposes strategic suggestions on the great tourism, large interconnection, great culture, big Hainan and big special zone.

  11. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    Science.gov (United States)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  12. An acoustic analysis of laughter produced by congenitally deaf and normally hearing college students1

    Science.gov (United States)

    Makagon, Maja M.; Funayama, E. Sumie; Owren, Michael J.

    2008-01-01

    Relatively few empirical data are available concerning the role of auditory experience in nonverbal human vocal behavior, such as laughter production. This study compared the acoustic properties of laughter in 19 congenitally, bilaterally, and profoundly deaf college students and in 23 normally hearing control participants. Analyses focused on degree of voicing, mouth position, air-flow direction, temporal features, relative amplitude, fundamental frequency, and formant frequencies. Results showed that laughter produced by the deaf participants was fundamentally similar to that produced by the normally hearing individuals, which in turn was consistent with previously reported findings. Finding comparable acoustic properties in the sounds produced by deaf and hearing vocalizers confirms the presumption that laughter is importantly grounded in human biology, and that auditory experience with this vocalization is not necessary for it to emerge in species-typical form. Some differences were found between the laughter of deaf and hearing groups; the most important being that the deaf participants produced lower-amplitude and longer-duration laughs. These discrepancies are likely due to a combination of the physiological and social factors that routinely affect profoundly deaf individuals, including low overall rates of vocal fold use and pressure from the hearing world to suppress spontaneous vocalizations. PMID:18646991

  13. Determination of the main solid-state form of albendazole in bulk drug, employing Raman spectroscopy coupled to multivariate analysis.

    Science.gov (United States)

    Calvo, Natalia L; Arias, Juan M; Altabef, Aída Ben; Maggio, Rubén M; Kaufman, Teodoro S

    2016-09-10

    Albendazole (ALB) is a broad-spectrum anthelmintic, which exhibits two solid-state forms (Forms I and II). The Form I is the metastable crystal at room temperature, while Form II is the stable one. Because the drug has poor aqueous solubility and Form II is less soluble than Form I, it is desirable to have a method to assess the solid-state form of the drug employed for manufacturing purposes. Therefore, a Partial Least Squares (PLS) model was developed for the determination of Form I of ALB in its mixtures with Form II. For model development, both solid-state forms of ALB were prepared and characterized by microscopic (optical and with normal and polarized light), thermal (DSC) and spectroscopic (ATR-FTIR, Raman) techniques. Mixtures of solids in different ratios were prepared by weighing and mechanical mixing of the components. Their Raman spectra were acquired, and subjected to peak smoothing, normalization, standard normal variate correction and de-trending, before performing the PLS calculations. The optimal spectral region (1396-1280cm(-1)) and number of latent variables (LV=3) were obtained employing a moving window of variable size strategy. The method was internally validated by means of the leave one out procedure, providing satisfactory statistics (r(2)=0.9729 and RMSD=5.6%) and figures of merit (LOD=9.4% and MDDC=1.4). Furthermore, the method's performance was also evaluated by analysis of two validation sets. Validation set I was used for assessment of linearity and range and Validation set II, to demonstrate accuracy and precision (Recovery=101.4% and RSD=2.8%). Additionally, a third set of spiked commercial samples was evaluated, exhibiting excellent recoveries (94.2±6.4%). The results suggest that the combination of Raman spectroscopy with multivariate analysis could be applied to the assessment of the main crystal form and its quantitation in samples of ALB bulk drug, in the routine quality control laboratory. Copyright © 2016 Elsevier B.V. All

  14. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  15. Hanford Site Composite Analysis Technical Approach Description: Waste Form Release.

    Energy Technology Data Exchange (ETDEWEB)

    Hardie, S. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Paris, B. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Apted, M. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions or assessment needs, if potential problems are identified.

  16. Subject description form of crime prevention (morphological analysis

    Directory of Open Access Journals (Sweden)

    Валерій Федорович Оболенцев

    2016-06-01

    Full Text Available Activities of the National Crime Prevention is a system object. Therefore, it should be improving on the basis of systems analysis techniques. The practice of systematic approach was realized in the works of  N. F. Kuznetsova, A. I. Dolgova, D. O. Li, V. M. Dryomin, O. Y. Manokha, O. G. Frolova. Crime models developed C. Y. Vitsin, Y. D. Bluvshteyn, N. V. Smetanina. We previously disclosed basic principles of system analysis system to prevent crime and its genetic and prognostic aspects, classification features, systemic factors latentyzatsiyi criminogenic factors - object protective activity, the amount of protected public relations. In order to investigate the systemic properties of the system of crime prevention in Ukraine we have defined objectives of the study - to its morphological analysis. Elements of a specialized system of crime prevention - a prosecution, Interior, Security Service, the Military Service of the Armed Forces of Ukraine, the National Anti-Corruption Bureau of Ukraine, bodies of state border protection agencies revenues and fees, enforcement and penal institutions, remand centers, public financial control, fisheries, the state forest protection. We determined depth analysis of your system functions at the level of law enforcement agencies. Intercom system to prevent crime is information links between elements of the system (transfer of information on crimes and criminals current activity. External relations systems provide processes of interaction with the environment. For crime prevention system external relations are relations of elements (law enforcement society. In the system of crime prevention implemented such coordination links: 1 Departmental coordination meeting of law enforcement agencies; 2 inter-agency coordination meeting of law enforcement agencies (Prosecutor General of Ukraine, the State Border Service of Ukraine and others. 3 mutual exchange of information; 4 order the prosecution, SBU on other agencies

  17. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  18. High-resolution SNP array analysis of patients with developmental disorder and normal array CGH results

    Directory of Open Access Journals (Sweden)

    Siggberg Linda

    2012-09-01

    Full Text Available Abstract Background Diagnostic analysis of patients with developmental disorders has improved over recent years largely due to the use of microarray technology. Array methods that facilitate copy number analysis have enabled the diagnosis of up to 20% more patients with previously normal karyotyping results. A substantial number of patients remain undiagnosed, however. Methods and Results Using the Genome-Wide Human SNP array 6.0, we analyzed 35 patients with a developmental disorder of unknown cause and normal array comparative genomic hybridization (array CGH results, in order to characterize previously undefined genomic aberrations. We detected no seemingly pathogenic copy number aberrations. Most of the vast amount of data produced by the array was polymorphic and non-informative. Filtering of this data, based on copy number variant (CNV population frequencies as well as phenotypically relevant genes, enabled pinpointing regions of allelic homozygosity that included candidate genes correlating to the phenotypic features in four patients, but results could not be confirmed. Conclusions In this study, the use of an ultra high-resolution SNP array did not contribute to further diagnose patients with developmental disorders of unknown cause. The statistical power of these results is limited by the small size of the patient cohort, and interpretation of these negative results can only be applied to the patients studied here. We present the results of our study and the recurrence of clustered allelic homozygosity present in this material, as detected by the SNP 6.0 array.

  19. Normal-Gamma-Bernoulli Peak Detection for Analysis of Comprehensive Two-Dimensional Gas Chromatography Mass Spectrometry Data.

    Science.gov (United States)

    Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang

    2017-01-01

    Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.

  20. Identification of Optimal Reference Genes for Normalization of qPCR Analysis during Pepper Fruit Development

    Directory of Open Access Journals (Sweden)

    Yuan Cheng

    2017-06-01

    Full Text Available Due to its high sensitivity and reproducibility, quantitative real-time PCR (qPCR is practiced as a useful research tool for targeted gene expression analysis. For qPCR operations, the normalization with suitable reference genes (RGs is a crucial step that eventually determines the reliability of the obtained results. Although pepper is considered an ideal model plant for the study of non-climacteric fruit development, at present no specific RG have been developed or validated for the qPCR analyses of pepper fruit. Therefore, this study aimed to identify stably expressed genes for their potential use as RGs in pepper fruit studies. Initially, a total of 35 putative RGs were selected by mining the pepper transcriptome data sets derived from the PGP (Pepper Genome Platform and PGD (Pepper Genome Database. Their expression stabilities were further measured in a set of pepper (Capsicum annuum L. var. 007e fruit samples, which represented four different fruit developmental stages (IM: Immature; MG: Mature green; B: Break; MR: Mature red using the qPCR analysis. Then, based on the qPCR results, three different statistical algorithms, namely geNorm, Normfinder, and boxplot, were chosen to evaluate the expression stabilities of these putative RGs. It should be noted that nine genes were proven to be qualified as RGs during pepper fruit development, namely CaREV05 (CA00g79660; CaREV08 (CA06g02180; CaREV09 (CA06g05650; CaREV16 (Capana12g002666; CaREV21 (Capana10g001439; CaREV23 (Capana05g000680; CaREV26 (Capana01g002973; CaREV27 (Capana11g000123; CaREV31 (Capana04g002411; and CaREV33 (Capana08g001826. Further analysis based on geNorm suggested that the application of the two most stably expressed genes (CaREV05 and CaREV08 would provide optimal transcript normalization in the qPCR experiments. Therefore, a new and comprehensive strategy for the identification of optimal RGs was developed. This strategy allowed for the effective normalization of the q

  1. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo [University of Ljubljana, Faculty of Electrical Engineering, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2008-04-07

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm ({+-}0.6 mm) for the first and 2.1 mm ({+-}1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and

  2. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    International Nuclear Information System (INIS)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo

    2008-01-01

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm (±0.6 mm) for the first and 2.1 mm (±1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and CA

  3. A branching process model for the analysis of abortive colony size distributions in carbon ion-irradiated normal human fibroblasts

    International Nuclear Information System (INIS)

    Sakashita, Tetsuya; Kobayashi, Yasuhiko; Hamada, Nobuyuki; Kawaguchi, Isao; Hara, Takamitsu; Saito, Kimiaki

    2014-01-01

    A single cell can form a colony, and ionizing irradiation has long been known to reduce such a cellular clonogenic potential. Analysis of abortive colonies unable to continue to grow should provide important information on the reproductive cell death (RCD) following irradiation. Our previous analysis with a branching process model showed that the RCD in normal human fibroblasts can persist over 16 generations following irradiation with low linear energy transfer (LET) γ-rays. Here we further set out to evaluate the RCD persistency in abortive colonies arising from normal human fibroblasts exposed to high-LET carbon ions (18.3 MeV/u, 108 keV/μm). We found that the abortive colony size distribution determined by biological experiments follows a linear relationship on the log–log plot, and that the Monte Carlo simulation using the RCD probability estimated from such a linear relationship well simulates the experimentally determined surviving fraction and the relative biological effectiveness (RBE). We identified the short-term phase and long-term phase for the persistent RCD following carbon-ion irradiation, which were similar to those previously identified following γ-irradiation. Taken together, our results suggest that subsequent secondary or tertiary colony formation would be invaluable for understanding the long-lasting RCD. All together, our framework for analysis with a branching process model and a colony formation assay is applicable to determination of cellular responses to low- and high-LET radiation, and suggests that the long-lasting RCD is a pivotal determinant of the surviving fraction and the RBE. (author)

  4. Analysis of serum and cerebrospinal fluid in clinically normal adult miniature donkeys.

    Science.gov (United States)

    Mozaffari, A A; Samadieh, H

    2013-09-01

    To establish reference intervals for serum and cerebrospinal fluid (CSF) parameters in clinically healthy adult miniature donkeys. Experiments were conducted on 10 female and 10 male clinically normal adult miniature donkeys, randomly selected from five herds. Lumbosacral CSF collection was performed with the sedated donkey in the standing position. Cell analysis was performed immediately after the samples were collected. Blood samples were obtained from the jugular vein immediately after CSF sample collection. Sodium, potassium, glucose, urea nitrogen, total protein, calcium, chloride, phosphorous and magnesium concentrations were measured in CSF and serum samples. A paired t-test was used to compare mean values between female and male donkeys. The CSF was uniformly clear, colourless and free from flocculent material, with a specific gravity of 1.002. The range of total nucleated cell counts was 2-4 cells/μL. The differential white cell count comprised only small lymphocytes. No erythrocytes or polymorphonuclear cells were observed on cytological examination. Reference values were obtained for biochemical analysis of serum and CSF. Gender had no effect on any variables measured in serum or CSF (p>0.05). CSF analysis can provide important information in addition to that gained by clinical examination. CSF analysis has not previously been performed in miniature donkeys; this is the first report on the subject. In the present study, reference intervals for total nucleated cell count, total protein, glucose, urea nitrogen, sodium, potassium, chloride, calcium, phosphorous and magnesium concentrations of serum and CSF were determined for male and female miniature donkeys.

  5. Strain-based finite elements for the analysis of cylinders with holes and normally intersecting cylinders

    International Nuclear Information System (INIS)

    Sabir, A.B.

    1983-01-01

    A finite element solution to the problems of stress distribution for cylindrical shells with circular and elliptical holes and also for normally intersecting thin elastic cylindrical shells is given. Quadrilateral and triangular curved finite elements are used in the analysis. The elements are of a new class, based on simple independent generalised strain functions insofar as this is allowed by the compatibility equations. The elements also satisfy exactly the requirements of strain-free-rigid body displacements and uses only the external 'geometrical' nodal degrees of freedom to avoid the difficulties associated with unnecessary internal degrees of freedom. We first develop strain based quadrilateral and triangular elements and apply them to the solution of the problem of stress concentrations in the neighbourhood of small and large circular and elliptical holes when the cylinders are subjected to a uniform axial tension. These results are compared with analytical solutions based on shallow shell approximations and show that the use of these strain based elements obviates the need for using an inordinately large number of elements. Normally intersecting cylinders are common configurations in structural components for nuclear reactor systems and design information for such configurations are generally lacking. The opportunity is taken in the present paper to provide a finite element solution to this problem. A method of substructing will be introduced to enable a solution to the large number of non banded set of simultaneous equations encountered. (orig./HP)

  6. Genome-wide expression analysis comparing hypertrophic changes in normal and dysferlinopathy mice

    Directory of Open Access Journals (Sweden)

    Yun-Sil Lee

    2015-12-01

    Full Text Available Because myostatin normally limits skeletal muscle growth, there are extensive efforts to develop myostatin inhibitors for clinical use. One potential concern is that in muscle degenerative diseases, inducing hypertrophy may increase stress on dystrophic fibers. Our study shows that blocking this pathway in dysferlin deficient mice results in early improvement in histopathology but ultimately accelerates muscle degeneration. Hence, benefits of this approach should be weighed against these potential detrimental effects. Here, we present detailed experimental methods and analysis for the gene expression profiling described in our recently published study in Human Molecular Genetics (Lee et al., 2015. Our data sets have been deposited in the Gene Expression Omnibus (GEO database (GSE62945 and are available at http://www.ncbi.nlm.nih.gov/geo/query/acc.cgi?acc=GSE62945. Our data provide a resource for exploring molecular mechanisms that are related to hypertrophy-induced, accelerated muscular degeneration in dysferlinopathy.

  7. Analysis of epothilone B-induced cell death in normal ovarian cells.

    Science.gov (United States)

    Rogalska, Aneta; Gajek, Arkadiusz; Marczak, Agnieszka

    2013-12-01

    We have investigated the mode of cell death induced by a new microtubule-stabilizing agent, epothilone B (EpoB, patupilone), and a clinically used medicine, paclitaxel (PTX), in normal ovarian cells. Using fluorescence microscopy, polyacrylamide gel electrophoresis preceding Western blot analysis, as well as spectrofluorimetric and colorimetric detection, we demonstrate that, compared to EpoB, PTX induced high time-dependent morphological and biochemical changes typical of apoptosis. Induction of apoptosis followed an early increase in p53 levels. Apoptosis reached its maximum at 24-48 h. At the same time, there was a significant increase in caspase-9 and -3 activity and PARP fragmentation, which suggests that an intrinsic path was involved. Apoptosis in MM14 cells was increased more by PTX than EpoB, and also induced more necrosis responsible for inflammation (1.4-fold) than EpoB. © 2013 International Federation for Cell Biology.

  8. Morphometric analysis of three normal facial types in mixed dentition using posteroanterior cephalometric radiographs: preliminary results

    Directory of Open Access Journals (Sweden)

    Renato Bigliazzi

    2017-08-01

    Full Text Available The aim of the present investigation was to evaluate the craniofacial features of subjects with normal occlusion with different vertical patterns in the mixed dentition using morphometric analysis (Thin-Plate Spline analysis - TPS applied to posteroanterior (PA films. The sample comprised 39 individuals (18 females and 21 males, all in mixed dentition, aged from 8.4 to 10 years with satisfactory occlusion and balanced profile and with no history of orthodontic or facial orthopedic treatment. The sample was divided into three groups (mesofacial, brachyfacial and dolichofacial according to the facial types proposed by Ricketts (1989. The average craniofacial configurations of each study group were obtained by orthogonal superimposition of Procrustes, thereby eliminating size differences and allowing only shape differences between groups to be analyzed by viewing the TPS deformation grid. Significant differences were found among the three facial types but were more remarkable between mesofacials and dolichofacials than between mesofacials and brachyfacials. TPS morphometric analysis proved efficient for accurate visualization of transverse and vertical differences among facial types even before pubertal growth spurt. These differences cannot be easily detected by traditional posteroanterior cephalometry.

  9. Normal conditions of transport thermal analysis and testing of a Type B drum package

    International Nuclear Information System (INIS)

    Jerrell, J.W.; Alstine, M.N. van; Gromada, R.J.

    1995-01-01

    Increasing the content limits of radioactive material packagings can save money and increase transportation safety by decreasing the total number of shipments required to transport large quantities of material. The contents of drum packages can be limited by unacceptable containment vessel pressures and temperatures due to the thermal properties of the insulation. The purpose of this work is to understand and predict the effects of insulation properties on containment system performance. The type B shipping container used in the study is a double containment fiberboard drum package. The package is primarily used to transport uranium and plutonium metals and oxides. A normal condition of transport (NCT) thermal test was performed to benchmark an NCT analysis of the package. A 21 W heater was placed in an instrumented package to simulate the maximum source decay heat. The package reached thermal equilibrium 120 hours after the heater was turned on. Testing took place indoors to minimize ambient temperature fluctuations. The thermal analysis of the package used fiberboard properties reported in the literature and resulted in temperature significantly greater than those measured during the test. Details of the NCT test will be described and transient temperatures at key thermocouple locations within the package will be presented. Analytical results using nominal fiberboard properties will be presented. Explanations of the results and the attempt to benchmark the analysis will be presented. The discovery that fiberboard has an anisotropic thermal conductivity and its effect on thermal performance will also be discussed

  10. Normal Mode Analysis in Zeolites: Toward an Efficient Calculation of Adsorption Entropies.

    Science.gov (United States)

    De Moor, Bart A; Ghysels, An; Reyniers, Marie-Françoise; Van Speybroeck, Veronique; Waroquier, Michel; Marin, Guy B

    2011-04-12

    An efficient procedure for normal-mode analysis of extended systems, such as zeolites, is developed and illustrated for the physisorption and chemisorption of n-octane and isobutene in H-ZSM-22 and H-FAU using periodic DFT calculations employing the Vienna Ab Initio Simulation Package. Physisorption and chemisorption entropies resulting from partial Hessian vibrational analysis (PHVA) differ at most 10 J mol(-1) K(-1) from those resulting from full Hessian vibrational analysis, even for PHVA schemes in which only a very limited number of atoms are considered free. To acquire a well-conditioned Hessian, much tighter optimization criteria than commonly used for electronic energy calculations in zeolites are required, i.e., at least an energy cutoff of 400 eV, maximum force of 0.02 eV/Å, and self-consistent field loop convergence criteria of 10(-8) eV. For loosely bonded complexes the mobile adsorbate method is applied, in which frequency contributions originating from translational or rotational motions of the adsorbate are removed from the total partition function and replaced by free translational and/or rotational contributions. The frequencies corresponding with these translational and rotational modes can be selected unambiguously based on a mobile block Hessian-PHVA calculation, allowing the prediction of physisorption entropies within an accuracy of 10-15 J mol(-1) K(-1) as compared to experimental values. The approach presented in this study is useful for studies on other extended catalytic systems.

  11. Normalization to specific gravity prior to analysis improves information recovery from high resolution mass spectrometry metabolomic profiles of human urine.

    Science.gov (United States)

    Edmands, William M B; Ferrari, Pietro; Scalbert, Augustin

    2014-11-04

    Extraction of meaningful biological information from urinary metabolomic profiles obtained by liquid-chromatography coupled to mass spectrometry (MS) necessitates the control of unwanted sources of variability associated with large differences in urine sample concentrations. Different methods of normalization either before analysis (preacquisition normalization) through dilution of urine samples to the lowest specific gravity measured by refractometry, or after analysis (postacquisition normalization) to urine volume, specific gravity and median fold change are compared for their capacity to recover lead metabolites for a potential future use as dietary biomarkers. Twenty-four urine samples of 19 subjects from the European Prospective Investigation into Cancer and nutrition (EPIC) cohort were selected based on their high and low/nonconsumption of six polyphenol-rich foods as assessed with a 24 h dietary recall. MS features selected on the basis of minimum discriminant selection criteria were related to each dietary item by means of orthogonal partial least-squares discriminant analysis models. Normalization methods ranked in the following decreasing order when comparing the number of total discriminant MS features recovered to that obtained in the absence of normalization: preacquisition normalization to specific gravity (4.2-fold), postacquisition normalization to specific gravity (2.3-fold), postacquisition median fold change normalization (1.8-fold increase), postacquisition normalization to urinary volume (0.79-fold). A preventative preacquisition normalization based on urine specific gravity was found to be superior to all curative postacquisition normalization methods tested for discovery of MS features discriminant of dietary intake in these urinary metabolomic datasets.

  12. All-atom normal-mode analysis reveals an RNA-induced allostery in a bacteriophage coat protein.

    Science.gov (United States)

    Dykeman, Eric C; Twarock, Reidun

    2010-03-01

    Assembly of the T=3 bacteriophage MS2 is initiated by the binding of a 19 nucleotide RNA stem loop from within the phage genome to a symmetric coat protein dimer. This binding event effects a folding of the FG loop in one of the protein subunits of the dimer and results in the formation of an asymmetric dimer. Since both the symmetric and asymmetric forms of the dimer are needed for the assembly of the protein container, this allosteric switch plays an important role in the life cycle of the phage. We provide here details of an all-atom normal-mode analysis of this allosteric effect. The results suggest that asymmetric contacts between the A -duplex RNA phosphodiester backbone of the stem loop with the EF loop in one coat protein subunit results in an increased dynamic behavior of its FG loop. The four lowest-frequency modes, which encompass motions predominantly on the FG loops, account for over 90% of the increased dynamic behavior due to a localization of the vibrational pattern on a single FG loop. Finally, we show that an analysis of the allosteric effect using an elastic network model fails to predict this localization effect, highlighting the importance of using an all-atom full force field method for this problem.

  13. Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)

    Science.gov (United States)

    Li, L.; Wu, Y.

    2017-12-01

    Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.

  14. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method.

    Directory of Open Access Journals (Sweden)

    Ganglong Yang

    Full Text Available The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia, KK47 (low grade nonmuscle invasive bladder cancer, NMIBC, and YTS1 (metastatic bladder cancer have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer.

  15. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    Science.gov (United States)

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  16. Reliability analysis of meteorological data registered during nuclear power plant normal operation

    International Nuclear Information System (INIS)

    Amado, V.; Ulke, A.; Marino, B.; Thomas, L.

    2011-01-01

    The atmosphere is the environment in which gaseous radioactive discharges from nuclear power plants are transported. It is therefore essential to have reliable meteorological information to characterize the dispersion and feed evaluation models and radiological environmental impact during normal operation of the plant as well as accidental releases. In this way it is possible to determine the effects on the environment and in humans. The basic data needed to represent adequately the local weather include air temperature, wind speed and direction, rainfall, humidity and pressure. On the other hand, specific data consistent with the used model is required to determine the turbulence, for instance, radiation, cloud cover and vertical temperature gradient. It is important that the recorded data are representative of the local meteorology. This requires, first, properly placed instruments, that should be kept in operation and undergoing maintenance on a regular basis. Second, but equally substantial, a thorough analysis of its reliability must be performed prior to storage and/or data processing. In this paper we present the main criteria to consider choosing the location of a meteorological tower in the area of a nuclear power plant and propose a methodology for assessing the reliability of recorded data. The methodology was developed from the analysis of meteorological data registered in nuclear power plants in Argentina. (authors) [es

  17. Normal left ventricular emptying in coronary artery disease at rest: analysis by radiographic and equilibrium radionuclide ventriculography

    International Nuclear Information System (INIS)

    Denenberg, B.S.; Makler, P.T.; Bove, A.A.; Spann, J.F.

    1981-01-01

    The volume ejected early in systole has been proposed as an indicator of abnormal left ventricular function that is present at rest in patients with coronary artery disease with a normal ejection fraction and normal wall motion. The volume ejected in systole was examined by calculating the percent change in ventricular volume using both computer-assisted analysis of biplane radiographic ventriculograms at 60 frames/s and equilibrium gated radionuclide ventriculograms. Ventricular emptying was examined with radiographic ventriculography in 33 normal patients and 23 patients with coronary artery disease and normal ejection fraction. Eight normal subjects and six patients with coronary artery disease had both radiographic ventriculography and equilibrium gated radionuclide ventriculography. In all patients, there was excellent correlation between the radiographic and radionuclide ventricular emptying curves (r . 0.971). There were no difference in the ventricular emptying curves of normal subjects and patients with coronary artery disease whether volumes were measured by radiographic or equilibrium gated radionuclide ventriculography. It is concluded that the resting ventricular emptying curves are identical in normal subjects and patients with coronary artery disease who have a normal ejection fraction and normal wall motion

  18. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory

    OpenAIRE

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-01-01

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and...

  19. Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis

    Science.gov (United States)

    Das, Samiran

    2018-04-01

    The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.

  20. Stress analysis of biomass fuel molding machine piston type stamping forming cone

    Directory of Open Access Journals (Sweden)

    Wu Gaofeng

    2015-01-01

    Full Text Available It is established the ram biomass straw machine as the analysis object in this paper,the molding machine cones of stress in the forming process of the analysis of the system. We used pottery instead of Wear-resistant cast iron for improving the performance of forming sleeve. The structure of the forming sleeve was analyzed with the mechanical module of a soft named Pro/engineer in this paper. The result indicated that the program was feasible. With the sensitivity analysis we identified the suitable angle for the sleeve.

  1. Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants

    KAUST Repository

    Jin, Ick Hoon

    2014-03-01

    Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo maximum likelihood method and the auxiliary variable Markov chain Monte Carlo methods. The Bayesian stochastic approximation Monte Carlo algorithm specifically addresses this problem: It works by sampling from a sequence of approximate distributions with their average converging to the target posterior distribution, where the approximate distributions can be achieved using the stochastic approximation Monte Carlo algorithm. A strong law of large numbers is established for the Bayesian stochastic approximation Monte Carlo estimator under mild conditions. Compared to the Monte Carlo maximum likelihood method, the Bayesian stochastic approximation Monte Carlo algorithm is more robust to the initial guess of model parameters. Compared to the auxiliary variable MCMC methods, the Bayesian stochastic approximation Monte Carlo algorithm avoids the requirement for perfect samples, and thus can be applied to many models for which perfect sampling is not available or very expensive. The Bayesian stochastic approximation Monte Carlo algorithm also provides a general framework for approximate Bayesian analysis. © 2012 Elsevier B.V. All rights reserved.

  2. Saturation analysis studies of corticosteroid levels in normal Greek subjects and in subjects with haemolytic diseases

    International Nuclear Information System (INIS)

    Vyzantiadis, A.

    1975-07-01

    Between 1970 and 1974 a saturation analysis for cortisol in plasma and free cortisol in urine, and a radioimmunoassay method for aldosterone in plasma and urine were developed. In order to permit a comparative evaluation it was necessary to study corticosteroids, diurnal rhythm and the probable effect of a siesta on this rhythm both in normal subjects and in patients suffering from hemic diseases, in particular from sickle-cell anemia. Saturation assay for cortisol, using serum from pregnant women as source of transcortin, and radioimmunoassay for aldosterone were the basic methods used. Serum cortisol was estimated twice a day (8-9 a.m. and 5-6 p.m.). Cortisol and aldosterone were also estimated in serum and in urine before and after adrenalin stimulation with ACTH. No significant influence of a siesta on the diurnal rhythm of cortisol was observed, nor did the levels of serum cortisol or the diurnal rhythm appear affected in congenital hemolytic anemias, following adrenalin stimulation. The report lists experimental results briefly and refers to a paper in which these are published in more detail

  3. Accretion onto magnetized neutron stars: Normal mode analysis of the interchange instability at the magnetopause

    International Nuclear Information System (INIS)

    Arons, J.; Lea, S.M.

    1976-01-01

    We describe the results of a linearized hydromagnetic stability analysis of the magnetopause of an accreting neutron star. The magnetosphere is assumed to be slowly rotating, and the plasma just outside of the magnetopause is assumed to be weakly magnetized. The plasma layer is assumed to be bounded above by a shock wave, and to be thin compared with the radius of the magnetosphere. Under these circumstances, the growing modes are shown to be localized in the direction parallel to the zero-order magnetic field. The structure of the modes is still similar to the flute mode, however. The growth rate at each magnetic latitude is lambda given by γ 2 =g/sub n/kα/sub eff/(lambda) tanh [kz/sub s/(lambda)] where g/sub n/ is the magnitude of the gravitational acceleration normal to the surface, kapprox. =vertical-barmvertical-bar/R (lambda)cos lambda, vertical-barmvertical-bar is the azimuthal mode number, R (lambda) is the radius of the magnetosphere, z/sub s/ is the height of the shock above the magnetopause, and α/sub eff/(lambda) <1 is the effective Atwood number which embodies the stabilizing effects of favorable curvature and magnetic tension. We calculate α/sub eff/(lambda), and also discuss the stabilizing effects of viscosity and of aligned flow parallel to the magnetopause

  4. Instantaneous normal mode analysis for intermolecular and intramolecular vibrations of water from atomic point of view.

    Science.gov (United States)

    Chen, Yu-Chun; Tang, Ping-Han; Wu, Ten-Ming

    2013-11-28

    By exploiting the instantaneous normal mode (INM) analysis for models of flexible molecules, we investigate intermolecular and intramolecular vibrations of water from the atomic point of view. With two flexible SPC/E models, our investigations include three aspects about their INM spectra, which are separated into the unstable, intermolecular, bending, and stretching bands. First, the O- and H-atom contributions in the four INM bands are calculated and their stable INM spectra are compared with the power spectra of the atomic velocity autocorrelation functions. The unstable and intermolecular bands of the flexible models are also compared with those of the SPC/E model of rigid molecules. Second, we formulate the inverse participation ratio (IPR) of the INMs, respectively, for the O- and H-atom and molecule. With the IPRs, the numbers of the three species participated in the INMs are estimated so that the localization characters of the INMs in each band are studied. Further, by the ratio of the IPR of the H atom to that of the O atom, we explore the number of involved OH bond per molecule participated in the INMs. Third, by classifying simulated molecules into subensembles according to the geometry of their local environments or their H-bond configurations, we examine the local-structure effects on the bending and stretching INM bands. All of our results are verified to be insensible to the definition of H-bond. Our conclusions about the intermolecular and intramolecular vibrations in water are given.

  5. Automated impedance-manometry analysis detects esophageal motor dysfunction in patients who have non-obstructive dysphagia with normal manometry

    NARCIS (Netherlands)

    Nguyen, N. Q.; Holloway, R. H.; Smout, A. J.; Omari, T. I.

    2013-01-01

    Background  Automated integrated analysis of impedance and pressure signals has been reported to identify patients at risk of developing dysphagia post fundoplication. This study aimed to investigate this analysis in the evaluation of patients with non-obstructive dysphagia (NOD) and normal

  6. Performance analysis of MIMO wireless optical communication system with Q-ary PPM over correlated log-normal fading channel

    Science.gov (United States)

    Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua

    2018-06-01

    The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.

  7. Existence of a soluble form of CD50 (intercellular adhesion molecule-3) produced upon human lymphocyte activation. Present in normal human serum and levels are increased in the serum of systemic lupus erythematosus patients.

    Science.gov (United States)

    Pino-Otín, M R; Viñas, O; de la Fuente, M A; Juan, M; Font, J; Torradeflot, M; Pallarés, L; Lozano, F; Alberola-Ila, J; Martorell, J

    1995-03-15

    CD50 (ICAM-3) is a leukocyte differentiation Ag expressed almost exclusively on hemopoietic cells, with a key role in the first steps of immune response. To develop a specific sandwich ELISA to detect a soluble CD50 form (sCD50), two different mAbs (140-11 and 101-1D2) recognizing non-overlapping epitopes were used. sCD50 was detected in the supernatant of stimulated PBMCs, with the highest levels after CD3 triggering. Simultaneously, the CD50 surface expression diminished during the first 24 h. sCD50 isolated from culture supernatant and analyzed by immunoblotting showed an apparent m.w. of 95 kDa, slightly smaller than the membrane form. These data, together with Northern blot kinetics analysis, suggest that sCD50 is cleaved from cell membrane. Furthermore, we detect sCD50 in normal human sera and higher levels in sera of systemic lupus erythematosus (SLE) patients, especially in those in active phase. The sCD50 levels showed a positive correlation with sCD27 levels (r = 0.4213; p = 0.0026). Detection of sCD50, both after in vitro CD3 triggering of PBMCs and increased in SLE sera, suggests that sCD50 could be used as a marker of lymphocyte stimulation.

  8. Determination of Elements in Normal and Leukemic Human Whole Blood by Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D; Frykberg, B; Samsahl, K; Wester, P O

    1961-11-15

    By means of gamma-spectrometry the following elements were simultaneously determined in normal and leukemic human whole blood: Cu, Mn, Zn, Sr, Na, P, Ca, Rb, Cd, Sb, Au, Cs and Fe. Chemical separations were performed according to a group separation method using ion-exchange technique. No significant difference between the concentrations of the elements in normal- and leukemic blood was observed.

  9. Determination of Elements in Normal and Leukemic Human Whole Blood by Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Brune, D.; Frykberg, B.; Samsahl, K.; Wester, P.O.

    1961-11-01

    By means of gamma-spectrometry the following elements were simultaneously determined in normal and leukemic human whole blood: Cu, Mn, Zn, Sr, Na, P, Ca, Rb, Cd, Sb, Au, Cs and Fe. Chemical separations were performed according to a group separation method using ion-exchange technique. No significant difference between the concentrations of the elements in normal- and leukemic blood was observed

  10. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    Science.gov (United States)

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Use of colour for hand-filled form analysis and recognition

    OpenAIRE

    Sherkat, N; Allen, T; Wong, WS

    2005-01-01

    Colour information in form analysis is currently under utilised. As technology has advanced and computing costs have reduced, the processing of forms in colour has now become practicable. This paper describes a novel colour-based approach to the extraction of filled data from colour form images. Images are first quantised to reduce the colour complexity and data is extracted by examining the colour characteristics of the images. The improved performance of the proposed method has been verifie...

  12. Muscle protein analysis. II. Two-dimensional electrophoresis of normal and diseased human skeletal muscle

    Energy Technology Data Exchange (ETDEWEB)

    Giometti, C.S. (Argonne National Lab., IL); Barany, M.; Danon, M.J.; Anderson, N.G.

    1980-07-01

    High-resolution two-dimensional electrophoresis was used to analyze the major proteins of normal and pathological human-muscle samples. The normal human-muscle pattern contains four myosin light chains: three that co-migrate with the myosin light chains from rabbit fast muscle (extensor digitorum longus), and one that co-migrates with the light chain 2 from rabbit slow muscle (soleus). Of seven Duchenne muscular dystrophy samples, four yielded patterns with decreased amounts of actin and myosin relative to normal muscle, while three samples gave patterns comparable to that for normal muscle. Six samples from patients with myotonic dystrophy also gave normal patterns. In nemaline rod myopathy, in contrast, the pattern was deficient in two of the fast-type myosin light chains.

  13. Functional analysis of the cross-section form and X-ray density of human ulnae

    International Nuclear Information System (INIS)

    Hilgen, B.

    1981-01-01

    On 20 ulnae the form of the cross sections and distribution of the X-ray density were investigated in five different cross-section heights. The analysis of the cross-section forms was carried through using plane contraction figures, the X-ray density was established by means of the equidensity line method. (orig.) [de

  14. Trace analysis of auxiliary feedwater capacity for Maanshan PWR loss-of-normal-feedwater transient

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Che-Hao; Shih, Chunkuan [National Tsing Hua Univ., Taiwan (China). Inst. of Nuclear Engineering and Science; Wang, Jong-Rong; Lin, Hao-Tzu [Atomic Energy Council, Taiwan (China). Inst. of Nuclear Energy Research

    2013-07-01

    Maanshan nuclear power plant is a Westinghouse PWR of Taiwan Power Company (Taipower, TPC). A few years ago, TPC has made many assessments in order to uprate the power of Maanshan NPP. The assessments include NSSS (Nuclear Steam Supply System) parameters calculation, uncertainty acceptance, integrity of pressure vessel, reliability of auxiliary systems, and transient analyses, etc. Since the Fukushima Daiichi accident happened, it is necessary to consider transients with multiple-failure. Base on the analysis, we further study the auxiliary feedwater capability for Loss-of-Normal-Feedwater (LONF) transient. LONF is the limiting transient of non-turbine trip initiated event for ATWS (Anticipated Transient Without Scram) which results in a reduction in capability of the secondary system to remove the heat generated in the reactor core. If the turbine fails to trip immediately, the secondary water inventory will decrease significantly before the actuation of auxiliary feedwater (AFW) system. The heat removal from the primary side decreases, and this leads to increases of primary coolant temperature and pressure. The water level of pressurizer also increases subsequently. The heat removal through the relief valves and the auxiliary feedwater is not sufficient to fully cope with the heat generation from primary side. The pressurizer will be filled with water finally, and the RCS pressure might rise above the set point of relief valves for water discharge. RCS pressure depends on steam generator inventory, primary coolant temperature, negative reactivity feedback, and core power, etc. The RCS pressure may reach its peak after core power reduction. According to ASME Code Level C service limit criteria, the Reactor Coolant System (RCS) pressure must be under 22.06 MPa. The USNRC is developing an advanced thermal hydraulic code named TRACE for nuclear power plant safety analysis. The development of TRACE is based on TRAC and integrating with RELAP5 and other programs. SNAP

  15. Trace analysis of auxiliary feedwater capacity for Maanshan PWR loss-of-normal-feedwater transient

    International Nuclear Information System (INIS)

    Chen, Che-Hao; Shih, Chunkuan; Wang, Jong-Rong; Lin, Hao-Tzu

    2013-01-01

    Maanshan nuclear power plant is a Westinghouse PWR of Taiwan Power Company (Taipower, TPC). A few years ago, TPC has made many assessments in order to uprate the power of Maanshan NPP. The assessments include NSSS (Nuclear Steam Supply System) parameters calculation, uncertainty acceptance, integrity of pressure vessel, reliability of auxiliary systems, and transient analyses, etc. Since the Fukushima Daiichi accident happened, it is necessary to consider transients with multiple-failure. Base on the analysis, we further study the auxiliary feedwater capability for Loss-of-Normal-Feedwater (LONF) transient. LONF is the limiting transient of non-turbine trip initiated event for ATWS (Anticipated Transient Without Scram) which results in a reduction in capability of the secondary system to remove the heat generated in the reactor core. If the turbine fails to trip immediately, the secondary water inventory will decrease significantly before the actuation of auxiliary feedwater (AFW) system. The heat removal from the primary side decreases, and this leads to increases of primary coolant temperature and pressure. The water level of pressurizer also increases subsequently. The heat removal through the relief valves and the auxiliary feedwater is not sufficient to fully cope with the heat generation from primary side. The pressurizer will be filled with water finally, and the RCS pressure might rise above the set point of relief valves for water discharge. RCS pressure depends on steam generator inventory, primary coolant temperature, negative reactivity feedback, and core power, etc. The RCS pressure may reach its peak after core power reduction. According to ASME Code Level C service limit criteria, the Reactor Coolant System (RCS) pressure must be under 22.06 MPa. The USNRC is developing an advanced thermal hydraulic code named TRACE for nuclear power plant safety analysis. The development of TRACE is based on TRAC and integrating with RELAP5 and other programs. SNAP

  16. Comparing the normalization methods for the differential analysis of Illumina high-throughput RNA-Seq data.

    Science.gov (United States)

    Li, Peipei; Piao, Yongjun; Shon, Ho Sun; Ryu, Keun Ho

    2015-10-28

    Recently, rapid improvements in technology and decrease in sequencing costs have made RNA-Seq a widely used technique to quantify gene expression levels. Various normalization approaches have been proposed, owing to the importance of normalization in the analysis of RNA-Seq data. A comparison of recently proposed normalization methods is required to generate suitable guidelines for the selection of the most appropriate approach for future experiments. In this paper, we compared eight non-abundance (RC, UQ, Med, TMM, DESeq, Q, RPKM, and ERPKM) and two abundance estimation normalization methods (RSEM and Sailfish). The experiments were based on real Illumina high-throughput RNA-Seq of 35- and 76-nucleotide sequences produced in the MAQC project and simulation reads. Reads were mapped with human genome obtained from UCSC Genome Browser Database. For precise evaluation, we investigated Spearman correlation between the normalization results from RNA-Seq and MAQC qRT-PCR values for 996 genes. Based on this work, we showed that out of the eight non-abundance estimation normalization methods, RC, UQ, Med, TMM, DESeq, and Q gave similar normalization results for all data sets. For RNA-Seq of a 35-nucleotide sequence, RPKM showed the highest correlation results, but for RNA-Seq of a 76-nucleotide sequence, least correlation was observed than the other methods. ERPKM did not improve results than RPKM. Between two abundance estimation normalization methods, for RNA-Seq of a 35-nucleotide sequence, higher correlation was obtained with Sailfish than that with RSEM, which was better than without using abundance estimation methods. However, for RNA-Seq of a 76-nucleotide sequence, the results achieved by RSEM were similar to without applying abundance estimation methods, and were much better than with Sailfish. Furthermore, we found that adding a poly-A tail increased alignment numbers, but did not improve normalization results. Spearman correlation analysis revealed that RC, UQ

  17. Non normal and non quadratic anisotropic plasticity coupled with ductile damage in sheet metal forming: Application to the hydro bulging test

    International Nuclear Information System (INIS)

    Badreddine, Houssem; Saanouni, Khemaies; Dogui, Abdelwaheb

    2007-01-01

    In this work an improved material model is proposed that shows good agreement with experimental data for both hardening curves and plastic strain ratios in uniaxial and equibiaxial proportional loading paths for steel metal until the final fracture. This model is based on non associative and non normal flow rule using two different orthotropic equivalent stresses in both yield criterion and plastic potential functions. For the plastic potential the classical Hill 1948 quadratic equivalent stress is considered while for the yield criterion the Karafillis and Boyce 1993 non quadratic equivalent stress is used taking into account the non linear mixed (kinematic and isotropic) hardening. Applications are made to hydro bulging tests using both circular and elliptical dies. The results obtained with different particular cases of the model such as the normal quadratic and the non normal non quadratic cases are compared and discussed with respect to the experimental results

  18. Quantitative scintigraphy analysis of the normal lacrimal drainage using 99mTc radioisotope

    International Nuclear Information System (INIS)

    Bittar, Marcos Daniel Ramos.

    1994-01-01

    The dynamics of the lacrimal drainage system was studied in 37 normal patients including 16 males and 21 females by means of a radioisotope 99m Tc (technetium as pertechnetate in a normal saline solution), a gamma camera and a computer. After instillation of 10 μl normal saline solution containing 99m Tc in the conjunctival sac a scintigram was taken at the beginning and at the end of 10 minutes measuring time and images were recorded every 5 seconds. Two areas were studied: conjunctival and lacrimal sac. 39 refs., 17 figs., 1 tab

  19. Lifetime analysis of the ITER first wall under steady-state and off-normal loads

    International Nuclear Information System (INIS)

    Mitteau, R; Sugihara, M; Raffray, R; Carpentier-Chouchana, S; Merola, M; Pitts, R A; Labidi, H; Stangeby, P

    2011-01-01

    The lifetime of the beryllium armor of the ITER first wall is evaluated for normal and off-normal operation. For the individual events considered, the lifetime spans between 930 and 35×10 6 discharges. The discrepancy between low and high estimates is caused by uncertainties about the behavior of the melt layer during off-normal events, variable plasma operation parameters and variability of the sputtering yields. These large uncertainties in beryllium armor loss estimates are a good example of the experimental nature of the ITER project and will not be truly resolved until ITER begins burning plasma operation.

  20. Normal mode analysis as a method to derive protein dynamics information from the Protein Data Bank.

    Science.gov (United States)

    Wako, Hiroshi; Endo, Shigeru

    2017-12-01

    Normal mode analysis (NMA) can facilitate quick and systematic investigation of protein dynamics using data from the Protein Data Bank (PDB). We developed an elastic network model-based NMA program using dihedral angles as independent variables. Compared to the NMA programs that use Cartesian coordinates as independent variables, key attributes of the proposed program are as follows: (1) chain connectivity related to the folding pattern of a polypeptide chain is naturally embedded in the model; (2) the full-atom system is acceptable, and owing to a considerably smaller number of independent variables, the PDB data can be used without further manipulation; (3) the number of variables can be easily reduced by some of the rotatable dihedral angles; (4) the PDB data for any molecule besides proteins can be considered without coarse-graining; and (5) individual motions of constituent subunits and ligand molecules can be easily decomposed into external and internal motions to examine their mutual and intrinsic motions. Its performance is illustrated with an example of a DNA-binding allosteric protein, a catabolite activator protein. In particular, the focus is on the conformational change upon cAMP and DNA binding, and on the communication between their binding sites remotely located from each other. In this illustration, NMA creates a vivid picture of the protein dynamics at various levels of the structures, i.e., atoms, residues, secondary structures, domains, subunits, and the complete system, including DNA and cAMP. Comparative studies of the specific protein in different states, e.g., apo- and holo-conformations, and free and complexed configurations, provide useful information for studying structurally and functionally important aspects of the protein.

  1. Normal mode analysis of macromolecular systems with the mobile block Hessian method

    International Nuclear Information System (INIS)

    Ghysels, An; Van Speybroeck, Veronique; Van Neck, Dimitri; Waroquier, Michel; Brooks, Bernard R.

    2015-01-01

    Until recently, normal mode analysis (NMA) was limited to small proteins, not only because the required energy minimization is a computationally exhausting task, but also because NMA requires the expensive diagonalization of a 3N a ×3N a matrix with N a the number of atoms. A series of simplified models has been proposed, in particular the Rotation-Translation Blocks (RTB) method by Tama et al. for the simulation of proteins. It makes use of the concept that a peptide chain or protein can be seen as a subsequent set of rigid components, i.e. the peptide units. A peptide chain is thus divided into rigid blocks with six degrees of freedom each. Recently we developed the Mobile Block Hessian (MBH) method, which in a sense has similar features as the RTB method. The main difference is that MBH was developed to deal with partially optimized systems. The position/orientation of each block is optimized while the internal geometry is kept fixed at a plausible - but not necessarily optimized - geometry. This reduces the computational cost of the energy minimization. Applying the standard NMA on a partially optimized structure however results in spurious imaginary frequencies and unwanted coordinate dependence. The MBH avoids these unphysical effects by taking into account energy gradient corrections. Moreover the number of variables is reduced, which facilitates the diagonalization of the Hessian. In the original implementation of MBH, atoms could only be part of one rigid block. The MBH is now extended to the case where atoms can be part of two or more blocks. Two basic linkages can be realized: (1) blocks connected by one link atom, or (2) by two link atoms, where the latter is referred to as the hinge type connection. In this work we present the MBH concept and illustrate its performance with the crambin protein as an example

  2. SEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture

    Science.gov (United States)

    Qianxiang, Zhou; Chao, Ma; Xiaohui, Zheng

    sEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture*1 Introduction Now the research on the isotonic muscle actions by using Surface Electromyography (sEMG) is becoming a pop topic in fields of astronaut life support training and rehabilitations. And researchers paid more attention on the sEMG signal processes for reducing the influence of noise which is produced during monitoring process and the fatigue estimation of isotonic muscle actions with different force levels by using the parameters which are obtained from sEMG signals such as Condition Velocity(CV), Median Frequency(MDF), Mean Frequency(MNF) and so on. As the lucubrated research is done, more and more research on muscle fatigue issue of isotonic muscle actions are carried out with sEMG analysis and subjective estimate system of Borg scales at the same time. In this paper, the relationship between the variable for fatigue based on sEMG and the Borg scale during the course of isotonic muscle actions of the upper arm with different contraction levels are going to be investigated. Methods 13 young male subjects(23.4±2.45years, 64.7±5.43Kg, 171.7±5.41cm) with normal standing postures were introduced to do isotonic actions of the upper arm with different force levels(10% MVC, 30%MVC and 50%MVC). And the MVC which means maximal voluntary contraction was obtained firstly in the experiment. Also the sEMG would be recorded during the experiments; the Borg scales would be recorded for each contraction level. By using one-third band octave method, the fatigue variable (p) based on sEMG were set up and it was expressed as p = i g(fi ) · F (fi ). And g(fi ) is defined as the frequent factor which was 0.42+0.5 cos(π fi /f0 )+0.08 cos(2π fi /f0 ), 0 f0 . According to the equations, the p could be computed and the relationship between variable p and the Borg scale would be investigated. Results In the research, three kinds of fitted curves between variable p and Borg

  3. The Importance of Form in Skinner's Analysis of Verbal Behavior and a Further Step

    Science.gov (United States)

    Vargas, E. A.

    2013-01-01

    A series of quotes from B. F. Skinner illustrates the importance of form in his analysis of verbal behavior. In that analysis, form plays an important part in contingency control. Form and function complement each other. Function, the array of variables that control a verbal utterance, dictates the meaning of a specified form; form, as stipulated…

  4. Control-group feature normalization for multivariate pattern analysis of structural MRI data using the support vector machine.

    Science.gov (United States)

    Linn, Kristin A; Gaonkar, Bilwaj; Satterthwaite, Theodore D; Doshi, Jimit; Davatzikos, Christos; Shinohara, Russell T

    2016-05-15

    Normalization of feature vector values is a common practice in machine learning. Generally, each feature value is standardized to the unit hypercube or by normalizing to zero mean and unit variance. Classification decisions based on support vector machines (SVMs) or by other methods are sensitive to the specific normalization used on the features. In the context of multivariate pattern analysis using neuroimaging data, standardization effectively up- and down-weights features based on their individual variability. Since the standard approach uses the entire data set to guide the normalization, it utilizes the total variability of these features. This total variation is inevitably dependent on the amount of marginal separation between groups. Thus, such a normalization may attenuate the separability of the data in high dimensional space. In this work we propose an alternate approach that uses an estimate of the control-group standard deviation to normalize features before training. We study our proposed approach in the context of group classification using structural MRI data. We show that control-based normalization leads to better reproducibility of estimated multivariate disease patterns and improves the classifier performance in many cases. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants

    KAUST Repository

    Jin, Ick Hoon; Liang, Faming

    2014-01-01

    Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo

  6. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis.

    Science.gov (United States)

    Astola, Laura; Molenaar, Jaap

    2014-07-01

    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  7. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis

    Directory of Open Access Journals (Sweden)

    Laura Astola

    2014-07-01

    Full Text Available Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  8. A statistical analysis of count normalization methods used in positron-emission tomography

    International Nuclear Information System (INIS)

    Holmes, T.J.; Ficke, D.C.; Snyder, D.L.

    1984-01-01

    As part of the Positron-Emission Tomography (PET) reconstruction process, annihilation counts are normalized for photon absorption, detector efficiency and detector-pair duty-cycle. Several normalization methods of time-of-flight and conventional systems are analyzed mathematically for count bias and variance. The results of the study have some implications on hardware and software complexity and on image noise and distortion

  9. Dynamic analysis to establish normal shock and vibration of radioactive material shipping packages

    International Nuclear Information System (INIS)

    Fields, S.R.

    1980-01-01

    A computer model, CARDS (Cask-Railcar Dynamic Simulator) was developed to provide input data for a broad range of radioactive material package-tiedown structural assessments. CARDS simulates the dynamic behavior of shipping packages and their transporters during normal transport conditions. The model will be used to identify parameters which significantly affect the normal shock and vibration environments which, in turn, provide the basis for determining the forces transmitted to the packages

  10. Analysis of the original causes of placental oxidative stress in normal pregnancy and pre-eclampsia: a hypothesis.

    Science.gov (United States)

    Yang, Xiang; Guo, Lili; Li, Huaifang; Chen, Xinliang; Tong, Xiaowen

    2012-07-01

    Pre-eclampsia (PE) and eclampsia remain enigmatic despite intensive research. Growing evidence suggests that placental oxidative stress (OS) is involved in the etiopathogenesis of pre-eclampsia. Reduced perfusion as a result of abnormal placentation was proposed to be responsible for placental OS in PE. However, placental OS was also observed in normal pregnancy. The exact differences and correlation of placental OS in PE and normal pregnancy remain elusive. In this review, we attempted to link both normal pregnancy and PE on the causes of placental OS and proposed a hypothesis that placental OS in normal pregnancy, plus the exploration of other placental and/or maternal factors, could provide a novel explanation of that in PE. We concluded that pregnancy, placental abnormality and preexisting maternal constitutional conditions are three principle factors that could contribute to placental OS in PE. The specific causes in each clinical case could be heterogeneous, which requires individual analysis.

  11. Development of the Parent Form of the Preschool Children's Communication Skills Scale and Comparison of the Communication Skills of Children with Normal Development and with Autism Spectrum Disorder

    Science.gov (United States)

    Aydin, Aydan

    2016-01-01

    This study aims at developing an assessment scale for identifying preschool children's communication skills, at distinguishing children with communication deficiencies and at comparing the communication skills of children with normal development (ND) and those with autism spectrum disorder (ASD). Participants were 427 children of up to 6 years of…

  12. Anticlockwise swirl of mesenteric vessels: A normal CT appearance, retrospective analysis of 200 pediatric patients

    Energy Technology Data Exchange (ETDEWEB)

    Sodhi, Kushaljit S., E-mail: sodhiks@gmail.com [Department of Radiodiagnosis and Imaging, Post Graduate Institute of Medical Education and Research, Sector-12, Chandigarh 160012 (India); Bhatia, Anmol, E-mail: anmol_bhatia26@yahoo.co.in [Department of Radiodiagnosis and Imaging, Post Graduate Institute of Medical Education and Research, Sector-12, Chandigarh 160012 (India); Saxena, Akshay K., E-mail: fatakshay@yahoo.com [Department of Radiodiagnosis and Imaging, Post Graduate Institute of Medical Education and Research, Sector-12, Chandigarh 160012 (India); Rao, Katragadda L.N., E-mail: klnrao@hotmail.com [Department of Pediatric Surgery, Post Graduate Institute of Medical Education and Research, Sector-12, Chandigarh 160012 (India); Menon, Prema, E-mail: menonprema@hotmail.com [Department of Pediatric Surgery, Post Graduate Institute of Medical Education and Research, Sector-12, Chandigarh 160012 (India); Khandelwal, Niranjan, E-mail: khandelwaln@hotmail.com [Department of Radiodiagnosis and Imaging, Post Graduate Institute of Medical Education and Research, Sector-12, Chandigarh 160012 (India)

    2014-04-15

    Objective: The counterclockwise rotation of the SMV on SMA is a normal and non-specific finding, which results in an incomplete swirl formation on CT scans. However, it has a potential to be misinterpreted as ‘midgut volvulus’ resulting in serious clinical implications. The study was done to determine the frequency and degree of counterclockwise rotation of the SMV on SMA on CT in normal otherwise asymptomatic pediatric patients undergoing CT scan. Methods: In this IRB approved study, we retrospectively analyzed abdominal CT scan examinations of 200 consecutive pediatric patients (age range of 11 days to 18 years), which were performed for different clinical indications over a period of 10 months. They were evaluated for the absence or presence and degree of counterclockwise rotation of the mesenteric vessels. Results: Of the 200 patients, 128 (64%) patients showed no clockwise or anticlockwise rotation of mesenteric vessels. Counterclockwise rotation of SMV on SMA was seen in 72 (36%) patients. Further, the degree of rotation of vessels was also calculated, based on the criteria proposed by the authors. Conclusions: The counterclockwise rotation of SMV on SMA gives an appearance of mesenteric whirlpool in otherwise normal mesenteric vessels and can be misinterpreted as midgut volvulus. It is a normal CT appearance and is due to a variation in branching pattern of mesenteric vessels. Awareness of this normal branching pattern of mesenteric vessels is important to avoid an inadvertent laparotomy.

  13. Anticlockwise swirl of mesenteric vessels: A normal CT appearance, retrospective analysis of 200 pediatric patients

    International Nuclear Information System (INIS)

    Sodhi, Kushaljit S.; Bhatia, Anmol; Saxena, Akshay K.; Rao, Katragadda L.N.; Menon, Prema; Khandelwal, Niranjan

    2014-01-01

    Objective: The counterclockwise rotation of the SMV on SMA is a normal and non-specific finding, which results in an incomplete swirl formation on CT scans. However, it has a potential to be misinterpreted as ‘midgut volvulus’ resulting in serious clinical implications. The study was done to determine the frequency and degree of counterclockwise rotation of the SMV on SMA on CT in normal otherwise asymptomatic pediatric patients undergoing CT scan. Methods: In this IRB approved study, we retrospectively analyzed abdominal CT scan examinations of 200 consecutive pediatric patients (age range of 11 days to 18 years), which were performed for different clinical indications over a period of 10 months. They were evaluated for the absence or presence and degree of counterclockwise rotation of the mesenteric vessels. Results: Of the 200 patients, 128 (64%) patients showed no clockwise or anticlockwise rotation of mesenteric vessels. Counterclockwise rotation of SMV on SMA was seen in 72 (36%) patients. Further, the degree of rotation of vessels was also calculated, based on the criteria proposed by the authors. Conclusions: The counterclockwise rotation of SMV on SMA gives an appearance of mesenteric whirlpool in otherwise normal mesenteric vessels and can be misinterpreted as midgut volvulus. It is a normal CT appearance and is due to a variation in branching pattern of mesenteric vessels. Awareness of this normal branching pattern of mesenteric vessels is important to avoid an inadvertent laparotomy

  14. The Job Dimensions Underlying the Job Elements of the Position Analysis Questionnaire (PAQ) (Form B).

    Science.gov (United States)

    The study was concerned with the identification of the job dimension underlying the job elements of the Position Analysis Questionnaire ( PAQ ), Form B...The PAQ is a structured job analysis instrument consisting of 187 worker-oriented job elements which are divided into six a priori major divisions...The statistical procedure of principal components analysis was used to identify the job dimensions of the PAQ . Forty-five job dimensions were

  15. Total body calcium by neutron activation analysis in normals and osteoporotic populations: a discriminator of significant bone mass loss

    International Nuclear Information System (INIS)

    Ott, S.M.; Murano, R.; Lewellen, T.K.; Nelp, W.B.; Chesnut, C.M.

    1983-01-01

    Measurements of total body calcium by neutron activation (TBC) in 94 normal individuals and 86 osteoporotic patients are reported. The ability of TBC to discriminate normal from osteoporotic females was evaluated with decision analysis. Bone mineral content (BMC) by single-photon absorptiometry was also measured. TBC was higher in males (range 826 to 1363 gm vs 537 to 1054 in females) and correlated with height in all normals. In females over age 55 there was a negative correlation with age. Thus, for normals an algorithm was derived to allow comparison between measured TBC and that predicted by sex, age, and height (TBCp). In the 28 normal females over age 55, the TBC was 764 +/- 115 gm vs. 616 +/- 90 in the osteoporotics. In 63 of the osteoporotic females an estimated height, from tibial length, was used to predict TBC. In normals the TBC/TBCp ratio was 1.00 +/- 0.12, whereas in osteoporotic females it was 0.80 +/- 0.12. A receiver operating characteristic curve showed better discrimination of osteoporosis with TBC/TBCp than with wrist BMC. By using Bayes' theorem, with a 25% prevalence of osteoporosis (estimate for postmenopausal women), the posttest probability of disease was 90% when the TBC/TBCp ratio was less than 0.84. The authors conclude that a low TBC/TBCp ratio is very helpful in determining osteoporosis

  16. Volume-controlled histographic analysis of pulmonary parenchyma in normal and diffuse parenchymal lung disease: a pilot study

    International Nuclear Information System (INIS)

    Park, Hyo Yong; Lee, Jongmin; Kim, Jong Seob; Won, Chyl Ho; Kang, Duk Sik; Kim, Myoung Nam

    2000-01-01

    To evaluate the clinical usefulness of a home-made histographic analysis system using a lung volume controller. Our study involved ten healthy volunteers, ten emphysema patients, and two idiopathic pulmonary fibrosis (IPF) patients. Using a home-made lung volume controller, images were obtained in the upper, middle, and lower lung zones at 70%, 50%, and 20% of vital capacity. Electron beam tomography was used and scanning parameters were single slice mode, 10-mm slice thickness, 0.4-second scan time, and 35-cm field of view. Usinga home-made semi-automated program, pulmonary parenchyma was isolated and a histogrm then obtained. Seven histographic parameters, namely mean density (MD), density at maximal frequency (DMF), maximal ascending gradient (MAG),maximal ascending gradient density (MAGD), maximal sescending gradient (MDG), maximal descending gradient density (MDGD), and full width at half maximum (FWHM) were derived from the histogram. We compared normal controls with abnormal groups including emphysema and IPF patients at the same respiration levels. A normal histographic zone with ± 1 standard deviation was obtained. Histographic curves of normal controls shifted toward the high density level, and the width of the normal zone increased as the level of inspiration decreased. In ten normal controls, MD, DMF, MAG, MAGD, MDG, MDGD, and FWHM readings at a 70% inspiration level were lower than those at 20% (p less than0.05). At the same level of inspiration, histograms of emphysema patients were locatedat a lower density area than those of normal controls. As inspiration status decreased, histograms of emphysema patients showed diminished shift compared with those of normal controls. At 50% and 20% inspiration levels, the MD, DMF, and MAGD readings of emphysema patients were significantly lower than those of normal controls (p less than 0.05). Compared with those of normal controls, histogrms of the two IPF patients obtained at three inspiration levels were

  17. Volume-controlled histographic analysis of pulmonary parenchyma in normal and diffuse parenchymal lung disease: a pilot study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hyo Yong; Lee, Jongmin; Kim, Jong Seob; Won, Chyl Ho; Kang, Duk Sik [School of Medicine, Kyungpook National University, Taegu (Korea, Republic of); Kim, Myoung Nam [The University of Iowa (United States)

    2000-06-01

    To evaluate the clinical usefulness of a home-made histographic analysis system using a lung volume controller. Our study involved ten healthy volunteers, ten emphysema patients, and two idiopathic pulmonary fibrosis (IPF) patients. Using a home-made lung volume controller, images were obtained in the upper, middle, and lower lung zones at 70%, 50%, and 20% of vital capacity. Electron beam tomography was used and scanning parameters were single slice mode, 10-mm slice thickness, 0.4-second scan time, and 35-cm field of view. Usinga home-made semi-automated program, pulmonary parenchyma was isolated and a histogrm then obtained. Seven histographic parameters, namely mean density (MD), density at maximal frequency (DMF), maximal ascending gradient (MAG),maximal ascending gradient density (MAGD), maximal sescending gradient (MDG), maximal descending gradient density (MDGD), and full width at half maximum (FWHM) were derived from the histogram. We compared normal controls with abnormal groups including emphysema and IPF patients at the same respiration levels. A normal histographic zone with {+-} 1 standard deviation was obtained. Histographic curves of normal controls shifted toward the high density level, and the width of the normal zone increased as the level of inspiration decreased. In ten normal controls, MD, DMF, MAG, MAGD, MDG, MDGD, and FWHM readings at a 70% inspiration level were lower than those at 20% (p less than0.05). At the same level of inspiration, histograms of emphysema patients were locatedat a lower density area than those of normal controls. As inspiration status decreased, histograms of emphysema patients showed diminished shift compared with those of normal controls. At 50% and 20% inspiration levels, the MD, DMF, and MAGD readings of emphysema patients were significantly lower than those of normal controls (p less than 0.05). Compared with those of normal controls, histogrms of the two IPF patients obtained at three inspiration levels were

  18. An analysis of longitudinal data with nonignorable dropout using the truncated multivariate normal distribution

    NARCIS (Netherlands)

    Jolani, Shahab

    2014-01-01

    For a vector of multivariate normal when some elements, but not necessarily all, are truncated, we derive the moment generating function and obtain expressions for the first two moments involving the multivariate hazard gradient. To show one of many applications of these moments, we then extend the

  19. Stewart analysis of apparently normal acid-base state in the critically ill

    NARCIS (Netherlands)

    Moviat, M.; Boogaard, M. van den; Intven, F.; Voort, P. van der; Hoeven, H. van der; Pickkers, P.

    2013-01-01

    PURPOSE: This study aimed to describe Stewart parameters in critically ill patients with an apparently normal acid-base state and to determine the incidence of mixed metabolic acid-base disorders in these patients. MATERIALS AND METHODS: We conducted a prospective, observational multicenter study of

  20. Algebraic method for analysis of nonlinear systems with a normal matrix

    International Nuclear Information System (INIS)

    Konyaev, Yu.A.; Salimova, A.F.

    2014-01-01

    A promising method has been proposed for analyzing a class of quasilinear nonautonomous systems of differential equations whose matrix can be represented as a sum of nonlinear normal matrices, which makes it possible to analyze stability without using the Lyapunov functions [ru

  1. Restoring normal eating behaviour in adolescents with anorexia nervosa: A video analysis of nursing interventions

    NARCIS (Netherlands)

    Beukers, L.; Berends, T.; van Ginkel, J.; van Elburg, A.A.; van Meijel, B.

    2015-01-01

    An important part of inpatient treatment for adolescents with anorexia nervosa is to restore normal eating behaviour. Health-care professionals play a significant role in this process, but little is known about their interventions during patients' meals. The purpose of the present study was to

  2. Restoring normal eating behaviour in adolescents with anorexia nervosa : A video analysis of nursing interventions

    NARCIS (Netherlands)

    Beukers, Laura; Berends, Tamara; de Man-van Ginkel, Janneke M; van Elburg, Annemarie A; van Meijel, Berno

    2015-01-01

    An important part of inpatient treatment for adolescents with anorexia nervosa is to restore normal eating behaviour. Health-care professionals play a significant role in this process, but little is known about their interventions during patients' meals. The purpose of the present study was to

  3. Morphometric connectivity analysis to distinguish normal, mild cognitive impaired, and Alzheimer subjects based on brain MRI

    DEFF Research Database (Denmark)

    Erleben, Lene Lillemark; Sørensen, Lauge; Mysling, Peter

    2013-01-01

    This work investigates a novel way of looking at the regions in the brain and their relationship as possible markers to classify normal control (NC), mild cognitive impaired (MCI), and Alzheimer Disease (AD) subjects. MRI scans from a subset of 101 subjects from the ADNI study at baseline was used...

  4. A new modified histogram matching normalization for time series microarray analysis

    NARCIS (Netherlands)

    Astola, L.J.; Molenaar, J.

    2014-01-01

    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on

  5. Immunohistochemical analysis of oxidative stress and DNA repair proteins in normal mammary and breast cancer tissues

    International Nuclear Information System (INIS)

    Curtis, Carol D; Thorngren, Daniel L; Nardulli, Ann M

    2010-01-01

    During the course of normal cellular metabolism, oxygen is consumed and reactive oxygen species (ROS) are produced. If not effectively dissipated, ROS can accumulate and damage resident proteins, lipids, and DNA. Enzymes involved in redox regulation and DNA repair dissipate ROS and repair the resulting damage in order to preserve a functional cellular environment. Because increased ROS accumulation and/or unrepaired DNA damage can lead to initiation and progression of cancer and we had identified a number of oxidative stress and DNA repair proteins that influence estrogen responsiveness of MCF-7 breast cancer cells, it seemed possible that these proteins might be differentially expressed in normal mammary tissue, benign hyperplasia (BH), ductal carcinoma in situ (DCIS) and invasive breast cancer (IBC). Immunohistochemistry was used to examine the expression of a number of oxidative stress proteins, DNA repair proteins, and damage markers in 60 human mammary tissues which were classified as BH, DCIS or IBC. The relative mean intensity was determined for each tissue section and ANOVA was used to detect statistical differences in the relative expression of BH, DCIS and IBC compared to normal mammary tissue. We found that a number of these proteins were overexpressed and that the cellular localization was altered in human breast cancer tissue. Our studies suggest that oxidative stress and DNA repair proteins not only protect normal cells from the damaging effects of ROS, but may also promote survival of mammary tumor cells

  6. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory.

    Science.gov (United States)

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-12-02

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and shear strains) were measured to reveal directional growth information every 3 months during the first year of life. Directional normal strain maps revealed that, during the first 6 months, the growth pattern of gray matter is anisotropic and spatially inhomogeneous with higher left-right stretch around the temporal lobe and interhemispheric fissure, anterior-posterior stretch in the frontal and occipital lobes, and superior-inferior stretch in right inferior occipital and right inferior temporal gyri. In contrast, anterior lateral ventricles and insula showed an isotropic stretch pattern. Volumetric and directional growth rates were linearly decreased with age for most of the cortical regions. Our results revealed anisotropic and inhomogeneous brain growth patterns of the human brain during the first year of life using longitudinal MRI and a biomechanical framework.

  7. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  8. An analysis of normalization methods for Drosophila RNAi genomic screens and development of a robust validation scheme

    Science.gov (United States)

    Wiles, Amy M.; Ravi, Dashnamoorthy; Bhavani, Selvaraj; Bishop, Alexander J.R.

    2010-01-01

    Genome-wide RNAi screening is a powerful, yet relatively immature technology that allows investigation into the role of individual genes in a process of choice. Most RNAi screens identify a large number of genes with a continuous gradient in the assessed phenotype. Screeners must then decide whether to examine just those genes with the most robust phenotype or to examine the full gradient of genes that cause an effect and how to identify the candidate genes to be validated. We have used RNAi in Drosophila cells to examine viability in a 384-well plate format and compare two screens, untreated control and treatment. We compare multiple normalization methods, which take advantage of different features within the data, including quantile normalization, background subtraction, scaling, cellHTS2 1, and interquartile range measurement. Considering the false-positive potential that arises from RNAi technology, a robust validation method was designed for the purpose of gene selection for future investigations. In a retrospective analysis, we describe the use of validation data to evaluate each normalization method. While no normalization method worked ideally, we found that a combination of two methods, background subtraction followed by quantile normalization and cellHTS2, at different thresholds, captures the most dependable and diverse candidate genes. Thresholds are suggested depending on whether a few candidate genes are desired or a more extensive systems level analysis is sought. In summary, our normalization approaches and experimental design to perform validation experiments are likely to apply to those high-throughput screening systems attempting to identify genes for systems level analysis. PMID:18753689

  9. A comparison of various "housekeeping" probes for northern analysis of normal and osteoarthritic articular cartilage RNA.

    Science.gov (United States)

    Matyas, J R; Huang, D; Adams, M E

    1999-01-01

    Several approaches are commonly used to normalize variations in RNA loading on Northern blots, including: ethidium bromide (EthBr) fluorescence of 18S or 28S rRNA or autoradiograms of radioactive probes hybridized with constitutively expressed RNAs such as elongation factor-1alpha (ELF), glyceraldehyde-3-phosphate dehydrogenase (G3PDH), actin, 18S or 28S rRNA, or others. However, in osteoarthritis (OA) the amount of total RNA changes significantly and none of these RNAs has been clearly demonstrated to be expressed at a constant level, so it is unclear if any of these approaches can be used reliably for normalizing RNA extracted from osteoarthritic cartilage. Total RNA was extracted from normal and osteoarthritic cartilage and assessed by EthBr fluorescence. RNA was then transferred to a nylon membrane hybridized with radioactive probes for ELF, G3PDH, Max, actin, and an oligo-dT probe. The autoradiographic signal across the six lanes of a gel was quantified by scanning densitometry. When compared on the basis of total RNA, the coefficient of variation was lowest for 28S ethidium bromide fluorescence and oligo-dT (approximately 7%), followed by 18S ethidium bromide fluorescence and G3PDH (approximately 13%). When these values were normalized to DNA concentration, the coefficient of variation exceeded 50% for all signals. Total RNA and the signals for 18S, 28S rRNA, and oligo-dT all correlated highly. These data indicate that osteoarthritic chondrocytes express similar ratios of mRNA to rRNA and mRNA to total RNA as do normal chondrocytes. Of all the "housekeeping" probes, G3PDH correlated best with the measurements of RNA. All of these "housekeeping" probes are expressed at greater levels by osteoarthritic chondrocytes when compared with normal chondrocytes. Thus, while G3PDH is satisfactory for evaluating the amount of RNA loaded, its level of expression is not the same in normal and osteoarthritic chondrocytes.

  10. Quantitative electroencephalogram (QEEG Spectrum Analysis of Patients with Schizoaffective Disorder Compared to Normal Subjects.

    Directory of Open Access Journals (Sweden)

    Mahdi Moeini

    2014-12-01

    Full Text Available The aim of this study was to achieve a better understanding of schizoaffective disorder. Therefore, we obtained electroencephalogram (EEG signals from patients with schizoaffective disorder and analyzed them in comparison to normal subjects.Forty patients with schizoaffective disorder and 40 normal subjects were selected randomly and their electroencephalogram signals were recorded based on 10-20 international system by 23 electrodes in open- and closed-eyes while they were sitting on a chair comfortably. After preprocessing for noise removal and artifact reduction, we took 60- second segments from each recorded signals. Then, the absolute and relative powers of these segments were evaluated in all channels and in 4 frequency bands (i.e., delta, theta, alpha and beta waves. Finally, Data were analyzed by independent t-test using SPSS software.A significant decrease in relative power in the alpha band, a significant decrease in power spectra in the alpha band and a significant increase in power spectra in the beta band were found in patients compared to normal subjects (P < 0.05. The predominant wave in the centro-parietal region was the beta wave in patients, but it was the alpha band in normal subjects (P = 0.048. Also, the predominant wave of the occipital region in patients was the delta wave, while it was the alpha wave in normal subjects (P = 0.038.Considering the findings, particularly based on the significant decrease of the alpha waves in schizoaffective patients, it can be concluded that schizoaffective disorder can be seen in schizophrenia spectrum.

  11. Static and Vibrational Analysis of Partially Composite Beams Using the Weak-Form Quadrature Element Method

    Directory of Open Access Journals (Sweden)

    Zhiqiang Shen

    2012-01-01

    Full Text Available Deformation of partially composite beams under distributed loading and free vibrations of partially composite beams under various boundary conditions are examined in this paper. The weak-form quadrature element method, which is characterized by direct evaluation of the integrals involved in the variational description of a problem, is used. One quadrature element is normally sufficient for a partially composite beam regardless of the magnitude of the shear connection stiffness. The number of integration points in a quadrature element is adjustable in accordance with convergence requirement. Results are compared with those of various finite element formulations. It is shown that the weak form quadrature element solution for partially composite beams is free of slip locking, and high computational accuracy is achieved with smaller number of degrees of freedom. Besides, it is found that longitudinal inertia of motion cannot be simply neglected in assessment of dynamic behavior of partially composite beams.

  12. Effect of care management program structure on implementation: a normalization process theory analysis.

    Science.gov (United States)

    Holtrop, Jodi Summers; Potworowski, Georges; Fitzpatrick, Laurie; Kowalk, Amy; Green, Lee A

    2016-08-15

    Care management in primary care can be effective in helping patients with chronic disease improve their health status, however, primary care practices are often challenged with implementation. Further, there are different ways to structure care management that may make implementation more or less successful. Normalization process theory (NPT) provides a means of understanding how a new complex intervention can become routine (normalized) in practice. In this study, we used NPT to understand how care management structure affected how well care management became routine in practice. Data collection involved semi-structured interviews and observations conducted at 25 practices in five physician organizations in Michigan, USA. Practices were selected to reflect variation in physician organizations, type of care management program, and degree of normalization. Data were transcribed, qualitatively coded and analyzed, initially using an editing approach and then a template approach with NPT as a guiding framework. Seventy interviews and 25 observations were completed. Two key structures for care management organization emerged: practice-based care management where the care managers were embedded in the practice as part of the practice team; and centralized care management where the care managers worked independently of the practice work flow and was located outside the practice. There were differences in normalization of care management across practices. Practice-based care management was generally better normalized as compared to centralized care management. Differences in normalization were well explained by the NPT, and in particular the collective action construct. When care managers had multiple and flexible opportunities for communication (interactional workability), had the requisite knowledge, skills, and personal characteristics (skill set workability), and the organizational support and resources (contextual integration), a trusting professional relationship

  13. NOMAD-Ref: visualization, deformation and refinement of macromolecular structures based on all-atom normal mode analysis.

    Science.gov (United States)

    Lindahl, Erik; Azuara, Cyril; Koehl, Patrice; Delarue, Marc

    2006-07-01

    Normal mode analysis (NMA) is an efficient way to study collective motions in biomolecules that bypasses the computational costs and many limitations associated with full dynamics simulations. The NOMAD-Ref web server presented here provides tools for online calculation of the normal modes of large molecules (up to 100,000 atoms) maintaining a full all-atom representation of their structures, as well as access to a number of programs that utilize these collective motions for deformation and refinement of biomolecular structures. Applications include the generation of sets of decoys with correct stereochemistry but arbitrary large amplitude movements, the quantification of the overlap between alternative conformations of a molecule, refinement of structures against experimental data, such as X-ray diffraction structure factors or Cryo-EM maps and optimization of docked complexes by modeling receptor/ligand flexibility through normal mode motions. The server can be accessed at the URL http://lorentz.immstr.pasteur.fr/nomad-ref.php.

  14. Differences in trace element concentrations between Alzheimer and 'normal' human brain tissue using instrumental neutron activation analysis (INAA)

    International Nuclear Information System (INIS)

    Panayi, A.E.; Spyrou, N.M.

    2001-01-01

    Brain samples obtained from the Netherlands Brain Bank were taken from the superior frontal gyrus, superior parietal gyrus and medial temporal gyrus of 'normal' and Alzheimer's disease subjects in order to determine elemental concentrations and compare elemental composition. Brain samples from the cortex were taken from 18 subjects, eight 'normals' (6 males and 2 females) and eleven with Alzheimer's disease, (1 male and 10 females) and the following elemental concentrations, Na, K, Fe, Zn, Se, Br, Rb, Ag, Cs, Ba, and Eu were determined by instrumental neutron activation analysis (INAA). The element which showed the greatest difference was Br, which was found to be significantly elevated in the cortex of Alzheimer's disease brains as compared to the 'normals' at significance (p < 0.001). (author)

  15. Numerical analysis of the effects induced by normal faults and dip angles on rock bursts

    Science.gov (United States)

    Jiang, Lishuai; Wang, Pu; Zhang, Peipeng; Zheng, Pengqiang; Xu, Bin

    2017-10-01

    The study of mining effects under the influences of a normal fault and its dip angle is significant for the prediction and prevention of rock bursts. Based on the geological conditions of panel 2301N in a coalmine, the evolution laws of the strata behaviors of the working face affected by a fault and the instability of the fault induced by mining operations with the working face of the footwall and hanging wall advancing towards a normal fault are studied using UDEC numerical simulation. The mechanism that induces rock burst is revealed, and the influence characteristics of the fault dip angle are analyzed. The results of the numerical simulation are verified by conducting a case study regarding the microseismic events. The results of this study serve as a reference for the prediction of rock bursts and their classification into hazardous areas under similar conditions.

  16. [Quantitative analysis method based on fractal theory for medical imaging of normal brain development in infants].

    Science.gov (United States)

    Li, Heheng; Luo, Liangping; Huang, Li

    2011-02-01

    The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P development.

  17. Analysis of adaptability of radioactive liquid effluent discharge under normal condition of inland nuclear power plant

    International Nuclear Information System (INIS)

    Xu Yueping; Zhang Bing; Chen Yang; Zhu Lingqing; Tao Yunliang; Shangguan Zhihong

    2011-01-01

    The discharge of radioactive liquid effluent from inland nuclear power plant under normal operation is an important part to be considered in environmental impact assessment. Requirements of newly revised and upcoming standards GB 6249 and GB 14587 are introduced in this paper. Through an example of an inland NPP siting in the preliminary feasibility study phase, the adaptability to the relevant regulations in the site selection is analyzed. Also, the concerned problems in the design of AP1000 units are addressed. (authors)

  18. Reactor internals design/analysis for normal, upset, and faulted conditions

    International Nuclear Information System (INIS)

    Burke, F.R.

    1977-06-01

    The analytical procedures used by Babcock and Wilcox to demonstrate the structural integrity of the 205-FA reactor internals are described. Analytical results are presented and compared to ASME Code allowable limits for Normal, Upset, and Faulted conditions. The particular faulted condition considered is a simultaneous loss-of-coolant accident and safe shutdown earthquake. The operating basis earthquake is addressed as an Upset condition

  19. Hyaluronic Acid in Normal and Neoplastic Colorectal Tissue: Electrospray Ionization Mass Spectrometric and Fluor Metric Analysis

    Directory of Open Access Journals (Sweden)

    Ana Paula Cleto Marolla

    2016-01-01

    Conclusions: The expression of HA was found to be slightly lower in tumor tissue than in colorectal non-neoplastic mucosa, although this difference was not statistically significant. This finding probably influenced the lower expression of HA in tumor tissue than in colorectal non-neoplastic mucosa. Compared to normal tissues, HA levels are significantly increased in the tumor tissues unless they exhibit lymph node metastasis. Otherwise, the expression of HA in tumor tissue did not correlated with the other clinicopathological parameters.

  20. Longitudinal genetic analysis of brain volumes in normal elderly male twins

    OpenAIRE

    Lessov-Schlaggar, Christina N.; Hardin, Jill; DeCarli, Charles; Krasnow, Ruth E.; Reed, Terry; Wolf, Philip A.; Swan, Gary E.; Carmelli, Dorit

    2010-01-01

    This study investigated the role of genetic and environmental influences on individual differences in brain volumes measured at two time points in normal elderly males from the National Heart, Lung, and Blood Institute Twin Study. The MRI scans were conducted four years apart on 33 monozygotic and 33 dizygotic male twin pairs, aged 68 to 77 years when first scanned. Volumetric measures of total brain and total cerebrospinal fluid were significantly heritable at baseline (over 70%). For both v...

  1. Three-dimensional finite analysis of acetabular contact pressure and contact area during normal walking.

    Science.gov (United States)

    Wang, Guangye; Huang, Wenjun; Song, Qi; Liang, Jinfeng

    2017-11-01

    This study aims to analyze the contact areas and pressure distributions between the femoral head and mortar during normal walking using a three-dimensional finite element model (3D-FEM). Computed tomography (CT) scanning technology and a computer image processing system were used to establish the 3D-FEM. The acetabular mortar model was used to simulate the pressures during 32 consecutive normal walking phases and the contact areas at different phases were calculated. The distribution of the pressure peak values during the 32 consecutive normal walking phases was bimodal, which reached the peak (4.2 Mpa) at the initial phase where the contact area was significantly higher than that at the stepping phase. The sites that always kept contact were concentrated on the acetabular top and leaned inwards, while the anterior and posterior acetabular horns had no pressure concentration. The pressure distributions of acetabular cartilage at different phases were significantly different, the zone of increased pressure at the support phase distributed at the acetabular top area, while that at the stepping phase distributed in the inside of acetabular cartilage. The zones of increased contact pressure and the distributions of acetabular contact areas had important significance towards clinical researches, and could indicate the inductive factors of acetabular osteoarthritis. Copyright © 2016. Published by Elsevier Taiwan.

  2. [Nutritional analysis of dietary patterns in students of primary education with normal nutritional status].

    Science.gov (United States)

    Durá-Gúrpide, Beatriz; Durá-Travé, Teodoro

    2014-06-01

    To perform a nutritional assessment of the dietary model in a group of primary school students (9-12 years) with a normal nutritional status. Recording of food consumption of two consecutive school days in a sample of 353 primary school students (188 boys and 165 girls) with normal nutritional situation. The intake of energy, macronutrients, minerals, and vitamins was calculated and compared with the recommended intakes. The mean value of daily caloric intake was 2,066.9 kcal. Grains (33%), dairy products (19%) and meats (17%) represented 70% of the total caloric intake. Proteins contributed with 20.3% of the caloric intake, sugars 48.8%, lipids 30.9%, and saturated fats 12.6%. Cholesterol intake was excessive and 2/3 of the caloric intake was of animal origin. The mean intake of calcium, iodine and A, D and E vitamins were lower than de recommended dietary intakes. The dietary model of the primary school students with normal nutritional status varies from the Mediterranean prototype, with an excessive intake of meats, limited intake of grains and dairy products, and deficient intake of vegetables, fruits, legumes, and fishes. This leads to an increase in the intake of proteins and fats from animals with a detriment of complex carbohydrates and a deficient intake of calcium, iodine, and vitamins A, D y E. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  3. Elemental concentration analysis in PCa, BPH and normal prostate tissues using SR-TXRF

    International Nuclear Information System (INIS)

    Leitao, Roberta G.; Anjos, Marcelino J.; Canellas, Catarine G.L.; Lopes, Ricardo T.

    2009-01-01

    Prostate cancer (PCa) is one of the main causes of illness and death all over the world. In Brazil, prostate cancer currently represents the second most prevalent malignant neoplasia in men, representing 21% of all cancer cases. Benign Prostate Hyperplasia (BPH) is an illness prevailing in men above the age of 50, close to 90% after the age of 80. The prostate presents a high zinc concentration, about 10-fold higher than any other body tissue. In this work, samples of human prostate tissues with cancer (PCa), BPH and normal tissue were analyzed utilizing the total reflection X-ray fluorescence spectroscopy using synchrotron radiation technique (SRTXRF) to investigate the differences in the elemental concentrations in these tissues. SR-TXRF analyses were performed at the X-Ray fluorescence beamline at Brazilian National Synchrotron Light Laboratory (LNLS), in Campinas, Sao Paulo. It was possible to determine the concentrations of the following elements: P, S, K, Ca, Fe, Cu, Zn, Br and Rb. By using Mann-Whitney U test it was observed that almost all elements presented concentrations with significant differences α = 0.05) between the groups studied. The elements and groups were: S, K, Ca, Fe, Zn, Br and Rb (PCa X Normal); S, Fe, Zn and Br (PCa X BPH); K, Ca, Fe, Zn, Br and Rb (BPH X Normal). (author)

  4. Quantitative electroencephalogram (QEEG) Spectrum Analysis of Patients with Schizoaffective Disorder Compared to Normal Subjects.

    Science.gov (United States)

    Moeini, Mahdi; Khaleghi, Ali; Amiri, Nasrin; Niknam, Zahra

    2014-10-01

    The aim of this study was to achieve a better understanding of schizoaffective disorder. Therefore, we obtained electroencephalogram (EEG) signals from patients with schizoaffective disorder and analyzed them in comparison to normal subjects. Forty patients with schizoaffective disorder and 40 normal subjects were selected randomly and their electroencephalogram signals were recorded based on 10-20 international system by 23 electrodes in open- and closed-eyes while they were sitting on a chair comfortably. After preprocessing for noise removal and artifact reduction, we took 60- second segments from each recorded signals. Then, the absolute and relative powers of these segments were evaluated in all channels and in 4 frequency bands (i.e., delta, theta, alpha and beta waves). Finally, Data were analyzed by independent t-test using SPSS software. A significant decrease in relative power in the alpha band, a significant decrease in power spectra in the alpha band and a significant increase in power spectra in the beta band were found in patients compared to normal subjects (P schizoaffective patients, it can be concluded that schizoaffective disorder can be seen in schizophrenia spectrum.

  5. Effect of psychological intervention in the form of relaxation and guided imagery on cellular immune function in normal healthy subjects. An overview

    DEFF Research Database (Denmark)

    Zachariae, R; Kristensen, J S; Hokland, P

    1991-01-01

    The present study measured the effects of relaxation and guided imagery on cellular immune function. During a period of 10 days 10 healthy subjects were given one 1-hour relaxation procedure and one combined relaxation and guided imagery procedure, instructing the subjects to imagine their immune...... on the immune defense and could form the basis of further studies on psychological intervention and immunological status. Udgivelsesdato: 1990-null...

  6. Matrix forming characteristics of inner and outer human meniscus cells on 3D collagen scaffolds under normal and low oxygen tensions.

    Science.gov (United States)

    Croutze, Roger; Jomha, Nadr; Uludag, Hasan; Adesida, Adetola

    2013-12-13

    Limited intrinsic healing potential of the meniscus and a strong correlation between meniscal injury and osteoarthritis have prompted investigation of surgical repair options, including the implantation of functional bioengineered constructs. Cell-based constructs appear promising, however the generation of meniscal constructs is complicated by the presence of diverse cell populations within this heterogeneous tissue and gaps in the information concerning their response to manipulation of oxygen tension during cell culture. Four human lateral menisci were harvested from patients undergoing total knee replacement. Inner and outer meniscal fibrochondrocytes (MFCs) were expanded to passage 3 in growth medium supplemented with basic fibroblast growth factor (FGF-2), then embedded in porous collagen type I scaffolds and chondrogenically stimulated with transforming growth factor β3 (TGF-β3) under 21% (normal or normoxic) or 3% (hypoxic) oxygen tension for 21 days. Following scaffold culture, constructs were analyzed biochemically for glycosaminoglycan production, histologically for deposition of extracellular matrix (ECM), as well as at the molecular level for expression of characteristic mRNA transcripts. Constructs cultured under normal oxygen tension expressed higher levels of collagen type II (p = 0.05), aggrecan (p oxygen tension. There was no significant difference in expression of these genes between scaffolds seeded with MFCs isolated from inner or outer regions of the tissue following 21 days chondrogenic stimulation (p > 0.05). Cells isolated from inner and outer regions of the human meniscus demonstrated equivalent differentiation potential toward chondrogenic phenotype and ECM production. Oxygen tension played a key role in modulating the redifferentiation of meniscal fibrochondrocytes on a 3D collagen scaffold in vitro.

  7. Reconstructing Normality

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov

    2012-01-01

    Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...

  8. Tag Questions across Irish English and British English: A Corpus Analysis of Form and Function

    Science.gov (United States)

    Barron, Anne; Pandarova, Irina; Muderack, Karoline

    2015-01-01

    The present study, situated in the area of variational pragmatics, contrasts tag question (TQ) use in Ireland and Great Britain using spoken data from the Irish and British components of the International Corpus of English (ICE). Analysis is on the formal and functional level and also investigates form-functional relationships. Findings reveal…

  9. Forms of Fighting: A Micro-Social Analysis of Bullying and In-School Violence

    Science.gov (United States)

    Malette, Nicole

    2017-01-01

    Current empirical research on youth bullying rarely asks students to describe their violent encounters. This practice conflates incidents of aggression that may actually have different forms and features. In this article I provide the results of a qualitative analysis of retrospective interviews with high school youth about their experiences of…

  10. Vitality Forms Processing in the Insula during Action Observation: A Multivoxel Pattern Analysis.

    Science.gov (United States)

    Di Cesare, Giuseppe; Valente, Giancarlo; Di Dio, Cinzia; Ruffaldi, Emanuele; Bergamasco, Massimo; Goebel, Rainer; Rizzolatti, Giacomo

    2016-01-01

    Observing the style of an action done by others allows the observer to understand the cognitive state of the agent. This information has been defined by Stern "vitality forms". Previous experiments showed that the dorso-central insula is selectively active both during vitality form observation and execution. In the present study, we presented participants with videos showing hand actions performed with different velocities and asked them to judge either their vitality form (gentle, neutral, rude) or their velocity (slow, medium, fast). The aim of the present study was to assess, using multi-voxel pattern analysis, whether vitality forms and velocities of observed goal-directed actions are differentially processed in the insula, and more specifically whether action velocity is encoded per se or it is an element that triggers neural populations of the insula encoding the vitality form. The results showed that, consistently across subjects, in the dorso-central sector of the insula there were voxels selectively tuned to vitality forms, while voxel tuned to velocity were rare. These results indicate that the dorso-central insula, which previous data showed to be involved in the vitality form processing, contains voxels specific for the action style processing.

  11. Analysis of approaches to classification of forms of non-standard employment

    Directory of Open Access Journals (Sweden)

    N. V. Dorokhova

    2017-01-01

    Full Text Available Currently becoming more widespread non-standard forms of employment. If this is not clear approach to the definition and maintenance of non-standard employment. In the article the analysis of diverse interpretations of the concept, on what basis, the author makes a conclusion about the complexity and contradictory nature of precarious employment as an economic category. It examines different approaches to classification of forms of precarious employment. The main forms of precarious employment such as flexible working year, flexible working week, flexible working hours, remote work, employees on call, shift forwarding; Agency employment, self-employment, negotiator, underemployment, over employment, employment on the basis of fixed-term contracts employment based on contract of civil-legal nature, one-time employment, casual employment, temporary employment, secondary employment and part-time. The author’s approach to classification of non-standard forms of employment, based on identifying the impact of atypical employment on the development of human potential. For the purpose of classification of non-standard employment forms from the standpoint of their impact on human development as the criteria of classification proposed in the following: working conditions, wages and social guarantees, possibility of workers ' participation in management, personal development and self-employment stability. Depending on what value each of these criteria, some form of non-standard employment can be attributed to the progressive or regressive. Classification of non-standard forms of employment should be the basis of the state policy of employment management.

  12. Quantitative Analysis of Torso FDG-PET Scans by Using Anatomical Standardization of Normal Cases from Thorough Physical Examinations.

    Directory of Open Access Journals (Sweden)

    Takeshi Hara

    Full Text Available Understanding of standardized uptake value (SUV of 2-deoxy-2-[18F]fluoro-d-glucose positron emission tomography (FDG-PET depends on the background accumulations of glucose because the SUV often varies the status of patients. The purpose of this study was to develop a new method for quantitative analysis of SUV of FDG-PET scan images. The method included an anatomical standardization and a statistical comparison with normal cases by using Z-score that are often used in SPM or 3D-SSP approach for brain function analysis. Our scheme consisted of two approaches, which included the construction of a normal model and the determination of the SUV scores as Z-score index for measuring the abnormality of an FDG-PET scan image. To construct the normal torso model, all of the normal images were registered into one shape, which indicated the normal range of SUV at all voxels. The image deformation process consisted of a whole body rigid registration of shoulder to bladder region and liver registration and a non-linear registration of body surface by using the thin-plate spline technique. In order to validate usefulness of our method, we segment suspicious regions on FDG-PET images manually, and obtained the Z-scores of the regions based on the corresponding voxels that stores the mean and the standard deviations from the normal model. We collected 243 (143 males and 100 females normal cases to construct the normal model. We also extracted 432 abnormal spots from 63 abnormal cases (73 cancer lesions to validate the Z-scores. The Z-scores of 417 out of 432 abnormal spots were higher than 2.0, which statistically indicated the severity of the spots. In conclusions, the Z-scores obtained by our computerized scheme with anatomical standardization of torso region would be useful for visualization and detection of subtle lesions on FDG-PET scan images even when the SUV may not clearly show an abnormality.

  13. Estimation of normal hydration in dialysis patients using whole body and calf bioimpedance analysis.

    Science.gov (United States)

    Zhu, Fansan; Kotanko, Peter; Handelman, Garry J; Raimann, Jochen G; Liu, Li; Carter, Mary; Kuhlmann, Martin K; Seibert, Eric; Leonard, Edward F; Levin, Nathan W

    2011-07-01

    Prescription of an appropriate dialysis target weight (dry weight) requires accurate evaluation of the degree of hydration. The aim of this study was to investigate whether a state of normal hydration (DW(cBIS)) as defined by calf bioimpedance spectroscopy (cBIS) and conventional whole body bioimpedance spectroscopy (wBIS) could be characterized in hemodialysis (HD) patients and normal subjects (NS). wBIS and cBIS were performed in 62 NS (33 m/29 f) and 30 HD patients (16 m/14 f) pre- and post-dialysis treatments to measure extracellular resistance and fluid volume (ECV) by the whole body and calf bioimpedance methods. Normalized calf resistivity (ρ(N)(,5)) was defined as resistivity at 5 kHz divided by the body mass index. The ratio of wECV to total body water (wECV/TBW) was calculated. Measurements were made at baseline (BL) and at DW(cBIS) following the progressive reduction of post-HD weight over successive dialysis treatments until the curve of calf extracellular resistance is flattened (stabilization) and the ρ(N)(,5) was in the range of NS. Blood pressures were measured pre- and post-HD treatment. ρ(N)(,5) in males and females differed significantly in NS. In patients, ρ(N)(,5) notably increased with progressive decrease in body weight, and systolic blood pressure significantly decreased pre- and post-HD between BL and DW(cBIS) respectively. Although wECV/TBW decreased between BL and DW(cBIS), the percentage of change in wECV/TBW was significantly less than that in ρ(N)(,5) (-5.21 ± 3.2% versus 28 ± 27%, p hydration between BL and DW(cBIS).

  14. Screw-Home Movement of the Tibiofemoral Joint during Normal Gait: Three-Dimensional Analysis.

    Science.gov (United States)

    Kim, Ha Yong; Kim, Kap Jung; Yang, Dae Suk; Jeung, Sang Wook; Choi, Han Gyeol; Choy, Won Sik

    2015-09-01

    The purpose of this study was to evaluate the screw-home movement at the tibiofemoral joint during normal gait by utilizing the 3-dimensional motion capture technique. Fifteen young males and fifteen young females (total 60 knee joints) who had no history of musculoskeletal disease or a particular gait problem were included in this study. Two more markers were attached to the subject in addition to the Helen-Hayes marker set. Thus, two virtual planes, femoral coronal plane (P f ) and tibial coronal plane (P t ), were created by Skeletal Builder software. This study measured the 3-dimensional knee joint movement in the sagittal, coronal, and transverse planes of these two virtual planes (P f and P t ) during normal gait. With respect to kinematics and kinetics, both males and females showed normal adult gait patterns, and the mean difference in the temporal gait parameters was not statistically significant (p > 0.05). In the transverse plane, the screw-home movement occurred as expected during the pre-swing phase and the late-swing phase at an angle of about 17°. However, the tibia rotated externally with respect to the femur, rather than internally, while the knee joint started to flex during the loading response (paradoxical screw-home movement), and the angle was 6°. Paradoxical screw-home movement may be an important mechanism that provides stability to the knee joint during the remaining stance phase. Obtaining the kinematic values of the knee joint during gait can be useful in diagnosing and treating the pathological knee joints.

  15. Normal-mode-based analysis of electron plasma waves with second-order Hermitian formalism

    Science.gov (United States)

    Ramos, J. J.; White, R. L.

    2018-03-01

    The classic problem of the dynamic evolution and Landau damping of linear Langmuir electron waves in a collisionless plasma with Maxwellian background is cast as a second-order, self-adjoint problem with a continuum spectrum of real and positive squared frequencies. The corresponding complete basis of singular normal modes is obtained, along with their orthogonality relation. This yields easily the general expression of the time-reversal-invariant solution for any initial-value problem. Examples are given for specific initial conditions that illustrate different behaviors of the Landau-damped macroscopic moments of the perturbations.

  16. Twist–radial normal mode analysis in double-stranded DNA chains

    International Nuclear Information System (INIS)

    Torrellas, Germán; Maciá, Enrique

    2012-01-01

    We study the normal modes of a duplex DNA chain at low temperatures. We consider the coupling between the hydrogen-bond radial oscillations and the twisting motion of each base pair within the Peyrard–Bishop–Dauxois model. The coupling is mediated by the stacking interaction between adjacent base pairs along the helix. We explicitly consider different mass values for different nucleotides, extending previous works. We disclose several resonance conditions of interest, determined by the fine-tuning of certain model parameters. The role of these dynamical effects on the DNA chain charge transport properties is discussed.

  17. Comparative analysis of guide mode of government - oriented industry guidance funds under china’s new normal of economic growth

    Science.gov (United States)

    Sun, Chunling; Cheng, Xuemei

    2017-11-01

    The government-oriented industry guidance Funds solve the problem of financing difficulty and high innovation under the background of China’s new normal. Through the provinces and cities of the policies and regulations of the collation and comparative analysis, it will be divided into three modes. And then compare among three modes and analyze applicability to guide the construction of provinces and cities.

  18. Towards the normalization of cybercrime victimization : A routine activities analysis of cybercrime in europe

    NARCIS (Netherlands)

    Junger, Marianne; Montoya, L.; Hartel, Pieter H.; Heydari, Maliheh

    2017-01-01

    This study investigates the relationships between users' routine activities and socio-economic characteristics and three forms of cybercrime victimization of 1) online shopping fraud, 2) online banking fraud and 3) cyber-attacks (i.e. DDoS attacks). Data from the Eurobarometer, containing a sample

  19. Comparative analysis of gene expression in normal and cancer human prostate cell lines

    Directory of Open Access Journals (Sweden)

    E. E. Rosenberg

    2014-04-01

    Full Text Available Prostate cancer is one of the main causes of mortality in men with malignant tumors. The urgent problem was a search for biomarkers of prostate cancer, which would allow distinguishing between aggressive metastatic and latent tumors. The aim of this work was to search for differentially expressed genes in normal epithelial cells PNT2 and prostate cancer cell lines LNCaP, DU145 and PC3, produced from tumors with different aggressiveness and metas­tatic ability. Such genes might be used to create a panel of prognostic markers for aggressiveness and metastasis. Relative gene expression of 65 cancer-related genes was determined by the quantitative polymerase chain reaction (Q-PCR. Expression of 29 genes was changed in LNCaP cells, 20 genes in DU145 and 16 genes in PC3 cell lines, compared with normal line PNT2. The obtained data make it possible to conclude that the epithelial-mesenchymal cell transition took place, which involved the loss of epithelial markers, reduced cell adhesion and increased migration. We have also found few differentially expressed genes among 3 prostate cancer cell lines. We have found that genes, involved in cell adhesion (CDH1, invasiveness and metastasis (IL8, CXCL2 and cell cycle control (P16, CCNE1 underwent most changes. These genes might be used for diagnosis and prognosis of invasive metastatic prostate tumors.

  20. Mechanical stress analysis for a fuel rod under normal operating conditions

    International Nuclear Information System (INIS)

    Pino, Eddy S.; Giovedi, Claudia; Serra, Andre da Silva; Abe, Alfredo Y.

    2013-01-01

    Nuclear reactor fuel elements consist mainly in a system of a nuclear fuel encapsulated by a cladding material subject to high fluxes of energetic neutrons, high operating temperatures, pressure systems, thermal gradients, heat fluxes and with chemical compatibility with the reactor coolant. The design of a nuclear reactor requires, among a set of activities, the evaluation of the structural integrity of the fuel rod submitted to different loads acting on the fuel rod and the specific properties (dimensions and mechanical and thermal properties) of the cladding material and coolant, including thermal and pressure gradients produced inside the rod due to the fuel burnup. In this work were evaluated the structural mechanical stresses of a fuel rod using stainless steel as cladding material and UO 2 with a low degree of enrichment as fuel pellet on a PWR (pressurized water reactor) under normal operating conditions. In this sense, tangential, radial and axial stress on internal and external cladding surfaces considering the orientations of 0 deg, 90 deg and 180 deg were considered. The obtained values were compared with the limit values for stress to the studied material. From the obtained results, it was possible to conclude that, under the expected normal reactor operation conditions, the integrity of the fuel rod can be maintained. (author)

  1. Kinetic analysis of zinc metabolism and its regulation in normal humans

    International Nuclear Information System (INIS)

    Wastney, M.E.; Aamodt, R.L.; Rumble, W.F.; Henkin, R.I.

    1986-01-01

    Zinc metabolism was studied in 32 normal volunteers after oral or intravenous administration of 65 Zn. Data were collected from the blood, urine, feces, whole body, and over the liver and thigh regions for 9 mo while the subjects consumed their regular diets (containing 10 mg Zn ion/day) and for an additional 9 mo while the subjects received an exogenous oral supplement of 100 mg Zn ion/day. Data from each subject were fitted by a compartmental model for zinc metabolism that was developed previously for patients with taste and smell dysfunction. These data from normal subjects were used to determine the absorption, distribution, and excretion of zinc and the mass of zinc in erythrocytes, liver, thigh, and whole body. By use of additional data obtained from the present study, the model was refined further such that a large compartment, which was previously determined to contain 90% of the body zinc, was subdivided into two compartments to represent zinc in muscle and bone. When oral zinc intake was increased 11-fold three new sites of regulation of zinc metabolism were identified in addition to the two sites previously defined in patients with taste and smell dysfunction (absorption of zinc from gut and excretion of zinc in urine). The three new sites are exchange of zinc with erythrocytes, release of zinc by muscle, and secretion of zinc into gut. Regulation at these five sites appears to maintain some tissue concentrations of zinc when dietary zinc increases

  2. A Network Flow-based Analysis of Cognitive Reserve in Normal Ageing and Alzheimer's Disease.

    Science.gov (United States)

    Wook Yoo, Sang; Han, Cheol E; Shin, Joseph S; Won Seo, Sang; Na, Duk L; Kaiser, Marcus; Jeong, Yong; Seong, Joon-Kyung

    2015-05-20

    Cognitive reserve is the ability to sustain cognitive function even with a certain amount of brain damages. Here we investigate the neural compensation mechanism of cognitive reserve from the perspective of structural brain connectivity. Our goal was to show that normal people with high education levels (i.e., cognitive reserve) maintain abundant pathways connecting any two brain regions, providing better compensation or resilience after brain damage. Accordingly, patients with high education levels show more deterioration in structural brain connectivity than those with low education levels before symptoms of Alzheimer's disease (AD) become apparent. To test this hypothesis, we use network flow measuring the number of alternative paths between two brain regions in the brain network. The experimental results show that for normal aging, education strengthens network reliability, as measured through flow values, in a subnetwork centered at the supramarginal gyrus. For AD, a subnetwork centered at the left middle frontal gyrus shows a negative correlation between flow and education, which implies more collapse in structural brain connectivity for highly educated patients. We conclude that cognitive reserve may come from the ability of network reorganization to secure the information flow within the brain network, therefore making it more resistant to disease progress.

  3. Migration velocity analysis using a transversely isotropic medium with tilt normal to the reflector dip

    KAUST Repository

    Alkhalifah, T.

    2010-06-13

    A transversely isotropic model in which the tilt is constrained to be normal to the dip (DTI model) allows for simplifications in the imaging and velocity model building efforts as compared to a general TTI model. Though this model, in some cases, can not be represented physically like in the case of conflicting dips, it handles all dips with the assumption of symmetry axis normal to the dip. It provides a process in which areas that meet this feature is handled properly. We use efficient downward continuation algorithms that utilizes the reflection features of such a model. For lateral inhomogeneity, phase shift migration can be easily extended to approximately handle lateral inhomogeneity, because unlike the general TTI case the DTI model reduces to VTI for zero dip. We also equip these continuation algorithms with tools that expose inaccuracies in the velocity. We test this model on synthetic data of general TTI nature and show its resilience even couping with complex models like the recently released anisotropic BP model.

  4. MATHEMATICAL ANALYSIS OF DENTAL ARCH OF CHILDREN IN NORMAL OCCLUSION: A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    M. Abu-Hussein DDS, MScD, MSc, DPD

    2012-03-01

    Full Text Available AIM. This paper is an attempt to compare and analyze the various mathematical models for defining the dental arch curvature of children in normal occlusion based upon a review of available literature. Background. While various studies have touched upon ways to cure or prevent dental diseases and upon surgical ways for teeth reconstitution to correct teeth anomalies during childhood, a substantial literature also exists, attempting to mathematically define the dental arch of children in normal occlusion. This paper reviews these dental studies and compares them analytically. Method. The paper compares the different mathematical approaches, highlights the basic assumptions behind each model, underscores the relevancy and applicability of the same, and also lists applicable mathematical formulae. Results. Each model has been found applicable to specific research conditions, as a universal mathematical model for describing the human dental arch still eludes satisfactory definition. The models necessarily need to include the features of the dental arch, such as shape, spacing between teeth and symmetry or asymmetry, but they also need substantial improvement. Conclusions. While the paper shows that the existing models are inadequate in properly defining the human dental arch, it also acknowledges that future research based on modern imaging techniques and computeraided simulation could well succeed in deriving an allinclusive definition for the human dental curve till now eluding the experts.

  5. Validation of Tuba1a as Appropriate Internal Control for Normalization of Gene Expression Analysis during Mouse Lung Development

    Directory of Open Access Journals (Sweden)

    Aditi Mehta

    2015-02-01

    Full Text Available The expression ratio between the analysed gene and an internal control gene is the most widely used normalization method for quantitative RT-PCR (qRT-PCR expression analysis. The ideal reference gene for a specific experiment is the one whose expression is not affected by the different experimental conditions tested. In this study, we validate the applicability of five commonly used reference genes during different stages of mouse lung development. The stability of expression of five different reference genes (Tuba1a, Actb Gapdh, Rn18S and Hist4h4 was calculated within five experimental groups using the statistical algorithm of geNorm software. Overall, Tuba1a showed the least variability in expression among the different stages of lung development, while Hist4h4 and Rn18S showed the maximum variability in their expression. Expression analysis of two lung specific markers, surfactant protein C (SftpC and Clara cell-specific 10 kDA protein (Scgb1a1, normalized to each of the five reference genes tested here, confirmed our results and showed that incorrect reference gene choice can lead to artefacts. Moreover, a combination of two internal controls for normalization of expression analysis during lung development will increase the accuracy and reliability of results.

  6. Value of normalization analysis of thyroid scans on 131I treatment planning for Graves' disease

    International Nuclear Information System (INIS)

    Jin Zhonghui; Mao Yuan; Chen Man; Zhang Yanyan

    2012-01-01

    Objective: To explore the value of normalization analysis of thyroid scans on 131 I treatment planning for Graves' disease. Methods: Patients with hyperthyroidism treated by 131 I were retrospectively analyzed. Sixty cases with thyroid glands less than 35 g and without thyroid nodules were enrolled. Raw data of thyroid scans were re-processed using a software for normalization and magnification. Correlation between total dose,the number of treatments and other factors, such as area ratios of bilateral lobes, bilateral radioactive counts, bilateral gray scales and multifocal uptake patterns, were analyzed using step-by-step regression analysis. Correlations between normal thyroid function, hypothyroidism and the above mentioned factors were analyzed using multiple linear regression analysis. Results: Fifty percent (30/60) of cases were cured after a single-dose treatment, and the remaining 50% required multiple treatments. In addition to thyroid mass and radioactive iodine uptake, total dose correlated with gender (F=4.23, P=0.050), area ratio of bilateral lobes (F=6.20, P=0.020) and multifocal uptake pattern (F=5.12, P=0.033). The number of treatments correlated with ratio of bilateral gray scales (F=8.89, P=0.006) and multifocal uptake pattern (F=4.98, P=0.034). According to outcomes, patients were divided into a normal thyroid function group and a hypothyroidism group. 131 I dose correlated with the area ratio of bilateral lobes (F=10.42, P=0.018) and ratio of bilateral gray scales in the normal thyroid function group (F=10.66, P=0.017); whereas in the hypothyroidism group, the clinical outcome correlated with thyroid mass (F=7.65, P=0.013) and multifocal uptake pattern (F=8.01, P=0.011). Conclusions: Computer aided normalization analysis is useful for 131 I dose calculation in the treatment of hyperthyroidism. For patients with significantly unbalanced bilateral radiotracer distribution, increasing the dose and the number of treatments should be suggested

  7. Fourier phase analysis on equilibrium gated radionuclide ventriculography: range of phase spread and cut-off limits in normal individuals

    International Nuclear Information System (INIS)

    Ramaiah, Vijayaraghavan L.; Harish, B.; Sunil, H.V.; Selvakumar, Job; Ravi Kishore, A.G.; Nair, Gopinathan

    2011-01-01

    To define the range of phase spread on equilibrium gated radionuclide ventriculography (ERNV) in normal individuals and derive the cut-off limit for the parameters to detect cardiac dyssynchrony. ERNV was carried out in 30 individuals (age 53±23 years, 25 males and 5 females) who had no history of cardiovascular disease. They all had normal left ventricular ejection fraction (LVEF 55-70%) as determined by echocardiography, were in sinus rhythm, with normal QRS duration (≤120 msec) and normal coronary angiography. First harmonic phase analysis was performed on scintigraphic data acquired in best septal view. Left and right ventricular standard deviation (LVSD and RVSD, respectively) and interventricular mechanical delay (IVMD), the absolute difference of mean phase angles of right and left ventricle, were computed and expressed in milliseconds. Mean + 3 standard deviation (SD) was used to derive the cut-off limits. Average LVEF and duration of cardiac cycle in the study group were 62.5%±5.44% and 868.9±114.5 msec, respectively. The observations of LVSD, RVSD and right and left ventricular mean phase angles were shown to be normally distributed by Shapiro-Wilk test. Cut-off limits for LVSD, RVSD and IVMD were calculated to be 80 msec, 85 msec and 75 msec, respectively. Fourier phase analysis on ERNV is an effective tool for the evaluation of synchronicity of cardiac contraction. The cut-off limits of parameters of dyssynchrony can be used to separate heart failure patients with cardiac dyssynchrony from those without. ERNV can be used to select patients for cardiac resynchronization therapy. (author)

  8. Qualitative Resting Coronary Pressure Wave Form Analysis to Predict Fractional Flow Reserve.

    Science.gov (United States)

    Matsumura, Mitsuaki; Maehara, Akiko; Johnson, Nils P; Fearon, William F; De Bruyne, Bernard; Oldroyd, Keith G; Pijls, Nico H J; Jenkins, Paul; Ali, Ziad A; Mintz, Gary S; Stone, Gregg W; Jeremias, Allen

    2018-03-27

    To evaluate the predictability of resting distal coronary pressure wave forms for fractional flow reserve (FFR). Resting coronary wave forms were qualitatively evaluated for the presence of (i) dicrotic notch; (ii) diastolic dipping; and (iii) ventricularization. In a development cohort (n=88) a scoring system was developed that was then applied to a validation cohort (n=428) using a multivariable linear regression model to predict FFR and receiver operating characteristics (ROC) to predict FFR ≤0.8. In the development cohort, all 3 qualitative parameters were independent predictors of FFR. However, in a multivariable linear regression model in the validation cohort, qualitative wave form analysis did not further improve the ability of resting distal coronary to aortic pressure ratio (Pd/Pa) (p=0.80) or instantaneous wave-free ratio (iFR) (p=0.26) to predict FFR. Using ROC, the area under the curve of resting Pd/Pa (0.86 versus 0.86, P=0.08) and iFR (0.86 versus 0.86, P=0.26) did not improve by adding qualitative analysis. Qualitative coronary wave form analysis showed moderate classification agreement in predicting FFR but did not add substantially to the resting pressure gradients Pd/Pa and iFR; however, when discrepancies between quantitative and qualitative analyses are observed, artifact or pressure drift should be considered.

  9. Estimation of normal hydration in dialysis patients using whole body and calf bioimpedance analysis

    International Nuclear Information System (INIS)

    Zhu, Fansan; Kotanko, Peter; Handelman, Garry J; Raimann, Jochen G; Liu, Li; Carter, Mary; Kuhlmann, Martin K; Seibert, Eric; Levin, Nathan W; Leonard, Edward F

    2011-01-01

    Prescription of an appropriate dialysis target weight (dry weight) requires accurate evaluation of the degree of hydration. The aim of this study was to investigate whether a state of normal hydration (DW cBIS ) as defined by calf bioimpedance spectroscopy (cBIS) and conventional whole body bioimpedance spectroscopy (wBIS) could be characterized in hemodialysis (HD) patients and normal subjects (NS). wBIS and cBIS were performed in 62 NS (33 m/29 f) and 30 HD patients (16 m/14 f) pre- and post-dialysis treatments to measure extracellular resistance and fluid volume (ECV) by the whole body and calf bioimpedance methods. Normalized calf resistivity (ρ N,5 ) was defined as resistivity at 5 kHz divided by the body mass index. The ratio of wECV to total body water (wECV/TBW) was calculated. Measurements were made at baseline (BL) and at DW cBIS following the progressive reduction of post-HD weight over successive dialysis treatments until the curve of calf extracellular resistance is flattened (stabilization) and the ρ N,5 was in the range of NS. Blood pressures were measured pre- and post-HD treatment. ρ N,5 in males and females differed significantly in NS. In patients, ρ N,5 notably increased with progressive decrease in body weight, and systolic blood pressure significantly decreased pre- and post-HD between BL and DW cBIS respectively. Although wECV/TBW decreased between BL and DW cBIS , the percentage of change in wECV/TBW was significantly less than that in ρ N,5 (−5.21 ± 3.2% versus 28 ± 27%, p < 0.001). This establishes the use of ρ N,5 as a new comparator allowing a clinician to incrementally monitor removal of extracellular fluid from patients over the course of dialysis treatments. The conventional whole body technique using wECV/TBW was less sensitive than the use of ρ N,5 to measure differences in body hydration between BL and DW cBIS

  10. Vitality Forms Processing in the Insula during Action Observation: A Multivoxel Pattern Analysis

    Science.gov (United States)

    Di Cesare, Giuseppe; Valente, Giancarlo; Di Dio, Cinzia; Ruffaldi, Emanuele; Bergamasco, Massimo; Goebel, Rainer; Rizzolatti, Giacomo

    2016-01-01

    Observing the style of an action done by others allows the observer to understand the cognitive state of the agent. This information has been defined by Stern “vitality forms”. Previous experiments showed that the dorso-central insula is selectively active both during vitality form observation and execution. In the present study, we presented participants with videos showing hand actions performed with different velocities and asked them to judge either their vitality form (gentle, neutral, rude) or their velocity (slow, medium, fast). The aim of the present study was to assess, using multi-voxel pattern analysis, whether vitality forms and velocities of observed goal-directed actions are differentially processed in the insula, and more specifically whether action velocity is encoded per se or it is an element that triggers neural populations of the insula encoding the vitality form. The results showed that, consistently across subjects, in the dorso-central sector of the insula there were voxels selectively tuned to vitality forms, while voxel tuned to velocity were rare. These results indicate that the dorso-central insula, which previous data showed to be involved in the vitality form processing, contains voxels specific for the action style processing. PMID:27375461

  11. Ductile failure analysis of high strength steel in hot forming based on micromechanical damage model

    Directory of Open Access Journals (Sweden)

    Ying Liang

    2016-01-01

    Full Text Available The damage evolution of high strength steel at elevated temperature is investigated by using the Gurson-Tvergaard-Needleman (GTN model. A hybrid method integrated thermal tensile test and numerical technique is employed to identify the damage parameters. The analysis results show that the damage parameters are different at different temperature as the variation of tested material microstructure. Furthermore, the calibrated damage parameters are implemented to simulate a bugling forming at elevated temperature. The experimental results show the availability of GTN damage model in analyzing sheet formability in hot forming.

  12. Q resolution calculation of small angle neutron scattering spectrometer and analysis of form factor

    International Nuclear Information System (INIS)

    Chen Liang; Peng Mei; Wang Yan; Sun Liangwei; Chen Bo

    2011-01-01

    The calculational methods of Small Angle Neutron Scattering (SANS) spectrometer Q resolution function and its correlative Q standard difference were introduced. The effects of Q standard difference were analysed with the geometry lay out of spectrometer and the spread of neutron wavelength. The one dimension Q resolution Gaussian function were analysed. The form factor curve of ideal solid sphere and two different instrument arrangement parameter was convoluted respectively and the different smearing curve of form factor was obtained. The combination of using the Q resolution function to more accurately analysis SANS data. (authors)

  13. Multistep Cylindrical Structure Analysis at Normal Incidence Based on Water-Substrate Broadband Metamaterial Absorbers

    Science.gov (United States)

    Fang, Chonghua

    2018-01-01

    A new multistep cylindrical structure based on water-substrate broadband metamaterial absorbers is designed to reduce the traditional radar cross-section (RCS) of a rod-shaped object. The proposed configuration consists of two distinct parts. One of these components is formed by a four-step cylindrical metal structure, whereas the other one is formed by a new water-substrate broadband metamaterial absorber. The designed structure can significantly reduce the radar cross section more than 10 dB from 4.58 to 18.42 GHz which is the 86.5 % bandwidth of from C-band to 20 GHz. The results of measurement show reasonably good accordance with the simulated ones, which verifies the ability and effect of the proposed design.

  14. On the sensitivity of protein data bank normal mode analysis: an application to GH10 xylanases

    Science.gov (United States)

    Tirion, Monique M.

    2015-12-01

    Protein data bank entries obtain distinct, reproducible flexibility characteristics determined by normal mode analyses of their three dimensional coordinate files. We study the effectiveness and sensitivity of this technique by analyzing the results on one class of glycosidases: family 10 xylanases. A conserved tryptophan that appears to affect access to the active site can be in one of two conformations according to x-ray crystallographic electron density data. The two alternate orientations of this active site tryptophan lead to distinct flexibility spectra, with one orientation thwarting the oscillations seen in the other. The particular orientation of this sidechain furthermore affects the appearance of the motility of a distant, C terminal region we term the mallet. The mallet region is known to separate members of this family of enzymes into two classes.

  15. On the sensitivity of protein data bank normal mode analysis: an application to GH10 xylanases

    International Nuclear Information System (INIS)

    Tirion, Monique M

    2015-01-01

    Protein data bank entries obtain distinct, reproducible flexibility characteristics determined by normal mode analyses of their three dimensional coordinate files. We study the effectiveness and sensitivity of this technique by analyzing the results on one class of glycosidases: family 10 xylanases. A conserved tryptophan that appears to affect access to the active site can be in one of two conformations according to x-ray crystallographic electron density data. The two alternate orientations of this active site tryptophan lead to distinct flexibility spectra, with one orientation thwarting the oscillations seen in the other. The particular orientation of this sidechain furthermore affects the appearance of the motility of a distant, C terminal region we term the mallet. The mallet region is known to separate members of this family of enzymes into two classes. (paper)

  16. Data analysis strategies for the characterization of normal: superconductor point contacts by barrier strength parameter

    Science.gov (United States)

    Smith, Charles W.; Reinertson, Randal C.; Dolan, P. J., Jr.

    1993-05-01

    The theoretical description by Blonder, Tinkham, and Klapwijk [Phys. Rev. B 25, 4515 (1982)] of the I-V curves of normal: superconductor point contacts encompasses a broad range of experimental behavior, from the tunnel junction case, on the one hand, to the clean metallic microconstriction limit on the other. The theory characterizes point contacts in terms of a single parameter, the barrier strength. The differential conductance of a point contact, at zero bias, as a function of temperature, offers a direct experimental method by which the barrier strength parameter can be evaluated. In view of the full range of phenomena incorporated by this theory, we suggest several different strategies for the evaluation of the barrier strength parameter from data in the low and intermediate barrier strength regimes and for measurements in the low temperature (near T=0 K) and high temperature (near T=Tc) limits.

  17. Analysis of the interaction of a weak normal shock wave with a turbulent boundary layer

    Science.gov (United States)

    Melnik, R. E.; Grossman, B.

    1974-01-01

    The method of matched asymptotic expansions is used to analyze the interaction of a normal shock wave with an unseparated turbulent boundary layer on a flat surface at transonic speeds. The theory leads to a three-layer description of the interaction in the double limit of Reynolds number approaching infinity and Mach number approaching unity. The interaction involves an outer, inviscid rotational layer, a constant shear-stress wall layer, and a blending region between them. The pressure distribution is obtained from a numerical solution of the outer-layer equations by a mixed-flow relaxation procedure. An analytic solution for the skin friction is determined from the inner-layer equations. The significance of the mathematical model is discussed with reference to existing experimental data.

  18. Cell renewal of glomerular cell types in normal rats. An autoradiographic analysis

    International Nuclear Information System (INIS)

    Pabst, R.; Sterzel, R.B.

    1983-01-01

    Normal adult Sprague-Dawley rats received either a single or repetitive injection of the DNA precursor 3 H-thymidine ( 3 H-TdR). For autoradiography semi-thin sections were prepared 2 hr to 14 days after labeling. The majority of labeled cells noted in glomerular tufts were endothelial cells. Mesangial cells had a lower production rate. Podocytes revealed no evidence of proliferation. Bowman's capsule cells showed a higher labeling index than tuft cells at all times. Neither the urinary nor the vascular pole was found to be a proliferative zone for Bowman's capsule cells. The flash and repetitive labeling experiments demonstrated a constant rate of cell renewal of about 1% per day, resulting in a long life span for endothelial and mesangial cells as well as Bowman's capsule cells. These data provide a basis for cell kinetic studies in models of glomerular diseases

  19. Analysis of normal zone propagation and hot spot temperature on ITER CS insert coil

    International Nuclear Information System (INIS)

    Suwa, Tomone; Ozeki, Hidemasa; Nabara, Yoshihiro; Saito, Toru; Kawano, Katsumi; Takahashi, Yoshikazu; Isono, Takaaki; Nunoya, Yoshihiko

    2016-01-01

    The Central Solenoid (CS) insert coil consists of a 42-m-long CS conductor, of which the specifications are the same as that of the ITER CS. In order to investigate normal zone propagation and hot spot temperature, a quench test was carried out on the CS insert under End-of-Burn condition at 12.5 T and 45.1 kA of after 16,000 cycles. External heat was applied at nearly the center of the CS insert using an inductive heater, and quench was induced. A current of 45.1 kA was dumped 9.5 s (7 s) after voltage generation (Quench detection, QD). The Normal zone propagation length reached 23.4 m, and the maxim propagation velocity was 3.1 m/s just before dumping. Considering the distribution of temperature, which is calculated by GANDALF, hot spot temperature was expected to reach 227 K. As the result, it was found that the hot spot temperature exceeded the criteria of 150 K which is designed on ITER. However, heating the CS insert to 227 K did not influence conductor performance, because the current sharing temperature was maintained after the quench test. Therefore, the quench detection has a margin of approximately 9.5 s (7 s) after voltage generation (QD) in view of the conductor performance under the conditions applied in this quench test. If the hot spot temperature is kept to less than 150 K, the current should be dumped 7.5 s (5 s) before voltage generation (QD). These results are very useful for designing quench protection of the ITER CS. (author)

  20. Blood pressure and heart rate variability analysis of orthostatic challenge in normal human pregnancies.

    Science.gov (United States)

    Heiskanen, Nonna; Saarelainen, Heli; Valtonen, Pirjo; Lyyra-Laitinen, Tiina; Laitinen, Tomi; Vanninen, Esko; Heinonen, Seppo

    2008-11-01

    The aim of the present study was to evaluate pregnancy-related changes in autonomic regulatory functions in healthy subjects. We studied cardiovascular autonomic responses to head-up tilt (HUT) in 28 pregnant women during the third trimester of pregnancy and 3 months after parturition. The maternal ECG and non-invasive beat-to-beat blood pressure were recorded in the horizontal position (left-lateral position) and during HUT in the upright position. Stroke volume was assessed from blood pressure signal by using the arterial pulse contour method. Heart rate variability (HRV) was analysed in frequency domain, and baroreflex sensitivity by the cross-spectral and the sequence methods. In the horizontal position, all frequency components of HRV were lower during pregnancy than 3 months after parturition (P pregnancy had no influence on normalized low frequency and high frequency powers. During pregnancy haemodynamics was well balanced with only minor changes in response to postural change while haemodynamic responses to HUT were more remarkable after parturition. In pregnant women HRV and especially its very low frequency component increased in response to HUT, whereas at 3 months after parturition the direction of these changes was opposite. Parasympathetic deactivation towards term is likely to contribute to increased heart rate and cardiac output at rest, whereas restored sympathetic modulation with modest responses may contribute stable peripheral resistance and sufficient placental blood supply under stimulated conditions. It is important to understand cardiovascular autonomic nervous system and haemodynamic control in normal pregnancy before being able to judge whether they are dysregulated in complicated pregnancies.

  1. Analysis of the variability of human normal urine by 2D-GE reveals a "public" and a "private" proteome.

    Science.gov (United States)

    Molina, Laurence; Salvetat, Nicolas; Ameur, Randa Ben; Peres, Sabine; Sommerer, Nicolas; Jarraya, Fayçal; Ayadi, Hammadi; Molina, Franck; Granier, Claude

    2011-12-10

    The characterization of the normal urinary proteome is steadily progressing and represents a major interest in the assessment of clinical urinary biomarkers. To estimate quantitatively the variability of the normal urinary proteome, urines of 20 healthy people were collected. We first evaluated the impact of the sample conservation temperature on urine proteome integrity. Keeping the urine sample at RT or at +4°C until storage at -80°C seems the best way for long-term storage of samples for 2D-GE analysis. The quantitative variability of the normal urinary proteome was estimated on the 20 urines mapped by 2D-GE. The occurrence of the 910 identified spots was analysed throughout the gels and represented in a virtual 2D gel. Sixteen percent of the spots were found to occur in all samples and 23% occurred in at least 90% of urines. About 13% of the protein spots were present only in 10% or less of the samples, thus representing the most variable part of the normal urinary proteome. Twenty proteins corresponding to a fraction of the fully conserved spots were identified by mass spectrometry. In conclusion, a "public" urinary proteome, common to healthy individuals, seems to coexist with a "private" urinary proteome, which is more specific to each individual. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Normal and abnormal electrical activation of the heart. Imaging patterns obtained by phase analysis of equilibrium cardiac studies

    International Nuclear Information System (INIS)

    Pavel, D.; Byrom, E.; Swiryn, S.; Meyer-Pavel, C.; Rosen, K.

    1981-01-01

    By using a temporal Fourier analysis of gated equilibrium cardiac studies, phase images were obtained. These functional images were analysed qualitatively and quantitatively to determine if specific patterns can be found for normal versus abnormal electrical activation of the heart. The study included eight subjects with normal cardiac function and 24 patients with abnormal electrical activation: eight with left bundle branch block (LBBB), two with right bundle branch block (RBBB), six with Wolff-Parkinson-White syndrome (WPW), one with junctional rhythm, one with spontaneous sustained ventricular tachycardia (VT) (all with normal wall motion), two with chronic transvenous pacemakers, and four with induced sustained VT (all with regional wall motion abnormalities). The results show that the two ventricals have the same mean phase (within +-9 0 ) in normals, but significantly different mean phases in all patients with bundle branch blocks. Of the six WPW patients, three had a distinctive abnormal pattern. The patient with junctional rhythm, those with transvenous pacemakers, and those with VT all had abnormal patterns on the phase image. The phase image is capable of showing differences between patients with electrical activation and a variety of electrical abnormalities. Within the latter category distinct patterns can be associated with each type of abnormality. (author)

  3. Isogeometric analysis of free-form Timoshenko curved beams including the nonlinear effects of large deformations

    Science.gov (United States)

    Hosseini, Seyed Farhad; Hashemian, Ali; Moetakef-Imani, Behnam; Hadidimoud, Saied

    2018-03-01

    In the present paper, the isogeometric analysis (IGA) of free-form planar curved beams is formulated based on the nonlinear Timoshenko beam theory to investigate the large deformation of beams with variable curvature. Based on the isoparametric concept, the shape functions of the field variables (displacement and rotation) in a finite element analysis are considered to be the same as the non-uniform rational basis spline (NURBS) basis functions defining the geometry. The validity of the presented formulation is tested in five case studies covering a wide range of engineering curved structures including from straight and constant curvature to variable curvature beams. The nonlinear deformation results obtained by the presented method are compared to well-established benchmark examples and also compared to the results of linear and nonlinear finite element analyses. As the nonlinear load-deflection behavior of Timoshenko beams is the main topic of this article, the results strongly show the applicability of the IGA method to the large deformation analysis of free-form curved beams. Finally, it is interesting to notice that, until very recently, the large deformations analysis of free-form Timoshenko curved beams has not been considered in IGA by researchers.

  4. Histomorphometric analysis of nuclear and cellular volumetric alterations in oral lichen planus, lichenoid lesions and normal oral mucosa using image analysis software.

    Science.gov (United States)

    Venkatesiah, Sowmya S; Kale, Alka D; Hallikeremath, Seema R; Kotrashetti, Vijayalakshmi S

    2013-01-01

    Lichen planus is a chronic inflammatory mucocutaneous disease that clinically and histologically resembles lichenoid lesions, although the latter has a different etiology. Though criteria have been suggested for differentiating oral lichen planus from lichenoid lesions, confusion still prevails. To study the cellular and nuclear volumetric features in the epithelium of normal mucosa, lichen planus, and lichenoid lesions to determine variations if any. A retrospective study was done on 25 histologically diagnosed cases each of oral lichen planus, oral lichenoid lesions, and normal oral mucosa. Cellular and nuclear morphometric measurements were assessed on hematoxylin and eosin sections using image analysis software. Analysis of variance test (ANOVA) and Tukey's post-hoc test. The basal cells of oral lichen planus showed a significant increase in the mean nuclear and cellular areas, and in nuclear volume; there was a significant decrease in the nuclear-cytoplasmic ratio as compared to normal mucosa. The suprabasal cells showed a significant increase in nuclear and cellular areas, nuclear diameter, and nuclear and cellular volumes as compared to normal mucosa. The basal cells of oral lichenoid lesions showed significant difference in the mean cellular area and the mean nuclear-cytoplasmic ratio as compared to normal mucosa, whereas the suprabasal cells differed significantly from normal mucosa in the mean nuclear area and the nuclear and cellular volumes. Morphometry can differentiate lesions of oral lichen planus and oral lichenoid lesions from normal oral mucosa. Thus, morphometry may serve to discriminate between normal and premalignant lichen planus and lichenoid lesions. These lesions might have a high risk for malignant transformation and may behave in a similar manner with respect to malignant transformation.

  5. Weak Form Efficiency of the Chittagong Stock Exchange: An Empirical Analysis (2006-2016

    Directory of Open Access Journals (Sweden)

    Shahadat Hussain

    2017-01-01

    Full Text Available We study the random walk behavior of Chittagong Stock Exchange (CSE by using daily returns of three indices for the period of 2006 to 2016 employing both non-parametric test (run test and parametric tests [autocorrelation coefficient test, Ljung– Box (LB statistics]. The skewness and kurtosis properties of daily return series are non-normal, with a hint of positively skewed and leptokurtic distribution. The results of run test; autocorrelation and Ljung–Box (LB statistics provide evidences against random walk behavior in the Chittagong Stock Exchange. Overall our result suggest that Chittagong Stock Exchange does not exhibit weak form of efficiency. Hence, there is opportunity of generating a superior return by the active investors.

  6. [Global analysis of the readability of the informed consent forms used in public hospitals of Spain].

    Science.gov (United States)

    Mariscal-Crespo, M I; Coronado-Vázquez, M V; Ramirez-Durán, M V

    To analyse the readability of informed consent forms (ICF) used in Public Hospitals throughout Spain, with the aim of checking their function of providing comprehensive information to people who are making any health decision no matter where they are in Spain. A descriptive study was performed on a total of 11,339 ICF received from all over Spanish territory, of which 1617 ICF were collected from 4 web pages of Health Portal and the rest (9722) were received through email and/or telephone contact from March 2012 to February 2013. The readability level was studied using the Inflesz tool. A total of 372 ICF were selected and analysed using simple random sampling. The Inflesz scale and the Flesch-Szigriszt index were used to analyse the readability. The readability results showed that 62.4% of the ICF were rated as a "little difficult", the 23.4% as "normal", and the 13.4% were rated as "very difficult". The highest readability means using the Flesch index were scored in Andalusia with a mean of 56.99 (95% CI; 55.42-58.57) and Valencia with a mean of 51.93 (95% CI; 48.4-55.52). The lowest readability means were in Galicia with a mean of 40.77 (95% CI; 9.83-71.71) and Melilla, mean=41.82 (95% CI; 35.5-48.14). The readability level of Spanish informed consent forms must be improved because their scores using readability tools could not be classified in normal scales. Furthermore, there was very wide variability among Spanish ICF, which showed a lack of equity in information access among Spanish citizens. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. Finite element analysis of composite beam-to-column connection with cold-formed steel section

    Science.gov (United States)

    Firdaus, Muhammad; Saggaff, Anis; Tahir, Mahmood Md

    2017-11-01

    Cold-formed steel (CFS) sections are well known due to its lightweight and high structural performance which is very popular for building construction. Conventionally, they are used as purlins and side rails in the building envelopes of the industrial buildings. Recent research development on cold-formed steel has shown that the usage is expanded to the use in composite construction. This paper presents the modelling of the proposed composite connection of beam-to-column connection where cold-formed steel of lipped steel section is positioned back-to-back to perform as beam. Reinforcement bars is used to perform the composite action anchoring to the column and part of it is embedded into a slab. The results of the finite element and numerical analysis has showed good agreement. The results show that the proposed composite connection contributes to significant increase to the moment capacity.

  8. UTILIZATION OF STEREOLOGY FOR QUANTITATIVE ANALYSIS OF PLASTIC DEFORMATION OF FORMING PIECES

    Directory of Open Access Journals (Sweden)

    Maroš Martinkovič

    2012-01-01

    Full Text Available Mechanical working leads to final properties of forming pieces, which are affected by conditions of production technology. Utilization of stereology leads to the detail analysis of three-dimensional plastic deformed material structure by different forming technologies, e.g. forging, extruding, upsetting, metal spinning, drawing etc. The microstructure of cold drawing wires was analyzed. Grain boundaries orientation was measured on the parallel section of wire with a different degree of deformation and direct axis plastic deformation was evaluated in bulk formed part. The strain of probes on their sections was obtained using stereology by measurement of degree of grain boundary orientation which was converted to deformation using model of conversion of grain boundary orientation degree to deformation.

  9. Microbiological and bioinformatics analysis of primary Sjogren's syndrome patients with normal salivation§

    Directory of Open Access Journals (Sweden)

    Huma Siddiqui

    2016-10-01

    Full Text Available Background: Reduced salivation is considered a major clinical feature of most but not all cases of primary Sjögren's syndrome (pSS. Reduced saliva flow may lead to changes in the salivary microbiota. These changes have mainly been studied with culture that typically recovers only 65% of the bacteria present. Objective: This study was to use high throughput sequencing, covering both cultivated and not-yet-cultivated bacteria, to assess the bacterial microbiota of whole saliva in pSS patients with normal salivation. Methods: Bacteria of whole unstimulated saliva from nine pSS patients with normal salivation flow and from nine healthy controls were examined by high throughput sequencing of the hypervariable region V1V2 of 16S rRNA using the 454 GS Junior system. Raw sequence reads were subjected to a species-level, reference-based taxonomy assignment pipeline specially designed for studying the human oral microbial community. Each of the sequence reads was BLASTN-searched against a database consisting of reference sequences representing 1,156 oral and 12,013 non-oral species. Unassigned reads were then screened for high-quality non-chimeras and subjected to de novo species-level operational taxonomy unit (OTU calling for potential novel species. Downstream analyses, including alpha and beta diversities, were analyzed using the Quantitative Insights into Microbial Ecology (QIIME pipeline. To reveal significant differences between the microbiota of control saliva and Sjögren's saliva, a statistical method introduced in Metastats www.metastats.cbcb.umd.edu was used. Results: Saliva of pSS patients with normal salivation had a significantly higher frequency of Firmicutes compared with controls (p=0.004. Two other major phyla, Synergistetes and Spirochaetes, were significantly depleted in pSS (p=0.001 for both. In addition, we saw a nearly 17% decrease in the number of genera in pSS (25 vs. 30. While Prevotella was almost equally abundant in both

  10. An automated form of video image analysis applied to classification of movement disorders.

    Science.gov (United States)

    Chang, R; Guan, L; Burne, J A

    Video image analysis is able to provide quantitative data on postural and movement abnormalities and thus has an important application in neurological diagnosis and management. The conventional techniques require patients to be videotaped while wearing markers in a highly structured laboratory environment. This restricts the utility of video in routine clinical practise. We have begun development of intelligent software which aims to provide a more flexible system able to quantify human posture and movement directly from whole-body images without markers and in an unstructured environment. The steps involved are to extract complete human profiles from video frames, to fit skeletal frameworks to the profiles and derive joint angles and swing distances. By this means a given posture is reduced to a set of basic parameters that can provide input to a neural network classifier. To test the system's performance we videotaped patients with dopa-responsive Parkinsonism and age-matched normals during several gait cycles, to yield 61 patient and 49 normal postures. These postures were reduced to their basic parameters and fed to the neural network classifier in various combinations. The optimal parameter sets (consisting of both swing distances and joint angles) yielded successful classification of normals and patients with an accuracy above 90%. This result demonstrated the feasibility of the approach. The technique has the potential to guide clinicians on the relative sensitivity of specific postural/gait features in diagnosis. Future studies will aim to improve the robustness of the system in providing accurate parameter estimates from subjects wearing a range of clothing, and to further improve discrimination by incorporating more stages of the gait cycle into the analysis.

  11. Normal growth, altered growth? Study of the relationship between harris lines and bone form within a post-medieval plague cemetery (Dendermonde, Belgium, 16th Century).

    Science.gov (United States)

    Boucherie, Alexandra; Castex, Dominique; Polet, Caroline; Kacki, Sacha

    2017-01-01

    Harris lines (HLs) are defined as transverse, mineralized lines associated with temporary growth arrest. In paleopathology, HLs are used to reconstruct health status of past populations. However, their etiology is still obscure. The aim of this article is to test the reliability of HLs as an arrested growth marker by investigating their incidence on human metrical parameters. The study was performed on 69 individuals (28 adults, 41 subadults) from the Dendermonde plague cemetery (Belgium, 16th century). HLs were rated on distal femora and both ends of tibiae. Overall prevalence and age-at-formation of each detected lines were calculated. ANOVA analyses were conducted within subadult and adult samples to test if the presence of HLs did impact size and shape parameters of the individuals. At Dendermonde, 52% of the individuals had at least one HL. The age-at-formation was estimated between 5 and 9 years old for the subadults and between 10 and 14 years old for the adults. ANOVA analyses showed that the presence of HLs did not affect the size of the individuals. However, significant differences in shape parameters were highlighted by HL presence. Subadults with HLs displayed slighter shape parameters than the subadults without, whereas the adults with HLs had larger measurements than the adults without. The results suggest that HLs can have a certain impact on shape parameters. The underlying causes can be various, especially for the early formed HLs. However, HLs deposited around puberty are more likely to be physiological lines reflecting hormonal secretions. Am. J. Hum. Biol. 29:e22885, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. Proximal and distal alignment of normal canine femurs: A morphometric analysis.

    Science.gov (United States)

    Kara, Mehmet Erkut; Sevil-Kilimci, Figen; Dilek, Ömer Gürkan; Onar, Vedat

    2018-05-01

    Many researchers are interested in femoral conformation because most orthopaedic problems of the long bones occur in the femur and its joints. The neck-shaft (NSA) and the anteversion (AVA) angles are good predictors for understanding the orientation of the proximal end of the femur. The varus (aLDFA) and procurvatum (CDFA) angles have also been used to understand the orientation of the distal end of the femur. The purposes of this study were to investigate the relationship between the proximal and distal angles of the femur and to compare the distal femoral angles in male and female dogs in order to investigate the sexual dimorphism. The measurements of normal CDFAs, which have not been previously reported, may also provide a database of canine distal femoral morphology. A total of 75 cleaned healthy femora from different breeds or mixed breed of dogs were used. The three-dimensional images were reconstructed from computed tomographic images. The AVA, NSA, aLDFA and CDFA were measured on the 3D images. The correlation coefficients were calculated among the measured angles. The distal femoral angles were also compared between male and female femora. The 95% confidence intervals of the AVA and the NSA were calculated to be 24.22°-29.50° and 144.97°-147.50°, respectively. The 95% confidence intervals of the aLDFA and the CDFA for all studied dogs were 92.62°-94.08° and 89.09°-91.94°, respectively. The NSA showed no correlation with either the aLDFA or CDFA. There was a weak inverse correlation between the AVA and CDFA and a weak positive correlation between the AVA and aLDFA. The differences in the aLDFA and CDFA measurements between male and female dog were not significant. In conclusion, femoral version, regardless of the plane, might have little influence on distal femoral morphology in normal dogs. Besides this, there is no evidence of a sexual dimorphism in the varus and procurvatum angles of the dog distal femur. The data from this study may be used in

  13. Problems with using the normal distribution--and ways to improve quality and efficiency of data analysis.

    Directory of Open Access Journals (Sweden)

    Eckhard Limpert

    Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.

  14. Structural evaluation and analysis under normal conditions for spent fuel concrete storage cask

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Taechul; Baeg, Changyeal; Yoon, Sitae [Korea Radioactive waste Management Agency, Daejeon (Korea, Republic of); Jung, Insoo [Korea Nuclear Engineering and Service Co., Daejeon (Korea, Republic of)

    2014-05-15

    The purpose of this paper is the verification of stabilities of the structural elements that influence the safety of a concrete storage cask. The evaluation results were reviewed with respect to every design criterion, in terms of whether the results satisfy the criteria, provided by 10CFR 72 and NUREG-1536. The basic information on the design is partially explained in 2. Description of spent fuel storage system and the maintainability and assumptions included in the analysis were confirmed through detailed explanations of the acceptable standards, analysis model, and analysis method. ABAQUS 6.10, a widely used finite element analysis program, was used in the structural analysis. The storage cask shall maintain the sub-criticality, shielding, structural integrity, thermal capability and confinement in accordance with the requirements specified in US 10 CFR 72. The safety of storage cask is analyzed and it has been confirmed to meet the requirements of US 10 CFR 72. This paper summarizes the structural stability evaluation results of a concrete storage cask with respect to the design criteria. The evaluation results of this paper show that the maximum stress was below the allowable stress under every condition, and the concrete storage cask satisfied the design criteria.

  15. Analysis of knee movement with low-field MR equipment. A normal volunteer study

    International Nuclear Information System (INIS)

    Ando, Yoko; Fukatsu, Hiroshi; Ishigaki, Takeo; Aoki, Ikuo; Yamada, Takashi.

    1994-01-01

    This study was performed to make a normal standard by analyzing knee movement in detail. An open low-field unit was used for 23 healthy knee joints. With three-dimensional Fourier transformation (3DFT) gradient echo sequence, 50 sagittal slices of 4.5 mm in thickness were obtained at four flexion angles: 0, 30, 60, and 90 degrees (lateral position). Although the tension ratio of the anterior and posterior cruciate ligaments (ACL, PCL) increased during knee flexion, the change in the tension ratio was significantly different between the ACL and PCL. The femur-ACL angle and femur-PCL angle were parallel with the knee flexion angle, but the tibia-ACL angle and tibia-PCL angle changed complexly. The lateral and medial condyles rolled and slid during knee flexion, and the medial side moved more than the lateral side, consistent with rotation of the lower thigh. The difference in backward movement distance on the tibia between the two condyles was significantly larger in females than in males. This might explain the dominance of knee osteoarthritis in women. Although the lateral position is not completely physiological, we could show initial cinematic data of up to 90 degrees of knee flexion using open-type MRI, which is impossible with high- and middle-field machines. (author)

  16. Three dimensional analysis of histone methylation patterns in normal and tumor cell nuclei

    Directory of Open Access Journals (Sweden)

    M Cremer

    2009-06-01

    Full Text Available Histone modifications represent an important epigenetic mechanism for the organization of higher order chromatin structure and gene regulation. Methylation of position-specific lysine residues in the histone H3 and H4 amino termini has been linked with the formation of constitutive and facultative heterochromatin as well as with specifically repressed single gene loci. Using an antibody, directed against dimethylated lysine 9 of histone H3 and several other lysine methylation sites, we visualized the nuclear distribution pattern of chromatin flagged by these methylated lysines in 3D preserved nuclei of normal and malignant cell types. Optical confocal serial sections were used for a quantitative evaluation. We demonstrate distinct differences of these histone methylation patterns among nuclei of different cell types after exit of the cell cycle. Changes in the pattern formation were also observed during the cell cycle. Our data suggest an important role of methylated histones in the reestablishment of higher order chromatin arrangements during telophase/early G1. Cell type specific histone methylation patterns are possibly causally involved in the formation of cell type specific heterochromatin compartments, composed of (pericentromeric regions and chromosomal subregions from neighboring chromosome territories, which contain silent genes.

  17. Analysis of postural sway in patients with normal pressure hydrocephalus: effects of shunt implantation

    Directory of Open Access Journals (Sweden)

    Czerwosz L

    2009-12-01

    Full Text Available Abstract Poor postural balance is one of the major risk factors for falling in normal pressure hydrocephalus (NPH. Postural instability in the clinic is commonly assessed based upon force platform posturography. In this study we focused on the identification of changes in sway characteristics while standing quiet in patients with NPH before and after shunt implantation. Postural sway area and sway radius were analyzed in a group of 9 patients and 46 controls of both genders. Subject's spontaneous sway was recorded while standing quiet on a force platform for 30-60 s, with eyes open and then closed. Both analyzed sway descriptors identified between-group differences and also an effect of shunt implantation in the NPH group. Sway radius and sway area in patients exhibited very high values compared with those in the control group. Importantly, the effect of eyesight in patients was not observed before shunt implantation and reappeared after the surgical treatment. The study documents that static force platform posturography may be a reliable measure of postural control improvement due to shunt surgery.

  18. An analysis of lymphographic signs for differentiating cancerous, lymphomatous, and normal lymph nodes

    International Nuclear Information System (INIS)

    Tatsuzaki, Hideo; Nakajima, Teiichi; Okumura, Toshiyuki; Akisada, Masayoshi

    1987-01-01

    Twenty-four lymphographic signs, obtained from a total of 204 cancerous (C), lymphomatous (L), and normal (N) lymph nodes, were analyzed based on the final diagnosis. Univariate analyses with correlation coefficient and multivariate regression analyses were employed to differentiate C, L, or N lymph nodes. Based on univariate analyses, lymphographic sings for C nodes were - enlarged node, irregular or deficient capsules or marginal sinuses, block of lymph vessels, extravasation, and defect in lymph-vascular and nodal phase (combination defect). The signs for L node were - enlarged node, elliptic shape from 2 projections, irregular or deficient capsules or marginal sinuses, and extravasation. Using multivariate analyses, the following signs were necessary for differentiating individual lymph nodes: (a) combination defect, specific pattern, granularity, nodal shape and stasis or preservation of lymph vessels for differentiating C from N nodes; (b) deficiencies of capsules, nodal shape, specific pattern, dislocation of lymph vessels, and nodal contrast for differentiating L from N nodes; and (c) character of defect, specific pattern, deformity, soft tissue shadow and nodal shape for differentiating C from L nodes. Should lymphography be used in the visualization of inside structure and lymphatic canal, it would even more increase the ability to diagnose cancer and malignant lymphomas. (Namekawa, K.)

  19. Identification of surface species by vibrational normal mode analysis. A DFT study

    Science.gov (United States)

    Zhao, Zhi-Jian; Genest, Alexander; Rösch, Notker

    2017-10-01

    Infrared spectroscopy is an important experimental tool for identifying molecular species adsorbed on a metal surface that can be used in situ. Often vibrational modes in such IR spectra of surface species are assigned and identified by comparison with vibrational spectra of related (molecular) compounds of known structure, e. g., an organometallic cluster analogue. To check the validity of this strategy, we carried out a computational study where we compared the normal modes of three C2Hx species (x = 3, 4) in two types of systems, as adsorbates on the Pt(111) surface and as ligands in an organometallic cluster compound. The results of our DFT calculations reproduce the experimental observed frequencies with deviations of at most 50 cm-1. However, the frequencies of the C2Hx species in both types of systems have to be interpreted with due caution if the coordination mode is unknown. The comparative identification strategy works satisfactorily when the coordination mode of the molecular species (ethylidyne) is similar on the surface and in the metal cluster. However, large shifts are encountered when the molecular species (vinyl) exhibits different coordination modes on both types of substrates.

  20. Endogenous protein "barcode" for data validation and normalization in quantitative MS analysis.

    Science.gov (United States)

    Lee, Wooram; Lazar, Iulia M

    2014-07-01

    Quantitative proteomic experiments with mass spectrometry detection are typically conducted by using stable isotope labeling and label-free quantitation approaches. Proteins with housekeeping functions and stable expression level such actin, tubulin, and glyceraldehyde-3-phosphate dehydrogenase are frequently used as endogenous controls. Recent studies have shown that the expression level of such common housekeeping proteins is, in fact, dependent on various factors such as cell type, cell cycle, or disease status and can change in response to a biochemical stimulation. The interference of such phenomena can, therefore, substantially compromise their use for data validation, alter the interpretation of results, and lead to erroneous conclusions. In this work, we advance the concept of a protein "barcode" for data normalization and validation in quantitative proteomic experiments. The barcode comprises a novel set of proteins that was generated from cell cycle experiments performed with MCF7, an estrogen receptor positive breast cancer cell line, and MCF10A, a nontumorigenic immortalized breast cell line. The protein set was selected from a list of ~3700 proteins identified in different cellular subfractions and cell cycle stages of MCF7/MCF10A cells, based on the stability of spectral count data generated with an LTQ ion trap mass spectrometer. A total of 11 proteins qualified as endogenous standards for the nuclear and 62 for the cytoplasmic barcode, respectively. The validation of the protein sets was performed with a complementary SKBR3/Her2+ cell line.

  1. Differential diagnosis of normal pressure hydrocephalus by MRI mean diffusivity histogram analysis.

    Science.gov (United States)

    Ivkovic, M; Liu, B; Ahmed, F; Moore, D; Huang, C; Raj, A; Kovanlikaya, I; Heier, L; Relkin, N

    2013-01-01

    Accurate diagnosis of normal pressure hydrocephalus is challenging because the clinical symptoms and radiographic appearance of NPH often overlap those of other conditions, including age-related neurodegenerative disorders such as Alzheimer and Parkinson diseases. We hypothesized that radiologic differences between NPH and AD/PD can be characterized by a robust and objective MR imaging DTI technique that does not require intersubject image registration or operator-defined regions of interest, thus avoiding many pitfalls common in DTI methods. We collected 3T DTI data from 15 patients with probable NPH and 25 controls with AD, PD, or dementia with Lewy bodies. We developed a parametric model for the shape of intracranial mean diffusivity histograms that separates brain and ventricular components from a third component composed mostly of partial volume voxels. To accurately fit the shape of the third component, we constructed a parametric function named the generalized Voss-Dyke function. We then examined the use of the fitting parameters for the differential diagnosis of NPH from AD, PD, and DLB. Using parameters for the MD histogram shape, we distinguished clinically probable NPH from the 3 other disorders with 86% sensitivity and 96% specificity. The technique yielded 86% sensitivity and 88% specificity when differentiating NPH from AD only. An adequate parametric model for the shape of intracranial MD histograms can distinguish NPH from AD, PD, or DLB with high sensitivity and specificity.

  2. Stability analysis and finite element simulations of superplastic forming in the presence of hydrostatic pressure

    Science.gov (United States)

    Nazzal, M. A.

    2018-04-01

    It is established that some superplastic materials undergo significant cavitation during deformation. In this work, stability analysis for the superplastic copper based alloy Coronze-638 at 550 °C based on Hart's definition of stable plastic deformation and finite element simulations for the balanced biaxial loading case are carried out to study the effects of hydrostatic pressure on cavitation evolution during superplastic forming. The finite element results show that imposing hydrostatic pressure yields to a reduction in cavitation growth.

  3. Ductile failure analysis of high strength steel in hot forming based on micromechanical damage model

    OpenAIRE

    Ying Liang; Liu Wenquan; Wang Dantong; Hu Ping

    2016-01-01

    The damage evolution of high strength steel at elevated temperature is investigated by using the Gurson-Tvergaard-Needleman (GTN) model. A hybrid method integrated thermal tensile test and numerical technique is employed to identify the damage parameters. The analysis results show that the damage parameters are different at different temperature as the variation of tested material microstructure. Furthermore, the calibrated damage parameters are implemented to simulate a bugling forming at el...

  4. Finite element analysis of ion transport in solid state nuclear waste form materials

    Science.gov (United States)

    Rabbi, F.; Brinkman, K.; Amoroso, J.; Reifsnider, K.

    2017-09-01

    Release of nuclear species from spent fuel ceramic waste form storage depends on the individual constituent properties as well as their internal morphology, heterogeneity and boundary conditions. Predicting the release rate is essential for designing a ceramic waste form, which is capable of effectively storing the spent fuel without contaminating the surrounding environment for a longer period of time. To predict the release rate, in the present work a conformal finite element model is developed based on the Nernst Planck Equation. The equation describes charged species transport through different media by convection, diffusion, or migration. And the transport can be driven by chemical/electrical potentials or velocity fields. The model calculates species flux in the waste form with different diffusion coefficient for each species in each constituent phase. In the work reported, a 2D approach is taken to investigate the contributions of different basic parameters in a waste form design, i.e., volume fraction, phase dispersion, phase surface area variation, phase diffusion co-efficient, boundary concentration etc. The analytical approach with preliminary results is discussed. The method is postulated to be a foundation for conformal analysis based design of heterogeneous waste form materials.

  5. EBSD analysis of plastic deformation of copper foils by flexible pad laser shock forming

    Energy Technology Data Exchange (ETDEWEB)

    Nagarajan, Balasubramanian; Castagne, Sylvie [Nanyang Technological University, SIMTech-NTU Joint Laboratory (Precision Machining), Singapore (Singapore); Nanyang Technological University, School of Mechanical and Aerospace Engineering, Singapore (Singapore); Wang, Zhongke; Zheng, H.Y. [Nanyang Technological University, SIMTech-NTU Joint Laboratory (Precision Machining), Singapore (Singapore); Singapore Institute of Manufacturing Technology, Machining Technology Group, Singapore (Singapore)

    2015-11-15

    Flexible pad laser shock forming (FPLSF) is a new mold-free microforming process that induces high-strain-rate plastic deformation in thin metallic foils using laser-induced shock pressure and a hyperelastic flexible pad. This paper studies the plastic deformation behavior of copper foils formed through FPLSF by investigating surface hardness and microstructure. The microstructure of the foil surface before and after FPLSF is analyzed by electron backscatter diffraction technique using grain size distribution and grain boundary misorientation angle as analysis parameters. The surface hardness of the craters experienced a significant improvement after FPLSF; the top crater surface being harder than the bottom surface. The microstructure of the copper foil surface after FPLSF was found to be dominated by grain elongation, along with minor occurrences of subgrain formation, grain refinement, and high dislocation density regions. The results indicate that the prominent plastic deformation mechanism in FPLSF is strain hardening behavior rather than the typical adiabatic softening effect known to be occurring at high-strain-rates for processes such as electromagnetic forming, explosive forming, and laser shock forming. This significant difference in FPLSF is attributed to the concurrent reduction in plastic strain, strain rate, and the inertia effects, resulting from the FPLSF process configuration. Correspondingly, different deformation behaviors are experienced at top and bottom surfaces of the deformation craters, inducing the change in surface hardness and microstructure profiles. (orig.)

  6. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  7. Enunciation and argumentation: proposals for the analysis of personal forms as argumentative resources in orality

    Directory of Open Access Journals (Sweden)

    Federico Testoni

    2017-02-01

    Full Text Available As part of an investigation about strategies and resources for the construction of social representations (Raiter, 2016 on orality, we propose to articulate an analysis of linguistic forms with the argumentative dimension of utterances. The aim of this paper is to review the methodology proposed by Lavandera (1984 for the analysis of the tension between the personal and the impersonal, interpreting the use of this dimension as an argumentative resource (Ducrot, 1984. From our interdiscursive perspective, we seek to show how the characterization of the enunciative roles can facilitate analysis of the positions of enunciation from which the meaning of statements is oriented (Guimarães, 1998. To this end, we will analyse three fragments of an interview by applying the study of the distribution of pronominal forms (Lavandera, 1984 and the argumentative chains (Ducrot, 1984. We believe that analysis as described from the perspective of discourse analysis will mean a contribution for future work on strategies for building social representations.

  8. U-Form vs. M-Form: How to Understand Decision Autonomy Under Healthcare Decentralization? Comment on "Decentralisation of Health Services in Fiji: A Decision Space Analysis".

    Science.gov (United States)

    Bustamante, Arturo Vargas

    2016-06-07

    For more than three decades healthcare decentralization has been promoted in developing countries as a way of improving the financing and delivery of public healthcare. Decision autonomy under healthcare decentralization would determine the role and scope of responsibility of local authorities. Jalal Mohammed, Nicola North, and Toni Ashton analyze decision autonomy within decentralized services in Fiji. They conclude that the narrow decision space allowed to local entities might have limited the benefits of decentralization on users and providers. To discuss the costs and benefits of healthcare decentralization this paper uses the U-form and M-form typology to further illustrate the role of decision autonomy under healthcare decentralization. This paper argues that when evaluating healthcare decentralization, it is important to determine whether the benefits from decentralization are greater than its costs. The U-form and M-form framework is proposed as a useful typology to evaluate different types of institutional arrangements under healthcare decentralization. Under this model, the more decentralized organizational form (M-form) is superior if the benefits from flexibility exceed the costs of duplication and the more centralized organizational form (U-form) is superior if the savings from economies of scale outweigh the costly decision-making process from the center to the regions. Budgetary and financial autonomy and effective mechanisms to maintain local governments accountable for their spending behavior are key decision autonomy variables that could sway the cost-benefit analysis of healthcare decentralization. © 2016 by Kerman University of Medical Sciences.

  9. Data analysis with small samples and non-normal data nonparametrics and other strategies

    CERN Document Server

    Siebert, Carl F

    2017-01-01

    Written in everyday language for non-statisticians, this book provides all the information needed to successfully conduct nonparametric analyses. This ideal reference book provides step-by-step instructions to lead the reader through each analysis, screenshots of the software and output, and case scenarios to illustrate of all the analytic techniques.

  10. Spectral analysis of blood perfusion in the free latissimus dorsi myocutaneous flap and in normal skin

    International Nuclear Information System (INIS)

    Liu Xudong; Zeng Bingfang; Fan Cunyi; Jiang Peizhu; Hu Xiao

    2006-01-01

    To find the properties in the oscillatory components of the cutaneous blood flow on the successful free flap, a wavelet transform was applied to the laser Doppler flowmetry (LDF) signals which were measured simultaneously on the surfaces of the free latissimus dorsi myocutaneous flap and on the adjacent intact skin of the healthy limb, of 18 patients. The frequency interval from 0.0095 to 1.6 Hz was examined and was divided into five subintervals (I: 0.0095-0.021 Hz; II: 0.021-0.052 Hz; III: 0.052-0.145 Hz; IV: 0.145-0.6 Hz and V: 0.6-1.6 Hz) corresponding to endothelial metabolic, neurogenic, myogenic, respiratory and cardiac origins. The average amplitude and total power in the frequency range 0.0095-1.6 Hz as well as within subintervals I, II, IV and V were significantly lower for signals measured on the free flap than those obtained in the healthy limb. However in interval III, they were significantly higher. The normalized spectral amplitude and power in the free flap were significantly lower in only two intervals, I and II, yet in interval III they were significantly higher; no statistical significance was observed in intervals IV and V. The distinctive finding made in this study, aside from the decrease of endothelial metabolic processes and sympathetic control, was the significant increase of myogenic activity in the free flap. It is hoped that this work will contribute towards knowledge on blood circulation in free flaps and make the monitoring by LDF more reliable

  11. Individual differences in normal body temperature: longitudinal big data analysis of patient records.

    Science.gov (United States)

    Obermeyer, Ziad; Samra, Jasmeet K; Mullainathan, Sendhil

    2017-12-13

    To estimate individual level body temperature and to correlate it with other measures of physiology and health. Observational cohort study. Outpatient clinics of a large academic hospital, 2009-14. 35 488 patients who neither received a diagnosis for infections nor were prescribed antibiotics, in whom temperature was expected to be within normal limits. Baseline temperatures at individual level, estimated using random effects regression and controlling for ambient conditions at the time of measurement, body site, and time factors. Baseline temperatures were correlated with demographics, medical comorbidities, vital signs, and subsequent one year mortality. In a diverse cohort of 35 488 patients (mean age 52.9 years, 64% women, 41% non-white race) with 243 506 temperature measurements, mean temperature was 36.6°C (95% range 35.7-37.3°C, 99% range 35.3-37.7°C). Several demographic factors were linked to individual level temperature, with older people the coolest (-0.021°C for every decade, Pdata) was linked to 8.4% higher one year mortality (P=0.014). Individuals' baseline temperatures showed meaningful variation that was not due solely to measurement error or environmental factors. Baseline temperatures correlated with demographics, comorbid conditions, and physiology, but these factors explained only a small part of individual temperature variation. Unexplained variation in baseline temperature, however, strongly predicted mortality. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. An analysis of science conceptual knowledge in journals of students with disabilities and normally achieving students

    Science.gov (United States)

    Grigg, Gail S.

    Science education reforms of the last two decades have focused on raising the bar for ALL students which includes students with mild to moderate disabilities. Formative assessment can be used to assess the progress of these students to inquire, understand scientific concepts, reason scientifically, make decisions, and communicate effectively in science. The purpose of this study is to examine the use of science journals as a formative assessment in a guided inquiry unit of study for students with learning disabilities. Two normally achieving students (NA) and five students with learning disabilities (SLD) participated in a study of mammals that utilized journals to record the development of student knowledge through the course of study. Students were interviewed after the lessons were complete using the same prompts required in the journals. Themes were developed from the student writings and their verbal discourse using Grounded Theory. Journals and verbal discourse were rated following the themes of Knowledge Telling (KT) and Knowledge Transformation (KTR). Concept maps were developed for the Pre and Post test lessons (written and verbal discourses) by the raters in an attempt to further explain the knowledge that the students conveyed. The results of this study suggest that SLD are able to demonstrate knowledge about mammals better through verbal discourse than written discourse. While the NA students wrote more and used more technical discourse than did their SLD peers, the conceptual understanding of the topic by the SLD was no less inclusive than their NA peers when accessed verbally. The journals demonstrated limited conceptual growth for the SLD. Further, while lexical density is important to the development of knowledge in science, this study suggests the "conceptual density" may be another important indicator to examine.

  13. Individual differences in normal body temperature: longitudinal big data analysis of patient records

    Science.gov (United States)

    Samra, Jasmeet K; Mullainathan, Sendhil

    2017-01-01

    Abstract Objective To estimate individual level body temperature and to correlate it with other measures of physiology and health. Design Observational cohort study. Setting Outpatient clinics of a large academic hospital, 2009-14. Participants 35 488 patients who neither received a diagnosis for infections nor were prescribed antibiotics, in whom temperature was expected to be within normal limits. Main outcome measures Baseline temperatures at individual level, estimated using random effects regression and controlling for ambient conditions at the time of measurement, body site, and time factors. Baseline temperatures were correlated with demographics, medical comorbidities, vital signs, and subsequent one year mortality. Results In a diverse cohort of 35 488 patients (mean age 52.9 years, 64% women, 41% non-white race) with 243 506 temperature measurements, mean temperature was 36.6°C (95% range 35.7-37.3°C, 99% range 35.3-37.7°C). Several demographic factors were linked to individual level temperature, with older people the coolest (–0.021°C for every decade, Ptemperature (eg, hypothyroidism: –0.013°C, P=0.01) or higher temperature (eg, cancer: 0.020, Pbody mass index: 0.002 per m/kg2, Ptemperature variation. Despite this, unexplained temperature variation was a significant predictor of subsequent mortality: controlling for all measured factors, an increase of 0.149°C (1 SD of individual temperature in the data) was linked to 8.4% higher one year mortality (P=0.014). Conclusions Individuals’ baseline temperatures showed meaningful variation that was not due solely to measurement error or environmental factors. Baseline temperatures correlated with demographics, comorbid conditions, and physiology, but these factors explained only a small part of individual temperature variation. Unexplained variation in baseline temperature, however, strongly predicted mortality. PMID:29237616

  14. Meta-Analysis: Effects of Probiotic Supplementation on Lipid Profiles in Normal to Mildly Hypercholesterolemic Individuals.

    Directory of Open Access Journals (Sweden)

    Mikiko Shimizu

    Full Text Available Recent experimental and clinical studies have suggested that probiotic supplementation has beneficial effects on serum lipid profiles. However, there are conflicting results on the efficacy of probiotic preparations in reducing serum cholesterol.To evaluate the effects of probiotics on human serum lipid levels, we conducted a meta-analysis of interventional studies.Eligible reports were obtained by searches of electronic databases. We included randomized, controlled clinical trials comparing probiotic supplementation with placebo or no treatment (control. Statistical analysis was performed with Review Manager 5.3.3. Subanalyses were also performed.Eleven of 33 randomized clinical trials retrieved were eligible for inclusion in the meta-analysis. No participant had received any cholesterol-lowering agent. Probiotic interventions (including fermented milk products and probiotics produced changes in total cholesterol (TC (mean difference -0.17 mmol/L, 95% CI: -0.27 to -0.07 mmol/L and low-density lipoprotein cholesterol (LDL-C (mean difference -0.22 mmol/L, 95% CI: -0.30 to -0.13 mmol/L. High-density lipoprotein cholesterol and triglyceride levels did not differ significantly between probiotic and control groups. In subanalysis, long-term (> 4-week probiotic intervention was statistically more effective in decreasing TC and LDL-C than short-term (≤ 4-week intervention. The decreases in TC and LDL-C levels with probiotic intervention were greater in mildly hypercholesterolemic than in normocholesterolemic individuals. Both fermented milk product and probiotic preparations decreased TC and LDL-C levels. Gaio and the Lactobacillus acidophilus strain reduced TC and LDL-C levels to a greater extent than other bacterial strains.In conclusion, this meta-analysis showed that probiotic supplementation could be useful in the primary prevention of hypercholesterolemia and may lead to reductions in risk factors for cardiovascular disease.

  15. Comparative genetic analysis of trichome-less and normal pod genotypes of Mucuna pruriens (Fabaceae).

    Science.gov (United States)

    Dhawan, S S; Rai, G K; Darokar, M P; Lal, R K; Misra, H O; Khanuja, S P S

    2011-09-15

    Velvet bean (Mucuna pruriens) seeds contain the catecholic amino acid L-DoPA (L-3,4-dihydroxyphenylalanine), which is a neurotransmitter precursor and used for the treatment of Parkinson's disease and mental disorders. The great demand for L-DoPA is largely met by the pharmaceutical industry through extraction of the compound from wild populations of this plant; commercial exploitation of this compound is hampered because of its limited availability. The trichomes present on the pods can cause severe itching, blisters and dermatitis, discouraging cultivation. We screened genetic stocks of velvet bean for the trichome-less trait, along with high seed yield and L-DoPA content. The highest yielding trichome-less elite strain was selected and indentified on the basis of a PCR-based DNA fingerprinting method (RAPD), using deca-nucleotide primers. A genetic similarity index matrix was obtained through multivariant analysis using Nei and Li's coefficient. The similarity coefficients were used to generate a tree for cluster analysis using the UPGMA method. Analysis of amplification spectra of 408 bands obtained with 56 primers allowed us to distinguish a trichome-less elite strain of M. pruriens.

  16. Analysis Study of Stevioside and Rebaudioside A from Stevia rebaudiana Bertoni by Normal Phase SPE and RP-HPLC

    Science.gov (United States)

    Martono, Y.; Rohman, A.; Riyanto, S.; Martono, S.

    2018-04-01

    Solid Phase Extraction (SPE) method using silica as sorbent for stevioside and rebaudiosida A analysis in Stevia rebaudiana Bertoni leaf have not been performed. The aim of this study is to develop SPE method using silica as sorbent for Reverse Phase-High Performance Liquid Chromatography (RP-HPLC) analysis of stevioside and rebaudiosida A in S. rebaudiana leaf. The results of this study indicate that the optimal conditions for normal phase SPE (silica) are conditioned with 3.0 mL of hexane. The sample loading volume is 0.1 mL. Cartridge is eluted with 1.0 mL acetonitrile: water (80: 20, v/v) to separate both analytes. The cartridge is washed with chloroform and water of 0.3 mL respectively. The developed SPE sample preparation method meets the accuracy and precision test and can be used for the analysis of stevioside and rebaudioside A by RP-HPLC.

  17. New applications to computerized tomography: analysis of solid dosage forms produced by pharmaceutical industry

    International Nuclear Information System (INIS)

    Oliveira Junior, Jose Martins de; Martins, Antonio Cesar Germano

    2009-01-01

    Full text: In recent years, computerized tomography (CT) has been used as a new probe to study solid dosage forms (tablets) produced by pharmaceutical industry. This new approach to study tablet and powder, or granulation, properties used in pharmaceutical industry is very suitable. First because CT can generate information that traditional technologies used in this kind of analysis can not, such as, density distribution of internal structures and tablet dimensions, pore size distribution, particle shape information, and also investigation of official and unofficial (counterfeit) copies of solid dosage forms. Second because CT is a nondestructive technique, allowing the use of tablets or granules in others analysis. In this work we discus how CT can be used to acquire and reconstruct internal microstructure of tablets and granules. CT is a technique that is based on attenuation of X-rays passing through matter. Attenuation depends on the density and atomic number of the material that is scanned. In this work, a micro-CT X-ray scanner (manufactured by the group of Applied Nuclear Physics at University of Sorocaba) was used to obtain three-dimensional images of the tablets and granules for nondestructive analysis. These images showed a non uniform density distribution of material inside some tablets, the morphology of some granules analyzed, the integrity of the liquid-filled soft-gelatin capsule and so on. It could also be observed that the distribution of different constituents presents an osmotic controlled-release dosage form. The present work shows that it is possible to use X-ray microtomography to obtain useful qualitative and quantitative information on the structure of pharmaceutical dosage forms. (author)

  18. Frequency analysis of CSF flow on cine-MRI in normal pressure hydrocephalus

    International Nuclear Information System (INIS)

    Miyati, Tosiaki; Kasuga, Toshio; Imai, Hiroshi; Fujita, Hiroshi; Mase, Mitsuhito; Itikawa, Katuhiro

    2001-01-01

    To clarify the flow dynamics of intracranial cerebrospinal fluid (CSF) in normal pressure hydrocephalus (NPH), frequency analyses of CSF flow measured with an ECG-gated phase contrast cine magnetic resonance imaging (MRI) were performed. The amplitude and phase in the CSF flow spectra in the aqueduct were determined in patients with NPH after a subarachnoid hemorrhage (SAH-NPH group, n=26), an idiopathic NPH (I-NPH group, n=4), an asymptomatic ventricular dilation or a brain atrophy (VD group, n=21), and in healthy volunteers (control group, n=25). The changes of CSF flow spectra were also analyzed 5 and 15 minutes after an intravenous injection of acetazolamide. Moreover, a phase transfer function (PTF) calculated from the spectra of the driving vascular pulsation and CSF flow in the aqueduct were assessed in patients with SAH-NPH and control groups before and after acetazolamide injection. There values were compared with the pressure volume response (PVR). The amplitude of the 1st-3rd harmonics in the SAH-NPH or I-NPH group was significantly larger than in the control or VD group because of a decrease in compliance (increase in PVR). The phase of the 1st harmonic in the SAH-NPH group was significantly different from that in the control or VD group, but no difference was found between the control and VD groups. The amplitude of the 0-3rd harmonics increased, and the phase of the 1st harmonic changed in all groups after an acetazolamide injection. An evaluation of the time course of the direct current of CSF flow provided further information about the compensatory faculty of the cerebrospinal cavity. A PTF of the 1st harmonic in the SAH-NPH group was significantly larger than in the control group, and a positive correlation was noted between PTF of the 1st harmonic and PVR. In conclusion, frequency analyses of CSF flow measured by cine-MRI make it possible to obtain noninvasively a more detailed picture of the pathophysiology of NPH and of changes in intracranial

  19. Frequency analysis of CSF flow on cine-MRI in normal pressure hydrocephalus

    Energy Technology Data Exchange (ETDEWEB)

    Miyati, Tosiaki; Kasuga, Toshio; Imai, Hiroshi [Kanazawa Univ. (Japan). School of Medicine; Fujita, Hiroshi; Mase, Mitsuhito; Itikawa, Katuhiro

    2001-09-01

    To clarify the flow dynamics of intracranial cerebrospinal fluid (CSF) in normal pressure hydrocephalus (NPH), frequency analyses of CSF flow measured with an ECG-gated phase contrast cine magnetic resonance imaging (MRI) were performed. The amplitude and phase in the CSF flow spectra in the aqueduct were determined in patients with NPH after a subarachnoid hemorrhage (SAH-NPH group, n=26), an idiopathic NPH (I-NPH group, n=4), an asymptomatic ventricular dilation or a brain atrophy (VD group, n=21), and in healthy volunteers (control group, n=25). The changes of CSF flow spectra were also analyzed 5 and 15 minutes after an intravenous injection of acetazolamide. Moreover, a phase transfer function (PTF) calculated from the spectra of the driving vascular pulsation and CSF flow in the aqueduct were assessed in patients with SAH-NPH and control groups before and after acetazolamide injection. There values were compared with the pressure volume response (PVR). The amplitude of the 1st-3rd harmonics in the SAH-NPH or I-NPH group was significantly larger than in the control or VD group because of a decrease in compliance (increase in PVR). The phase of the 1st harmonic in the SAH-NPH group was significantly different from that in the control or VD group, but no difference was found between the control and VD groups. The amplitude of the 0-3rd harmonics increased, and the phase of the 1st harmonic changed in all groups after an acetazolamide injection. An evaluation of the time course of the direct current of CSF flow provided further information about the compensatory faculty of the cerebrospinal cavity. A PTF of the 1st harmonic in the SAH-NPH group was significantly larger than in the control group, and a positive correlation was noted between PTF of the 1st harmonic and PVR. In conclusion, frequency analyses of CSF flow measured by cine-MRI make it possible to obtain noninvasively a more detailed picture of the pathophysiology of NPH and of changes in intracranial

  20. Automated impedance-manometry analysis detects esophageal motor dysfunction in patients who have non-obstructive dysphagia with normal manometry.

    Science.gov (United States)

    Nguyen, N Q; Holloway, R H; Smout, A J; Omari, T I

    2013-03-01

    Automated integrated analysis of impedance and pressure signals has been reported to identify patients at risk of developing dysphagia post fundoplication. This study aimed to investigate this analysis in the evaluation of patients with non-obstructive dysphagia (NOD) and normal manometry (NOD/NM). Combined impedance-manometry was performed in 42 patients (27F : 15M; 56.2 ± 5.1 years) and compared with that of 24 healthy subjects (8F : 16M; 48.2 ± 2.9 years). Both liquid and viscous boluses were tested. MATLAB-based algorithms defined the median intrabolus pressure (IBP), IBP slope, peak pressure (PP), and timing of bolus flow relative to peak pressure (TNadImp-PP). An index of pressure and flow (PFI) in the distal esophagus was derived from these variables. Diagnoses based on conventional manometric assessment: diffuse spasm (n = 5), non-specific motor disorders (n = 19), and normal (n = 11). Patients with achalasia (n = 7) were excluded from automated impedance-manometry (AIM) analysis. Only 2/11 (18%) patients with NOD/NM had evidence of flow abnormality on conventional impedance analysis. Several variables derived by integrated impedance-pressure analysis were significantly different in patients as compared with healthy: higher PNadImp (P < 0.01), IBP (P < 0.01) and IBP slope (P < 0.05), and shorter TNadImp_PP (P = 0.01). The PFI of NOD/NM patients was significantly higher than that in healthy (liquid: 6.7 vs 1.2, P = 0.02; viscous: 27.1 vs 5.7, P < 0.001) and 9/11 NOD/NM patients had abnormal PFI. Overall, the addition of AIM analysis provided diagnoses and/or a plausible explanation in 95% (40/42) of patients who presented with NOD. Compared with conventional pressure-impedance assessment, integrated analysis is more sensitive in detecting subtle abnormalities in esophageal function in patients with NOD and normal manometry. © 2012 Blackwell Publishing Ltd.

  1. Voxel-based analysis of Tc-99 m ECD brain perfusion SPECT in patients with normal pressure hydrocephalus

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Bora [Department of Neurology, College of Medicine, Catholic University of Korea, Seoul 137-701 (Korea, Republic of); Yang, Dong-Won [Department of Neurology, College of Medicine, Catholic University of Korea, Seoul 137-701 (Korea, Republic of)], E-mail: neuroman@catholic.ac.kr; Shim, Yong-Soo; Chung, Sung-Woo [Department of Neurology, College of Medicine, Catholic University of Korea, Seoul 137-701 (Korea, Republic of); Ahn, Kook-Jin; O, Joo-Hyun; Kim, Sung-Hoon; Sohn, Hyung-Sun; Chung, Soo-Kyo [Department of Radiology, College of Medicine, Catholic University of Korea, Seoul 137-701 (Korea, Republic of); Chung, Yong-An [Department of Radiology, College of Medicine, Catholic University of Korea, Seoul 137-701 (Korea, Republic of); East-West Research Institute of Translational Medicine (EWTM), Incheon St. Mary' s Hospital, Incheon 403-720 (Korea, Republic of)], E-mail: nm@catholic.ac.kr

    2009-07-15

    Idiopathic normal pressure hydrocephalus (iNPH) is a reversible dementia characterized by gait disturbance, incontinence and dementia. This study investigates the neuropsychological characteristics and changes of regional cerebral blood flow (rCBF) in patients with iNPH. Ten patients who met the criteria of probable iNPH and 13 normal control subjects were evaluated. The general cognitive function and detailed neuropsychological functions were measured by K-MMSE and comprehensive neuropsychological battery. Tc-99m-ethyl cysteinate dimmer (Tc-99m-ECD) single photon emission computed tomography (SPECT) was performed to measure the rCBF and statistical parametric mapping (SPM) and statistical probabilistic brain anatomic map (SPAM) was applied to the objective analysis of SPECT data. On the neuropsychological examination, all the patients showed abnormality in memory, psychomotor speed and frontal executive function. SPM analysis of SPECT images revealed that rCBF in bilateral thalami, right prefrontal area, bilateral anterior and posterior cingulate gyri, right caudate nucleus, and left parahippocampal gyrus was significantly decreased in patients with iNPH compared to normal controls (uncorrected P<0.005). In SPAM analysis, rCBF reduction was observed in bilateral prefrontal area, anterior, posterior cingulate gyri and caudate nuclei. We have found that rCBF changes occurred predominantly in prefrontal and subcortical areas, the changes were associated with frontal subcortical circuit, and the affected frontal subcortical circuit may contribute to the cognitive decline seen in the iNPH patients. The reduction of rCBF and clinical cognitive impairment are closely connected in patients with iNPH.

  2. Voxel-based analysis of Tc-99 m ECD brain perfusion SPECT in patients with normal pressure hydrocephalus

    International Nuclear Information System (INIS)

    Yoon, Bora; Yang, Dong-Won; Shim, Yong-Soo; Chung, Sung-Woo; Ahn, Kook-Jin; O, Joo-Hyun; Kim, Sung-Hoon; Sohn, Hyung-Sun; Chung, Soo-Kyo; Chung, Yong-An

    2009-01-01

    Idiopathic normal pressure hydrocephalus (iNPH) is a reversible dementia characterized by gait disturbance, incontinence and dementia. This study investigates the neuropsychological characteristics and changes of regional cerebral blood flow (rCBF) in patients with iNPH. Ten patients who met the criteria of probable iNPH and 13 normal control subjects were evaluated. The general cognitive function and detailed neuropsychological functions were measured by K-MMSE and comprehensive neuropsychological battery. Tc-99m-ethyl cysteinate dimmer (Tc-99m-ECD) single photon emission computed tomography (SPECT) was performed to measure the rCBF and statistical parametric mapping (SPM) and statistical probabilistic brain anatomic map (SPAM) was applied to the objective analysis of SPECT data. On the neuropsychological examination, all the patients showed abnormality in memory, psychomotor speed and frontal executive function. SPM analysis of SPECT images revealed that rCBF in bilateral thalami, right prefrontal area, bilateral anterior and posterior cingulate gyri, right caudate nucleus, and left parahippocampal gyrus was significantly decreased in patients with iNPH compared to normal controls (uncorrected P<0.005). In SPAM analysis, rCBF reduction was observed in bilateral prefrontal area, anterior, posterior cingulate gyri and caudate nuclei. We have found that rCBF changes occurred predominantly in prefrontal and subcortical areas, the changes were associated with frontal subcortical circuit, and the affected frontal subcortical circuit may contribute to the cognitive decline seen in the iNPH patients. The reduction of rCBF and clinical cognitive impairment are closely connected in patients with iNPH.

  3. An elemental abundance analysis of the superficially normal A star Vega

    International Nuclear Information System (INIS)

    Adelman, S.J.; Gulliver, A.F.

    1990-01-01

    An elemental abundance analysis of Vega has been performed using high-signal-to-noise 2.4 A/mm Reticon observations of the region 4313-4809 A. Vega is found to be a metal-poor star with a mean underabundance of 0.60 dex. The He/H ratio of 0.03 as derived from He I 4472 A suggests that the superficial helium convection zone has disappeared and that radiative diffusion is producing the photospheric abundance anomalies. 45 refs

  4. Volumetric analysis of the normal infant brain and in intrauterine growth retardation

    DEFF Research Database (Denmark)

    Toft, P B; Leth, H; Ring, P B

    1995-01-01

    and the volumes were determined by encircling each structure of interest on every slice. Segmentation into grey matter, white matter and CSF was done by semi-automatic discriminant analysis. Growth charts for the cerebrum, cerebellum, corpora striata, thalami, ventricles, and grey and white matter are provided...... for infants with appropriate birth weight. The striatal (P = 0.02) and thalamic (P matter to white matter (G/W-ratio) increased (P = 0.01). In the neonatal patients, brain volumes were independently associated...... growth retardation reduces grey matter volume more than white matter....

  5. Morphological analysis of the cervical spinal canal, dural tube and spinal cord in normal individuals using CT myelography

    International Nuclear Information System (INIS)

    Inoue, H.; Ohmori, K.; Takatsu, T.; Teramoto, T.; Ishida, Y.; Suzuki, K.

    1996-01-01

    To verify the conventional concept of ''developmental stenosis of the cervical spinal canal'', we performed a morphological analysis of the relations of the cervical spinal canal, dural tube and spinal cord in normal individuals. The sagittal diameter, area and circularity of the three structures, and the dispersion of each parameter, were examined on axial sections of CT myelograms of 36 normal subjects. The spinal canal was narrowest at C4, followed by C5, while the spinal cord was largest at C4/5. The area and circularity of the cervical spinal cord were not significantly correlated with any parameter of the spinal canal nor with the sagittal diameter and area of the dural tube at any level examined, and the spinal cord showed less individual variation than the bony canal. Compression of the spinal cord might be expected whenever the sagittal diameter of the spinal canal is below the lower limit of normal, that is about 12 mm on plain radiographs. Thus, we concluded that the concept of ''developmental stenosis of the cervical spinal canal'' was reasonable and acceptable. (orig.). With 2 figs., 3 tabs

  6. Methods and data for HTGR fuel performance and radionuclide release modeling during normal operation and accidents for safety analysis

    International Nuclear Information System (INIS)

    Verfondern, K.; Martin, R.C.; Moormann, R.

    1993-01-01

    The previous status report released in 1987 on reference data and calculation models for fission product transport in High-Temperature, Gas-Cooled Reactor (HTGR) safety analyses has been updated to reflect the current state of knowledge in the German HTGR program. The content of the status report has been expanded to include information from other national programs in HTGRs to provide comparative information on methods of analysis and the underlying database for fuel performance and fission product transport. The release and transport of fission products during normal operating conditions and during the accident scenarios of core heatup, water and air ingress, and depressurization are discussed. (orig.) [de

  7. Single-Phase Full-Wave Rectifier as an Effective Example to Teach Normalization, Conduction Modes, and Circuit Analysis Methods

    Directory of Open Access Journals (Sweden)

    Predrag Pejovic

    2013-12-01

    Full Text Available Application of a single phase rectifier as an example in teaching circuit modeling, normalization, operating modes of nonlinear circuits, and circuit analysis methods is proposed.The rectifier supplied from a voltage source by an inductive impedance is analyzed in the discontinuous as well as in the continuous conduction mode. Completely analytical solution for the continuous conduction mode is derived. Appropriate numerical methods are proposed to obtain the circuit waveforms in both of the operating modes, and to compute the performance parameters. Source code of the program that performs such computation is provided.

  8. Structural analysis of surface film on alloy 600 formed under environment of PWR primary water

    Energy Technology Data Exchange (ETDEWEB)

    Terachi, Takumi; Totsuka, Nobuo; Yamada, Takuyo; Nakagawa, Tomokazu [Inst. of Nuclear Safety System Inc., Mihama, Fukui (Japan); Deguchi, Hiroshi [Kansai Electric Power Co., Inc., Osaka (Japan); Horiuchi, Masaki; Oshitani, Masato [Kanden Kako Co., Ltd., Osaka (Japan)

    2002-09-01

    It has been shown by one of the present authors and so forth that PWSCC of alloy 600 relates to dissolved hydrogen concentration (DH) in water and oxide film structure. However, the mechanism of PWSCC has not been clear yet. Therefore, in order to investigate relationship between them, structural analysis of the oxide film formed under the environment of PWR primary water was carried out by using X-ray diffraction, the scanning electron microscope and the transmission electron microscope. Especially, to perform accurate analysis, the synchrotron orbital radiation with SPring-8 was tried to use for thin film X-ray diffraction measurement. From the results, observed are as follows: 1. the oxide film is mainly composed of NiO, under the condition without hydrogen. 2. In the environment of DH 2.75ppm, the oxide film forms thin spinel structures. 3. On the other hand, needlelike oxides are formed at DH 1ppm. For this reason, around 1ppm of DH there would be the boundary that stable NiO and spinel oxide generate, and it agrees with the peak range of the PWSCC susceptibility on hydrogen. From this, it is suggested that the boundary of NiO/spinel oxide affects the SCC susceptibility. (author)

  9. Neutron activation analysis of alternative waste forms at the Savannah River Laboratory

    International Nuclear Information System (INIS)

    Johns, R.A.

    1981-01-01

    A remotely controlled system for neutron activation of candidate high-level waste (HLW) isolation forms was built by the Savannah River Laboratory at a Savannah River Plant reactor. With this system, samples can be irradiated for up to 24 hours and transferred through pneumatic tubing to a shielded repository unitl their activity is low enough for them to be handled in a radiobench. The principal use of the system is to support the Alternative Waste Forms Leach Testing (AWFLT) Program in which the comparative leachability of the various waste forms will be determined. The experimental method used in this work is based on neutron activation analysis techniques. Neutron irradiation of the solid waste form containing simulated HLW sludge activates elements in the sample. After suitable leaching of the solid matrix in standard solutions, the leachate and solid are assayed for gamma-emitting nuclides. From these measurements, the fraction of a specific element leached can be determined al half-lives with experimental ones, over a range of 24 orders of magnitude was obtained. This is a strong argument that the alpha decay could be considered a fission process with very high mass asymmetry and charge density asymmetry

  10. Confirmatory factor analysis reveals a latent cognitive structure common to bipolar disorder, schizophrenia, and normal controls.

    Science.gov (United States)

    Schretlen, David J; Peña, Javier; Aretouli, Eleni; Orue, Izaskun; Cascella, Nicola G; Pearlson, Godfrey D; Ojeda, Natalia

    2013-06-01

    We sought to determine whether a single hypothesized latent factor structure would characterize cognitive functioning in three distinct groups. We assessed 576 adults (340 community controls, 126 adults with bipolar disorder, and 110 adults with schizophrenia) using 15 measures derived from nine cognitive tests. Confirmatory factor analysis (CFA) was conducted to examine the fit of a hypothesized six-factor model. The hypothesized factors included attention, psychomotor speed, verbal memory, visual memory, ideational fluency, and executive functioning. The six-factor model provided an excellent fit for all three groups [for community controls, root mean square error of approximation (RMSEA) schizophrenia, RMSEA = 0.06 and CFI = 0.98]. Alternate models that combined fluency with processing speed or verbal and visual memory reduced the goodness of fit. Multi-group CFA results supported factor invariance across the three groups. Confirmatory factor analysis supported a single six-factor structure of cognitive functioning among patients with schizophrenia or bipolar disorder and community controls. While the three groups clearly differ in level of performance, they share a common underlying architecture of information processing abilities. These cognitive factors could provide useful targets for clinical trials of treatments that aim to enhance information processing in persons with neurological and neuropsychiatric disorders. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Normalization in Unsupervised Segmentation Parameter Optimization: A Solution Based on Local Regression Trend Analysis

    Directory of Open Access Journals (Sweden)

    Stefanos Georganos

    2018-02-01

    Full Text Available In object-based image analysis (OBIA, the appropriate parametrization of segmentation algorithms is crucial for obtaining satisfactory image classification results. One of the ways this can be done is by unsupervised segmentation parameter optimization (USPO. A popular USPO method does this through the optimization of a “global score” (GS, which minimizes intrasegment heterogeneity and maximizes intersegment heterogeneity. However, the calculated GS values are sensitive to the minimum and maximum ranges of the candidate segmentations. Previous research proposed the use of fixed minimum/maximum threshold values for the intrasegment/intersegment heterogeneity measures to deal with the sensitivity of user-defined ranges, but the performance of this approach has not been investigated in detail. In the context of a remote sensing very-high-resolution urban application, we show the limitations of the fixed threshold approach, both in a theoretical and applied manner, and instead propose a novel solution to identify the range of candidate segmentations using local regression trend analysis. We found that the proposed approach showed significant improvements over the use of fixed minimum/maximum values, is less subjective than user-defined threshold values and, thus, can be of merit for a fully automated procedure and big data applications.

  12. CATHENA Analysis Of Candu Advanced Passive Moderator Concept In Normal Operation Condition

    International Nuclear Information System (INIS)

    Alfa, Sudjatmi K

    2001-01-01

    In the CANDU - advanced passive moderator (APM) concept, the positive void reactivity is eliminated by reducing the density of the moderator. The simple model for the CANDU APM concept consists of the calandria, heat exchanger, pump, and a stabilizing tank, along with connecting piping. The calandria is divided into two parts, one part simulates the down area, while the other simulates up flow area. To demonstrate the thermalhydraulic behavior of the APM concept, Canadian algorithm for thermalhydraulic network analysis (CATHENA) code is used. The simulation for a pressure boundary condition of 300, 330 and 360 kPa and for water coolant mass flow rate boundary conditions of 2000 and 3000 kg/s respectively have been studied. Preliminary results show that there is boiling in the core, with vapor condensing in the heat exchanger. It is important to note, that the solution had not reached steady state when the boiling occurred

  13. Forming Attitudes That Predict Future Behavior: A Meta-Analysis of the Attitude–Behavior Relation

    Science.gov (United States)

    Glasman, Laura R.; Albarracín, Dolores

    2016-01-01

    A meta-analysis (k of conditions = 128; N = 4,598) examined the influence of factors present at the time an attitude is formed on the degree to which this attitude guides future behavior. The findings indicated that attitudes correlated with a future behavior more strongly when they were easy to recall (accessible) and stable over time. Because of increased accessibility, attitudes more strongly predicted future behavior when participants had direct experience with the attitude object and reported their attitudes frequently. Because of the resulting attitude stability, the attitude–behavior association was strongest when attitudes were confident, when participants formed their attitude on the basis of behavior-relevant information, and when they received or were induced to think about one- rather than two-sided information about the attitude object. PMID:16910754

  14. Building form and environmental performance: archetypes, analysis and an arid climate

    Energy Technology Data Exchange (ETDEWEB)

    Ratti, C.; Raydan, D.; Steemers, K. [Department of Architecture, The Martin Centre for Architectural and Urban Studies, University of Cambridge, Cambridge (United Kingdom)

    2002-07-01

    Leslie Martin and others at Cambridge University addressed the question 'What building forms make the best use of land?' in a number of influential papers published in the late 1960s. They selected six simplified urban arrays based on archetypal building forms. Then they analysed and compared the archetypes in terms of built potential and day lighting criteria, eventually reaching the conclusion that courtyards perform best. Their results, which inspired a generation of designers, are briefly reviewed here and reassessed in environmental terms using innovative computer analysis techniques. Furthermore, the implications of their question, which to date has not addressed the link with climate, are explored using a case study in a hot-arid region. (author)

  15. Knowledge discovery in ophthalmology: analysis of wet form of age-related macular degeneration treatment outcomes

    Science.gov (United States)

    Ulińska, Magdalena; Tataj, Emanuel; Mulawka, Jan J.; Szaflik, Jerzy

    2009-06-01

    Age-related Macular Degeneration (AMD), according to epidemiological data, is a main reason of social blindness among elderly people in developed countries. There are two forms of AMD: dry and wet. The first one is of good prognosis with low possibility of serious visual deterioration, while the second one usually leads to quick and severe visual impairment. The aim of our investigations is to analyse results of so called real-life treatment of wet AMD. We analysed outcomes of our patients treated with intravitreal injections of anti-VEGF drugs: Lucentis (61 patients) and Avastin (78 patients). We analysed changes in visual acuity (functional effect) and central retinal thickness (anatomic effect). Both drugs occurred to be efficient in treatment of wet form of AMD, however results were more satisfying in patients with better baseline visual acuity. In our approach we used R environment - an integrated suite of software facilities for data analysis and graphics.

  16. Analysis of vegetation recovery surrounding a restored wetland using the normalized difference infrared index (NDII) and normalized difference vegetation index (NDVI)

    Science.gov (United States)

    Wilson, Natalie R.; Norman, Laura

    2018-01-01

    Watershed restoration efforts seek to rejuvenate vegetation, biological diversity, and land productivity at Cienega San Bernardino, an important wetland in southeastern Arizona and northern Sonora, Mexico. Rock detention and earthen berm structures were built on the Cienega San Bernardino over the course of four decades, beginning in 1984 and continuing to the present. Previous research findings show that restoration supports and even increases vegetation health despite ongoing drought conditions in this arid watershed. However, the extent of restoration impacts is still unknown despite qualitative observations of improvement in surrounding vegetation amount and vigor. We analyzed spatial and temporal trends in vegetation greenness and soil moisture by applying the normalized difference vegetation index (NDVI) and normalized difference infrared index (NDII) to one dry summer season Landsat path/row from 1984 to 2016. The study area was divided into zones and spectral data for each zone was analyzed and compared with precipitation record using statistical measures including linear regression, Mann– Kendall test, and linear correlation. NDVI and NDII performed differently due to the presence of continued grazing and the effects of grazing on canopy cover; NDVI was better able to track changes in vegetation in areas without grazing while NDII was better at tracking changes in areas with continued grazing. Restoration impacts display higher greenness and vegetation water content levels, greater increases in greenness and water content through time, and a decoupling of vegetation greenness and water content from spring precipitation when compared to control sites in nearby tributary and upland areas. Our results confirm the potential of erosion control structures to affect areas up to 5 km downstream of restoration sites over time and to affect 1 km upstream of the sites.

  17. Analysis of local warm forming of high strength steel using near infrared ray energy

    Energy Technology Data Exchange (ETDEWEB)

    Yang, W. H., E-mail: whyang21@hyundai.com [Hyundai Motor Company, 700 Yeompo-ro, Buk-Gu, Ulsan, 683-791 (Korea, Republic of); Lee, K., E-mail: klee@deform.co.kr [Solution Lab, 502, 102, Dunsan-daero 117 beon-gil, Seo-Gu, Daejeon, 302-834 (Korea, Republic of); Lee, E. H., E-mail: mtgs2@kaist.ac.kr, E-mail: dyyang@kaist.ac.kr; Yang, D. Y., E-mail: mtgs2@kaist.ac.kr, E-mail: dyyang@kaist.ac.kr [KAIST, Science Town291, Daehak-ro, Yuseong-Gu, Daejeon 305-701 (Korea, Republic of)

    2013-12-16

    The automotive industry has been pressed to satisfy more rigorous fuel efficiency requirements to promote energy conservation, safety features and cost containment. To satisfy this need, high strength steel has been developed and used for many different vehicle parts. The use of high strength steels, however, requires careful analysis and creativity in order to accommodate its relatively high springback behavior. An innovative method, called local warm forming with near infrared ray, has been developed to help promote the use of high strength steels in sheet metal forming. For this method, local regions of the work piece are heated using infrared ray energy, thereby promoting the reduction of springback behavior. In this research, a V-bend test is conducted with DP980. After springback, the bend angles for specimens without local heating are compared to those with local heating. Numerical analysis has been performed using the commercial program, DEFORM-2D. This analysis is carried out with the purpose of understanding how changes to the local stress distribution will affect the springback during the unloading process. The results between experimental and computational approaches are evaluated to assure the accuracy of the simulation. Subsequent numerical simulation studies are performed to explore best practices with respect to thermal boundary conditions, timing, and applicability to the production environment.

  18. Analysis of iodine chemical form noted from severe fuel damage experiments

    International Nuclear Information System (INIS)

    Cronenberg, A.W.; Osetek, D.J.

    1986-01-01

    Data from the TMI-2 accident has shown that only small amounts of iodine (I) escaped the plant. The postulated reason for such limited release is the formation of CsI (a salt) within fuel, which remains stable in a reducing high-temperature steam-H 2 environment. Upon cooldown CsI would dissolve in water condensate to form an ionic solution. However, recent data from fuel destruction experiments indicate different iodine release behavior that is tied to fuel burnup and oxidation conditions, as well as fission product concentration levels in the steam/H 2 effluent. Analysis of the data indicate that at low-burnup conditions, atomic I release from fuel is favored. Likewise, at low fission product concentration conditions HI is the favored chemical form in the steam/H 2 environment, not CsI. Results of thermochemical equilibria and chemical kinetics analysis support the data trends noted from the PBF-SFD tests. An a priori assumption of CsI for risk analysis of all accident sequences may therefore be inappropriate

  19. Analysis of local warm forming of high strength steel using near infrared ray energy

    International Nuclear Information System (INIS)

    Yang, W. H.; Lee, K.; Lee, E. H.; Yang, D. Y.

    2013-01-01

    The automotive industry has been pressed to satisfy more rigorous fuel efficiency requirements to promote energy conservation, safety features and cost containment. To satisfy this need, high strength steel has been developed and used for many different vehicle parts. The use of high strength steels, however, requires careful analysis and creativity in order to accommodate its relatively high springback behavior. An innovative method, called local warm forming with near infrared ray, has been developed to help promote the use of high strength steels in sheet metal forming. For this method, local regions of the work piece are heated using infrared ray energy, thereby promoting the reduction of springback behavior. In this research, a V-bend test is conducted with DP980. After springback, the bend angles for specimens without local heating are compared to those with local heating. Numerical analysis has been performed using the commercial program, DEFORM-2D. This analysis is carried out with the purpose of understanding how changes to the local stress distribution will affect the springback during the unloading process. The results between experimental and computational approaches are evaluated to assure the accuracy of the simulation. Subsequent numerical simulation studies are performed to explore best practices with respect to thermal boundary conditions, timing, and applicability to the production environment

  20. Comparative analysis of the surface exposed proteome of two canine osteosarcoma cell lines and normal canine osteoblasts.

    Science.gov (United States)

    Milovancev, Milan; Hilgart-Martiszus, Ian; McNamara, Michael J; Goodall, Cheri P; Seguin, Bernard; Bracha, Shay; Wickramasekara, Samanthi I

    2013-06-13

    Osteosarcoma (OSA) is the most common primary bone tumor of dogs and carries a poor prognosis despite aggressive treatment. An improved understanding of the biology of OSA is critically needed to allow for development of novel diagnostic, prognostic, and therapeutic tools. The surface-exposed proteome (SEP) of a cancerous cell includes a multifarious array of proteins critical to cellular processes such as proliferation, migration, adhesion, and inter-cellular communication. The specific aim of this study was to define a SEP profile of two validated canine OSA cell lines and a normal canine osteoblast cell line utilizing a biotinylation/streptavidin system to selectively label, purify, and identify surface-exposed proteins by mass spectrometry (MS) analysis. Additionally, we sought to validate a subset of our MS-based observations via quantitative real-time PCR, Western blot and semi-quantitative immunocytochemistry. Our hypothesis was that MS would detect differences in the SEP composition between the OSA and the normal osteoblast cells. Shotgun MS identified 133 putative surface proteins when output from all samples were combined, with good consistency between biological replicates. Eleven of the MS-detected proteins underwent analysis of gene expression by PCR, all of which were actively transcribed, but varied in expression level. Western blot of whole cell lysates from all three cell lines was effective for Thrombospondin-1, CYR61 and CD44, and indicated that all three proteins were present in each cell line. Semi-quantitative immunofluorescence indicated that CD44 was expressed at much higher levels on the surface of the OSA than the normal osteoblast cell lines. The results of the present study identified numerous differences, and similarities, in the SEP of canine OSA cell lines and normal canine osteoblasts. The PCR, Western blot, and immunocytochemistry results, for the subset of proteins evaluated, were generally supportive of the mass spectrometry data

  1. Characteristics of Cerebral Blood Flow in Vascular Dementia using SPM Analysis Compared to Normal Control and Alzheimer's Dementia

    International Nuclear Information System (INIS)

    Kang, Do Young; Park, Kyung Won; Kim, Jae Woo

    2003-01-01

    Cerebral perfusion pattern of vascular dementia (VD) was not well established and overlap of cerebral perfusion pattern was reported between VD and Alzheimer's dementia (AD). The aim of this study is to assess the specific patterns of SPECT finding in VD compared with normal control subjects and to disclose differences of cerebral blood flow between subjects with VD and AD were investigated using statistic parametric mapping analysis. Thirty-two VD (mean age ; 67.86.4 years, mean CDR ; 0.980.27), 51 AD (mean age ; 71.47.2 years, CDR ; 1.160.47), which were matched for age and severity of dementia, and 30 normal control subjects (mean age ; 60.17.7 years) participated in this study. The Tc-99m HMPAO brain perfusion SPECT data were analyzed by SPM99. The SPECT data of the patients with VD were compared to those of the control subjects and then compared to the patients with AD. SPM analysis of the SPECT image showed significant perfusion deficits in the both frontal (both cingulate gyrus, both inferior frontal gyrus, B no.47, right frontal rectal gyrus, left frontal subcallosal gyrus, B no.25), both temporal (right insula, B no.13, left superior temporal gyrus, left parahippocampal gyrus, B no.35), occipital (occipital lingual gyrus), right corpus callosum and right cerebellar tonsil regions in subjects with VD compared with normal control subjects (uncorrected p<0.01). Comparison of the two dementia groups (uncorrected p<0.01) revealed significant hypoperfusion in both parietal posterior central gyrus, right inferior frontal gyrus (B no.47), left insula, right thalamus (ventral lateral nucleus), right claustrum and right occipital cuneus regions in VD group compared with AD. There were no typical confined regional hypoperfusion areas but scattered multiple perfusion deficits in VD compared AD. These findings may be helpful to reflect the pathophysiological mechanisms of VD and to disclose differences of cerebral blood flow between subjects with VD and AD

  2. Cone-beam CT analysis of patients with obstructive sleep apnea compared to normal controls

    Energy Technology Data Exchange (ETDEWEB)

    Buchanan, Allison; Kalathingal Sajitha; De Rossi, Scott [Dept. of Oral Health and Diagnostic Sciences, Dental College of Georgia, Augusta University, Augusta (United States); Cohen, Ruben [Park Avenue Oral and Facial Surgery, New York (United States); Loony, Stephen [Dept. of Biostatistics and Epidemiology, Augusta University Medical College of Georgia, Augusta (United States)

    2016-03-15

    To evaluate the upper airway dimensions of obstructive sleep apnea (OSA) and control subjects using a cone-beam computed tomography (CBCT) unit commonly applied in clinical practice in order to assess airway dimensions in the same fashion as that routinely employed in a clinical setting. This was a retrospective analysis utilizing existing CBCT scans to evaluate the dimensions of the upper airway in OSA and control subjects. The CBCT data of sixteen OSA and sixteen control subjects were compared. The average area, average volume, total volume, and total length of the upper airway were computed. Width and anterior-posterior (AP) measurements were obtained on the smallest axial slice. OSA subjects had a significantly smaller average airway area, average airway volume, total airway volume, and mean airway width. OSA subjects had a significantly larger airway length measurement. The mean A-P distance was not significantly different between groups. OSA subjects have a smaller upper airway compared to controls with the exception of airway length. The lack of a significant difference in the mean A-P distance may indicate that patient position during imaging (upright vs. supine) can affect this measurement. Comparison of this study with a future prospective study design will allow for validation of these results.

  3. Cone-beam CT analysis of patients with obstructive sleep apnea compared to normal controls

    International Nuclear Information System (INIS)

    Buchanan, Allison; Kalathingal Sajitha; De Rossi, Scott; Cohen, Ruben; Loony, Stephen

    2016-01-01

    To evaluate the upper airway dimensions of obstructive sleep apnea (OSA) and control subjects using a cone-beam computed tomography (CBCT) unit commonly applied in clinical practice in order to assess airway dimensions in the same fashion as that routinely employed in a clinical setting. This was a retrospective analysis utilizing existing CBCT scans to evaluate the dimensions of the upper airway in OSA and control subjects. The CBCT data of sixteen OSA and sixteen control subjects were compared. The average area, average volume, total volume, and total length of the upper airway were computed. Width and anterior-posterior (AP) measurements were obtained on the smallest axial slice. OSA subjects had a significantly smaller average airway area, average airway volume, total airway volume, and mean airway width. OSA subjects had a significantly larger airway length measurement. The mean A-P distance was not significantly different between groups. OSA subjects have a smaller upper airway compared to controls with the exception of airway length. The lack of a significant difference in the mean A-P distance may indicate that patient position during imaging (upright vs. supine) can affect this measurement. Comparison of this study with a future prospective study design will allow for validation of these results

  4. Exogenous reference gene normalization for real-time reverse transcription-polymerase chain reaction analysis under dynamic endogenous transcription.

    Science.gov (United States)

    Johnston, Stephen; Gallaher, Zachary; Czaja, Krzysztof

    2012-05-15

    Quantitative real-time reverse transcription-polymerase chain reaction (qPCR) is widely used to investigate transcriptional changes following experimental manipulations to the nervous system. Despite the widespread utilization of qPCR, the interpretation of results is marred by the lack of a suitable reference gene due to the dynamic nature of endogenous transcription. To address this inherent deficiency, we investigated the use of an exogenous spike-in mRNA, luciferase, as an internal reference gene for the 2(-∆∆Ct) normalization method. To induce dynamic transcription, we systemically administered capsaicin, a neurotoxin selective for C-type sensory neurons expressing the TRPV-1 receptor, to adult male Sprague-Dawley rats. We later isolated nodose ganglia for qPCR analysis with the reference being either exogenous luciferase mRNA or the commonly used endogenous reference β-III tubulin. The exogenous luciferase mRNA reference clearly demonstrated the dynamic expression of the endogenous reference. Furthermore, variability of the endogenous reference would lead to misinterpretation of other genes of interest. In conclusion, traditional reference genes are often unstable under physiologically normal situations, and certainly unstable following the damage to the nervous system. The use of exogenous spike-in reference provides a consistent and easily implemented alternative for the analysis of qPCR data.

  5. Investigation of olfactory function in normal volunteers by Tc-99m ECD Brain SPECT: Analysis using statistical parametric mapping

    International Nuclear Information System (INIS)

    Chung, Y.A.; Kim, S.H.; Park, Y.H.; Lee, S.Y.; Sohn, H.S.; Chung, S.K.

    2002-01-01

    The purpose of this study was to investigate olfactory function according to Tc-99m ECD uptake pattern in brain perfusion SPET of normal volunteer by means of statistical parametric mapping (SPM) analysis. The study population was 8 healthy volunteer subjects (M:F = 6:2, age range: 22-54 years, mean 34 years). We performed baseline brain perfusion SPET using 555 MBq of Tc-99m ECD in a silent dark room. Two hours later, we obtained brain perfusion SPET using 1110 MBq of Tc-99m ECD after 3% butanol solution under the same condition. All SPET images were spatially transformed to standard space smoothed and globally normalized. The differences between the baseline and odor-identification SPET images were statistically analyzed using SPM-99 software. The difference between two sets of brain perfusion SPET was considered significant at a threshold of uncorrected p values less than 0.01. SPM analysis revealed significant hyper-perfusion in both cingulated gyri, right middle temporal gyrus, right superior and inferior frontal gyri, right lingual gyrus and right fusiform gyrus on odor-identification SPET. This study shows that brain perfusion SPET can securely support other diagnostic techniques in the evaluation of olfactory function

  6. Analysis of fluid lubrication mechanisms in metal forming at mesoscopic scale

    DEFF Research Database (Denmark)

    Dubar, L.; Hubert, C.; Christiansen, Peter

    2012-01-01

    The lubricant entrapment and escape phenomena in metal forming are studied experimentally as well as numerically. Experiments are carried out in strip reduction of aluminium sheet applying a transparent die to study the fluid flow between mesoscopic cavities. The numerical analysis involves two...... computation steps. The first one is a fully coupled fluid-structure Finite Element computation, where pockets in the surface are plastically deformed leading to the pressurization of the entrapped fluid. The second step computes the fluid exchange between cavities through the plateaus of asperity contacts...

  7. Computer simulation analysis of normal and abnormal development of the mammalian diaphragm

    Directory of Open Access Journals (Sweden)

    Bodenstein Lawrence

    2006-02-01

    recreate a CDH defect using a combination of experimental data and testable hypotheses gives impetus to simulation modeling as an adjunct to experimental analysis of diaphragm morphogenesis.

  8. Computer simulation analysis of normal and abnormal development of the mammalian diaphragm

    Science.gov (United States)

    Fisher, Jason C; Bodenstein, Lawrence

    2006-01-01

    using a combination of experimental data and testable hypotheses gives impetus to simulation modeling as an adjunct to experimental analysis of diaphragm morphogenesis. PMID:16483386

  9. A new algorithm for the discrimination of actinic keratosis from normal skin and squamous cell carcinoma based on in vivo analysis of optical properties by high-definition optical coherence tomography

    DEFF Research Database (Denmark)

    Boone, M A L M; Suppa, M; Marneffe, A

    2016-01-01

    properties for discrimination of AK from SCC and from normal sun exposed skin and to subdifferentiate AKs. METHODS: The technique of semi-log plot has been implemented on HD-OCT signals. This permitted the in vivo measurement of OCT signals coming from the skin entrance up to the superficial reticular dermis...... involvement, non-Bowenoid AK with follicular involvement, Bowenoid AK, hypertrophic and lichenoid form of AK and squamous cell carcinoma. CONCLUSION: HD-OCT seems to enable the combination of in vivo morphological analysis of cellular and 3D microarchitectural structures with in vivo analysis of optical...... properties of tissue scatterers in AK/SCC lesions and normal sun-exposed skin. In vivoHD-OCT analysis of optical properties permits AK discrimination from SCC and AK subdifferentiation with higher accuracy than in vivoHD-OCT analysis of morphology alone....

  10. ANALYSIS OF THE FLOW OF GOODS IN NEW FORMS OF MULTICHANNEL SALES

    Directory of Open Access Journals (Sweden)

    Roman Domański

    2016-12-01

    Full Text Available New distribution channels have been growing dynamically in recent years as a result of the ever-present Internet, which offers a number of new retail forms that enable communication between individual market participants. The recent growth of trade has been identified chiefly with the dynamic development of e-commerce sales. The purpose of the article is to define the characteristic features of each new distribution channel and the guidelines referring to the economics of the flow of goods in a logistics system. The conclusions have been based on the analysis of literature and observed business practices. Today, further growth of commercial exchange is linked to the introduction of new forms of multichannel, crosschannel and omnichannel sales. New distribution channels have not been precisely defined to date. Presently executed undertakings which employ multichannel sales are more or less pioneering pilot projects. The further functioning of new distribution channels will depend on economic calculations. In these terms, analysing the effectiveness of individual new forms of distribution channels will be of key significance. The term "effectiveness of a distribution channel" is linked to the size of a lot of flowing goods. Classic methods of specifying lot size assume stable conditions of the environment in which a distribution channel works. Today, however, the market situation is unstable and subject to continuous changes which occur very quickly.

  11. Methods of acicular ferrite forming in the weld bead metal (Brief analysis

    Directory of Open Access Journals (Sweden)

    Володимир Олександрович Лебедєв

    2016-11-01

    Full Text Available A brief analysis of the methods of acicular ferrite formation as the most preferable structural component in the weld metal has been presented. The term «acicular ferrite» is meant as a structure that forms during pearlite and martensite transformation and austenite decomposition. Acicular ferrite is a packet structure consisting of battens of bainitic ferrite, there being no cementite particles inside these battens at all. The chemical elements most effectively influencing on the formation of acicular ferrite have been considered and their combined effect as well. It has been shown in particular, that the most effective chemical element in terms of impact toughness and cost relation is manganese. Besides, the results of multipass surfacing with impulse and constant feed of low-alloy steel wire electrode have been considered. According to these results acicular ferrite forms in both cases. However, at impulse feed of the electrode wire high mechanical properties of surfacing layer were got in the first passes, the form of the acicular ferrite crystallite has been improved and volume shares of polygonal and lamellar ferrite have been reduced. An assumption has been made, according to which acicular ferrite in the surfacing layer may be obtained through superposition of mechanical low-frequency oscillation on the welding torch or on the welding pool instead of periodic thermal effect due to electrode wire periodic feed

  12. Modular forms

    NARCIS (Netherlands)

    Edixhoven, B.; van der Geer, G.; Moonen, B.; Edixhoven, B.; van der Geer, G.; Moonen, B.

    2008-01-01

    Modular forms are functions with an enormous amount of symmetry that play a central role in number theory, connecting it with analysis and geometry. They have played a prominent role in mathematics since the 19th century and their study continues to flourish today. Modular forms formed the

  13. Gas chromatographic analysis of reactive carbonyl compounds formed from lipids upon UV-irradiation

    International Nuclear Information System (INIS)

    Dennis, K.J.; Shibamoto, T.

    1990-01-01

    Peroxidation of lipids produces carbonyl compounds; some of these, e.g., malonaldehyde and 4-hydroxynonenal, are genotoxic because of their reactivity with biological nucleophiles. Analysis of the reactive carbonyl compounds is often difficult. The methylhydrazine method developed for malonaldehyde analysis was applied to simultaneously measure the products formed from linoleic acid, linolenic acid, arachidonic acid, and squalene upon ultraviolet-irradiation (UV-irradiation). The photoreaction products, saturated monocarbonyl, alpha,beta-unsaturated carbonyls, and beta-dicarbonyls, were derivatized with methylhydrazine to give hydrazones, pyrazolines, and pyrazoles, respectively. The derivatives were analyzed by gas chromatography and gas chromatography-mass spectrometry. Lipid peroxidation products identified included formaldehyde, acetaldehyde, acrolein, malonaldehyde, n-hexanal, and 4-hydroxy-2-nonenal. Malonaldehyde levels formed upon 4 hr of irradiation were 0.06 micrograms/mg from squalene, 2.4 micrograms/mg from linolenic acid, and 5.7 micrograms/mg from arachidonic acid. Significant levels of acrolein (2.5 micrograms/mg) and 4-hydroxy-2-nonenal (0.17 micrograms/mg) were also produced from arachidonic acid upon 4 hr irradiation

  14. COMPARATIVE ANALYSIS OF PLASTIC ATTRIBUTES OF DIFFERENT CARP KOI FORMS (CYPRINUS CARPIO KOI

    Directory of Open Access Journals (Sweden)

    O. O. Lysak

    2014-08-01

    variances of morphometric indexes of fish from different groups is mathematically approved by Student’s index-criterion. Variances were considered to be valid if Student’s criterion exceeded 2, 68 at insurance level of magnitude α = 0,01 that is accepted as an ample for majority of biological objects. By comparison of different carp koi (Cyprinus carpio koi forms researchers used Mayr’s coefficient of resolution. (CD. It is evident that the less curves of two compared populations are lapped over, the more difference is between middle М1 and М2 divided by sum of middle quadric deviations σ1 і σ2. Coefficient of directive deviation (Кdv characterizes percentage deviation of two compared indexes of particular attribute. For deviation determination (Кdv between the index, the difference between an attribute A and the same attribute of sample index E is necessary to divide by index of attribute E and multiply by 100%.Estimation criterion is the same as in variability index – low level of deviation Кdv ≤ 5%, middle Кdv  = 10-30%, large Кdv > 30-50%. At conducting the morphological analysis of groups №I-IV of carp koi different forms authors researched 30 plastic attributes. Plastic attributes were grouped according to indexes which were determined by percents from zoological length of the body ( standard length, trunk length, the biggest and smallest heights, antidorsal and postdorsal distances, length of tail-stem, antipectoral, antiventral, antianal pectroventral, ventroanal distances and length of a head; were determined by percents from length of a head ( length of a fish snout, eye diameter, sight unseen distance, height and width of forehead, lengths of lower and upper jaw, height of the head near the nape and through the middle of the eye but the measuring of all the fins were held separately so as for koi they are the criterion for estimation of form quality, which also were determined by percents from body length ( the biggest height and length of

  15. The study of forms of bonding marshmallow moisture with different composition by method of thermal analysis

    Directory of Open Access Journals (Sweden)

    G. O. Magomedov

    2017-01-01

    Full Text Available Marshmallow is a sugar confectionary product with increased sugar content and energy value because of the significant content of carbohydrates, in particular sugar-sand. The main drawback of marshmallow is the rapid process of its drying during storage due to the crystallization of sucrose and the gradual removal of moisture from the product. A method for obtaining marshmallow without sugar on the basis of high-conversion glucose syrup. In the work, experimental studies were carried out to determine the content and ratio of free and bound forms of moisture in marshmallow on the basis of sugars and on the basis of  high-conversion glucose syrup by Differential Scanning Calorimetry (DSC and Thermogravimetry (TG. To study the patterns of thermal effects on the properties of marshmallow samples, the non-isothermal analysis method and the synchronous thermal analysis instrument (TG-DTA / DSC of the STA 449 F3 Jupiter were used. In the process of thermal exposure, the samples decompose sugars and other organic compounds, as a result of which the sample weight decreases due to evaporation of moisture. The process of dehydration in a control sample of marshmallow using sugar occurs in a less wide temperature range than in a sample of marshmallow on the basis of  high-conversion glucose syrup, which indicates a greater degree of moisture bonding in the developed sample. A quantitative evaluation of the forms of moisture bonding in the samples was carried out using the experimental curves obtained by the TG method. From the temperature curves, the endothermic effects were determined, which correspond to the release of moisture with different forms and energies. Substitution of sugar for treacle in the formula of marshmallow reduces the share of free moisture and increases the safety of the product without signs of staling.

  16. CFD Analysis of Random Turbulent Flow Load in Steam Generator of APR1400 Under Normal Operation Condition

    International Nuclear Information System (INIS)

    Lim, Sang Gyu; You, Sung Chang; Kim, Han Gon

    2011-01-01

    Regulatory guide 1.20 revision 3 of the Nuclear Regulatory Committee (NRC) modifies guidance for vibration assessments of reactor internals and steam generator internals. The new guidance requires applicants to provide a preliminary analysis and evaluation of the design and performance of a facility, including the safety margins of during normal operation and transient conditions anticipated during the life of the facility. Especially, revision 3 require rigorous assessments of adverse flow effects in the steam dryer cased by flow-excited acoustic and structural resonances such as the abnormality from power-uprated BWR cases. For two nearly identical nuclear power plants, the steam system of one BWR plant experienced failure of steam dryers and the main steam system components when steam flow was increased by 16 percent for extended power uprate (EPU). The mechanisms of those failures have revealed that a small adverse flow changing from the prototype condition induced severe flow-excited acoustic and structural resonances, leading to structural failures. In accordance with the historical background, therefore, potential adverse flow effects should be evaluated rigorously for steam generator internals in both BWR and Pressurized Water Reactor (PWR). The Advanced Power Reactor 1400 (APR1400), an evolutionary light water reactor, increased the power by 7.7 percent from the design of the 'Valid Prototype', System80+. Thus, reliable evaluations of potential adverse flow effects on the steam generator of APR1400 are necessary according to the regulatory guide. This paper is part of the computational fluid dynamics (CFD) analysis results for evaluation of the adverse flow effect for the steam generator internals of APR1400, including a series of sensitivity analyses to enhance the reliability of CFD analysis and an estimation the effect of flow loads on the internals of the steam generator under normal operation conditions

  17. Three-dimensional analysis of tarsal bone response to axial loading in patients with hallux valgus and normal feet.

    Science.gov (United States)

    Watanabe, Kota; Ikeda, Yasutoshi; Suzuki, Daisuke; Teramoto, Atsushi; Kobayashi, Takuma; Suzuki, Tomoyuki; Yamashita, Toshihiko

    2017-02-01

    Patients with hallux valgus present a variety of symptoms that may be related to the type of deformity. Weightbearing affects the deformities, and the evaluation of the load response of tarsal bones has been mainly performed using two-dimensional plane radiography. The purpose of this study was to investigate and compare structural changes in the medial foot arch between patients with hallux valgus and normal controls using a computer image analysis technique and weightbearing computed tomography data. Eleven patients with hallux valgus and eleven normal controls were included. Computed tomograms were obtained with and without simulated weightbearing using a compression device. Computed tomography data were transferred into a personal computer, and a three-dimensional bone model was created using image analysis software. The load responses of each tarsal bone in the medial foot arch were measured three-dimensionally and statistically compared between the two groups. Displacement of each tarsal bone under two weightbearing conditions was visually observed by creating three-dimensional bone models. At the first metatarsophalangeal joint, the proximal phalanges of the hallux valgus group showed significantly different displacements in multiple directions. Moreover, opposite responses to axial loading were also observed in both translation and rotation between the two groups. Weightbearing caused deterioration of the hallux valgus deformity three-dimensionally at the first metatarsophalangeal joint. Information from the computer image analysis was useful for understanding details of the pathology of foot disorders related to the deformities or instability and may contribute to the development of effective conservative and surgical treatments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Diagnosis of asbestosis by a time expanded wave form analysis, auscultation and high resolution computed tomography: a comparative study.

    Science.gov (United States)

    al Jarad, N; Strickland, B; Bothamley, G; Lock, S; Logan-Sinclair, R; Rudd, R M

    1993-01-01

    BACKGROUND--Crackles are a prominent clinical feature of asbestosis and may be an early sign of the condition. Auscultation, however, is subjective and interexaminer disagreement is a problem. Computerised lung sound analysis can visualise, store, and analyse lung sounds and disagreement on the presence of crackles is minimal. High resolution computed tomography (HRCT) is superior to chest radiography in detecting early signs of asbestosis. The aim of this study was to compare clinical auscultation, time expanded wave form analysis (TEW), chest radiography, and HRCT in detecting signs of asbestosis in asbestos workers. METHODS--Fifty three asbestos workers (51 men and two women) were investigated. Chest radiography and HRCT were assessed by two independent readers for detection of interstitial opacities. HRCT was performed in the supine position with additional sections at the bases in the prone position. Auscultation for persistent fine inspiratory crackles was performed by two independent examiners unacquainted with the diagnosis. TEW analysis was obtained from a 33 second recording of lung sounds over the lung bases. TEW and auscultation were performed in a control group of 13 subjects who had a normal chest radiograph. There were 10 current smokers and three previous smokers. In asbestos workers the extent of pulmonary opacities on the chest radiograph was scored according to the International Labour Office (ILO) scale. Patients were divided into two groups: 21 patients in whom the chest radiograph was > 1/0 (group 1) and 32 patients in whom the chest radiograph was scored auscultation in seven (22%) patients and by TEW in 14 (44%). HRCT detected definite interstitial opacities in 11 (34%) and gravity dependent subpleural lines in two (6%) patients. All but two patients with evidence of interstitial disease or gravity dependent subpleural lines on HRCT had crackles detected by TEW. In patients with an ILO score of > 1/0 auscultation and TEW revealed mid to late

  19. TXRF, PIXE and EDXRF: a first step towards normalization of x-ray spectrometry for chemical analysis

    International Nuclear Information System (INIS)

    Barreiros, M.A.; Costa, M.M.; Palha, M.; Pinheiro, T.; Araujo, M.F.; Silva, R.C. da

    2000-01-01

    Nowadays, many research studies rely on analytical measurements to which x-ray spectrometry techniques are particularly adequate. Because they are multi-elemental, versatile, fast and can reach low detection limits they are often applied to environmental and biomedical studies. Besides, they are able to provide reliable and accurate results. As also occurs with other analytical procedures, the quality assurance for non-routine analysis in R and D laboratories, is much less well established. when compared with routine analytical work. Thus, it is of foremost importance, to set the major quality parameters that significantly affect the quality of results. In addition, there is a need to apply the new concepts of trace ability and uncertainty to these techniques. This work reports on the reliability of TXRF, PIXE and EDXRF measurements by evaluating the uncertainty and trace ability for elemental analysis. The set of experimental and statistical procedures used to ensure the quality of results relative to the proposed objectives, as well as the methods applied to estimate the uncertainties will be presented and discussed. The sample preparation procedures, analytical calibration and spectral evaluation will be the parameters examined in this inter-laboratory study. The sample preparation procedures comprise pressure acid digestion and pelletizing for solid samples. The internal quality control for analytical calibration and spectral analysis is performed using certified reference materials and standard solutions. The goal of this work is to present a first step towards normalization of x-ray spectrometry techniques for chemical analysis. (author)

  20. A hierarchical cluster analysis of normal-tension glaucoma using spectral-domain optical coherence tomography parameters.

    Science.gov (United States)

    Bae, Hyoung Won; Ji, Yongwoo; Lee, Hye Sun; Lee, Naeun; Hong, Samin; Seong, Gong Je; Sung, Kyung Rim; Kim, Chan Yun

    2015-01-01

    Normal-tension glaucoma (NTG) is a heterogenous disease, and there is still controversy about subclassifications of this disorder. On the basis of spectral-domain optical coherence tomography (SD-OCT), we subdivided NTG with hierarchical cluster analysis using optic nerve head (ONH) parameters and retinal nerve fiber layer (RNFL) thicknesses. A total of 200 eyes of 200 NTG patients between March 2011 and June 2012 underwent SD-OCT scans to measure ONH parameters and RNFL thicknesses. We classified NTG into homogenous subgroups based on these variables using a hierarchical cluster analysis, and compared clusters to evaluate diverse NTG characteristics. Three clusters were found after hierarchical cluster analysis. Cluster 1 (62 eyes) had the thickest RNFL and widest rim area, and showed early glaucoma features. Cluster 2 (60 eyes) was characterized by the largest cup/disc ratio and cup volume, and showed advanced glaucomatous damage. Cluster 3 (78 eyes) had small disc areas in SD-OCT and were comprised of patients with significantly younger age, longer axial length, and greater myopia than the other 2 groups. A hierarchical cluster analysis of SD-OCT scans divided NTG patients into 3 groups based upon ONH parameters and RNFL thicknesses. It is anticipated that the small disc area group comprised of younger and more myopic patients may show unique features unlike the other 2 groups.

  1. Tumor and normal tissue motion in the thorax during respiration: Analysis of volumetric and positional variations using 4D CT

    International Nuclear Information System (INIS)

    Weiss, Elisabeth; Wijesooriya, Krishni; Dill, S. Vaughn; Keall, Paul J.

    2007-01-01

    Purpose: To investigate temporospatial variations of tumor and normal tissue during respiration in lung cancer patients. Methods and Materials: In 14 patients, gross tumor volume (GTV) and normal tissue structures were manually contoured on four-dimensional computed tomography (4D-CT) scans. Structures were evaluated for volume changes, centroid (center of mass) motion, and phase dependence of variations relative to inspiration. Only volumetrically complete structures were used for analysis (lung in 2, heart in 8, all other structures in >10 patients). Results: During respiration, the magnitude of contoured volumes varied up to 62.5% for GTVs, 25.5% for lungs, and 12.6% for hearts. The range of maximum three-dimensional centroid movement for individual patients was 1.3-24.0 mm for GTV, 2.4-7.9 mm for heart, 5.2-12.0 mm for lungs, 0.3-5.5 mm for skin markers, 2.9-10.0 mm for trachea, and 6.6-21.7 mm for diaphragm. During respiration, the centroid positions of normal structures varied relative to the centroid position of the respective GTV by 1.5-8.1 mm for heart, 2.9-9.3 mm for lungs, 1.2-9.2 mm for skin markers, 0.9-7.1 mm for trachea, and 2.7-16.4 mm for diaphragm. Conclusion: Using 4D-CT, volumetric changes, positional alterations as well as changes in the position of contoured structures relative to the GTV were observed with large variations between individual patients. Although the interpretation of 4D-CT data has considerable uncertainty because of 4D-CT artifacts, observer variations, and the limited acquisition time, the findings might have a significant impact on treatment planning

  2. Analysis of Factors Associated With the Tear Film Lipid Layer Thickness in Normal Eyes and Patients With Dry Eye Syndrome.

    Science.gov (United States)

    Jung, Ji Won; Park, Si Yoon; Kim, Jin Sun; Kim, Eung Kweon; Seo, Kyoung Yul; Kim, Tae-Im

    2016-08-01

    To determine the effects of clinical variables, including age, sex, history of refractive or cataract surgery, contact lens use, and ocular surface and meibomian gland parameters on the lipid layer thickness (LLT) in normal subjects and patients with dry eye syndrome (DES). A total of 64 normal subjects and 326 patients with DES were enrolled, and they underwent measurements of LLT with a LipiView interferometer and tear meniscus height using optical coherence tomography, tear film break-up time (TBUT) determination, ocular surface staining, Schirmer's test, examination of the lid margins and meibomian glands, and assessment using the Ocular Surface Disease Index (OSDI). In normal subjects, the median (range) LLT was 67 (33-100) nm, and age was the only factor that was significantly associated with LLT (β = 0.678, P = 0.028). In patients with DES, the median (range) LLT was 84 (20-100) nm, and 79.0% of the participants fulfilled the diagnostic criteria for meibomian gland dysfunction (MGD). In a multivariate analysis, increased age and female sex were significantly related to increased LLT (β = 0.282, P = 0.005 and β = 11.493, P < 0.001), and hypersecretory MGD and lid margin inflammation were independently associated with increased LLT (β = 11.299, P = 0.001 and β = 12.747, P = 0.001). Lipid layer thickness measurements using a new interferometer are significantly affected by demographic factors such as age, sex, ocular surgical history, and MGD type. Therefore, all of these factors must be considered in the diagnosis of ocular surface diseases.

  3. Model-based analysis and control of a network of basal ganglia spiking neurons in the normal and Parkinsonian states

    Science.gov (United States)

    Liu, Jianbo; Khalil, Hassan K.; Oweiss, Karim G.

    2011-08-01

    Controlling the spatiotemporal firing pattern of an intricately connected network of neurons through microstimulation is highly desirable in many applications. We investigated in this paper the feasibility of using a model-based approach to the analysis and control of a basal ganglia (BG) network model of Hodgkin-Huxley (HH) spiking neurons through microstimulation. Detailed analysis of this network model suggests that it can reproduce the experimentally observed characteristics of BG neurons under a normal and a pathological Parkinsonian state. A simplified neuronal firing rate model, identified from the detailed HH network model, is shown to capture the essential network dynamics. Mathematical analysis of the simplified model reveals the presence of a systematic relationship between the network's structure and its dynamic response to spatiotemporally patterned microstimulation. We show that both the network synaptic organization and the local mechanism of microstimulation can impose tight constraints on the possible spatiotemporal firing patterns that can be generated by the microstimulated network, which may hinder the effectiveness of microstimulation to achieve a desired objective under certain conditions. Finally, we demonstrate that the feedback control design aided by the mathematical analysis of the simplified model is indeed effective in driving the BG network in the normal and Parskinsonian states to follow a prescribed spatiotemporal firing pattern. We further show that the rhythmic/oscillatory patterns that characterize a dopamine-depleted BG network can be suppressed as a direct consequence of controlling the spatiotemporal pattern of a subpopulation of the output Globus Pallidus internalis (GPi) neurons in the network. This work may provide plausible explanations for the mechanisms underlying the therapeutic effects of deep brain stimulation (DBS) in Parkinson's disease and pave the way towards a model-based, network level analysis and closed

  4. Emotional Intelligence and Nurse Recruitment: Rasch and confirmatory factor analysis of the trait emotional intelligence questionnaire short form.

    Science.gov (United States)

    Snowden, Austyn; Watson, Roger; Stenhouse, Rosie; Hale, Claire

    2015-12-01

    To examine the construct validity of the Trait Emotional Intelligence Questionnaire Short form. Emotional intelligence involves the identification and regulation of our own emotions and the emotions of others. It is therefore a potentially useful construct in the investigation of recruitment and retention in nursing and many questionnaires have been constructed to measure it. Secondary analysis of existing dataset of responses to Trait Emotional Intelligence Questionnaire Short form using concurrent application of Rasch analysis and confirmatory factor analysis. First year undergraduate nursing and computing students completed Trait Emotional Intelligence Questionnaire-Short Form in September 2013. Responses were analysed by synthesising results of Rasch analysis and confirmatory factor analysis. Participants (N = 938) completed Trait Emotional Intelligence Questionnaire Short form. Rasch analysis showed the majority of the Trait Emotional Intelligence Questionnaire-Short Form items made a unique contribution to the latent trait of emotional intelligence. Five items did not fit the model and differential item functioning (gender) accounted for this misfit. Confirmatory factor analysis revealed a four-factor structure consisting of: self-confidence, empathy, uncertainty and social connection. All five misfitting items from the Rasch analysis belonged to the 'social connection' factor. The concurrent use of Rasch and factor analysis allowed for novel interpretation of Trait Emotional Intelligence Questionnaire Short form. Much of the response variation in Trait Emotional Intelligence Questionnaire Short form can be accounted for by the social connection factor. Implications for practice are discussed. © 2015 John Wiley & Sons Ltd.

  5. Model-free methods of analyzing domain motions in proteins from simulation : A comparison of normal mode analysis and molecular dynamics simulation of lysozyme

    NARCIS (Netherlands)

    Hayward, S.; Kitao, A.; Berendsen, H.J.C.

    Model-free methods are introduced to determine quantities pertaining to protein domain motions from normal mode analyses and molecular dynamics simulations, For the normal mode analysis, the methods are based on the assumption that in low frequency modes, domain motions can be well approximated by

  6. Sequence analysis of annually normalized citation counts: an empirical analysis based on the characteristic scores and scales (CSS) method.

    Science.gov (United States)

    Bornmann, Lutz; Ye, Adam Y; Ye, Fred Y

    2017-01-01

    In bibliometrics, only a few publications have focused on the citation histories of publications, where the citations for each citing year are assessed. In this study, therefore, annual categories of field- and time-normalized citation scores (based on the characteristic scores and scales method: 0 = poorly cited, 1 = fairly cited, 2 = remarkably cited, and 3 = outstandingly cited) are used to study the citation histories of papers. As our dataset, we used all articles published in 2000 and their annual citation scores until 2015. We generated annual sequences of citation scores (e.g., [Formula: see text]) and compared the sequences of annual citation scores of six broader fields (natural sciences, engineering and technology, medical and health sciences, agricultural sciences, social sciences, and humanities). In agreement with previous studies, our results demonstrate that sequences with poorly cited (0) and fairly cited (1) elements dominate the publication set; sequences with remarkably cited (3) and outstandingly cited (4) periods are rare. The highest percentages of constantly poorly cited papers can be found in the social sciences; the lowest percentages are in the agricultural sciences and humanities. The largest group of papers with remarkably cited (3) and/or outstandingly cited (4) periods shows an increasing impact over the citing years with the following orders of sequences: [Formula: see text] (6.01%), which is followed by [Formula: see text] (1.62%). Only 0.11% of the papers ( n  = 909) are constantly on the outstandingly cited level.

  7. Characterization of exosomes derived from ovarian cancer cells and normal ovarian epithelial cells by nanoparticle tracking analysis.

    Science.gov (United States)

    Zhang, Wei; Peng, Peng; Kuang, Yun; Yang, Jiaxin; Cao, Dongyan; You, Yan; Shen, Keng

    2016-03-01

    Cellular exosomes are involved in many disease processes and have the potential to be used for diagnosis and treatment. In this study, we compared the characteristics of exosomes derived from human ovarian epithelial cells (HOSEPiC) and three epithelial ovarian cancer cell lines (OVCAR3, IGROV1, and ES-2) to investigate the differences between exosomes originating from normal and malignant cells. Two established colloid-chemical methodologies, electron microscopy (EM) and dynamic light scattering (DLS), and a relatively new method, nanoparticle tracking analysis (NTA), were used to measure the size and size distribution of exosomes. The concentration and epithelial cellular adhesion molecule (EpCAM) expression of exosomes were measured by NTA. Quantum dots were conjugated with anti-EpCAM to label exosomes, and the labeled exosomes were detected by NTA in fluorescent mode. The normal-cell-derived exosomes were significantly larger than those derived from malignant cells, and exosomes were successfully labeled using anti-EpCAM-conjugated quantum dots. Exosomes from different cell lines may vary in size, and exosomes might be considered as potential diagnosis biomarkers. NTA can be considered a useful, efficient, and objective method for the study of different exosomes and their unique properties in ovarian cancer.

  8. Investigation of olfactory function in normal volunteers and patients with anosmia : analysis of brain perfusion SPECTs using statistical parametric mapping

    International Nuclear Information System (INIS)

    Chung, Y. A.; Kim, S. H.; Sohn, H. S.; Chung, S. K.

    2002-01-01

    The purpose of this study was to investigate olfactory function with Tc-99m ECD brain perfusion SPECT using statistical parametric mapping (SPM) analysis in normal volunteers and patients with anosmia. The study populations were 8 subjects matched healthy volunteers and 16 subjects matched patients with anosmia. We obtaibed baseline and post-stimulation (3% butanol) brain perfusion SPECTs in the silent dark room. We analyzed the all SPECTs using SPM. The difference between two sets of brain perfusion SPECTs were compared with t-test. The voxels with p-value of less than 0.01 were considered to be significantly different. We demonstrated increased perfusion in the both cingulated gyri, right middle temporal gyrus, right superior and inferior frontal gyri, right lingual gyrus and right fusiform gyrus on post-stimulation brain SPECT in normal volunteers, and demonstrated decreased perfusion in the both cingulate gyri, right middle temporal gyrus, right rectal gyrus and both superior and inferior frontal gyri in the 10 patients with anosmia. No significant hypoperfusion area was observed in the other 6 patients with anosmia. The baseline and post-stimulation brain perfusion SPECTs can helpful in the evaluation of olfactory function and be useful in the diagnosis of anosmia

  9. Investigation of olfactory function in normal volunteers and patients with anosmia : analysis of brain perfusion SPECTs using statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Y. A.; Kim, S. H.; Sohn, H. S.; Chung, S. K. [Catholic University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    The purpose of this study was to investigate olfactory function with Tc-99m ECD brain perfusion SPECT using statistical parametric mapping (SPM) analysis in normal volunteers and patients with anosmia. The study populations were 8 subjects matched healthy volunteers and 16 subjects matched patients with anosmia. We obtaibed baseline and post-stimulation (3% butanol) brain perfusion SPECTs in the silent dark room. We analyzed the all SPECTs using SPM. The difference between two sets of brain perfusion SPECTs were compared with t-test. The voxels with p-value of less than 0.01 were considered to be significantly different. We demonstrated increased perfusion in the both cingulated gyri, right middle temporal gyrus, right superior and inferior frontal gyri, right lingual gyrus and right fusiform gyrus on post-stimulation brain SPECT in normal volunteers, and demonstrated decreased perfusion in the both cingulate gyri, right middle temporal gyrus, right rectal gyrus and both superior and inferior frontal gyri in the 10 patients with anosmia. No significant hypoperfusion area was observed in the other 6 patients with anosmia. The baseline and post-stimulation brain perfusion SPECTs can helpful in the evaluation of olfactory function and be useful in the diagnosis of anosmia.

  10. Comparative analysis of morphological and topometric parameters of lumbar spine in normal state and in degenerative-dystrophic changes

    Directory of Open Access Journals (Sweden)

    Anisimova Е.А.

    2015-12-01

    Full Text Available Objective: to carry out comparative analysis and identify patterns of topographic variation patterns of lumbar spine in normal and degenerative changes. Material and methods. CT- and MRT-grams for men and women I (M1-22-35 years; W — 21-35 years and II (M2-36-60 years; W2-36-55 years periods of mature age with no signs of trauma, scoliosis and systemic diseases of the spine (n=140 and CT- and MRT-grams in patients with revealed degenerative changes in the lumbar spine degree II-III (n=120. The pictures with digital PACS system measure the height of the vertebral body, intervertebral disc height, vertical, horizontal diameter and the area of intervertebral foramen. Results. The height of the lumbar vertebral bodies normally increased from27,90±0,38mmatthe level of L, to 29,93±0,33 mm Lm, and then decreased to 24,35±0,27 mm at level L^, in osteochondrosis it is statistically significantly lower at all levels on average by 20%. The height of the intervertebral disc with osteochondrosis below at all levels by an average of 25% of its value in the range 5,27±0,19 to 6,13±0,17mm, while the normal disc height varies from 6,88±030 to 9,36±0,28mm. The area of intervertebral holes normally ranging from 103,29±5,78 to 127,99±5,92mm2, with osteochondrosis aperture area is reduced to a greater extent by decreasing the vertical diameter in comparison with the horizontal. Conclusion. For the studied parameters characteristic topographic variability has been determined. The maximum values parameters are marked at the top of the lumbar lordosis, at chest height, lumbar and lumbosacral junctions sizes are reduced. In osteochondrosis the intervertebral disc height and the height of lumbar vertebral bodies are reduced; intervertebral foramina area is also reduced to a greater extent by reducing the vertical diameter than the horizontal one.

  11. Analysis of loss of normal feedwater transient using RELAP5/MOD1/NSC; KNU1 plant simulation

    International Nuclear Information System (INIS)

    Kim, Hho Jung; Chung, Bub Dong; Lee, Young Jin; Kim, Jin Soo

    1986-01-01

    Simulation of the system thermal-hydraulic parameters was carried out following the KNU1(Korea Nuclear Unit-1) loss of normal feedwater transient sequence occurred on november 14, 1984. Results were compared with the plant transient data, and good agreements were obtained. Some deviations were found in the parameters such as the steam flowrate and the RCS(Reactor Coolant System) average temperature, around the time of reactor trip. It can be expected since the thermal-hydraulic parameters encounter rapid transitions due to the large reduction of the reactor thermal power in a short period of time and, thereby, the plant data involve transient uncertainties. The analysis was performed using the RELAP5/MOD1/NSC developed through some modifications of the interphase drag and the wall heat transfer modeling routines of the RELAP5/MOD1/CY018. (Author)

  12. Powerlessness, Normalization, and Resistance: A Foucauldian Discourse Analysis of Women's Narratives On Obstetric Fistula in Eastern Sudan.

    Science.gov (United States)

    Hamed, Sarah; Ahlberg, Beth-Maina; Trenholm, Jill

    2017-10-01

    Eastern Sudan has high prevalence of female circumcision and child marriage constituting a risk for developing obstetric fistula. Few studies have examined gender roles' relation with obstetric fistula in Sudan. To explore the associated power-relations that may put women at increased risk for developing obstetric fistula, we conducted nine interviews with women living with obstetric fistula in Kassala in eastern Sudan. Using a Foucauldian discourse analysis, we identified three discourses: powerlessness, normalization, and covert resistance. Existing power-relations between the women and other societal members revealed their internalization of social norms as absolute truth, and influenced their status and decision-making power in regard to circumcision, early marriage, and other transformative decisions as well as women's general behaviors. The women showed subtle resistance to these norms and the harassment they encountered because of their fistula. These findings suggest that a more in-depth contextual assessment could benefit future maternal health interventions.

  13. Normal myelination of the child brain on MRI - a meta-analysis; Die normale Myelinisierung des kindlichen Gehirns in der MRT - eine Metaanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Staudt, M.; Grodd, W. [Tuebingen Univ. (Germany). Abt. fuer Neuroradiologie; Kraegeloh-Mann, I. [Tuebingen Univ. (Germany). Abt. Entwicklungsneurologie und Neuropaediatrie

    2000-10-01

    Purpose: To establish age limits for the assessment of normal myelination of the brain on T{sub 1}-weighted (T{sub 1}w) and T{sub 2}-weighted (T{sub 2}w) images. Method: Comparison of previous publications (Barkovich et al. 1988, Grodd 1993, Hayakawa et al. 1990, Hittmair et al. 1994, Martin et al. 1988/1990/1991, Nakagawa et al. 1998, Staudt et al. 1993/1994, Stricker et al. 1990). Results: Despite technical and methodological differences, these studies principally agreed on the timing of myelination for most regions of the brain. Thus, a common timetable could be established: At 1 month, myelin is visible on both T{sub 1}w and T{sub 2}w in the medulla oblongata, tegmentum pontis, cerebellar peduncles and vermis, quadrigeminal plate, decussation of superior cerebellar peduncles, thalamus, posterior limb of internal capsule, optic radiation, corona radiata. Thereafter, the myelin-typical signal in the different regions of the brain should be present at the following ages (M=months): Anterior limb of internal capsule (2 M: T{sub 1}w; 7 M: T{sub 2}w), splenium of corpus callosum (4 M: T{sub 1}w; 6 M: T{sub 2}w), genu of corpus callosum (6 M: T{sub 1}w; 8 M: T{sub 2}w), centrum semiovale (2 M: T{sub 1}w; 7 M: T{sub 2}w). Branching of myelin into the gyri of the telencephalon (=arborization) appears at the latest at: occipital lobe (5 M: T{sub 1}w; 12 M: T{sub 2}w) and frontal lobe (7 M: T{sub 1}w; 14 M: T{sub 2}w). Conclusion: These extracted age limits can be used for a more reliable assessment of myelination than the time-tables from a single study. (orig.) [German] Ziel: Ermittlung von Altersgrenzen fuer die MR-tomographisch erfassbare Myelinisierung des kindlichen Gehirns in T{sub 1}- und T{sub 2}-gewichteten Aufnahmen (T{sub 1}w, T{sub 2}w). Methode: Vergleich bisher publizierter Zeitangaben (Barkovich et al 1988, Grodd 1993, Hayakawa et al 1990, Hittmair et al 1994, Martin et al 1988/1990/1991, Nakagawa et al 1998, Staudt et al 1993/1994, Stricker et al 1990

  14. Analysis of the Factors Affecting the Interval between Blood Donations Using Log-Normal Hazard Model with Gamma Correlated Frailties.

    Science.gov (United States)

    Tavakol, Najmeh; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Time to donating blood plays a major role in a regular donor to becoming continues one. The aim of this study was to determine the effective factors on the interval between the blood donations. In a longitudinal study in 2008, 864 samples of first-time donors in Shahrekord Blood Transfusion Center,  capital city of Chaharmahal and Bakhtiari Province, Iran were selected by a systematic sampling and were followed up for five years. Among these samples, a subset of 424 donors who had at least two successful blood donations were chosen for this study and the time intervals between their donations were measured as response variable. Sex, body weight, age, marital status, education, stay and job were recorded as independent variables. Data analysis was performed based on log-normal hazard model with gamma correlated frailty. In this model, the frailties are sum of two independent components assumed a gamma distribution. The analysis was done via Bayesian approach using Markov Chain Monte Carlo algorithm by OpenBUGS. Convergence was checked via Gelman-Rubin criteria using BOA program in R. Age, job and education were significant on chance to donate blood (Pdonation for the higher-aged donors, clericals, workers, free job, students and educated donors were higher and in return, time intervals between their blood donations were shorter. Due to the significance effect of some variables in the log-normal correlated frailty model, it is necessary to plan educational and cultural program to encourage the people with longer inter-donation intervals to donate more frequently.

  15. Gene expression analysis of skin grafts and cultured keratinocytes using synthetic RNA normalization reveals insights into differentiation and growth control.

    Science.gov (United States)

    Katayama, Shintaro; Skoog, Tiina; Jouhilahti, Eeva-Mari; Siitonen, H Annika; Nuutila, Kristo; Tervaniemi, Mari H; Vuola, Jyrki; Johnsson, Anna; Lönnerberg, Peter; Linnarsson, Sten; Elomaa, Outi; Kankuri, Esko; Kere, Juha

    2015-06-25

    Keratinocytes (KCs) are the most frequent cells in the epidermis, and they are often isolated and cultured in vitro to study the molecular biology of the skin. Cultured primary cells and various immortalized cells have been frequently used as skin models but their comparability to intact skin has been questioned. Moreover, when analyzing KC transcriptomes, fluctuation of polyA+ RNA content during the KCs' lifecycle has been omitted. We performed STRT RNA sequencing on 10 ng samples of total RNA from three different sample types: i) epidermal tissue (split-thickness skin grafts), ii) cultured primary KCs, and iii) HaCaT cell line. We observed significant variation in cellular polyA+ RNA content between tissue and cell culture samples of KCs. The use of synthetic RNAs and SAMstrt in normalization enabled comparison of gene expression levels in the highly heterogenous samples and facilitated discovery of differences between the tissue samples and cultured cells. The transcriptome analysis sensitively revealed genes involved in KC differentiation in skin grafts and cell cycle regulation related genes in cultured KCs and emphasized the fluctuation of transcription factors and non-coding RNAs associated to sample types. The epidermal keratinocytes derived from tissue and cell culture samples showed highly different polyA+ RNA contents. The use of SAMstrt and synthetic RNA based normalization allowed the comparison between tissue and cell culture samples and thus proved to be valuable tools for RNA-seq analysis with translational approach. Transciptomics revealed clear difference both between tissue and cell culture samples and between primary KCs and immortalized HaCaT cells.

  16. A Protocol for the Comprehensive Flow Cytometric Analysis of Immune Cells in Normal and Inflamed Murine Non-Lymphoid Tissues

    Science.gov (United States)

    Yu, Yen-Rei A.; O’Koren, Emily G.; Hotten, Danielle F.; Kan, Matthew J.; Kopin, David; Nelson, Erik R.; Que, Loretta; Gunn, Michael D.

    2016-01-01

    Flow cytometry is used extensively to examine immune cells in non-lymphoid tissues. However, a method of flow cytometric analysis that is both comprehensive and widely applicable has not been described. We developed a protocol for the flow cytometric analysis of non-lymphoid tissues, including methods of tissue preparation, a 10-fluorochrome panel for cell staining, and a standardized gating strategy, that allows the simultaneous identification and quantification of all major immune cell types in a variety of normal and inflamed non-lymphoid tissues. We demonstrate that our basic protocol minimizes cell loss, reliably distinguishes macrophages from dendritic cells (DC), and identifies all major granulocytic and mononuclear phagocytic cell types. This protocol is able to accurately quantify 11 distinct immune cell types, including T cells, B cells, NK cells, neutrophils, eosinophils, inflammatory monocytes, resident monocytes, alveolar macrophages, resident/interstitial macrophages, CD11b- DC, and CD11b+ DC, in normal lung, heart, liver, kidney, intestine, skin, eyes, and mammary gland. We also characterized the expression patterns of several commonly used myeloid and macrophage markers. This basic protocol can be expanded to identify additional cell types such as mast cells, basophils, and plasmacytoid DC, or perform detailed phenotyping of specific cell types. In examining models of primary and metastatic mammary tumors, this protocol allowed the identification of several distinct tumor associated macrophage phenotypes, the appearance of which was highly specific to individual tumor cell lines. This protocol provides a valuable tool to examine immune cell repertoires and follow immune responses in a wide variety of tissues and experimental conditions. PMID:26938654

  17. Network analysis reveals that bacteria and fungi form modules that correlate independently with soil parameters.

    Science.gov (United States)

    de Menezes, Alexandre B; Prendergast-Miller, Miranda T; Richardson, Alan E; Toscas, Peter; Farrell, Mark; Macdonald, Lynne M; Baker, Geoff; Wark, Tim; Thrall, Peter H

    2015-08-01

    Network and multivariate statistical analyses were performed to determine interactions between bacterial and fungal community terminal restriction length polymorphisms as well as soil properties in paired woodland and pasture sites. Canonical correspondence analysis (CCA) revealed that shifts in woodland community composition correlated with soil dissolved organic carbon, while changes in pasture community composition correlated with moisture, nitrogen and phosphorus. Weighted correlation network analysis detected two distinct microbial modules per land use. Bacterial and fungal ribotypes did not group separately, rather all modules comprised of both bacterial and fungal ribotypes. Woodland modules had a similar fungal : bacterial ribotype ratio, while in the pasture, one module was fungal dominated. There was no correspondence between pasture and woodland modules in their ribotype composition. The modules had different relationships to soil variables, and these contrasts were not detected without the use of network analysis. This study demonstrated that fungi and bacteria, components of the soil microbial communities usually treated as separate functional groups as in a CCA approach, were co-correlated and formed distinct associations in these adjacent habitats. Understanding these distinct modular associations may shed more light on their niche space in the soil environment, and allow a more realistic description of soil microbial ecology and function. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.

  18. Human papillomavirus infection in females with normal cervical cytology: Genotyping and phylogenetic analysis among women in Punjab, Pakistan.

    Science.gov (United States)

    Aziz, Hafsa; Iqbal, Huma; Mahmood, Humera; Fatima, Shazia; Faheem, Mohammad; Sattar, Areej Abdul; Tabassum, Sobia; Napper, Sanum; Batool, Syeda; Rasheed, Nuzhat

    2018-01-01

    Globally, cervical cancer is the fourth most common cancer in women and the seventh most common cancer overall, accounting for an estimated 300 000 annual deaths. Human papillomavirus (HPV) is the second most common cause of cervical cancer worldwide. HPV screening is not a common practice in Pakistan. The aim of this study was to determine the prevalence of HPV and HPV types in women with a normal cytology of the cervix living in the upper and lower regions of Punjab, Pakistan, and to analyze the risk factors for HPV in this region. PCR analysis was performed for 1011 female patients with a normal cytology of the cervix from various districts of Punjab Province, Pakistan. Risk factors for the acquisition of HPV were studied. High-risk HPV types (HPV16 and HPV18) were detected using the Abbott Real Time HR HPV test. To determine the genotype, partial L1 region sequences of HPV-positive samples were subjected to sequencing using MY/09/MY11 primers, and a phylogenetic tree was constructed using CLC software. The study found a 4.74% prevalence of HPV, with the most frequent HPV type found being the low-risk HPV6 (in 25% of infected individuals), followed by HPV55 (22.9%), HPV11 (20.8%), and high-risk types HPV45 (12.5%), HPV33 (8.33%), HPV18 (6.25%), and HPV16 (4.16%). Phylogenetic analysis of all HPV types in this study showed 80-99% nucleotide identity with types related to the same species. The sequences were clustered with China, India, Mexico, Iran, Slovenia, and Germany, showing the diversity in origin of the various genotypes prevalent in Pakistan. In this population with a normal cervical cytology, the prevalence of high-risk HPV types was very low. The major prevalent HPV genotype in Punjab Province of Pakistan was the low-risk HPV type 6, followed by HPV type 55. Sequencing of the partial L1 region suggested that the region was highly conserved in all reported sequences. This study highlights the need to conduct robust epidemiological studies in the region

  19. Ultrastructural Analysis of Leishmania infantum chagasi Promastigotes Forms Treated In Vitro with Usnic Acid

    Directory of Open Access Journals (Sweden)

    João S. B. da Luz

    2015-01-01

    Full Text Available Leishmaniasis is considered by the World Health Organization as one of the infectious parasitic diseases endemic of great relevance and a global public health problem. Pentavalent antimonials used for treatment of this disease are limited and new phytochemicals emerge as an alternative to existing treatments, due to the low toxicity and cost reduction. Usnic acid is uniquely found in lichens and is especially abundant in genera such as Alectoria, Cladonia, Evernia, Lecanora, Ramalina, and Usnea. Usnic acid has been shown to exhibit antiviral, antiprotozoal, antiproliferative, anti-inflammatory, and analgesic activity. The aim of this study was to evaluate the antileishmanial activity of usnic acid on Leishmania infantum chagasi promastigotes and the occurrence of drug-induced ultrastructural damage in the parasite. Usnic acid was effective against the promastigote forms (IC50 = 18.30 ± 2.00 µg/mL. Structural and ultrastructural aspects of parasite were analyzed. Morphological alterations were observed as blebs in cell membrane and shapes given off, increasing the number of cytoplasmic vacuoles, and cellular and mitochondrial swelling, with loss of cell polarity. We concluded that the usnic acid presented antileishmanial activity against promastigote forms of Leishmania infantum chagasi and structural and ultrastructural analysis reinforces its cytotoxicity. Further, in vitro studies are warranted to further evaluate this potential.

  20. PHYLOGENETIC ANALYSIS AND AUTECOLOGY OF SPORE-FORMING BACTERIA FROM HYPERSALINE ENVIRONMENTS.

    Science.gov (United States)

    Gladka, G V; Romanovskaya, V A; Tashyreva, H O; Tashyrev, O B

    2015-01-01

    Multi-resistant to extreme factors spore-forming bacteria of Bacillus genus are isolated from hypersaline environments of the Crimea (Ukraine) and the Dead Sea (Israel). Phylogenetic analysis showed distinction of dominating extremophilic culturable species in studied regions. In Crimean environments they are B. mojavensis and B. simplex, in the Dead Sea ecosystem--B. subtilis subsp. spizizenii, B. subtilis subsp. subtilis, B. licheniformis and B. simplex. Isolates are simultaneously halotolerant and resistant to UV radiation. Strains isolated from the Dead Sea and the Crimea environments were resistant to UV: LD90 and LD99.99 made 100-170 J/m2 and 750-1500 J/m2 respectively. Spores showed higher UV-resistance (LD99.99-2500 J/m2) than the vegetative cells. However the number of spores made 0.02-0.007% of the whole cell population, and should not significantly affect the UV LD99.99 value. Isolates of both environments were halotolerant in the range of 0.1-10% NaCl and thermotolerant in the range of 20-50 °C, and didn't grow at 15 °C. Survival strategy of spore-forming bacteria from hypersaline environments under high UV radiation level can be performed by spore formation which minimize cell damage as well as efficient DNA-repair systems that remove damages.

  1. Analysis of white layers formed in hard turning of AISI 52100 steel

    International Nuclear Information System (INIS)

    Ramesh, A.; Melkote, S.N.; Allard, L.F.; Riester, L.; Watkins, T.R.

    2005-01-01

    The formation mechanisms and properties of white layers produced in machining of hardened steels are not clearly understood to date. In particular, detailed analysis of their structure and mechanical properties is lacking. This paper investigates the differences in structure and properties of white layers formed during machining of hardened AISI 52100 steel (62 HRC) at different cutting speeds. A combination of experimental techniques including transmission electron microscopy (TEM), X-ray diffraction (XRD), and nano-indentation are used to analyze the white layers formed. TEM results suggest that white layers produced at low-to-moderate cutting speeds are in large part due to grain refinement induced by severe plastic deformation, whereas white layer formation at high cutting speeds is mainly due to thermally-driven phase transformation. The white layers at all speeds are found to be comprised of very fine (nano-scale) grains compared to the bulk material. XRD-based residual stress and retained austenite measurements, and hardness data support these findings

  2. Plastic forming simulation analysis of marine engine crankshaft single-throw

    Directory of Open Access Journals (Sweden)

    LIU Peipei

    2016-08-01

    Full Text Available The research object is for marine engine crankshaft single-throw.A 3D model of the crankshaft single-throw blank and die in forging process is established by SolidWorks software,then the 3D model is imported into metal plastic forming CAE software DEFROM-3D to carry on the plastic forming simulation,to verify the relationship between the internal flow stress and the external deformation conditions in the process of metal plastic deformation under different strain rate and temperature,and to carry on the scientific analysis based on the obtained data.The result shows that the preset temperature is higher,the stress-strain curve is relatively lower when the strain rate is constant.Sample internal flow stress will be greater and the resistance to fatigue strength will be poorer at a higher strain rate when the temperature of the blank is constant.The result also provides a theoretical basis for further optimization design.

  3. Dynamic analysis of suspension cable based on vector form intrinsic finite element method

    Science.gov (United States)

    Qin, Jian; Qiao, Liang; Wan, Jiancheng; Jiang, Ming; Xia, Yongjun

    2017-10-01

    A vector finite element method is presented for the dynamic analysis of cable structures based on the vector form intrinsic finite element (VFIFE) and mechanical properties of suspension cable. Firstly, the suspension cable is discretized into different elements by space points, the mass and external forces of suspension cable are transformed into space points. The structural form of cable is described by the space points at different time. The equations of motion for the space points are established according to the Newton’s second law. Then, the element internal forces between the space points are derived from the flexible truss structure. Finally, the motion equations of space points are solved by the central difference method with reasonable time integration step. The tangential tension of the bearing rope in a test ropeway with the moving concentrated loads is calculated and compared with the experimental data. The results show that the tangential tension of suspension cable with moving loads is consistent with the experimental data. This method has high calculated precision and meets the requirements of engineering application.

  4. Non-Conventional Applications of Computerized Tomography: Analysis of Solid Dosage Forms Produced by Pharmaceutical Industry

    International Nuclear Information System (INIS)

    Martins de Oliveira, Jose Jr.; Germano Martins, Antonio Cesar

    2010-01-01

    X-ray computed tomography (CT) refers to the cross-sectional imaging of an object measuring the transmitted radiation at different directions. In this work, we describe a non-conventional application of computerized tomography: visualization and improvements in the understanding of some internal structural features of solid dosage forms. A micro-CT X-ray scanner, with a minimum resolution of 30 μm was used to characterize some pharmaceutical tablets, granules, controlled-release osmotic tablet and liquid-filled soft-gelatin capsules. The analysis presented in this work are essentially qualitative, but quantitative parameters, such as porosity, density distribution, tablets dimensions, etc. could also be obtained using the related CT techniques.

  5. Finite element analysis and optimization of process parameters during stamp forming of composite materials

    International Nuclear Information System (INIS)

    Venkatesan, S; Kalyanasundaram, S

    2010-01-01

    In the manufacture of parts for high performance structures using composite materials, the quality and robustness of the parts is of utmost importance. The quality of the produced parts depends largely on the process parameters and manufacturing methodologies. This study presents the use of a temperature dependant orthotropic material for a coupled structural-thermal analysis of the stamp forming process. The study investigated the effects of process parameters such as pre-heat temperature, blank holder force and process time on the formability of composite materials. Temperature was found to be the dominant factor governing the formability of the composite material while higher blank holder forces were deemed to be important for achieving high quality of the parts manufactured. Finally, an optimum set of parameters was used to compare the simulations with experimental results using an optical strain measurement system.

  6. Structural studies of formic acid using partial form-factor analysis

    International Nuclear Information System (INIS)

    Swan, G.; Dore, J.C.; Bellissent-Funel, M.C.

    1993-01-01

    Neutron diffraction measurements have been made of liquid formic acid using H/D isotopic substitution. Data are recorded for samples of DCOOD, HCOOD and a (H/D)COOD mixture (α D =0.36). A first-order difference method is used to determine the intra-molecular contribution through the introduction of a partial form-factor analysis technique incorporating a hydrogen-bond term. The method improves the sensitivity of the parameters defining the molecular geometry and avoids some of the ambiguities arising from terms involving spatial overlap of inter- and intra-molecular features. The possible application to other systems is briefly reviewed. (authors). 8 figs., 2 tabs., 8 refs

  7. Constitutive modeling for analysis and design of aluminum sheet forming processes

    International Nuclear Information System (INIS)

    Barlat, F.; Chung, K.; Yoon, J-W.; Choi, S-H.

    2000-01-01

    Finite element modeling (FEM) technology is one of the most powerful tools used to design new products, i.e. appliances, automotive, rigid packaging and aerospace parts, etc., and processes. However, FEM users need data and models to characterize the materials used to fabricate the new products. In fact, they need more information than the traditional and standard yield strength, ultimate strength, elongation, etc. Constitutive models and their associated coefficients represent a new way to describe material properties, a way that can be used by FEM users. In order to help manufacturers use more aluminum alloy sheet in their products, appropriate material models are needed to analyze and design specifically for these materials. This work describes a methodology that provides phenomenological constitutive equations based on three main microstructure components of aluminum alloys: dislocation density, second-phase particles and crystallographic texture. Examples of constitutive equations and their applications to numerical sheet forming process analysis and design are provided in this work. (author)

  8. The research of moisture forms in the baking yeast by the thermogravimetric analysis method

    Directory of Open Access Journals (Sweden)

    S. V. Lavrov

    2016-01-01

    Full Text Available The thermogravimetry method is one of the few absolute methods of analysis, that makes it one of the most accurate methods. In this research, thermogravimetric analysis of baking yeast (Saccharomyces cerevisiae was carried out. It allowed to identify temperature zones, which correspond to dripping with various link energy, as well as to predict operating parameters of the process of dehumidification and to choose their most effective dehydration method. The studies were conducted in the laboratory of the collective use center "Control and management of energy efficient projects" of the "Voronezh state university of engineering technologies" on the simultaneous thermal analysis device STA 449 F3 model (NETZSCH, Germany. The device records the change in a substance mass and the difference of the heat flow inside the crucible containing the sample and the crucible containing the standard analyte. The analyzer's working principle is based on continuous recording of the dependence of the material mass on time or temperature and its being heated to the selected temperature program in a specified gas atmosphere. The release or absorption of heat by the sample due to phase transitions or chemical reactions is recorded simultaneously. The study was performed in the following modes: the pressure is atmospheric, the maximum temperature is 588 K, the rate of temperature change is 5 K/min. The experiments were performed in aluminum crucibles with a total weight of 12 mg. The software NETZSCH Proteus was used for processing of the obtained TG and DTG curves. The analysis of the obtained data allowed to identify periods of water dehydration and solids transformation by thermal effect on baking yeast, and to identify temperature zones, which correspond to the release of moisture with different link form and energy.

  9. Uncertainty analysis of the radiological characteristics of radioactive waste using a method based on log-normal distributions

    International Nuclear Information System (INIS)

    Gigase, Yves

    2007-01-01

    Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide as example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)

  10. Chandra-SDSS Normal and Star-Forming Galaxies. I. X-Ray Source Properties of Galaxies Detected by the Chandra X-Ray Observatory in SDSS DR2

    Science.gov (United States)

    Hornschemeier, A. E.; Heckman, T. M.; Ptak, A. F.; Tremonti, C. A.; Colbert, E. J. M.

    2005-01-01

    We have cross-correlated X-ray catalogs derived from archival Chandra X-Ray Observatory ACIS observations with a Sloan Digital Sky Survey Data Release 2 (DR2) galaxy catalog to form a sample of 42 serendipitously X-ray-detected galaxies over the redshift interval 0.03normal galaxies and those in the deepest X-ray surveys. Our chief purpose is to compare optical spectroscopic diagnostics of activity (both star formation and accretion) with X-ray properties of galaxies. Our work supports a normalization value of the X-ray-star formation rate correlation consistent with the lower values published in the literature. The difference is in the allocation of X-ray emission to high-mass X-ray binaries relative to other components, such as hot gas, low-mass X-ray binaries, and/or active galactic nuclei (AGNs). We are able to quantify a few pitfalls in the use of lower resolution, lower signal-to-noise ratio optical spectroscopy to identify X-ray sources (as has necessarily been employed for many X-ray surveys). Notably, we find a few AGNs that likely would have been misidentified as non-AGN sources in higher redshift studies. However, we do not find any X-ray-hard, highly X-ray-luminous galaxies lacking optical spectroscopic diagnostics of AGN activity. Such sources are members of the ``X-ray-bright, optically normal galaxy'' (XBONG) class of AGNs.

  11. Normal radiographic findings. 4. act. ed.

    International Nuclear Information System (INIS)

    Moeller, T.B.

    2003-01-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  12. Normal radiographic findings. 4. act. ed.; Roentgennormalbefunde

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, T.B. [Gemeinschaftspraxis fuer Radiologie und Nuklearmedizin, Dillingen (Germany)

    2003-07-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  13. Thermal Analysis of Braille Formed by Using Screen Printing and Inks with Thermo Powder

    Directory of Open Access Journals (Sweden)

    Svіtlana HAVENKO

    2015-03-01

    Full Text Available In order to improve the integration of blind people into society, suitable conditions should be provided for them. The expansion of Braille (BR use could serve the purpose. Depending on the materials used for Braille, it can be formed or printed in different ways: embossing, screen printing, thermoforming, digital printing. The aim of this research is to determine the effect of thermal properties of screen printing inks and inks with thermo-powder on the qualitative parameters of Braille. Screen printing inks and inks with thermo-powder were chosen for the research. Carrying out the qualitative analysis of printouts with Braille, the thermal stability was evaluated by analyzing the thermograms obtained with derivatograph Q-1500. This paper presents the findings of the thermogravimetric (TG, differential thermogravimetric (DTG and differential thermal analysis (DTA of printouts printed on paperboard Plike and using traditional screen printing inks and screen printing inks with thermo-powder. Based on the testing findings it is determined that thermal stability of printouts printed with thermo-powder ink is higher than printed with screen printing inks. It is determined that the appropriate temperature range of screen printing inks with thermo-powder drying is 98 ºC – 198 ºC because in this case better relief of Braille dots is obtained.DOI: http://dx.doi.org/10.5755/j01.ms.21.1.5702

  14. Optimal methodologies for terahertz time-domain spectroscopic analysis of traditional pigments in powder form

    Science.gov (United States)

    Ha, Taewoo; Lee, Howon; Sim, Kyung Ik; Kim, Jonghyeon; Jo, Young Chan; Kim, Jae Hoon; Baek, Na Yeon; Kang, Dai-ill; Lee, Han Hyoung

    2017-05-01

    We have established optimal methods for terahertz time-domain spectroscopic analysis of highly absorbing pigments in powder form based on our investigation of representative traditional Chinese pigments, such as azurite [blue-based color pigment], Chinese vermilion [red-based color pigment], and arsenic yellow [yellow-based color pigment]. To accurately extract the optical constants in the terahertz region of 0.1 - 3 THz, we carried out transmission measurements in such a way that intense absorption peaks did not completely suppress the transmission level. This required preparation of pellet samples with optimized thicknesses and material densities. In some cases, mixing the pigments with polyethylene powder was required to minimize absorption due to certain peak features. The resulting distortion-free terahertz spectra of the investigated set of pigment species exhibited well-defined unique spectral fingerprints. Our study will be useful to future efforts to establish non-destructive analysis methods of traditional pigments, to construct their spectral databases, and to apply these tools to restoration of cultural heritage materials.

  15. Gliomas: Application of Cumulative Histogram Analysis of Normalized Cerebral Blood Volume on 3 T MRI to Tumor Grading

    Science.gov (United States)

    Kim, Hyungjin; Choi, Seung Hong; Kim, Ji-Hoon; Ryoo, Inseon; Kim, Soo Chin; Yeom, Jeong A.; Shin, Hwaseon; Jung, Seung Chai; Lee, A. Leum; Yun, Tae Jin; Park, Chul-Kee; Sohn, Chul-Ho; Park, Sung-Hye

    2013-01-01

    Background Glioma grading assumes significant importance in that low- and high-grade gliomas display different prognoses and are treated with dissimilar therapeutic strategies. The objective of our study was to retrospectively assess the usefulness of a cumulative normalized cerebral blood volume (nCBV) histogram for glioma grading based on 3 T MRI. Methods From February 2010 to April 2012, 63 patients with astrocytic tumors underwent 3 T MRI with dynamic susceptibility contrast perfusion-weighted imaging. Regions of interest containing the entire tumor volume were drawn on every section of the co-registered relative CBV (rCBV) maps and T2-weighted images. The percentile values from the cumulative nCBV histograms and the other histogram parameters were correlated with tumor grades. Cochran’s Q test and the McNemar test were used to compare the diagnostic accuracies of the histogram parameters after the receiver operating characteristic curve analysis. Using the parameter offering the highest diagnostic accuracy, a validation process was performed with an independent test set of nine patients. Results The 99th percentile of the cumulative nCBV histogram (nCBV C99), mean and peak height differed significantly between low- and high-grade gliomas (P = histogram analysis of nCBV using 3 T MRI can be a useful method for preoperative glioma grading. The nCBV C99 value is helpful in distinguishing high- from low-grade gliomas and grade IV from III gliomas. PMID:23704910

  16. Genetic and anatomical analysis of normal and abnormal flowers of date palm cultivar barhy derived from offshoot and tissue culture

    International Nuclear Information System (INIS)

    Shair, O.H.

    2016-01-01

    Random Amplified Polymorphic DNA (RAPD) analysis between 6 normal flower producing offshoot derived and 6 abnormal multiple carpel, flower producing tissue culture (TC) derived trees of cultivar (cv.) Barhy, was performed with the objective to check genetic variation if any at DNA level. DNA samples were extracted from pollinated and un-pollinated flowers from both sets of plants. Amplified RAPD products were clearly detected with 30 primers used in this experiment but only 3 gave a few polymorphic bands which shows low level of genetic variation among the offshoot and TC derived plants. Cluster analysis by the unweighted paired group method of arithmetic means (UPGMA) showed close genomic similarity among the 12 DNA samples with the range of 0.486-0.904 Nei and Li's coefficient in the similarity matrix. The average similarity among the 12 DNA samples was more than 50%. Floral abnormalities in TC derived plants were also studied microscopically. Abnormalities like more than three carpel development, abnormal ovule development and deformities of style and stigma were observed. The results show that the composition and the abnormalities of flowers in TC derived plants of cultivar Barhy may be attributed to epigenetic changes that takes place at different stages of tissue culture and not due to major changes at DNA level. (author)

  17. Analysis of medical institutions with various organizational forms of rehabilitation treatment and outpatient departments

    Directory of Open Access Journals (Sweden)

    Shapovalenko T.V.

    2013-12-01

    Full Text Available The study aimed organization of the analysis of activity of two medical institutions rendering services in rehabilitation and recovery treatment to adult population, having various organizational forms. Material and Methods. For five years there had been studied practical experience of rendering medical care on recovery treatment and rehabilitation to adult population on the basis of medico-statistical processing of reporting documentation of the Medicine Recovery Center and the rehabilitation, Ministry of Health of Russia functioning on the basis of the "Medical and Rehabilitation Center" — large versatile medical center and the interdistrict center of recovery treatment on the basis of city policlinic of St. Petersburg. Results. As a result it had been established an advisability of rendering this type of specialized medical care by medical institutions with different organizational forms. Conclusion. The interdistrict centers of rehabilitation functioning as a part of city policlinics, are undoubtedly necessary, as the closest medical setting for patients' homes, however such functions as diagnostics of a functional condition of an organism and an objective assessment of a state of health of patients with use of screening techniques; inspection of the organized collectives and groups of the population for the purpose of identification of groups of risk, establishment of extent of influence of environmental factors on a state of health, active supervision over persons with factors of the increased risk of diseases and correction of the revealed functional violations, etc. can be performed only in the centers organized on the basis of modern versatile treatment-and-prophylactic establishments, equipped with the modern diagnostic devices, allowing to supplement traditional methods of drug therapy with new effective techniques of treatment.

  18. Ultrastructural analysis of bone nodules formed in vitro by isolated fetal rat calvaria cells

    International Nuclear Information System (INIS)

    Bhargava, U.; Bar-Lev, M.; Bellows, C.G.; Aubin, J.E.

    1988-01-01

    When cells enzymatically digested from 21 d fetal rat calvaria are grown in ascorbic acid and Na beta-glycerophosphate, they form discrete three-dimensional nodular structures with the histological and immunohistochemical appearance of woven bone. The present investigation was undertaken to verify that bone-like features were identifiable at the ultrastructural level. The nodules formed on top of a fibroblast-like multilayer of cells. The upper surface of the nodules was lined by a continuous layer of cuboidal osteoblastic cells often seen to be joined by adherens junctions. Numerous microvilli, membrane protrusions, and coated pits could be seen on the upper surface of these cells, their cytoplasm contained prominent RER and Golgi membranes, and processes extended from their lower surfaces into a dense, highly organized collagenous matrix. Some osteocyte-like cells were completely embedded within this matrix; they also displayed RER and prominent processes which extended through the matrix and often made both adherens and gap junctional contacts with the processes of other cells. The fibroblastic cells not participating in nodule formation were surrounded by a less dense collagenous matrix and, in contrast to the matrix of the nodules, it did not mineralize. An unmineralized osteoid-like layer was seen directly below the cuboidal top layer of cells. A mineralization front was detectable below this in which small, discrete structures resembling matrix vesicles and feathery mineral crystals were evident and frequently associated with the collagen fibrils. More heavily mineralized areas were seen further into the nodule. Electron microprobe and electron and X-ray diffraction analysis confirmed the mineral to be hydroxyapatite

  19. Hashimoto thyroiditis: Part 1, sonographic analysis of the nodular form of Hashimoto thyroiditis.

    Science.gov (United States)

    Anderson, Lauren; Middleton, William D; Teefey, Sharlene A; Reading, Carl C; Langer, Jill E; Desser, Terry; Szabunio, Margaret M; Hildebolt, Charles F; Mandel, Susan J; Cronan, John J

    2010-07-01

    The purpose of this article is to analyze the sonographic appearance of nodular Hashimoto thyroiditis. As part of an ongoing multiinstitutional study, patients who underwent ultrasound examination and fine-needle aspiration of one or more thyroid nodules were analyzed for multiple predetermined sonographic features. Patients completed a questionnaire, including information about thyroid function and thyroid medication. Patients (n = 61) with fine-needle aspiration cytologic results consistent with nodular Hashimoto thyroiditis (n = 64) were included in the study. The mean (+/- SD) diameter of nodular Hashimoto thyroiditis was 15 +/- 7.33 mm. Nodular Hashimoto thyroiditis occurred as a solitary nodule in 36% (23/64) of cases and in the setting of five or more nodules in 23% (15/64) of cases. Fifty-five percent (35/64) of the cases of nodular Hashimoto thyroiditis occurred within a sonographic background of diffuse Hashimoto thyroiditis, and 45% (29/64) of cases occurred within normal thyroid parenchyma. The sonographic appearance was extremely variable. It was most commonly solid (69% [42/61] of cases) and hypoechoic (47% [27/58] of cases). Twenty percent (13/64) of nodules had calcifications (seven with nonspecific bright reflectors, four with macrocalcifications, and three eggshell), and 5% (3/64) of nodules had colloid. Twenty-seven percent (17/64) of nodules had a hypoechoic halo. The margins were well defined in 60% (36/60) and ill defined in 40% (24/60) of nodules. On Doppler analysis, 35% (22/62) of nodules were hypervascular, 42% (26/62) were isovascular or hypovascular, and 23% (14/62) were avascular. The sonographic features and vascularity of nodular Hashimoto thyroiditis were extremely variable.

  20. SFAP: Scan-Tron Forms Analysis Package for the IBM-PC. User's Guide, Version 2.0.

    Science.gov (United States)

    Harnisch, Delwyn L.; And Others

    The Scan-Tron Forms Analysis Package (SFAP) is a collection of integrated programs that allow an IBM-PC (or compatible) to collect data from a Scan-Tron 1200 forms reader. In addition to the basic data acquisition capability, the SFAP has additional capabilities related to the viewing and formatting of incoming data. When used in combination with…

  1. ESTUDIO ESTADÍSTICO DEL NÚMERO DE REGLAS RESULTANTES AL TRANSFORMAR UNA GRAMÁTICA LIBRE DE CONTEXTO A LA FORMA NORMAL DE CHOMSKY STATISTICAL STUDY OF THE NUMBER OF RESULTING RULES WHEN TRANSFORMING A CONTEXT-FREE GRAMMAR TO CHOMSKY NORMAL FORM

    Directory of Open Access Journals (Sweden)

    Fredy Ángel Miguel Amaya Robayo

    2010-08-01

    Full Text Available Es un hecho conocido que toda gramática libre de contexto puede ser transformada a la forma normal de Chomsky de tal forma que los lenguajes generados por las dos gramáticas son equivalentes. Una gramática en forma normal de Chomsky (FNC, tiene algunas ventajas, por ejemplo sus árboles de derivación son binarios, la forma de sus reglas más simples etc. Por eso es siempre deseable poder trabajar con una gramática en FNC en las aplicaciones que lo requieran. Existe un algoritmo que permite transformar una gramática libre de contexto a una en FNC, sin embargo la cantidad de reglas generadas al hacer la transformación depende del número de reglas en la gramática inicial así como de otras características. En este trabajo se analiza desde el punto de vista experimental y estadístico, la relación existente entre el número de reglas iniciales y el número de reglas que resultan luego de transformar una Gramática Libre de Contexto a la FNC. Esto permite planificar la cantidad de recursos computacionales necesarios en caso de tratar con gramáticas de alguna complejidad.It is well known that any context-free grammar can be transformed to the Chomsky normal form so that the languages generated by each one are equivalent. A grammar in Chomsky Normal Form (CNF, has some advantages: their derivation trees are binary, simplest rules and so on. So it is always desirable to work with a grammar in CNF in applications that require them. There is an algorithm that can transform a context-free grammar to one CNF grammar, however the number of rules generated after the transformation depends on the initial grammar and other circumstances. In this work we analyze from the experimental and statistical point of view the relationship between the number of initial rules and the number of resulting rules after transforming. This allows you to plan the amount of computational resources needed in case of dealing with grammars of some complexity.

  2. Teaching Form as Form

    DEFF Research Database (Denmark)

    Keiding, Tina Bering

    2012-01-01

    understanding of form per se, or, to use an expression from this text, of form as form. This challenge can be reduced to one question: how can design teaching support students in achieving not only the ability to recognize and describe different form-related concepts in existing design (i.e. analytical...

  3. Structural analysis of salt cavities formed by solution mining: I. Method of analysis and preliminary results for spherical cavities

    International Nuclear Information System (INIS)

    Fossum, A.F.

    1976-01-01

    The primary objective of this effort is an analysis of the structural stability of cavities formed by solution mining in salt domes. In particular, the effects of depth (i.e. initial state of in situ stress), shape, volume (i.e. physical dimensions of the cavity), and sequence of salt excavation/fluid evacuation on the timewise structural stability of a cavity are of interest. It is anticipated that an assessment can be made of the interrelation between depth, cavern size, and cavern shape or of the practical limits therewith. In general, the cavity shape is assumed to be axisymmetric and the salt is assumed to exhibit nonlinear creep behavior. The primary emphasis is placed on the methodology of the finite element analysis, and the results of preliminary calculations for a spherically shaped cavity. It is common practice for engineers to apply elasticity theory to the behavior of rock in order to obtain near field stresses and displacements around an underground excavation in an effort to assess structural stability. Rock masses, particularly at depth, may be subjected to a rather complex state of initial stress, and may be nonhomogeneous and anisotropic. If one also includes complex geometrical excavation shape, the use of analytical techniques as an analysis tool is practically impossible. Thus, it is almost a necessity that approximate solution techniques be employed. In this regard, the finite element method is ideal as it can handle complex geometries and nonlinear material behavior with relative ease. An unusual feature of the present study is the incorporation into the finite element code of a procedure for handling the gradual creation or excavation of an underground cavity. During the excavation sequence, the salt is permitted to exhibit nonlinear stress-strain-time dependence. The bulk of this report will be devoted to a description of the analysis procedures, together with a preliminary calculation for a spherically shaped cavity

  4. A computed tomography-based spatial normalization for the analysis of [18F] fluorodeoxyglucose positron emission tomography of the brain.

    Science.gov (United States)

    Cho, Hanna; Kim, Jin Su; Choi, Jae Yong; Ryu, Young Hoon; Lyoo, Chul Hyoung

    2014-01-01

    We developed a new computed tomography (CT)-based spatial normalization method and CT template to demonstrate its usefulness in spatial normalization of positron emission tomography (PET) images with [(18)F] fluorodeoxyglucose (FDG) PET studies in healthy controls. Seventy healthy controls underwent brain CT scan (120 KeV, 180 mAs, and 3 mm of thickness) and [(18)F] FDG PET scans using a PET/CT scanner. T1-weighted magnetic resonance (MR) images were acquired for all subjects. By averaging skull-stripped and spatially-normalized MR and CT images, we created skull-stripped MR and CT templates for spatial normalization. The skull-stripped MR and CT images were spatially normalized to each structural template. PET images were spatially normalized by applying spatial transformation parameters to normalize skull-stripped MR and CT images. A conventional perfusion PET template was used for PET-based spatial normalization. Regional standardized uptake values (SUV) measured by overlaying the template volume of interest (VOI) were compared to those measured with FreeSurfer-generated VOI (FSVOI). All three spatial normalization methods underestimated regional SUV values by 0.3-20% compared to those measured with FSVOI. The CT-based method showed slightly greater underestimation bias. Regional SUV values derived from all three spatial normalization methods were correlated significantly (p normalization may be an alternative method for structure-based spatial normalization of [(18)F] FDG PET when MR imaging is unavailable. Therefore, it is useful for PET/CT studies with various radiotracers whose uptake is expected to be limited to specific brain regions or highly variable within study population.

  5. Sensitivity analysis of respiratory parameter uncertainties: impact of criterion function form and constraints.

    Science.gov (United States)

    Lutchen, K R

    1990-08-01

    A sensitivity analysis based on weighted least-squares regression is presented to evaluate alternative methods for fitting lumped-parameter models to respiratory impedance data. The goal is to maintain parameter accuracy simultaneously with practical experiment design. The analysis focuses on predicting parameter uncertainties using a linearized approximation for joint confidence regions. Applications are with four-element parallel and viscoelastic models for 0.125- to 4-Hz data and a six-element model with separate tissue and airway properties for input and transfer impedance data from 2-64 Hz. The criterion function form was evaluated by comparing parameter uncertainties when data are fit as magnitude and phase, dynamic resistance and compliance, or real and imaginary parts of input impedance. The proper choice of weighting can make all three criterion variables comparable. For the six-element model, parameter uncertainties were predicted when both input impedance and transfer impedance are acquired and fit simultaneously. A fit to both data sets from 4 to 64 Hz could reduce parameter estimate uncertainties considerably from those achievable by fitting either alone. For the four-element models, use of an independent, but noisy, measure of static compliance was assessed as a constraint on model parameters. This may allow acceptable parameter uncertainties for a minimum frequency of 0.275-0.375 Hz rather than 0.125 Hz. This reduces data acquisition requirements from a 16- to a 5.33- to 8-s breath holding period. These results are approximations, and the impact of using the linearized approximation for the confidence regions is discussed.

  6. Sporadic and genetic forms of paediatric somatotropinoma: a retrospective analysis of seven cases and a review of the literature

    Directory of Open Access Journals (Sweden)

    Nozières Cécile

    2011-10-01

    Full Text Available Abstract Background Somatotropinoma, a pituitary adenoma characterised by excessive production of growth hormone (GH, is extremely rare in childhood. A genetic defect is evident in some cases; known genetic changes include: multiple endocrine neoplasia type 1 (MEN1; Carney complex; McCune-Albright syndrome; and, more recently identified, aryl hydrocarbon receptor-interacting protein (AIP. We describe seven children with somatotropinoma with a special focus on the differences between genetic and sporadic forms. Methods Seven children who presented in our regional network between 1992 and 2008 were included in this retrospective analysis. First-type therapy was somatostatin (SMS analogues or transsphenoidal surgery. Control was defined as when insulin-like growth factor-1 (IGF-1 levels were within the normal range for the patient's age at 6 months after therapy, associated with decreasing tumour volume. Results Patients were aged 5-17 years and the majority (n = 6 were male. Four patients had an identified genetic mutation (McCune-Albright syndrome: n = 1; MEN1: n = 1; AIP: n = 2; the remaining three cases were sporadic. Accelerated growth rate was reported as the first clinical sign in four patients. Five patients presented with macroadenoma; invasion was noted in four of them (sporadic: n = 1; genetic: n = 3. Six patients were treated with SMS analogues; normalisation of IGF-1 occurred in one patient who had a sporadic intrasellar macroadenoma. Multiple types of therapy were necessary in all patients with an identified genetic mutation (4 types: n = 1; 3 types: n = 2; 2 types: n = 1, whereas two of the three patients with sporadic somatotropinoma required only one type of therapy. Conclusions This is the first series that analyzes the therapeutic response of somatotropinoma in paediatric patients with identified genetic defects. We found that, in children, genetic somatotropinomas are more invasive than sporadic somatotropinomas. Furthermore

  7. Partner cooperation with decode-and-forward: Closed-form outage analysis and comparison

    KAUST Repository

    Benjillali, Mustapha

    2013-01-01

    In this paper, we investigate the outage performance of "partner cooperation" based on opportunistic Decodeand- Forward with constrained partial selection and reactive relaying strategies in dual-hop cooperative Nakagami-m fading links. The source/destination, which is based on the unique knowledge of local channel state information, selects the best relay to increase the chances of cooperation in both uplink and downlink communications when the direct link is also available. After deriving new expressions for the cumulative distribution functions of the variables of interest, the outage probability of the system is obtained in closed-form. We also derive the ε-outage capacity in different particular cases, and the obtained results - when the channel model is reduced to a Rayleigh fading - either are new or correspond to those previously obtained in other works. Simulation results confirm the accuracy of our analysis for a large selection of system and fading parameters and provide a new insight into the design and optimization of cooperative configurations. © 2012 IEEE.

  8. Use of Emanation Thermal Analysis in the characterization of nuclear waste forms and their alteration products

    International Nuclear Information System (INIS)

    Balek, V; Malek, Z.; Banba, T.; Mitamura, H.; Vance, E.R.

    1999-01-01

    Emanation Thermal Analysis (ETA) was used for the characterization of thermal behavior of two nuclear waste glasses, basalt volcanic glass and perovskite ceramics before and after hydrolytic treatment. The release of radon, formed by the spontaneous α-decay of 228 Th and 224 Ra and incorporated into samples to a maximum depth of 100 nm from the surface due to the recoil, was measured during heating of the samples from 20 to 1200degC and subsequent cooling. Temperatures of the annealing of surface roughness, micro-cracks and other defects, produced by manufacture and/or by subsequent treatment of glass and ceramic samples, were determined using the ETA. Microstructure changes of glass corrosion accompanying their dehydration and thermal decomposition were characterized by the radon release rate changes. The effect of hydrolytic alteration on the thermal behavior of the nuclear waste glass was revealed by ETA in an early corrosion stage. In the alteration product of the perovskite ceramics the diffusion mobility of radon was assessed in the temperature range 1000-1200degC. The thermal stability of radiation-induced defects in perovskite ceramic powder bombarded by He + ions to doses of 10 14 and 10 16 ions/cm 2 was determined by means of ETA. (author)

  9. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Mahnken, A.H.; Kohnen, M.; Steinberg, S.; Wein, B.B.; Guenther, R.W.

    2001-01-01

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.) [de

  10. Microbiological assay for the analysis of certain macrolides in pharmaceutical dosage forms.

    Science.gov (United States)

    Mahmoudi, A; Fourar, R E-A; Boukhechem, M S; Zarkout, S

    2015-08-01

    Clarithromycin (CLA) and roxithromycin (ROX) are macrolide antibiotics with an expanded spectrum of activity that are commercially available as tablets. A microbiological assay, applying the cylinder-plate method and using a strain of Micrococcus luteus ATCC 9341 as test organism, has been used and validated for the quantification of two macrolide drugs; CLA and ROX in pure and pharmaceutical formulations. The validation of the proposed method was carried out for linearity, precision, accuracy and specificity. The linear dynamic ranges were from 0.1 to 0.5μg/mL for both compounds. Logarithmic calibration curve was obtained for each macrolide (r>0.989) with statistically equal slopes varying from 3.275 to 4.038, and a percentage relative standard deviation in the range of 0.24-0.92%. Moreover, the method was applied successfully for the assay of the studied drugs in pharmaceutical tablet dosage forms. Recovery from standard addition experiments in commercial products was 94.71-96.91% regarding clarithromycin and 93.94-98.12% regarding roxithromycin, with a precision (%RSD) 1.32-2.11%. Accordingly, this microbiological assay can be used for routine quality control analysis of titled drugs in tablet formulations. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Pros and cons of rotating ground motion records to fault-normal/parallel directions for response history analysis of buildings

    Science.gov (United States)

    Kalkan, Erol; Kwong, Neal S.

    2014-01-01

    According to the regulatory building codes in the United States (e.g., 2010 California Building Code), at least two horizontal ground motion components are required for three-dimensional (3D) response history analysis (RHA) of building structures. For sites within 5 km of an active fault, these records should be rotated to fault-normal/fault-parallel (FN/FP) directions, and two RHAs should be performed separately (when FN and then FP are aligned with the transverse direction of the structural axes). It is assumed that this approach will lead to two sets of responses that envelope the range of possible responses over all nonredundant rotation angles. This assumption is examined here, for the first time, using a 3D computer model of a six-story reinforced-concrete instrumented building subjected to an ensemble of bidirectional near-fault ground motions. Peak values of engineering demand parameters (EDPs) were computed for rotation angles ranging from 0 through 180° to quantify the difference between peak values of EDPs over all rotation angles and those due to FN/FP direction rotated motions. It is demonstrated that rotating ground motions to FN/FP directions (1) does not always lead to the maximum responses over all angles, (2) does not always envelope the range of possible responses, and (3) does not provide maximum responses for all EDPs simultaneously even if it provides a maximum response for a specific EDP.

  12. The Amounts of As, Au, Br, Cu, Fe, Mo, Se and Zn in Normal and Uraemic Human whole Blood. A. Comparison by Means of Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D; Samsahl, K [AB Atomenergi, Nykoeping (Sweden); Wester, P O [Dept. of Medicine, Karolinska Inst., Serafimerlasarettet, Stockholm (Sweden)

    1964-01-15

    Quantitative determination of the elements As, Au, Br, Cu, Fe, Mo, Se and Zn have been performed in normal and uraemic human whole blood by means of H{sub 2}SO{sub 4} - H-O- digestion, distillation and ion exchange, combined with gamma-spectrometric analysis. The uraemic blood was found to contain about 10 times as much As and twice as much Mo as did the normal blood. As regards Fe, the uraemic blood contained slightly less than the normal blood. For the other elements there were no detectable difference.

  13. Malware Normalization

    OpenAIRE

    Christodorescu, Mihai; Kinder, Johannes; Jha, Somesh; Katzenbeisser, Stefan; Veith, Helmut

    2005-01-01

    Malware is code designed for a malicious purpose, such as obtaining root privilege on a host. A malware detector identifies malware and thus prevents it from adversely affecting a host. In order to evade detection by malware detectors, malware writers use various obfuscation techniques to transform their malware. There is strong evidence that commercial malware detectors are susceptible to these evasion tactics. In this paper, we describe the design and implementation of a malware normalizer ...

  14. Triple SILAC quantitative proteomic analysis reveals differential abundance of cell signaling proteins between normal and lung cancer-derived exosomes.

    Science.gov (United States)

    Clark, David J; Fondrie, William E; Yang, Austin; Mao, Li

    2016-02-05

    Exosomes are 30-100 nm sized membrane vesicles released by cells into the extracellular space that mediate intercellular communication via transfer of proteins and other biological molecules. To better understand the role of these microvesicles in lung carcinogenesis, we employed a Triple SILAC quantitative proteomic strategy to examine the differential protein abundance between exosomes derived from an immortalized normal bronchial epithelial cell line and two non-small cell lung cancer (NSCLC) cell lines harboring distinct activating mutations in the cell signaling molecules: Kirsten rat sarcoma viral oncogene homolog (KRAS) or epidermal growth factor receptor (EGFR). In total, we were able to quantify 721 exosomal proteins derived from the three cell lines. Proteins associated with signal transduction, including EGFR, GRB2 and SRC, were enriched in NSCLC exosomes, and could actively regulate cell proliferation in recipient cells. This study's investigation of the NSCLC exosomal proteome has identified enriched protein cargo that can contribute to lung cancer progression, which may have potential clinical implications in biomarker development for patients with NSCLC. The high mortality associated with lung cancer is a result of late-stage diagnosis of the disease. Current screening techniques used for early detection of lung cancer lack the specificity for accurate diagnosis. Exosomes are nano-sized extracellular vesicles, and the increased abundance of select protein cargo in exosomes derived from cancer cells may be used for diagnostic purposes. In this paper, we applied quantitative proteomic analysis to elucidate abundance differences in exosomal protein cargo between two NSCLC cell lines with distinctive oncogene mutations and an immortalized normal bronchial epithelial cell line. This study revealed proteins associated with cell adhesion, the extracellular matrix, and a variety of signaling molecules were enriched in NSCLC exosomes. The present data reveals

  15. Designing experiments for maximum information from cyclic oxidation tests and their statistical analysis using half Normal plots

    International Nuclear Information System (INIS)

    Coleman, S.Y.; Nicholls, J.R.

    2006-01-01

    Cyclic oxidation testing at elevated temperatures requires careful experimental design and the adoption of standard procedures to ensure reliable data. This is a major aim of the 'COTEST' research programme. Further, as such tests are both time consuming and costly, in terms of human effort, to take measurements over a large number of cycles, it is important to gain maximum information from a minimum number of tests (trials). This search for standardisation of cyclic oxidation conditions leads to a series of tests to determine the relative effects of cyclic parameters on the oxidation process. Following a review of the available literature, databases and the experience of partners to the COTEST project, the most influential parameters, upper dwell temperature (oxidation temperature) and time (hot time), lower dwell time (cold time) and environment, were investigated in partners' laboratories. It was decided to test upper dwell temperature at 3 levels, at and equidistant from a reference temperature; to test upper dwell time at a reference, a higher and a lower time; to test lower dwell time at a reference and a higher time and wet and dry environments. Thus an experiment, consisting of nine trials, was designed according to statistical criteria. The results of the trial were analysed statistically, to test the main linear and quadratic effects of upper dwell temperature and hot time and the main effects of lower dwell time (cold time) and environment. The nine trials are a quarter fraction of the 36 possible combinations of parameter levels that could have been studied. The results have been analysed by half Normal plots as there are only 2 degrees of freedom for the experimental error variance, which is rather low for a standard analysis of variance. Half Normal plots give a visual indication of which factors are statistically significant. In this experiment each trial has 3 replications, and the data are analysed in terms of mean mass change, oxidation kinetics

  16. Sheath liquid interface for the coupling of normal-phase liquid chromatography with electrospray mass spectrometry and its application to the analysis of neoflavonoids.

    Science.gov (United States)

    Charles, Laurence; Laure, Frédéric; Raharivelomanana, Phila; Bianchini, Jean-Pierre

    2005-01-01

    A novel interface that allows normal-phase liquid chromatography to be coupled with electrospray ionization (ESI) is reported. A make-up solution of 60 mM ammonium acetate in methanol, infused at a 5 microl min(-1) flow-rate at the tip of the electrospray probe, provides a sheath liquid which is poorly miscible with the chromatographic effluent, but promotes efficient ionization of the targeted analytes. Protonated molecules generated in the ESI source were subjected to tandem mass spectrometric experiments in a triple-quadrupole mass spectrometer. The main fragmentation reactions were characterized for each analyte and specific mass spectral transitions were used to acquire chromatographic data in the multiple reaction monitoring detection mode. Results obtained during optimization of the sheath liquid composition and flow-rate suggest that the electrospray process was mainly under the control of the make-up solution, and that it forms an external charged layer around a neutral chromatographic mobile phase core. This sheath liquid interface was implemented for the analysis of some neoflavonoid compounds and its performance was evaluated. Limits of detection were established for calophillolide, inophyllum B, inophyllum P and inophyllum C at 100, 25, 15 and 100 ng ml(-1), respectively.

  17. An approach to eliminate stepped features in multistage incremental sheet forming process: Experimental and FEA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nirala, Harish Kumar; Jain, Prashant K.; Tandon, Puneet [PDPM Indian Institute of Information Technology, Design and Manufacturing Jabalpur Jabalpur-482005, Madhya Pradesh (India); Roy, J. J.; Samal, M. K. [Bhabha Atomic Research Centre, Mumbai (India)

    2017-02-15

    Incremental sheet forming (ISF) is a recently developed manufacturing technique. In ISF, forming is done by applying deformation force through the motion of Numerically controlled (NC) single point forming tool on the clamped sheet metal blank. Single Point Incremental sheet forming (SPISF) is also known as a die-less forming process because no die is required to fabricate any component by using this process. Now a day it is widely accepted for rapid manufacturing of sheet metal components. The formability of SPISF process improves by adding some intermediate stages into it, which is known as Multi-stage SPISF (MSPISF) process. However during forming in MSPISF process because of intermediate stages stepped features are generated. This paper investigates the generation of stepped features with simulation and experimental results. An effective MSPISF strategy is proposed to remove or eliminate this generated undesirable stepped features.

  18. The Job Dimensions Underlying the Job Elements of the Position Analysis Questionnaire (PAQ) (Form B). Report No. 4.

    Science.gov (United States)

    Marquardt, Lloyd D.; McCormick, Ernest J.

    This study was concerned with the identification of the job dimension underlying the job elements of the Position Analysis Questionnaire (PAQ), Form B. The PAQ is a structured job analysis instrument consisting of 187 worker-oriented job elements which are divided into six a priori major divisions. The statistical procedure of principal components…

  19. Safety analysis report for packaging: the ORNL DOT specification 6M - special form package

    Energy Technology Data Exchange (ETDEWEB)

    Schaich, R.W.

    1982-07-01

    The ORNL DOT Specification 6M - Special Form Package was fabricated at the Oak Ridge Nation al Laboratory (ORNL) for the transport of Type B solid non-fissile radioactive materials in special form. The package was evaluated on the basis of tests performed by the Dow Chemical Company, Rocky Flats Division, on the DOT-6M container and special form tests performed on a variety of stainless steel capsules at ORNL by Operations Division personnel. The results of these evaluations demonstrate that the package is in compliance with the applicable regulations for the transport of Type B quantities in special form of non-fissile radioactive materials.

  20. Safety analysis report for packaging: the ORNL DOT Specification 20WC-5 - special form packaging

    International Nuclear Information System (INIS)

    Schaich, R.W.

    1982-10-01

    The ORNL DOT Specification 20WC-5 - Special Form Package was fabricated for the transport of large quantities of solid nonfissile radioactive materials in special form. The package was evaluated on the basis of tests performed at Sandia National Laboratories, Albuquerque, New Mexico on an identical fire and impact shield and special form tests performed on a variety of stainless steel capsules at ORNL by Operations Division personnel. The results of these evaluations demonstrate that the package is in compliance with the applicable regulations for the transport of large quantities of nonfissile radioactive materials in special form

  1. Safety analysis report for packaging: the ORNL DOT specification 6M - special form package

    International Nuclear Information System (INIS)

    Schaich, R.W.

    1982-07-01

    The ORNL DOT Specification 6M - Special Form Package was fabricated at the Oak Ridge Nation al Laboratory (ORNL) for the transport of Type B solid non-fissile radioactive materials in special form. The package was evaluated on the basis of tests performed by the Dow Chemical Company, Rocky Flats Division, on the DOT-6M container and special form tests performed on a variety of stainless steel capsules at ORNL by Operations Division personnel. The results of these evaluations demonstrate that the package is in compliance with the applicable regulations for the transport of Type B quantities in special form of non-fissile radioactive materials

  2. Corpus luteum blood flow in normal and abnormal early pregnancy: evaluation and analysis with transvaginal color and pulsed doppler sonography

    International Nuclear Information System (INIS)

    Tang Xiaoyi; Lin Meifang; Zheng Meirong; Liang Xiaoxian; Liu Jianfeng

    2005-01-01

    Objective: Detecting and assessment the corpus luteum blood flow in normal and abnormal early pregnancy. Methods: Using transvaginal color and pulse Doppler sonography, we detected 215 pregnant women including 150 normal intrauterine pregnancies, 25 abortion, 29 ectopic pregnancies, and then recorded corpus luteum blood flow feature and the blood flow indexes (Vmax, RI and PI). Results: 1) Corpus luteum was successfully identified in 148 cases out of 150 of normal early pregnancies, 25 cases out of 26 of threatened abortion; 22 cases out of 29 of ectopic pregnancy. 2) Three groups shared the same feature of Color Doppler imaging: a circumferential rim around the entire corpus luteum. 3) The flow index revealed mean PVS, RI and PI had no statistical difference in normal and abnormal early pregnancy; The mean PVS was lower in ectopic pregnancy than in normal pregnancy (P<0.05), while PI and PR had no characteristic in ectopic pregnancy group compared with the indexes obtained in normal pregnancy group. Conclusion: The corpus luteum can be precisely identified in most pregnancy using transvaginal color Doppler and manifests a characterized rim Doppler imaging. PVS may help in differentiating the ectopic pregnancy from normal early pregnancy. (authors)

  3. Natural Disasters under the Form of Severe Storms in Europe: the Cause-Effect Analysis

    Directory of Open Access Journals (Sweden)

    Virginia Câmpeanu

    2009-07-01

    Full Text Available For more than 100 years, from 1900 to 2008, there were almost 400 storms natural disasters in Europe, 40% of which occurred in the 1990s. The international prognoses for the world weather suggest a tendency toward increasing in frequency and intensity of the severe storms as the climate warms. In these circumstances, for a researcher in the field of Environmental Economics, a natural question occurs, on whether people can contribute to reducing the frequency and the magnitude of severe storms that produce disastreous social and economic effects, by acting on their causes. In researching an answer to support the public policies in the field, a cause-effect analysis applied to Europe might make a contribution to the literature in the field. This especially considering the fact that international literature regarding the factors influencing global warming contains certainties in regard to the natural factors of influence, but declared incertitudes or skepticism in regard to anthropogenic ones. Skepticism, and even tension arised during the international negotiations in Copenhagen (December 2009 in regard to the agreement for limiting global warming, with doubts being raised about the methods used by experts of the International Climate Experts Group (GIEC, and thus the results obtained, which served as a basis for the negotiations. The object of critics was in regard to the form, and at times in regard to the content. It was not about contesting the phenomenon of Global warming during the negotiations, but the methods of calculation. The methodology relies on qualitative (type top down and quantitative (type correlations bottom up cause-effect analysis of the storm disasters in Europe. Based on the instruments used, we proposed a dynamic model of association of the evolution of storm disasters in Europe with anthropogenic factors, with 3 variants. Results: The diagram cause-effect (Ishikawa or fishbone diagram and quantitative correlation of sub

  4. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    Ren, M J; Cheung, C F; Kong, L B

    2012-01-01

    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  5. A Multilevel Cross-National Analysis of Direct and Indirect Forms of School Violence

    Science.gov (United States)

    Agnich, Laura E.; Miyazaki, Yasuo

    2013-01-01

    The detrimental effects of school violence on students' physical and emotional health are well studied, and research has shown that school violence affects students in every nation across the globe. However, few cross-national studies have compared direct, physical forms of school violence to indirect, emotional forms such as teasing. Using…

  6. Forms and Functions of Aggression in Adolescent Friendship Selection and Influence : A Longitudinal Social Network Analysis

    NARCIS (Netherlands)

    Sijtsema, Jelle J.; Ojanen, Tiina; Veenstra, Rene; Lindenberg, Siegwart; Hawley, Patricia H.; Little, Todd D.

    Aggressive children are known to have friends. However, less is known about the impact of aggression on friendship development and how this can differ for overt and relational (i.e., the forms) and instrumental and reactive (i.e., the functions) aggression. This longitudinal study utilized the forms

  7. Forms of Address in Post-Revolutionary Iranian Persian: A Sociolinguistic Analysis.

    Science.gov (United States)

    Keshavarz, Mohammad Hossein

    1988-01-01

    Provides a sociolinguistic account of the forms of address used in present-day Iranian Persian. The shift from power to solidarity as a result of the Islamic Revolution has resulted in a sociolinguistic simplification of address forms. (Author/CB)

  8. Normal accidents

    International Nuclear Information System (INIS)

    Perrow, C.

    1989-01-01

    The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de

  9. What Are Normal Metal Ion Levels After Total Hip Arthroplasty? A Serologic Analysis of Four Bearing Surfaces.

    Science.gov (United States)

    Barlow, Brian T; Ortiz, Philippe A; Boles, John W; Lee, Yuo-Yu; Padgett, Douglas E; Westrich, Geoffrey H

    2017-05-01

    The recent experiences with adverse local tissue reactions have highlighted the need to establish what are normal serum levels of cobalt (Co), chromium (Cr), and titanium (Ti) after hip arthroplasty. Serum Co, Cr, and Ti levels were measured in 80 nonconsecutive patients with well-functioning unilateral total hip arthroplasty and compared among 4 bearing surfaces: ceramic-on-ceramic (CoC); ceramic-on-polyethylene (CoP); metal-on-polyethylene (MoP), and dual mobility (DM). The preoperative and most recent University of California, Los Angeles (UCLA) and Western Ontario and McMaster Universities Arthritis Index (WOMAC) scores were compared among the different bearing surfaces. No significant difference was found among serum Co and Cr levels between the 4 bearing surface groups (P = .0609 and P = .1577). Secondary analysis comparing metal and ceramic femoral heads demonstrated that the metal group (MoP, modular dual mobility (Stryker Orthopedics, Mahwah, NJ) [metal]) had significant higher serum Co levels compared with the ceramic group (CoC, CoP, MDM [ceramic]) (1.05 mg/L ± 1.25 vs 0.59 mg/L ± 0.24; P = .0411). Spearman coefficient identified no correlation between metal ion levels and patient-reported outcome scores. No serum metal ion level differences were found among well-functioning total hip arthroplasty with modern bearing couples. Significantly higher serum Co levels were seen when comparing metal vs ceramic femoral heads in this study and warrants further investigation. Metal ion levels did not correlate with patient-reported outcome measures. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Multiscale virtual particle based elastic network model (MVP-ENM) for normal mode analysis of large-sized biomolecules.

    Science.gov (United States)

    Xia, Kelin

    2017-12-20

    In this paper, a multiscale virtual particle based elastic network model (MVP-ENM) is proposed for the normal mode analysis of large-sized biomolecules. The multiscale virtual particle (MVP) model is proposed for the discretization of biomolecular density data. With this model, large-sized biomolecular structures can be coarse-grained into virtual particles such that a balance between model accuracy and computational cost can be achieved. An elastic network is constructed by assuming "connections" between virtual particles. The connection is described by a special harmonic potential function, which considers the influence from both the mass distributions and distance relations of the virtual particles. Two independent models, i.e., the multiscale virtual particle based Gaussian network model (MVP-GNM) and the multiscale virtual particle based anisotropic network model (MVP-ANM), are proposed. It has been found that in the Debye-Waller factor (B-factor) prediction, the results from our MVP-GNM with a high resolution are as good as the ones from GNM. Even with low resolutions, our MVP-GNM can still capture the global behavior of the B-factor very well with mismatches predominantly from the regions with large B-factor values. Further, it has been demonstrated that the low-frequency eigenmodes from our MVP-ANM are highly consistent with the ones from ANM even with very low resolutions and a coarse grid. Finally, the great advantage of MVP-ANM model for large-sized biomolecules has been demonstrated by using two poliovirus virus structures. The paper ends with a conclusion.

  11. Differences in bone mineral density between normal-weight children and children with overweight and obesity: a systematic review and meta-analysis.

    Science.gov (United States)

    van Leeuwen, J; Koes, B W; Paulis, W D; van Middelkoop, M

    2017-05-01

    This study examines the differences in bone mineral density between normal-weight children and children with overweight or obesity. A systematic review and meta-analysis of observational studies (published up to 22 June 2016) on the differences in bone mineral density between normal-weight children and overweight and obese children was performed. Results were pooled when possible and mean differences (MDs) were calculated between normal-weight and overweight and normal-weight and obese children for bone content and density measures at different body sites. Twenty-seven studies, with a total of 5,958 children, were included. There was moderate and high quality of evidence that overweight (MD 213 g; 95% confidence interval [CI] 166, 261) and obese children (MD 329 g; 95%CI [229, 430]) have a significantly higher whole body bone mineral content than normal-weight children. Similar results were found for whole body bone mineral density. Sensitivity analysis showed that the association was stronger in girls. Overweight and obese children have a significantly higher bone mineral density compared with normal-weight children. Because there was only one study included with a longitudinal design, the long-term impact of childhood overweight and obesity on bone health at adulthood is not clear. © 2017 World Obesity Federation.

  12. Microstructural analysis of hot press formed 22MnB5 steel

    Science.gov (United States)

    Aziz, Nuraini; Aqida, Syarifah Nur; Ismail, Izwan

    2017-10-01

    This paper presents a microstructural study on hot press formed 22MnB5 steel for enhanced mechanical properties. Hot press forming process consists of simultaneous forming and quenching of heated blank. The 22MnB5 steel was processed at three different parameter settings: quenching time, water temperature and water flow rate. 22MnB5 was processed using 33 full factorial design of experiment (DOE). The full factorial DOE was designed using three factors of quenching time, water temperature and water flow rate at three levels. The factors level were quenching time range of 5 - 11 s, water temperature; 5 - 27°C and water flow rate; 20 - 40 L/min. The as-received and hot press forming processed steel was characterised for metallographic study and martensitic structure area percentage using JEOL Field Emission Scanning Electron Microscopic (FESEM). From the experimental finding, the hot press formed 22MnB5 steel consisted of 50 to 84% martensitic structure area. The minimum quenching time of 8 seconds was required to obtain formed sample with high percentage of martensite. These findings contribute to initial design of processing parameters in hot press forming of 22MnB5 steel blanks for automotive component.

  13. Integrated Analysis of Flow, Form, and Function for River Management and Design Testing

    Science.gov (United States)

    Lane, B. A. A.; Pasternack, G. B.; Sandoval Solis, S.

    2017-12-01

    Rivers are highly complex, dynamic systems that support numerous ecosystem functions including transporting sediment, modulating biogeochemical processes, and regulating habitat availability for native species. The extent and timing of these functions is largely controlled by the interplay of hydrologic dynamics (i.e. flow) and the shape and composition of the river corridor (i.e. form). This study applies synthetic channel design to the evaluation of river flow-form-function linkages, with the aim of evaluating these interactions across a range of flows and forms to inform process-driven management efforts with limited data and financial requirements. In an application to California's Mediterranean-montane streams, the interacting roles of channel form, water year type, and hydrologic impairment were evaluated across a suite of ecosystem functions related to hydrogeomorphic processes, aquatic habitat, and riparian habitat. Channel form acted as the dominant control on hydrogeomorphic processes considered, while water year type controlled salmonid habitat functions. Streamflow alteration for hydropower increased redd dewatering risk and altered aquatic habitat availability and riparian recruitment dynamics. Study results highlight critical tradeoffs in ecosystem function performance and emphasize the significance of spatiotemporal diversity of flow and form at multiple scales for maintaining river ecosystem integrity. The approach is broadly applicable and extensible to other systems and ecosystem functions, where findings can be used to characterize complex controls on river ecosystems, assess impacts of proposed flow and form alterations, and inform river restoration strategies.

  14. [Current situations and problem analysis of influencing factors of traditional Chinese medicine tablets on forming quality].

    Science.gov (United States)

    Li, Yan-Nian; Wu, Zhen-Feng; Wan, Na; Li, Yuan-Hui; Li, Hui-Ting; Yang, Ming

    2018-04-01

    The compressibility of tablets is the essential operating unit during the preparation of traditional Chinese medicine tablets, as well as a complicated process. Therefore, it is of great significance to comprehensively study the influencing factors on the formation process. This paper aimed to review the evaluation methods for the tablet forming quality and highlight the effects of material powder properties, excipients and preparation technology on the quality of traditional Chinese medicine tablets on the basis of relevant literatures. Furthermore, the common problems in tablet forming process are also analyzed to provide useful references for the development of tablet forming quality of traditional Chinese medicines. Copyright© by the Chinese Pharmaceutical Association.

  15. Vibrational spectra, molecular structure, natural bond orbital, first order hyperpolarizability, thermodynamic analysis and normal coordinate analysis of Salicylaldehyde p-methylphenylthiosemicarbazone by density functional method

    Science.gov (United States)

    Porchelvi, E. Elamurugu; Muthu, S.

    2015-01-01

    The thiosemicarbazone compound, Salicylaldehyde p-methylphenylthiosemicarbazone (abbreviated as SMPTSC) was synthesized and characterized by FTIR, FT-Raman and UV. Density functional (DFT) calculations have been carried out for the title compound by performing DFT level of theory using B3LYP/6-31++G(d,p) basis set. The molecular geometry and vibrational frequencies were calculated and compared with the experimental data. The detailed interpretation of the vibrational spectra has been carried out with aid of normal coordinate analysis (NCA) following the scaled quantum mechanical force field methodology. The electronic dipole moment (μD) and the first hyperpolarizability (βtot) values of the investigated molecule were computed using density functional theory (DFT/B3LYP) with 6-311++G(d,p) basis set. The stability and charge delocalization of the molecule was studied by natural bond orbital (NBO) analysis. Thearomaticities of the phenyl rings were studied using the standard harmonic oscillator model of aromaticity (HOMA) index. Mulliken population analysis on atomic charges is also calculated. The molecule orbital contributions are studied by density of energy states (DOSs).

  16. Analysis of reactive aldehydes formed from the irradiated skin lipid, triolein

    International Nuclear Information System (INIS)

    Niyati-Shirkhodaee, F.; Shibamoto. Y.

    1992-01-01

    One of the major skin lipids, triolein, was irradiated by 300 nm uv light under conditions approximately those at the skin surface exposed to sunlight for different periods of time. Irradiated samples were analyzed for acrolein, formaldehyde, and acetaldehyde by gas chromatography. Acrolein formed was derivatized to more stable 1-methyl-2-pyrazoline with N-methylhydrazine and analyzed by a nitrogen-phosphorus specific detector. Formaldehyde and acetaldehyde formed were reacted with cysteamine to give thiazolidine and 2-methylthiazolidine, respectively and analyzed by a flame photometric sulfur specific detector. The maximum amount of acrolein (1.05 nmol/mg triolein) was formed after 6 hr irradiation. The maximum quantities of formaldehyde (6 nmol/mg triolein) and acetaldehyde (2.71 nmol/mg triolein) were formed after 12 hr irradiation. Both formaldehyde and acrolein have been known to cause skin irritation in the levels of 1 ppM

  17. Analysis and Compensation for Gear Accuracy with Setting Error in Form Grinding

    Directory of Open Access Journals (Sweden)

    Chenggang Fang

    2015-01-01

    Full Text Available In the process of form grinding, gear setting error was the main factor that influenced the form grinding accuracy; we proposed an effective method to improve form grinding accuracy that corrected the error by controlling the machine operations. Based on establishing the geometry model of form grinding and representing the gear setting errors as homogeneous coordinate, tooth mathematic model was obtained and simplified under the gear setting error. Then, according to the gear standard of ISO1328-1: 1997 and the ANSI/AGMA 2015-1-A01: 2002, the relationship was investigated by changing the gear setting errors with respect to tooth profile deviation, helix deviation, and cumulative pitch deviation, respectively, under the condition of gear eccentricity error, gear inclination error, and gear resultant error. An error compensation method was proposed based on solving sensitivity coefficient matrix of setting error in a five-axis CNC form grinding machine; simulation and experimental results demonstrated that the method can effectively correct the gear setting error, as well as further improving the forming grinding accuracy.

  18. Pursuing Normality

    DEFF Research Database (Denmark)

    Madsen, Louise Sofia; Handberg, Charlotte

    2018-01-01

    implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease......BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... was to understand the reflections on cancer survivorship care of lymphoma survivors to aid the future planning of cancer survivorship care and overcome barriers to participation. METHODS: Data were generated in a hematological ward during 4 months of ethnographic fieldwork, including participant observation and 46...

  19. ArrayMining: a modular web-application for microarray analysis combining ensemble and consensus methods with cross-study normalization

    Directory of Open Access Journals (Sweden)

    Krasnogor Natalio

    2009-10-01

    Full Text Available Abstract Background Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.

  20. Identification, Expression Analysis, and Target Prediction of Flax Genotroph MicroRNAs Under Normal and Nutrient Stress Conditions

    Science.gov (United States)

    Melnikova, Nataliya V.; Dmitriev, Alexey A.; Belenikin, Maxim S.; Koroban, Nadezhda V.; Speranskaya, Anna S.; Krinitsina, Anastasia A.; Krasnov, George S.; Lakunina, Valentina A.; Snezhkina, Anastasiya V.; Sadritdinova, Asiya F.; Kishlyan, Natalya V.; Rozhmina, Tatiana A.; Klimina, Kseniya M.; Amosova, Alexandra V.; Zelenin, Alexander V.; Muravenko, Olga V.; Bolsheva, Nadezhda L.; Kudryavtseva, Anna V.

    2016-01-01

    Cultivated flax (Linum usitatissimum L.) is an important plant valuable for industry. Some flax lines can undergo heritable phenotypic and genotypic changes (LIS-1 insertion being the most common) in response to nutrient stress and are called plastic lines. Offspring of plastic lines, which stably inherit the changes, are called genotrophs. MicroRNAs (miRNAs) are involved in a crucial regulatory mechanism of gene expression. They have previously been assumed to take part in nutrient stress response and can, therefore, participate in genotroph formation. In the present study, we performed high-throughput sequencing of small RNAs (sRNAs) extracted from flax plants grown under normal, phosphate deficient and nutrient excess conditions to identify miRNAs and evaluate their expression. Our analysis revealed expression of 96 conserved miRNAs from 21 families in flax. Moreover, 475 novel potential miRNAs were identified for the first time, and their targets were predicted. However, none of the identified miRNAs were transcribed from LIS-1. Expression of seven miRNAs (miR168, miR169, miR395, miR398, miR399, miR408, and lus-miR-N1) with up- or down-regulation under nutrient stress (on the basis of high-throughput sequencing data) was evaluated on extended sampling using qPCR. Reference gene search identified ETIF3H and ETIF3E genes as most suitable for this purpose. Down-regulation of novel potential lus-miR-N1 and up-regulation of conserved miR399 were revealed under the phosphate deficient conditions. In addition, the negative correlation of expression of lus-miR-N1 and its predicted target, ubiquitin-activating enzyme E1 gene, as well as, miR399 and its predicted target, ubiquitin-conjugating enzyme E2 gene, was observed. Thus, in our study, miRNAs expressed in flax plastic lines and genotrophs were identified and their expression and expression of their targets was evaluated using high-throughput sequencing and qPCR for the first time. These data provide new insights