WorldWideScience

Sample records for normal forms standard

  1. Closed-form confidence intervals for functions of the normal mean and standard deviation.

    Science.gov (United States)

    Donner, Allan; Zou, G Y

    2012-08-01

    Confidence interval methods for a normal mean and standard deviation are well known and simple to apply. However, the same cannot be said for important functions of these parameters. These functions include the normal distribution percentiles, the Bland-Altman limits of agreement, the coefficient of variation and Cohen's effect size. We present a simple approach to this problem by using variance estimates recovered from confidence limits computed for the mean and standard deviation separately. All resulting confidence intervals have closed forms. Simulation results demonstrate that this approach performs very well for limits of agreement, coefficients of variation and their differences.

  2. Normal form theory and spectral sequences

    OpenAIRE

    Sanders, Jan A.

    2003-01-01

    The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.

  3. Nonlinear dynamics exploration through normal forms

    CERN Document Server

    Kahn, Peter B

    2014-01-01

    Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations

  4. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  5. Normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)

  6. Normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  7. An Algorithm for Higher Order Hopf Normal Forms

    Directory of Open Access Journals (Sweden)

    A.Y.T. Leung

    1995-01-01

    Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.

  8. Comparative analysis of JKR Sarawak form of contract and Malaysia Standard form of building contract (PWD203A)

    Science.gov (United States)

    Yunus, A. I. A.; Muhammad, W. M. N. W.; Saaid, M. N. F.

    2018-04-01

    Standard form of contract is normally being used in Malaysia construction industry in establishing legal relation between contracting parties. Generally, most of Malaysia federal government construction project used PWD203A which is a standard form of contract to be used where Bills of Quantities Form Part of the Contract and it is issued by Public Works Department (PWD/JKR). On the other hand in Sarawak, the largest state in Malaysia, the state government has issued their own standard form of contract namely JKR Sarawak Form of Contract 2006. Even both forms have been used widely in construction industry; there is still lack of understanding on both forms. The aim of this paper is to identify significant provision on both forms of contract. Document analysis has been adopted in conducting an in-depth review on both forms. It is found that, both forms of contracts have differences and similarities on several provisions specifically matters to definitions and general; execution of the works; payments, completion and final account; and delay, dispute resolution and determination.

  9. Normal form for mirror machine Hamiltonians

    International Nuclear Information System (INIS)

    Dragt, A.J.; Finn, J.M.

    1979-01-01

    A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior

  10. Volume-preserving normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-01-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)

  11. Volume-preserving normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  12. TRASYS form factor matrix normalization

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1992-01-01

    A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.

  13. Automatic identification and normalization of dosage forms in drug monographs

    Science.gov (United States)

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  14. Normal form and synchronization of strict-feedback chaotic systems

    International Nuclear Information System (INIS)

    Wang, Feng; Chen, Shihua; Yu Minghai; Wang Changping

    2004-01-01

    This study concerns the normal form and synchronization of strict-feedback chaotic systems. We prove that, any strict-feedback chaotic system can be rendered into a normal form with a invertible transform and then a design procedure to synchronize the normal form of a non-autonomous strict-feedback chaotic system is presented. This approach needs only a scalar driving signal to realize synchronization no matter how many dimensions the chaotic system contains. Furthermore, the Roessler chaotic system is taken as a concrete example to illustrate the procedure of designing without transforming a strict-feedback chaotic system into its normal form. Numerical simulations are also provided to show the effectiveness and feasibility of the developed methods

  15. Normal forms in Poisson geometry

    NARCIS (Netherlands)

    Marcut, I.T.

    2013-01-01

    The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric

  16. Diagonalization and Jordan Normal Form--Motivation through "Maple"[R

    Science.gov (United States)

    Glaister, P.

    2009-01-01

    Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package…

  17. Normal equivariant forms of vector fields

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    We prove a theorem of linearization of type Siegel and a theorem of normal forms of type Poincare-Dulac for germs of holomorphic vector fields in the origin of C 2 , Γ -equivariants, where Γ is a finite subgroup of GL (2,C). (author). 5 refs

  18. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.

    Science.gov (United States)

    Frejlich, Pedro; Mărcuț, Ioan

    2018-01-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  19. AFP Algorithm and a Canonical Normal Form for Horn Formulas

    OpenAIRE

    Majdoddin, Ruhollah

    2014-01-01

    AFP Algorithm is a learning algorithm for Horn formulas. We show that it does not improve the complexity of AFP Algorithm, if after each negative counterexample more that just one refinements are performed. Moreover, a canonical normal form for Horn formulas is presented, and it is proved that the output formula of AFP Algorithm is in this normal form.

  20. Utilizing Nested Normal Form to Design Redundancy Free JSON Schemas

    Directory of Open Access Journals (Sweden)

    Wai Yin Mok

    2016-12-01

    Full Text Available JSON (JavaScript Object Notation is a lightweight data-interchange format for the Internet. JSON is built on two structures: (1 a collection of name/value pairs and (2 an ordered list of values (http://www.json.org/. Because of this simple approach, JSON is easy to use and it has the potential to be the data interchange format of choice for the Internet. Similar to XML, JSON schemas allow nested structures to model hierarchical data. As data interchange over the Internet increases exponentially due to cloud computing or otherwise, redundancy free JSON data are an attractive form of communication because they improve the quality of data communication through eliminating update anomaly. Nested Normal Form, a normal form for hierarchical data, is a precise characterization of redundancy. A nested table, or a hierarchical schema, is in Nested Normal Form if and only if it is free of redundancy caused by multivalued and functional dependencies. Using Nested Normal Form as a guide, this paper introduces a JSON schema design methodology that begins with UML use case diagrams, communication diagrams and class diagrams that model a system under study. Based on the use cases’ execution frequencies and the data passed between involved parties in the communication diagrams, the proposed methodology selects classes from the class diagrams to be the roots of JSON scheme trees and repeatedly adds classes from the class diagram to the scheme trees as long as the schemas satisfy Nested Normal Form. This process continues until all of the classes in the class diagram have been added to some JSON scheme trees.

  1. A normal form approach to the theory of nonlinear betatronic motion

    International Nuclear Information System (INIS)

    Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.

    1994-01-01

    The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)

  2. SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS

    Directory of Open Access Journals (Sweden)

    A. V. Sokolov

    2016-01-01

    Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.

  3. Quantifying Normal Craniofacial Form and Baseline Craniofacial Asymmetry in the Pediatric Population.

    Science.gov (United States)

    Cho, Min-Jeong; Hallac, Rami R; Ramesh, Jananie; Seaward, James R; Hermann, Nuno V; Darvann, Tron A; Lipira, Angelo; Kane, Alex A

    2018-03-01

    Restoring craniofacial symmetry is an important objective in the treatment of many craniofacial conditions. Normal form has been measured using anthropometry, cephalometry, and photography, yet all of these modalities have drawbacks. In this study, the authors define normal pediatric craniofacial form and craniofacial asymmetry using stereophotogrammetric images, which capture a densely sampled set of points on the form. After institutional review board approval, normal, healthy children (n = 533) with no known craniofacial abnormalities were recruited at well-child visits to undergo full head stereophotogrammetric imaging. The children's ages ranged from 0 to 18 years. A symmetric three-dimensional template was registered and scaled to each individual scan using 25 manually placed landmarks. The template was deformed to each subject's three-dimensional scan using a thin-plate spline algorithm and closest point matching. Age-based normal facial models were derived. Mean facial asymmetry and statistical characteristics of the population were calculated. The mean head asymmetry across all pediatric subjects was 1.5 ± 0.5 mm (range, 0.46 to 4.78 mm), and the mean facial asymmetry was 1.2 ± 0.6 mm (range, 0.4 to 5.4 mm). There were no significant differences in the mean head or facial asymmetry with age, sex, or race. Understanding the "normal" form and baseline distribution of asymmetry is an important anthropomorphic foundation. The authors present a method to quantify normal craniofacial form and baseline asymmetry in a large pediatric sample. The authors found that the normal pediatric craniofacial form is asymmetric, and does not change in magnitude with age, sex, or race.

  4. Normal form of linear systems depending on parameters

    International Nuclear Information System (INIS)

    Nguyen Huynh Phan.

    1995-12-01

    In this paper we resolve completely the problem to find normal forms of linear systems depending on parameters for the feedback action that we have studied for the special case of controllable linear systems. (author). 24 refs

  5. The COBE normalization for standard cold dark matter

    Science.gov (United States)

    Bunn, Emory F.; Scott, Douglas; White, Martin

    1995-01-01

    The Cosmic Background Explorer Satellite (COBE) detection of microwave anisotropies provides the best way of fixing the amplitude of cosmological fluctuations on the largest scales. This normalization is usually given for an n = 1 spectrum, including only the anisotropy caused by the Sachs-Wolfe effect. This is certainly not a good approximation for a model containing any reasonable amount of baryonic matter. In fact, even tilted Sachs-Wolfe spectra are not a good fit to models like cold dark matter (CDM). Here, we normalize standard CDM (sCDM) to the two-year COBE data and quote the best amplitude in terms of the conventionally used measures of power. We also give normalizations for some specific variants of this standard model, and we indicate how the normalization depends on the assumed values on n, Omega(sub B) and H(sub 0). For sCDM we find the mean value of Q = 19.9 +/- 1.5 micro-K, corresponding to sigma(sub 8) = 1.34 +/- 0.10, with the normalization at large scales being B = (8.16 +/- 1.04) x 10(exp 5)(Mpc/h)(exp 4), and other numbers given in the table. The measured rms temperature fluctuation smoothed on 10 deg is a little low relative to this normalization. This is mainly due to the low quadrupole in the data: when the quadrupole is removed, the measured value of sigma(10 deg) is quite consistent with the best-fitting the mean value of Q. The use of the mean value of Q should be preferred over sigma(10 deg), when its value can be determined for a particular theory, since it makes full use of the data.

  6. A New Normal Form for Multidimensional Mode Conversion

    International Nuclear Information System (INIS)

    Tracy, E. R.; Richardson, A. S.; Kaufman, A. N.; Zobin, N.

    2007-01-01

    Linear conversion occurs when two wave types, with distinct polarization and dispersion characteristics, are locally resonant in a nonuniform plasma [1]. In recent work, we have shown how to incorporate a ray-based (WKB) approach to mode conversion in numerical algorithms [2,3]. The method uses the ray geometry in the conversion region to guide the reduction of the full NxN-system of wave equations to a 2x2 coupled pair which can be solved and matched to the incoming and outgoing WKB solutions. The algorithm in [2] assumes the ray geometry is hyperbolic and that, in ray phase space, there is an 'avoided crossing', which is the most common type of conversion. Here, we present a new formulation that can deal with more general types of conversion [4]. This formalism is based upon the fact (first proved in [5]) that it is always possible to put the 2x2 wave equation into a 'normal' form, such that the diagonal elements of the dispersion matrix Poisson-commute with the off-diagonals (at leading order). Therefore, if we use the diagonals (rather than the eigenvalues or the determinant) of the dispersion matrix as ray Hamiltonians, the off-diagonals will be conserved quantities. When cast into normal form, the 2x2 dispersion matrix has a very natural physical interpretation: the diagonals are the uncoupled ray hamiltonians and the off-diagonals are the coupling. We discuss how to incorporate the normal form into ray tracing algorithms

  7. Normal forms of invariant vector fields under a finite group action

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    Let Γ be a finite subgroup of GL(n,C). This subgroup acts on the space of germs of holomorphic vector fields vanishing at the origin in C n . We prove a theorem of invariant conjugation to a normal form and linearization for the subspace of invariant elements and we give a description of these normal forms in dimension n=2. (author)

  8. Normalization method for metabolomics data using optimal selection of multiple internal standards

    Directory of Open Access Journals (Sweden)

    Yetukuri Laxman

    2007-03-01

    Full Text Available Abstract Background Success of metabolomics as the phenotyping platform largely depends on its ability to detect various sources of biological variability. Removal of platform-specific sources of variability such as systematic error is therefore one of the foremost priorities in data preprocessing. However, chemical diversity of molecular species included in typical metabolic profiling experiments leads to different responses to variations in experimental conditions, making normalization a very demanding task. Results With the aim to remove unwanted systematic variation, we present an approach that utilizes variability information from multiple internal standard compounds to find optimal normalization factor for each individual molecular species detected by metabolomics approach (NOMIS. We demonstrate the method on mouse liver lipidomic profiles using Ultra Performance Liquid Chromatography coupled to high resolution mass spectrometry, and compare its performance to two commonly utilized normalization methods: normalization by l2 norm and by retention time region specific standard compound profiles. The NOMIS method proved superior in its ability to reduce the effect of systematic error across the full spectrum of metabolite peaks. We also demonstrate that the method can be used to select best combinations of standard compounds for normalization. Conclusion Depending on experiment design and biological matrix, the NOMIS method is applicable either as a one-step normalization method or as a two-step method where the normalization parameters, influenced by variabilities of internal standard compounds and their correlation to metabolites, are first calculated from a study conducted in repeatability conditions. The method can also be used in analytical development of metabolomics methods by helping to select best combinations of standard compounds for a particular biological matrix and analytical platform.

  9. On the relationship between LTL normal forms and Büchi automata

    DEFF Research Database (Denmark)

    Li, Jianwen; Pu, Geguang; Zhang, Lijun

    2013-01-01

    In this paper, we revisit the problem of translating LTL formulas to Büchi automata. We first translate the given LTL formula into a special disjuctive-normal form (DNF). The formula will be part of the state, and its DNF normal form specifies the atomic properties that should hold immediately...

  10. 7 CFR 1755.30 - List of telecommunications standard contract forms.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false List of telecommunications standard contract forms... UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE TELECOMMUNICATIONS POLICIES ON SPECIFICATIONS, ACCEPTABLE MATERIALS, AND STANDARD CONTRACT FORMS § 1755.30 List of telecommunications standard contract forms. (a...

  11. NON-STANDARD FORMS OF EMPLOYMENT IN BUSINESS ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    A. E. Chekanov

    2013-01-01

    Full Text Available The article discusses the emergence and development of non-standard forms of employment and flexible working. The causes of their use reflects the results of research conducted in the workplace. Non-standard forms of employment and attractive today as they allow to expand the circle of the workforce.

  12. Normal Forms for Retarded Functional Differential Equations and Applications to Bogdanov-Takens Singularity

    Science.gov (United States)

    Faria, T.; Magalhaes, L. T.

    The paper addresses, for retarded functional differential equations (FDEs), the computation of normal forms associated with the flow on a finite-dimensional invariant manifold tangent to invariant spaces for the infinitesimal generator of the linearized equation at a singularity. A phase space appropriate to the computation of these normal forms is introduced, and adequate nonresonance conditions for the computation of the normal forms are derived. As an application, the general situation of Bogdanov-Takens singularity and its versal unfolding for scalar retarded FDEs with nondegeneracy at second order is considered, both in the general case and in the case of differential-delay equations of the form ẋ( t) = ƒ( x( t), x( t-1)).

  13. Reconstruction of normal forms by learning informed observation geometries from data.

    Science.gov (United States)

    Yair, Or; Talmon, Ronen; Coifman, Ronald R; Kevrekidis, Ioannis G

    2017-09-19

    The discovery of physical laws consistent with empirical observations is at the heart of (applied) science and engineering. These laws typically take the form of nonlinear differential equations depending on parameters; dynamical systems theory provides, through the appropriate normal forms, an "intrinsic" prototypical characterization of the types of dynamical regimes accessible to a given model. Using an implementation of data-informed geometry learning, we directly reconstruct the relevant "normal forms": a quantitative mapping from empirical observations to prototypical realizations of the underlying dynamics. Interestingly, the state variables and the parameters of these realizations are inferred from the empirical observations; without prior knowledge or understanding, they parametrize the dynamics intrinsically without explicit reference to fundamental physical quantities.

  14. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, James A.; Heinemann, Klaus [New Mexico Univ., Albuquerque, NM (United States). Dept. of Mathematics and Statistics; Vogt, Mathias [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Gooden, Matthew [North Carolina State Univ., Raleigh, NC (United States). Dept. of Physics

    2013-03-15

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length {lambda} of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As {lambda} varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in

  15. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    International Nuclear Information System (INIS)

    Ellison, James A.; Heinemann, Klaus; Gooden, Matthew

    2013-03-01

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length λ of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As λ varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in the

  16. A standard form for generalized CP transformations

    International Nuclear Information System (INIS)

    Ecker, G.; Grimus, W.; Neufeld, H.

    1987-01-01

    The investigation of general CP transformations leads to transformations of the form U → W T UW with unitary matrices U, W. It is shown that a basis for weak eigenstates can always be chosen such that W T UW has a certain real standard form. (Author)

  17. Standardized uptake values of fluorine-18 fluorodeoxyglucose: the value of different normalization procedures

    International Nuclear Information System (INIS)

    Schomburg, A.; Bender, H.; Reichel, C.; Sommer, T.; Ruhlmann, J.; Kozak, B.; Biersack, H.J.

    1996-01-01

    While the evident advantages of absolute metabolic rate determinations cannot be equalled by static image analysis of fluorine-18 fluorodexyglucose positron emission tomographic (FDG PET) studies, various algorithms for the normalization of static FDG uptake values have been proposed. This study was performed to compare different normalization procedures in terms of dependency on individual patient characteristics. Standardized FDG uptake values (SUVs) were calculated for liver and lung tissue in 126 patients studied with whole-body FDG PET. Uptake values were normalized for total body weight, lean body mass and body surface area. Ranges, means, medians, standard deviations and variation coefficients of these SUV parameters were calculated and their interdependency with total body weight, lean body mass, body surface area, patient height and blood sugar levels was calculated by means of regression analysis. Standardized FDG uptake values normalized for body surface area were clearly superior to SUV parameters normalized for total body weight or lean body mass. Variation and correlation coefficients of body surface area-normalized uptake values were minimal when compared with SUV parameters derived from the other normalization procedures. Normalization for total body weight resulted in uptake values still dependent on body weight and blood sugar levels, while normalization for lean body mass did not eliminate the positive correlation with lean body mass and patient height. It is concluded that normalization of FDG uptake values for body surface area is less dependent on the individual patient characteristics than are FDG uptake values normalized for other parameters, and therefore appears to be preferable for FDG PET studies in oncology. (orig.)

  18. Use of newly developed standardized form for interpretation of high-resolution CT in screening for pneumoconiosis

    International Nuclear Information System (INIS)

    Julien, P.J.; Sider, L.; Silverman, J.M.; Dahlgren, J.; Harber, P.; Bunn, W.

    1991-01-01

    This paper reports that although the International Labour Office (ILO) standard for interpretation of the posteroanterior chest radiograph has been available for 10 years, there has been no attempt to standardize the high-resolution CT (HRTC) readings for screening of pneumoconiosis. An integrated respirator surveillance program for 87 workers exposed to inorganic dust was conducted. This program consisted of a detailed occupational exposure history, physical symptoms and signs, spirometry, chest radiography, and HRCT. Two groups of workers with known exposure were studied with HRCT. Group 1 had normal spirometry results and chest radiographs, and group 2 had abnormalities at spirometry or on chest radiographs. The HRCT scans were read independently of the clinical findings and chest radiographs. The HRCT scans were interpreted by using an ILO-based standard form developed by the authors for this project. With the newly developed HRCT form, individual descriptive abnormality localized severity, and overall rating systems have been developed and compared for inter- and intraobserver consistency

  19. Normal Forms for Fuzzy Logics: A Proof-Theoretic Approach

    Czech Academy of Sciences Publication Activity Database

    Cintula, Petr; Metcalfe, G.

    2007-01-01

    Roč. 46, č. 5-6 (2007), s. 347-363 ISSN 1432-0665 R&D Projects: GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * normal form * proof theory * hypersequents Subject RIV: BA - General Mathematics Impact factor: 0.620, year: 2007

  20. A New One-Pass Transformation into Monadic Normal Form

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We present a translation from the call-by-value λ-calculus to monadic normal forms that includes short-cut boolean evaluation. The translation is higher-order, operates in one pass, duplicates no code, generates no chains of thunks, and is properly tail recursive. It makes a crucial use of symbolic...

  1. Cephalometric Standards of Pre-University Boys with Normal Occlusion in Hamadan

    Directory of Open Access Journals (Sweden)

    N. Farhadian

    2005-04-01

    Full Text Available The important base of orthodontic treatment is correct diagnosis . One of the diagnostic tools is lateral cephalogram. There are some differences in normal standards between different races. The present study was carried out with the aim of determining and assessing the cephalometric standards of boys with the age of 17 to 20 years old in Hamadan.Among 1204 boys of preuniversity centers , 27 persons were selected based on IOTN and normal occlusal standards. Lateral cephalograms were obtained in Natural Head Position. 22 cephalometric variables (15 angles , 5 lines , 2 ratios were determined and measured three times by an orthodontist . Student t - test used for analysis.Mean age of the cases were 18.2±1.4 years. Range of reliability coefficient was between 0.901 to 0.986. In comparison with similar studies following variables were statistically different at p<0.05 level: Articular Angle= 146 ,Gonial Angle =118 , NPog-TH =89 , AB-TH = 4.6 , L1 –TH =116 , Go Gn –TH =20 , Ant. Cranial Base =76mm.The length of anterior cranial base in our study was significantly less than Michigan standards and there was a tendency to more straight profile in this evaluation . In comparison with the Cooke standards there was less protrusion in mandibular incisors and more counter-clockwise rotation of mandible. In comparison with similar study on girls(with normal occlusion and 18.2±1.1 years old linear measurements were generally greater in boys. Therefore it is important to consider the ethnic and racial variations in our ideal treatment plan.

  2. Standardized waste form test methods

    International Nuclear Information System (INIS)

    Slate, S.C.

    1984-01-01

    The Materials Characterization Center (MCC) is developing standard tests to characterize nuclear waste forms. Development of the first thirteen tests was originally initiated to provide data to compare different high-level waste (HLW) forms and to characterize their basic performance. The current status of the first thirteen MCC tests and some sample test results are presented: the radiation stability tests (MCC-6 and 12) and the tensile-strength test (MCC-11) are approved; the static leach tests (MCC-1, 2, and 3) are being reviewed for full approval; the thermal stability (MCC-7) and microstructure evaluation (MCC-13) methods are being considered for the first time; and the flowing leach test methods (MCC-4 and 5), the gas generation methods (MCC-8 and 9), and the brittle fracture method (MCC-10) are indefinitely delayed. Sample static leach test data on the ARM-1 approved reference material are presented. Established tests and proposed new tests will be used to meet new testing needs. For waste form production, tests on stability and composition measurement are needed to provide data to ensure waste form quality. In transporation, data are needed to evaluate the effects of accidents on canisterized waste forms. The new MCC-15 accident test method and some data are presented. Compliance testing needs required by the recent draft repository waste acceptance specifications are described. These specifications will control waste form contents, processing, and performance

  3. Standardized waste form test methods

    International Nuclear Information System (INIS)

    Slate, S.C.

    1984-11-01

    The Materials Characterization Center (MCC) is developing standard tests to characterize nuclear waste forms. Development of the first thirteen tests was originally initiated to provide data to compare different high-level waste (HLW) forms and to characterize their basic performance. The current status of the first thirteen MCC tests and some sample test results is presented: The radiation stability tests (MCC-6 and 12) and the tensile-strength test (MCC-11) are approved; the static leach tests (MCC-1, 2, and 3) are being reviewed for full approval; the thermal stability (MCC-7) and microstructure evaluation (MCC-13) methods are being considered for the first time; and the flowing leach tests methods (MCC-4 and 5), the gas generation methods (MCC-8 and 9), and the brittle fracture method (MCC-10) are indefinitely delayed. Sample static leach test data on the ARM-1 approved reference material are presented. Established tests and proposed new tests will be used to meet new testing needs. For waste form production, tests on stability and composition measurement are needed to provide data to ensure waste form quality. In transportation, data are needed to evaluate the effects of accidents on canisterized waste forms. The new MCC-15 accident test method and some data are presented. Compliance testing needs required by the recent draft repository waste acceptance specifications are described. These specifications will control waste form contents, processing, and performance. 2 references, 2 figures

  4. 48 CFR 53.301-1427 - Standard Form 1427, Inventory Schedule A-Construction Sheet (Metals in Mill Product Form).

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Standard Form 1427, Inventory Schedule A-Construction Sheet (Metals in Mill Product Form). 53.301-1427 Section 53.301-1427... Illustrations of Forms 53.301-1427 Standard Form 1427, Inventory Schedule A—Construction Sheet (Metals in Mill...

  5. Normal standards for kidney length as measured with US in premature infants

    International Nuclear Information System (INIS)

    Schlesinger, A.E.; Hedlund, G.L.; Pierson, W.P.; Null, D.M.

    1986-01-01

    In order to develop normal standards for kidney length in premature infants, the authors measured kidney length by US imaging in 39 (to date) premature infants less than 72 hours old and without known renal disease. Kidney length was compared with four different parameters of body size, including gestational age, birth weight, birth length, and body surface area. Similar standards have been generated previously for normal renal length as measured by US imaging in full-term infants and older children. These standards have proven utility in cases of congenital and acquired disorders that abnormally increase or decrease renal size. Scatter plots of kidney length versus body weight and kidney length versus body surface area conformed well to a logarithmic distribution, with a high correlation coefficient and close-fitting 95% confidence limits (SEE = 2.05)

  6. 41 CFR 101-1.4901 - Standard forms. [Reserved

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Standard forms. [Reserved] 101-1.4901 Section 101-1.4901 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS GENERAL 1-INTRODUCTION 1.49-Illustrations of Forms...

  7. 41 CFR 101-39.4901 - Obtaining standard and optional forms.

    Science.gov (United States)

    2010-07-01

    ... VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.49-Forms § 101-39.4901 Obtaining standard and optional... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Obtaining standard and optional forms. 101-39.4901 Section 101-39.4901 Public Contracts and Property Management Federal Property...

  8. Analysis of approaches to classification of forms of non-standard employment

    Directory of Open Access Journals (Sweden)

    N. V. Dorokhova

    2017-01-01

    Full Text Available Currently becoming more widespread non-standard forms of employment. If this is not clear approach to the definition and maintenance of non-standard employment. In the article the analysis of diverse interpretations of the concept, on what basis, the author makes a conclusion about the complexity and contradictory nature of precarious employment as an economic category. It examines different approaches to classification of forms of precarious employment. The main forms of precarious employment such as flexible working year, flexible working week, flexible working hours, remote work, employees on call, shift forwarding; Agency employment, self-employment, negotiator, underemployment, over employment, employment on the basis of fixed-term contracts employment based on contract of civil-legal nature, one-time employment, casual employment, temporary employment, secondary employment and part-time. The author’s approach to classification of non-standard forms of employment, based on identifying the impact of atypical employment on the development of human potential. For the purpose of classification of non-standard employment forms from the standpoint of their impact on human development as the criteria of classification proposed in the following: working conditions, wages and social guarantees, possibility of workers ' participation in management, personal development and self-employment stability. Depending on what value each of these criteria, some form of non-standard employment can be attributed to the progressive or regressive. Classification of non-standard forms of employment should be the basis of the state policy of employment management.

  9. Fast Bitwise Implementation of the Algebraic Normal Form Transform

    OpenAIRE

    Bakoev, Valentin

    2017-01-01

    The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...

  10. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Science.gov (United States)

    2010-10-01

    ... Policy, Form MA-283. 308.409 Section 308.409 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF... of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk Builder's Risk Insurance Policy, Form MA-283 may be obtained from the American War Risk Agency or MARAD. ...

  11. Standard forms of construction contracts in Romania

    Directory of Open Access Journals (Sweden)

    Cristian Bănică

    2013-12-01

    Full Text Available Construction industry in Romania is under pressure to modernize in order to cope with the new demands of development and convergence with EU. Contractual procedures in construction have to become an integral part in this process of modernization. The article makes an introduction to the advantages of standard forms of contract and professional contract administration in construction and presents the current state-of-the art in the use of standard construction contracts in Romania. Some practical conclusions and recommendations are presented considering the need for further contract studies.

  12. Improved overall delivery documentation following implementation of a standardized shoulder dystocia delivery form

    Science.gov (United States)

    Moragianni, Vasiliki A.; Hacker, Michele R.; Craparo, Frank J.

    2013-01-01

    Objective Our objective was to evaluate whether using a standardized shoulder dystocia delivery form improved documentation. A standardized delivery form was added to our institution’s obstetrical record in August 2003. Methods A retrospective cohort study was conducted comparing 100 vaginal deliveries complicated by shoulder dystocia before, and 81 after implementation of the standardized delivery form. The two groups were compared in terms of obstetric characteristics, neonatal outcomes and documentation components. Results Charts that included the standardized delivery form were more likely to contain documentation of estimated fetal weight (82.7% vs. 39.0% without the form, Pdystocia, and second stage duration. Conclusions Inclusion of a standardized form in the delivery record improves the rate of documentation of both shoulder dystocia-specific and general delivery components. PMID:22017330

  13. Oblique projections and standard-form transformations for discrete inverse problems

    DEFF Research Database (Denmark)

    Hansen, Per Christian

    2013-01-01

    This tutorial paper considers a specific computational tool for the numerical solution of discrete inverse problems, known as the standard-form transformation, by which we can treat general Tikhonov regularization problems efficiently. In the tradition of B. N. Datta's expositions of numerical li...... linear algebra, we use the close relationship between oblique projections, pseudoinverses, and matrix computations to derive a simple geometric motivation and algebraic formulation of the standard-form transformation....

  14. Effects of a prolonged standardized diet on normalizing the human metabolome123

    OpenAIRE

    Winnike, Jason H; Busby, Marjorie G; Watkins, Paul B; O'Connell, Thomas M

    2009-01-01

    Background: Although the effects of acute dietary interventions on the human metabolome have been studied, the extent to which the metabolome can be normalized by extended dietary standardization has not yet been examined.

  15. 7 CFR 28.123 - Costs of practical forms of cotton standards.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Costs of practical forms of cotton standards. 28.123 Section 28.123 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE COMMODITY STANDARDS AND STANDARD...

  16. Standard heart and vessel size on plain films of normal children

    International Nuclear Information System (INIS)

    Stoever, B.

    1986-01-01

    Standards of heart size, i.e. heart diameters and heart volume of normal children aged 4-15 years were obtained. In all cases requiring exact heart size determination, heart volume calculation is mandatory in children as well as in adults. Statistical work to date has provided precise calculation of heart volume plain films in the upright position. Additional plain films in prone position are unnecessary because no evident orthostatic influence on heart volume in children can be found. Percentiles of normal heart volume related to body weight representing the best correlation to the individual data are given as well as percentiles related to age. Furthermore ratios of normal vessel size to the height of the 8sup(th) thoracic vertebral body, measured on the same plain film, are given. In addition the ratio of upper to lower lung vessel size is calculated. These ratios are useful criteria in estimating normal vessel size and also in cases with increased pulmonary venous pressure. (orig.) [de

  17. On the construction of the Kolmogorov normal form for the Trojan asteroids

    CERN Document Server

    Gabern, F; Locatelli, U

    2004-01-01

    In this paper we focus on the stability of the Trojan asteroids for the planar Restricted Three-Body Problem (RTBP), by extending the usual techniques for the neighbourhood of an elliptic point to derive results in a larger vicinity. Our approach is based on the numerical determination of the frequencies of the asteroid and the effective computation of the Kolmogorov normal form for the corresponding torus. This procedure has been applied to the first 34 Trojan asteroids of the IAU Asteroid Catalog, and it has worked successfully for 23 of them. The construction of this normal form allows for computer-assisted proofs of stability. To show it, we have implemented a proof of existence of families of invariant tori close to a given asteroid, for a high order expansion of the Hamiltonian. This proof has been successfully applied to three Trojan asteroids.

  18. Best-Matched Internal Standard Normalization in Liquid Chromatography-Mass Spectrometry Metabolomics Applied to Environmental Samples.

    Science.gov (United States)

    Boysen, Angela K; Heal, Katherine R; Carlson, Laura T; Ingalls, Anitra E

    2018-01-16

    The goal of metabolomics is to measure the entire range of small organic molecules in biological samples. In liquid chromatography-mass spectrometry-based metabolomics, formidable analytical challenges remain in removing the nonbiological factors that affect chromatographic peak areas. These factors include sample matrix-induced ion suppression, chromatographic quality, and analytical drift. The combination of these factors is referred to as obscuring variation. Some metabolomics samples can exhibit intense obscuring variation due to matrix-induced ion suppression, rendering large amounts of data unreliable and difficult to interpret. Existing normalization techniques have limited applicability to these sample types. Here we present a data normalization method to minimize the effects of obscuring variation. We normalize peak areas using a batch-specific normalization process, which matches measured metabolites with isotope-labeled internal standards that behave similarly during the analysis. This method, called best-matched internal standard (B-MIS) normalization, can be applied to targeted or untargeted metabolomics data sets and yields relative concentrations. We evaluate and demonstrate the utility of B-MIS normalization using marine environmental samples and laboratory grown cultures of phytoplankton. In untargeted analyses, B-MIS normalization allowed for inclusion of mass features in downstream analyses that would have been considered unreliable without normalization due to obscuring variation. B-MIS normalization for targeted or untargeted metabolomics is freely available at https://github.com/IngallsLabUW/B-MIS-normalization .

  19. Understanding Emotions from Standardized Facial Expressions in Autism and Normal Development

    Science.gov (United States)

    Castelli, Fulvia

    2005-01-01

    The study investigated the recognition of standardized facial expressions of emotion (anger, fear, disgust, happiness, sadness, surprise) at a perceptual level (experiment 1) and at a semantic level (experiments 2 and 3) in children with autism (N= 20) and normally developing children (N= 20). Results revealed that children with autism were as…

  20. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.; Spoto, F.; Scollo, Giuseppe; Nijholt, Antinus

    2003-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq 1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  1. Generating all permutations by context-free grammars in Chomsky normal form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2006-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq1$, with

  2. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2004-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  3. On some hypersurfaces with time like normal bundle in pseudo Riemannian space forms

    International Nuclear Information System (INIS)

    Kashani, S.M.B.

    1995-12-01

    In this work we classify immersed hypersurfaces with constant sectional curvature in pseudo Riemannian space forms if the normal bundle is time like and the mean curvature is constant. (author). 9 refs

  4. 12 CFR 22.6 - Required use of standard flood hazard determination form.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Required use of standard flood hazard determination form. 22.6 Section 22.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY... the Act. The standard flood hazard determination form may be used in a printed, computerized, or...

  5. Mandibulary dental arch form differences between level four polynomial method and pentamorphic pattern for normal occlusion sample

    Directory of Open Access Journals (Sweden)

    Y. Yuliana

    2011-07-01

    Full Text Available The aim of an orthodontic treatment is to achieve aesthetic, dental health and the surrounding tissues, occlusal functional relationship, and stability. The success of an orthodontic treatment is influenced by many factors, such as diagnosis and treatment plan. In order to do a diagnosis and a treatment plan, medical record, clinical examination, radiographic examination, extra oral and intra oral photos, as well as study model analysis are needed. The purpose of this study was to evaluate the differences in dental arch form between level four polynomial and pentamorphic arch form and to determine which one is best suitable for normal occlusion sample. This analytic comparative study was conducted at Faculty of Dentistry Universitas Padjadjaran on 13 models by comparing the dental arch form using the level four polynomial method based on mathematical calculations, the pattern of the pentamorphic arch and mandibular normal occlusion as a control. The results obtained were tested using statistical analysis T student test. The results indicate a significant difference both in the form of level four polynomial method and pentamorphic arch form when compared with mandibular normal occlusion dental arch form. Level four polynomial fits better, compare to pentamorphic arch form.

  6. Application of normal form methods to the analysis of resonances in particle accelerators

    International Nuclear Information System (INIS)

    Davies, W.G.

    1992-01-01

    The transformation to normal form in a Lie-algebraic framework provides a very powerful method for identifying and analysing non-linear behaviour and resonances in particle accelerators. The basic ideas are presented and illustrated. (author). 4 refs

  7. Development of standard testing methods for nuclear-waste forms

    International Nuclear Information System (INIS)

    Mendel, J.E.; Nelson, R.D.

    1981-11-01

    Standard test methods for waste package component development and design, safety analyses, and licensing are being developed for the Nuclear Waste Materials Handbook. This paper describes mainly the testing methods for obtaining waste form materials data

  8. Standard-Chinese Lexical Neighborhood Test in normal-hearing young children.

    Science.gov (United States)

    Liu, Chang; Liu, Sha; Zhang, Ning; Yang, Yilin; Kong, Ying; Zhang, Luo

    2011-06-01

    The purposes of the present study were to establish the Standard-Chinese version of Lexical Neighborhood Test (LNT) and to examine the lexical and age effects on spoken-word recognition in normal-hearing children. Six lists of monosyllabic and six lists of disyllabic words (20 words/list) were selected from the database of daily speech materials for normal-hearing (NH) children of ages 3-5 years. The lists were further divided into "easy" and "hard" halves according to the word frequency and neighborhood density in the database based on the theory of Neighborhood Activation Model (NAM). Ninety-six NH children (age ranged between 4.0 and 7.0 years) were divided into three different age groups of 1-year intervals. Speech-perception tests were conducted using the Standard-Chinese monosyllabic and disyllabic LNT. The inter-list performance was found to be equivalent and inter-rater reliability was high with 92.5-95% consistency. Results of word-recognition scores showed that the lexical effects were all significant. Children scored higher with disyllabic words than with monosyllabic words. "Easy" words scored higher than "hard" words. The word-recognition performance also increased with age in each lexical category. A multiple linear regression analysis showed that neighborhood density, age, and word frequency appeared to have increasingly more contributions to Chinese word recognition. The results of the present study indicated that performances of Chinese word recognition were influenced by word frequency, age, and neighborhood density, with word frequency playing a major role. These results were consistent with those in other languages, supporting the application of NAM in the Chinese language. The development of Standard-Chinese version of LNT and the establishment of a database of children of 4-6 years old can provide a reliable means for spoken-word recognition test in children with hearing impairment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Syntactic Dependencies and Verbal Inflection: Complementisers and Verbal Forms in Standard Arabic

    Directory of Open Access Journals (Sweden)

    Feras Saeed

    2015-12-01

    Full Text Available This paper investigates the syntactic dependency between complementisers and verbal forms in Standard Arabic and provides a new analysis of this dependency. The imperfective verb in this language surfaces with three different forms, where each form is indicated by a different suffixal marker attached to the end of the verb as (-u, (-a, or (-Ø. The occurrence of each suffixal marker on the verb corresponds to the co-occurrence of a particular type of Comp-elements in the C/T domain. I argue that these morphological markers on the three verbal forms are the manifestation of an Agree relation between an interpretable unvalued finiteness feature [Fin] on C and an uninterpretable but valued instance of the same feature on v, assuming feature transfer and feature sharing between C/T and v (Pesetsky & Torrego 2007; Chomsky 2008. I also argue that the different verbal forms in Standard Arabic are dictated by the co-occurrence of three types of Comp-elements: i C-elements; ii T-elements which ultimately move to C; and iii imperative/negative elements. Keywords: feature transfer/sharing, verbal forms, complementisers, finiteness, syntactic dependency, Standard Arabic

  10. Normal form of particle motion under the influence of an ac dipole

    Directory of Open Access Journals (Sweden)

    R. Tomás

    2002-05-01

    Full Text Available ac dipoles in accelerators are used to excite coherent betatron oscillations at a drive frequency close to the tune. These beam oscillations may last arbitrarily long and, in principle, there is no significant emittance growth if the ac dipole is adiabatically turned on and off. Therefore the ac dipole seems to be an adequate tool for nonlinear diagnostics provided the particle motion is well described in the presence of the ac dipole and nonlinearities. Normal forms and Lie algebra are powerful tools to study the nonlinear content of an accelerator lattice. In this article a way to obtain the normal form of the Hamiltonian of an accelerator with an ac dipole is described. The particle motion to first order in the nonlinearities is derived using Lie algebra techniques. The dependence of the Hamiltonian terms on the longitudinal coordinate is studied showing that they vary differently depending on the ac dipole parameters. The relation is given between the lines of the Fourier spectrum of the turn-by-turn motion and the Hamiltonian terms.

  11. 41 CFR 102-194.5 - What is the Standard and Optional Forms Management Program?

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is the Standard and Optional Forms Management Program? 102-194.5 Section 102-194.5 Public Contracts and Property Management... PROGRAMS 194-STANDARD AND OPTIONAL FORMS MANAGEMENT PROGRAM § 102-194.5 What is the Standard and Optional...

  12. 76 FR 44965 - Notice of Revision of Standard Forms 39 and 39-A

    Science.gov (United States)

    2011-07-27

    ... OFFICE OF PERSONNEL MANAGEMENT Notice of Revision of Standard Forms 39 and 39-A AGENCY: U.S... Management (OPM) has revised Standard Form (SF) 39, Request For Referral Of Eligibles, and SF 39-A, Request... part 332. The SF 39 outlines instructions to be used by hiring officials to request a list of eligible...

  13. Analysis of a renormalization group method and normal form theory for perturbed ordinary differential equations

    Science.gov (United States)

    DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.

    2008-06-01

    For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).

  14. Child in a Form: The Definition of Normality and Production of Expertise in Teacher Statement Forms--The Case of Northern Finland, 1951-1990

    Science.gov (United States)

    Koskela, Anne; Vehkalahti, Kaisa

    2017-01-01

    This article shows the importance of paying attention to the role of professional devices, such as standardised forms, as producers of normality and deviance in the history of education. Our case study focused on the standardised forms used by teachers during child guidance clinic referrals and transfers to special education in northern Finland,…

  15. Quantitative Analysis of Torso FDG-PET Scans by Using Anatomical Standardization of Normal Cases from Thorough Physical Examinations.

    Directory of Open Access Journals (Sweden)

    Takeshi Hara

    Full Text Available Understanding of standardized uptake value (SUV of 2-deoxy-2-[18F]fluoro-d-glucose positron emission tomography (FDG-PET depends on the background accumulations of glucose because the SUV often varies the status of patients. The purpose of this study was to develop a new method for quantitative analysis of SUV of FDG-PET scan images. The method included an anatomical standardization and a statistical comparison with normal cases by using Z-score that are often used in SPM or 3D-SSP approach for brain function analysis. Our scheme consisted of two approaches, which included the construction of a normal model and the determination of the SUV scores as Z-score index for measuring the abnormality of an FDG-PET scan image. To construct the normal torso model, all of the normal images were registered into one shape, which indicated the normal range of SUV at all voxels. The image deformation process consisted of a whole body rigid registration of shoulder to bladder region and liver registration and a non-linear registration of body surface by using the thin-plate spline technique. In order to validate usefulness of our method, we segment suspicious regions on FDG-PET images manually, and obtained the Z-scores of the regions based on the corresponding voxels that stores the mean and the standard deviations from the normal model. We collected 243 (143 males and 100 females normal cases to construct the normal model. We also extracted 432 abnormal spots from 63 abnormal cases (73 cancer lesions to validate the Z-scores. The Z-scores of 417 out of 432 abnormal spots were higher than 2.0, which statistically indicated the severity of the spots. In conclusions, the Z-scores obtained by our computerized scheme with anatomical standardization of torso region would be useful for visualization and detection of subtle lesions on FDG-PET scan images even when the SUV may not clearly show an abnormality.

  16. A Mathematical Framework for Critical Transitions: Normal Forms, Variance and Applications

    Science.gov (United States)

    Kuehn, Christian

    2013-06-01

    Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast-subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension-two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.

  17. Estimates and Standard Errors for Ratios of Normalizing Constants from Multiple Markov Chains via Regeneration.

    Science.gov (United States)

    Doss, Hani; Tan, Aixin

    2014-09-01

    In the classical biased sampling problem, we have k densities π 1 (·), …, π k (·), each known up to a normalizing constant, i.e. for l = 1, …, k , π l (·) = ν l (·)/ m l , where ν l (·) is a known function and m l is an unknown constant. For each l , we have an iid sample from π l , · and the problem is to estimate the ratios m l /m s for all l and all s . This problem arises frequently in several situations in both frequentist and Bayesian inference. An estimate of the ratios was developed and studied by Vardi and his co-workers over two decades ago, and there has been much subsequent work on this problem from many different perspectives. In spite of this, there are no rigorous results in the literature on how to estimate the standard error of the estimate. We present a class of estimates of the ratios of normalizing constants that are appropriate for the case where the samples from the π l 's are not necessarily iid sequences, but are Markov chains. We also develop an approach based on regenerative simulation for obtaining standard errors for the estimates of ratios of normalizing constants. These standard error estimates are valid for both the iid case and the Markov chain case.

  18. A Denotational Account of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2004-01-01

    Abstract. We show that the standard normalization-by-evaluation construction for the simply-typed λβη-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a “recursively defined” invariant relation, in the style of Pitts. In fact......, the construction can be seen as generalizing a computational adequacy argument for an untyped, call-by-name language to normalization instead of evaluation. In the untyped setting, not all terms have normal forms, so the normalization function is necessarily partial. We establish its correctness in the senses...

  19. Standard Test Method for Normal Spectral Emittance at Elevated Temperatures

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1972-01-01

    1.1 This test method describes a highly accurate technique for measuring the normal spectral emittance of electrically conducting materials or materials with electrically conducting substrates, in the temperature range from 600 to 1400 K, and at wavelengths from 1 to 35 μm. 1.2 The test method requires expensive equipment and rather elaborate precautions, but produces data that are accurate to within a few percent. It is suitable for research laboratories where the highest precision and accuracy are desired, but is not recommended for routine production or acceptance testing. However, because of its high accuracy this test method can be used as a referee method to be applied to production and acceptance testing in cases of dispute. 1.3 The values stated in SI units are to be regarded as the standard. The values in parentheses are for information only. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this stan...

  20. Generating All Circular Shifts by Context-Free Grammars in Greibach Normal Form

    NARCIS (Netherlands)

    Asveld, Peter R.J.

    2007-01-01

    For each alphabet Σn = {a1,a2,…,an}, linearly ordered by a1 < a2 < ⋯ < an, let Cn be the language of circular or cyclic shifts over Σn, i.e., Cn = {a1a2 ⋯ an-1an, a2a3 ⋯ ana1,…,ana1 ⋯ an-2an-1}. We study a few families of context-free grammars Gn (n ≥1) in Greibach normal form such that Gn generates

  1. Normalize the response of EPID in pursuit of linear accelerator dosimetry standardization.

    Science.gov (United States)

    Cai, Bin; Goddu, S Murty; Yaddanapudi, Sridhar; Caruthers, Douglas; Wen, Jie; Noel, Camille; Mutic, Sasa; Sun, Baozhou

    2018-01-01

    Normalize the response of electronic portal imaging device (EPID) is the first step toward an EPID-based standardization of Linear Accelerator (linac) dosimetry quality assurance. In this study, we described an approach to generate two-dimensional (2D) pixel sensitivity maps (PSM) for EPIDs response normalization utilizing an alternative beam and dark-field (ABDF) image acquisition technique and large overlapping field irradiations. The automated image acquisition was performed by XML-controlled machine operation and the PSM was generated based on a recursive calculation algorithm for Varian linacs equipped with aS1000 and aS1200 imager panels. Cross-comparisons of normalized beam profiles and 1.5%/1.5 mm 1D Gamma analysis was adopted to quantify the improvement of beam profile matching before and after PSM corrections. PSMs were derived for both photon (6, 10, 15 MV) and electron (6, 20 MeV) beams via proposed method. The PSM-corrected images reproduced a horn-shaped profile for photon beams and a relative uniform profiles for electrons. For dosimetrically matched linacs equipped with aS1000 panels, PSM-corrected images showed increased 1D-Gamma passing rates for all energies, with an average 10.5% improvement for crossline and 37% for inline beam profiles. Similar improvements in the phantom study were observed with a maximum improvement of 32% for 15 MV and 22% for 20 MeV. The PSM value showed no significant change for all energies over a 3-month period. In conclusion, the proposed approach correct EPID response for both aS1000 and aS1200 panels. This strategy enables the possibility to standardize linac dosimetry QA and to benchmark linac performance utilizing EPID as the common detector. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  2. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    Science.gov (United States)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  3. Center manifolds, normal forms and bifurcations of vector fields with application to coupling between periodic and steady motions

    Science.gov (United States)

    Holmes, Philip J.

    1981-06-01

    We study the instabilities known to aeronautical engineers as flutter and divergence. Mathematically, these states correspond to bifurcations to limit cycles and multiple equilibrium points in a differential equation. Making use of the center manifold and normal form theorems, we concentrate on the situation in which flutter and divergence become coupled, and show that there are essentially two ways in which this is likely to occur. In the first case the system can be reduced to an essential model which takes the form of a single degree of freedom nonlinear oscillator. This system, which may be analyzed by conventional phase-plane techniques, captures all the qualitative features of the full system. We discuss the reduction and show how the nonlinear terms may be simplified and put into normal form. Invariant manifold theory and the normal form theorem play a major role in this work and this paper serves as an introduction to their application in mechanics. Repeating the approach in the second case, we show that the essential model is now three dimensional and that far more complex behavior is possible, including nonperiodic and ‘chaotic’ motions. Throughout, we take a two degree of freedom system as an example, but the general methods are applicable to multi- and even infinite degree of freedom problems.

  4. Normalizing tweets with edit scripts and recurrent neural embeddings

    NARCIS (Netherlands)

    Chrupala, Grzegorz; Toutanova, Kristina; Wu, Hua

    2014-01-01

    Tweets often contain a large proportion of abbreviations, alternative spellings, novel words and other non-canonical language. These features are problematic for standard language analysis tools and it can be desirable to convert them to canonical form. We propose a novel text normalization model

  5. Effects of variable transformations on errors in FORM results

    International Nuclear Information System (INIS)

    Qin Quan; Lin Daojin; Mei Gang; Chen Hao

    2006-01-01

    On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors

  6. 33 CFR Appendix - List of FPC Standard Articles Forms Used in Permits and Licenses for Hydroelectric Projects

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false List of FPC Standard Articles Forms Used in Permits and Licenses for Hydroelectric Projects Navigation and Navigable Waters CORPS OF... Forms Used in Permits and Licenses for Hydroelectric Projects The following FPC standard articles Forms...

  7. 48 CFR 53.301-252 - Standard Form 252, Architect-Engineer Contract.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Standard Form 252, Architect-Engineer Contract. 53.301-252 Section 53.301-252 Federal Acquisition Regulations System FEDERAL..., Architect-Engineer Contract. EC01MY91.035 EC01MY91.036 ...

  8. Theory and praxis pf map analsys in CHEF part 1: Linear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /Fermilab

    2008-10-01

    This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.

  9. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    International Nuclear Information System (INIS)

    Michelotti, Leo

    2009-01-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from

  10. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /FERMILAB

    2009-04-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material

  11. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  12. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    Science.gov (United States)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when

  13. Adherence to a Standardized Order Form for Gastric Cancer in a Referral Chemotherapy Teaching Hospital, Mashhad, Iran

    Directory of Open Access Journals (Sweden)

    Mitra Asgarian

    2017-09-01

    Full Text Available Background: Standardized forms for prescription and medication administration are one solution to reduce medication errors in the chemotherapy process. Gastric cancer is the most common cancer in Iran. In this study, we have attempted to design and validate a standard printed chemotherapy form and evaluate adherence by oncologists and nurses to this form. Methods: We performed this cross-sectional study in a Mashhad, Iran teaching hospital from August 2015 until January 2016. A clinical pharmacist designed the chemotherapy form that included various demographic and clinical parameters and approved chemotherapy regimens for gastric cancer. Clinical oncologists that worked in this center validated the form. We included all eligible patients. A pharmacy student identified adherence by the oncologists and nurses to this form and probable medication errors. Results are mean ± standard deviation or number (percentages for nominal variables. Data analysis was performed using the SPSS 16.0 statistical package. Results:We evaluated 54 patients and a total of 249 chemotherapy courses. In 146 (58.63% chemotherapy sessions, the administered regimens lacked compatibility with the standard form. Approximately 66% of recorded errors occurred in the prescription phase and the remainder during the administration phase. The most common errors included improper dose (61% and wrong infusion time (34%. We observed that 37 dose calculation errors occurred in 32 chemotherapy sessions. Conclusions: In general, adherence by oncologists and nurses with the developed form for chemotherapy treatment of gastric cancer was not acceptable. These findings indicated the necessity for a standardized order sheet to simplify the chemotherapy process for the clinicians, and reduce prescription and administration errors.

  14. Molecular Form Differences Between Prostate-Specific Antigen (PSA) Standards Create Quantitative Discordances in PSA ELISA Measurements

    Science.gov (United States)

    McJimpsey, Erica L.

    2016-02-01

    The prostate-specific antigen (PSA) assays currently employed for the detection of prostate cancer (PCa) lack the specificity needed to differentiate PCa from benign prostatic hyperplasia and have high false positive rates. The PSA calibrants used to create calibration curves in these assays are typically purified from seminal plasma and contain many molecular forms (intact PSA and cleaved subforms). The purpose of this study was to determine if the composition of the PSA molecular forms found in these PSA standards contribute to the lack of PSA test reliability. To this end, seminal plasma purified PSA standards from different commercial sources were investigated by western blot (WB) and in multiple research grade PSA ELISAs. The WB results revealed that all of the PSA standards contained different mass concentrations of intact and cleaved molecular forms. Increased mass concentrations of intact PSA yielded higher immunoassay absorbance values, even between lots from the same manufacturer. Standardization of seminal plasma derived PSA calibrant molecular form mass concentrations and purification methods will assist in closing the gaps in PCa testing measurements that require the use of PSA values, such as the % free PSA and Prostate Health Index by increasing the accuracy of the calibration curves.

  15. Bioactive form of resveratrol in glioblastoma cells and its safety for normal brain cells

    Directory of Open Access Journals (Sweden)

    Xiao-Hong Shu

    2013-05-01

    Full Text Available ABSTRACTBackground: Resveratrol, a plant polyphenol existing in grapes and many other natural foods, possesses a wide range of biological activities including cancer prevention. It has been recognized that resveratrol is intracellularly biotransformed to different metabolites, but no direct evidence has been available to ascertain its bioactive form because of the difficulty to maintain resveratrol unmetabolized in vivo or in vitro. It would be therefore worthwhile to elucidate the potential therapeutic implications of resveratrol metabolism using a reliable resveratrol-sensitive cancer cells.Objective: To identify the real biological form of trans-resveratrol and to evaluate the safety of the effective anticancer dose of resveratrol for the normal brain cells.Methods: The samples were prepared from the condition media and cell lysates of human glioblastoma U251 cells, and were purified by solid phase extraction (SPE. The samples were subjected to high performance liquid chromatography (HPLC and liquid chromatography/tandem mass spectrometry (LC/MS analysis. According to the metabolite(s, trans-resveratrol was biotransformed in vitro by the method described elsewhere, and the resulting solution was used to treat U251 cells. Meanwhile, the responses of U251 and primarily cultured rat normal brain cells (glial cells and neurons to 100μM trans-resveratrol were evaluated by multiple experimental methods.Results: The results revealed that resveratrol monosulfate was the major metabolite in U251 cells. About half fraction of resveratrol monosulfate was prepared in vitro and this trans-resveratrol and resveratrol monosulfate mixture showed little inhibitory effect on U251 cells. It is also found that rat primary brain cells (PBCs not only resist 100μM but also tolerate as high as 200μM resveratrol treatment.Conclusions: Our study thus demonstrated that trans-resveratrol was the bioactive form in glioblastoma cells and, therefore, the biotransforming

  16. Normal form analysis of linear beam dynamics in a coupled storage ring

    International Nuclear Information System (INIS)

    Wolski, Andrzej; Woodley, Mark D.

    2004-01-01

    The techniques of normal form analysis, well known in the literature, can be used to provide a straightforward characterization of linear betatron dynamics in a coupled lattice. Here, we consider both the beam distribution and the betatron oscillations in a storage ring. We find that the beta functions for uncoupled motion generalize in a simple way to the coupled case. Defined in the way that we propose, the beta functions remain well behaved (positive and finite) under all circumstances, and have essentially the same physical significance for the beam size and betatron oscillation amplitude as in the uncoupled case. Application of this analysis to the online modeling of the PEP-II rings is also discussed

  17. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    Science.gov (United States)

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while

  18. The "Second Place" Problem: Assistive Technology in Sports and (Re) Constructing Normal.

    Science.gov (United States)

    Baker, D A

    2016-02-01

    Objections to the use of assistive technologies (such as prostheses) in elite sports are generally raised when the technology in question is perceived to afford the user a potentially "unfair advantage," when it is perceived as a threat to the purity of the sport, and/or when it is perceived as a precursor to a slippery slope toward undesirable changes in the sport. These objections rely on being able to quantify standards of "normal" within a sport so that changes attributed to the use of assistive technology can be judged as causing a significant deviation from some baseline standard. This holds athletes using assistive technologies accountable to standards that restrict their opportunities to achieve greatness, while athletes who do not use assistive technologies are able to push beyond the boundaries of these standards without moral scrutiny. This paper explores how constructions of fairness and "normality" impact athletes who use assistive technology to compete in a sporting venue traditionally populated with "able-bodied" competitors. It argues that the dynamic and obfuscated construction of "normal" standards in elite sports should move away from using body performance as the measuring stick of "normal," toward alternate forms of constructing norms such as defining, quantifying, and regulating the mechanical actions that constitute the critical components of a sport. Though framed within the context of elite sports, this paper can be interpreted more broadly to consider problems with defining "normal" bodies in a society in which technologies are constantly changing our abilities and expectations of what normal means.

  19. Optimization of accelerator parameters using normal form methods on high-order transfer maps

    Energy Technology Data Exchange (ETDEWEB)

    Snopok, Pavel [Michigan State Univ., East Lansing, MI (United States)

    2007-05-01

    Methods of analysis of the dynamics of ensembles of charged particles in collider rings are developed. The following problems are posed and solved using normal form transformations and other methods of perturbative nonlinear dynamics: (1) Optimization of the Tevatron dynamics: (a) Skew quadrupole correction of the dynamics of particles in the Tevatron in the presence of the systematic skew quadrupole errors in dipoles; (b) Calculation of the nonlinear tune shift with amplitude based on the results of measurements and the linear lattice information; (2) Optimization of the Muon Collider storage ring: (a) Computation and optimization of the dynamic aperture of the Muon Collider 50 x 50 GeV storage ring using higher order correctors; (b) 750 x 750 GeV Muon Collider storage ring lattice design matching the Tevatron footprint. The normal form coordinates have a very important advantage over the particle optical coordinates: if the transformation can be carried out successfully (general restrictions for that are not much stronger than the typical restrictions imposed on the behavior of the particles in the accelerator) then the motion in the new coordinates has a very clean representation allowing to extract more information about the dynamics of particles, and they are very convenient for the purposes of visualization. All the problem formulations include the derivation of the objective functions, which are later used in the optimization process using various optimization algorithms. Algorithms used to solve the problems are specific to collider rings, and applicable to similar problems arising on other machines of the same type. The details of the long-term behavior of the systems are studied to ensure the their stability for the desired number of turns. The algorithm of the normal form transformation is of great value for such problems as it gives much extra information about the disturbing factors. In addition to the fact that the dynamics of particles is represented

  20. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game

    Directory of Open Access Journals (Sweden)

    Adam Karbowski

    2017-09-01

    Full Text Available The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants’ attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  1. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game.

    Science.gov (United States)

    Karbowski, Adam; Ramsza, Michał

    2017-01-01

    The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants' attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  2. Molecular Form Differences Between Prostate-Specific Antigen (PSA) Standards Create Quantitative Discordances in PSA ELISA Measurements

    Science.gov (United States)

    McJimpsey, Erica L.

    2016-01-01

    The prostate-specific antigen (PSA) assays currently employed for the detection of prostate cancer (PCa) lack the specificity needed to differentiate PCa from benign prostatic hyperplasia and have high false positive rates. The PSA calibrants used to create calibration curves in these assays are typically purified from seminal plasma and contain many molecular forms (intact PSA and cleaved subforms). The purpose of this study was to determine if the composition of the PSA molecular forms found in these PSA standards contribute to the lack of PSA test reliability. To this end, seminal plasma purified PSA standards from different commercial sources were investigated by western blot (WB) and in multiple research grade PSA ELISAs. The WB results revealed that all of the PSA standards contained different mass concentrations of intact and cleaved molecular forms. Increased mass concentrations of intact PSA yielded higher immunoassay absorbance values, even between lots from the same manufacturer. Standardization of seminal plasma derived PSA calibrant molecular form mass concentrations and purification methods will assist in closing the gaps in PCa testing measurements that require the use of PSA values, such as the % free PSA and Prostate Health Index by increasing the accuracy of the calibration curves. PMID:26911983

  3. 41 CFR 304-6.5 - What guidelines must we follow when using the Standard Form (SF) 326?

    Science.gov (United States)

    2010-07-01

    ... REQUIREMENTS 6-PAYMENT GUIDELINES Reports § 304-6.5 What guidelines must we follow when using the Standard Form... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false What guidelines must we follow when using the Standard Form (SF) 326? 304-6.5 Section 304-6.5 Public Contracts and Property...

  4. A simple global representation for second-order normal forms of Hamiltonian systems relative to periodic flows

    International Nuclear Information System (INIS)

    Avendaño-Camacho, M; Vallejo, J A; Vorobjev, Yu

    2013-01-01

    We study the determination of the second-order normal form for perturbed Hamiltonians relative to the periodic flow of the unperturbed Hamiltonian H 0 . The formalism presented here is global, and can be easily implemented in any computer algebra system. We illustrate it by means of two examples: the Hénon–Heiles and the elastic pendulum Hamiltonians. (paper)

  5. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  6. THE METHOD OF CONSTRUCTING A BOOLEAN FORMULA OF A POLYGON IN THE DISJUNCTIVE NORMAL FORM

    Directory of Open Access Journals (Sweden)

    A. A. Butov

    2014-01-01

    Full Text Available The paper focuses on finalizing the method of finding a polygon Boolean formula in disjunctive normal form, described in the previous article [1]. An improved method eliminates the drawback asso-ciated with the existence of a class of problems for which the solution is only approximate. The pro-posed method always allows to find an exact solution. The method can be used, in particular, in the systems of computer-aided design of integrated circuits topology.

  7. High molecular gas fractions in normal massive star-forming galaxies in the young Universe.

    Science.gov (United States)

    Tacconi, L J; Genzel, R; Neri, R; Cox, P; Cooper, M C; Shapiro, K; Bolatto, A; Bouché, N; Bournaud, F; Burkert, A; Combes, F; Comerford, J; Davis, M; Schreiber, N M Förster; Garcia-Burillo, S; Gracia-Carpio, J; Lutz, D; Naab, T; Omont, A; Shapley, A; Sternberg, A; Weiner, B

    2010-02-11

    Stars form from cold molecular interstellar gas. As this is relatively rare in the local Universe, galaxies like the Milky Way form only a few new stars per year. Typical massive galaxies in the distant Universe formed stars an order of magnitude more rapidly. Unless star formation was significantly more efficient, this difference suggests that young galaxies were much more molecular-gas rich. Molecular gas observations in the distant Universe have so far largely been restricted to very luminous, rare objects, including mergers and quasars, and accordingly we do not yet have a clear idea about the gas content of more normal (albeit massive) galaxies. Here we report the results of a survey of molecular gas in samples of typical massive-star-forming galaxies at mean redshifts of about 1.2 and 2.3, when the Universe was respectively 40% and 24% of its current age. Our measurements reveal that distant star forming galaxies were indeed gas rich, and that the star formation efficiency is not strongly dependent on cosmic epoch. The average fraction of cold gas relative to total galaxy baryonic mass at z = 2.3 and z = 1.2 is respectively about 44% and 34%, three to ten times higher than in today's massive spiral galaxies. The slow decrease between z approximately 2 and z approximately 1 probably requires a mechanism of semi-continuous replenishment of fresh gas to the young galaxies.

  8. Multiple internal standard normalization for improving HS-SPME-GC-MS quantitation in virgin olive oil volatile organic compounds (VOO-VOCs) profile.

    Science.gov (United States)

    Fortini, Martina; Migliorini, Marzia; Cherubini, Chiara; Cecchi, Lorenzo; Calamai, Luca

    2017-04-01

    The commercial value of virgin olive oils (VOOs) strongly depends on their classification, also based on the aroma of the oils, usually evaluated by a panel test. Nowadays, a reliable analytical method is still needed to evaluate the volatile organic compounds (VOCs) and support the standard panel test method. To date, the use of HS-SPME sampling coupled to GC-MS is generally accepted for the analysis of VOCs in VOOs. However, VOO is a challenging matrix due to the simultaneous presence of: i) compounds at ppm and ppb concentrations; ii) molecules belonging to different chemical classes and iii) analytes with a wide range of molecular mass. Therefore, HS-SPME-GC-MS quantitation based upon the use of external standard method or of only a single internal standard (ISTD) for data normalization in an internal standard method, may be troublesome. In this work a multiple internal standard normalization is proposed to overcome these problems and improving quantitation of VOO-VOCs. As many as 11 ISTDs were used for quantitation of 71 VOCs. For each of them the most suitable ISTD was selected and a good linearity in a wide range of calibration was obtained. Except for E-2-hexenal, without ISTD or with an unsuitable ISTD, the linear range of calibration was narrower with respect to that obtained by a suitable ISTD, confirming the usefulness of multiple internal standard normalization for the correct quantitation of VOCs profile in VOOs. The method was validated for 71 VOCs, and then applied to a series of lampante virgin olive oils and extra virgin olive oils. In light of our results, we propose the application of this analytical approach for routine quantitative analyses and to support sensorial analysis for the evaluation of positive and negative VOOs attributes. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. 41 CFR 101-26.4901-149 - Standard Form 149, U.S. Government National Credit Card.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Standard Form 149, U.S. Government National Credit Card. 101-26.4901-149 Section 101-26.4901-149 Public Contracts and Property... 149, U.S. Government National Credit Card. Note: The form illustrated in § 101-26.4901-149 is filed as...

  10. Towards reporting standards for neuropsychological study results: A proposal to minimize communication errors with standardized qualitative descriptors for normalized test scores.

    Science.gov (United States)

    Schoenberg, Mike R; Rum, Ruba S

    2017-11-01

    Rapid, clear and efficient communication of neuropsychological results is essential to benefit patient care. Errors in communication are a lead cause of medical errors; nevertheless, there remains a lack of consistency in how neuropsychological scores are communicated. A major limitation in the communication of neuropsychological results is the inconsistent use of qualitative descriptors for standardized test scores and the use of vague terminology. PubMed search from 1 Jan 2007 to 1 Aug 2016 to identify guidelines or consensus statements for the description and reporting of qualitative terms to communicate neuropsychological test scores was conducted. The review found the use of confusing and overlapping terms to describe various ranges of percentile standardized test scores. In response, we propose a simplified set of qualitative descriptors for normalized test scores (Q-Simple) as a means to reduce errors in communicating test results. The Q-Simple qualitative terms are: 'very superior', 'superior', 'high average', 'average', 'low average', 'borderline' and 'abnormal/impaired'. A case example illustrates the proposed Q-Simple qualitative classification system to communicate neuropsychological results for neurosurgical planning. The Q-Simple qualitative descriptor system is aimed as a means to improve and standardize communication of standardized neuropsychological test scores. Research are needed to further evaluate neuropsychological communication errors. Conveying the clinical implications of neuropsychological results in a manner that minimizes risk for communication errors is a quintessential component of evidence-based practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Principal Typings in a Restricted Intersection Type System for Beta Normal Forms with De Bruijn Indices

    Directory of Open Access Journals (Sweden)

    Daniel Ventura

    2010-01-01

    Full Text Available The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it.

  12. 41 CFR 102-194.30 - What role does my agency play in the Standard and Optional Forms Management Program?

    Science.gov (United States)

    2010-07-01

    ... What role does my agency play in the Standard and Optional Forms Management Program? Your agency head... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What role does my agency play in the Standard and Optional Forms Management Program? 102-194.30 Section 102-194.30 Public...

  13. Informed consent for clinical trials: a comparative study of standard versus simplified forms.

    Science.gov (United States)

    Davis, T C; Holcombe, R F; Berkel, H J; Pramanik, S; Divers, S G

    1998-05-06

    A high level of reading skill and comprehension is necessary to understand and complete most consent forms that are required for participation in clinical research studies. This study was conducted to test the hypothesis that a simplified consent form would be less intimidating and more easily understood by individuals with low-to-marginal reading skills. During July 1996, 183 adults (53 patients with cancer or another medical condition and 130 apparently healthy participants) were tested for reading ability and then asked to read either the standard Southwestern Oncology Group (SWOG) consent form (16th grade level) or a simplified form (7th grade level) developed at Louisiana State University Medical Center-Shreveport (LSU). Participants were interviewed to assess their attitudes toward and comprehension of the form read. Then they were given the alternate consent form and asked which one they preferred and why. Overall, participants preferred the LSU form (62%; 95% confidence interval [CI] = 54.8%-69.2%) over the SWOG form (38%; 95% CI = 30.8%-45.2%) (P = .0033). Nearly all participants thought that the LSU form was easier to read (97%; 95% CI = 93.1%-99.9%) than the SWOG form (75%; 95% CI = 65.1%-85.7%) (Pinformed consent documents for the substantial proportion of Americans with low-to-marginal literacy skills.

  14. MRI of the normal appendix in children: data toward a new reference standard

    Energy Technology Data Exchange (ETDEWEB)

    Swenson, David W. [Alpert Medical School of Brown University and Rhode Island Hospital, Department of Diagnostic Imaging, Providence, RI (United States); Schooler, Gary R. [Duke University Medical Center, Department of Radiology, Durham, NC (United States); Stamoulis, Catherine; Lee, Edward Y. [Boston Children' s Hospital and Harvard Medical School, Department of Radiology, Boston, MA (United States)

    2016-06-15

    Magnetic resonance imaging (MRI) might prove useful in the diagnostic evaluation of pediatric appendicitis in the effort to avoid exposing children to the ionizing radiation of CT, yet there is a paucity of literature describing the normal range of appearances of the pediatric appendix on MRI. To investigate MRI characteristics of the normal appendix to aid in establishing a reference standard in the pediatric population. We conducted a retrospective study of children and young adults (≤18 years of age) who underwent lumbar spine or pelvis MRI between Jan. 1, 2013, and Dec. 31, 2013, for indications unrelated to appendicitis. Two board-certified radiologists independently reviewed all patients' MRI examinations for appendix visualization, diameter, intraluminal content signal, and presence of periappendiceal inflammation or free fluid. We used the Cohen kappa statistic and Spearman correlation coefficient to assess reader agreement on qualitative and quantitative data, respectively. Three hundred forty-six patients met inclusion criteria. Both readers visualized the appendix in 192/346 (55.5%) patients (kappa = 0.88, P < 0.0001). Estimated median appendix diameter was 5 mm for reader 1 and 6 mm for reader 2 ([25th, 75th] quartiles = [5, 6] mm; range, 2-11 mm; r = 0.81, P < 0.0001). Appendix intraluminal signal characteristics were variable. Periappendiceal inflammation was present in 0/192 (0%) and free fluid in 6/192 (3.1%) MRI examinations (kappa = 1.0). The normal appendix was seen on MRI in approximately half of pediatric patients, with a mean diameter of ∝5-6 mm, variable intraluminal signal characteristics, no adjacent inflammatory changes, and rare surrounding free fluid. (orig.)

  15. Algorithms for finding Chomsky and Greibach normal forms for a fuzzy context-free grammar using an algebraic approach

    Energy Technology Data Exchange (ETDEWEB)

    Lee, E.T.

    1983-01-01

    Algorithms for the construction of the Chomsky and Greibach normal forms for a fuzzy context-free grammar using the algebraic approach are presented and illustrated by examples. The results obtained in this paper may have useful applications in fuzzy languages, pattern recognition, information storage and retrieval, artificial intelligence, database and pictorial information systems. 16 references.

  16. Normalizing acronyms and abbreviations to aid patient understanding of clinical texts: ShARe/CLEF eHealth Challenge 2013, Task 2.

    Science.gov (United States)

    Mowery, Danielle L; South, Brett R; Christensen, Lee; Leng, Jianwei; Peltonen, Laura-Maria; Salanterä, Sanna; Suominen, Hanna; Martinez, David; Velupillai, Sumithra; Elhadad, Noémie; Savova, Guergana; Pradhan, Sameer; Chapman, Wendy W

    2016-07-01

    The ShARe/CLEF eHealth challenge lab aims to stimulate development of natural language processing and information retrieval technologies to aid patients in understanding their clinical reports. In clinical text, acronyms and abbreviations, also referenced as short forms, can be difficult for patients to understand. For one of three shared tasks in 2013 (Task 2), we generated a reference standard of clinical short forms normalized to the Unified Medical Language System. This reference standard can be used to improve patient understanding by linking to web sources with lay descriptions of annotated short forms or by substituting short forms with a more simplified, lay term. In this study, we evaluate 1) accuracy of participating systems' normalizing short forms compared to a majority sense baseline approach, 2) performance of participants' systems for short forms with variable majority sense distributions, and 3) report the accuracy of participating systems' normalizing shared normalized concepts between the test set and the Consumer Health Vocabulary, a vocabulary of lay medical terms. The best systems submitted by the five participating teams performed with accuracies ranging from 43 to 72 %. A majority sense baseline approach achieved the second best performance. The performance of participating systems for normalizing short forms with two or more senses with low ambiguity (majority sense greater than 80 %) ranged from 52 to 78 % accuracy, with two or more senses with moderate ambiguity (majority sense between 50 and 80 %) ranged from 23 to 57 % accuracy, and with two or more senses with high ambiguity (majority sense less than 50 %) ranged from 2 to 45 % accuracy. With respect to the ShARe test set, 69 % of short form annotations contained common concept unique identifiers with the Consumer Health Vocabulary. For these 2594 possible annotations, the performance of participating systems ranged from 50 to 75 % accuracy. Short form normalization continues

  17. Investigation of reliability, validity and normality Persian version of the California Critical Thinking Skills Test; Form B (CCTST

    Directory of Open Access Journals (Sweden)

    Khallli H

    2003-04-01

    Full Text Available Background: To evaluate the effectiveness of the present educational programs in terms of students' achieving problem solving, decision making and critical thinking skills, reliable, valid and standard instrument are needed. Purposes: To Investigate the Reliability, validity and Norm of CCTST Form.B .The California Critical Thinking Skills Test contain 34 multi-choice questions with a correct answer in the jive Critical Thinking (CT cognitive skills domain. Methods: The translated CCTST Form.B were given t0405 BSN nursing students ojNursing Faculties located in Tehran (Tehran, Iran and Shahid Beheshti Universitiesthat were selected in the through random sampling. In order to determine the face and content validity the test was translated and edited by Persian and English language professor and researchers. it was also confirmed by judgments of a panel of medical education experts and psychology professor's. CCTST reliability was determined with internal consistency and use of KR-20. The construct validity of the test was investigated with factor analysis and internal consistency and group difference. Results: The test coefficien for reliablity was 0.62. Factor Analysis indicated that CCTST has been formed from 5 factor (element namely: Analysis, Evaluation, lriference, Inductive and Deductive Reasoning. Internal consistency method shows that All subscales have been high and positive correlation with total test score. Group difference method between nursing and philosophy students (n=50 indicated that there is meaningfUl difference between nursing and philosophy students scores (t=-4.95,p=0.OOO1. Scores percentile norm also show that percentile offifty scores related to 11 raw score and 95, 5 percentiles are related to 17 and 6 raw score ordinary. Conclusions: The Results revealed that the questions test is sufficiently reliable as a research tool, and all subscales measure a single construct (Critical Thinking and are able to distinguished the

  18. Testing of Software Routine to Determine Deviate and Cumulative Probability: ModStandardNormal Version 1.0

    International Nuclear Information System (INIS)

    A.H. Monib

    1999-01-01

    The purpose of this calculation is to document that the software routine ModStandardNomal Version 1.0 which is a Visual Fortran 5.0 module, provides correct results for a normal distribution up to five significant figures (three significant figures at the function tails) for a specified range of input parameters. The software routine may be used for quality affecting work. Two types of output are generated in ModStandardNomal: a deviate, x, given a cumulative probability, p, between 0 and 1; and a cumulative probability, p, given a deviate, x, between -8 and 8. This calculation supports Performance Assessment, under Technical Product Development Plan, TDP-EBS-MD-000006 (Attachment I, DIRS 3) and is written in accordance with the AP-3.12Q Calculations procedure (Attachment I, DIRS 4)

  19. Standard form contracts and a smart contract future

    Directory of Open Access Journals (Sweden)

    Kristin B. Cornelius

    2018-05-01

    Full Text Available With a budding market of widespread smart contract implementation on the horizon, there is much conversation about how to regulate this new technology. Discourse on standard form contracts (SFCs and how they have been adopted in a digital environment is useful toward predicting how smart contracts might be interpreted. This essay provides a critical review of the discourse surrounding digitised SFCs and applies it to issues in smart contract regulation. An exploration of the literature surrounding specific instances SFCs finds that it lacks a close examination of the textual and documentary aspects of SFCs, which are particularly important in a digital environment as a shift in medium prompts a different procedural process. Instead, common perspectives are either based on outdated notions of paper versions of these contracts or on ideologies of industry and business that do not sufficiently address the needs of consumers/users in the digital age. Most importantly, noting the failure of contract law to address the inequities of SFCs in this environment can help prevent them from being codified further with smart contracts.

  20. Robust Confidence Interval for a Ratio of Standard Deviations

    Science.gov (United States)

    Bonett, Douglas G.

    2006-01-01

    Comparing variability of test scores across alternate forms, test conditions, or subpopulations is a fundamental problem in psychometrics. A confidence interval for a ratio of standard deviations is proposed that performs as well as the classic method with normal distributions and performs dramatically better with nonnormal distributions. A simple…

  1. Normalization Of Thermal-Radiation Form-Factor Matrix

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1994-01-01

    Report describes algorithm that adjusts form-factor matrix in TRASYS computer program, which calculates intraspacecraft radiative interchange among various surfaces and environmental heat loading from sources such as sun.

  2. Composite Reliability and Standard Errors of Measurement for a Seven-Subtest Short Form of the Wechsler Adult Intelligence Scale-Revised.

    Science.gov (United States)

    Schretlen, David; And Others

    1994-01-01

    Composite reliability and standard errors of measurement were computed for prorated Verbal, Performance, and Full-Scale intelligence quotient (IQ) scores from a seven-subtest short form of the Wechsler Adult Intelligence Scale-Revised. Results with 1,880 adults (standardization sample) indicate that this form is as reliable as the complete test.…

  3. Post-UV colony-forming ability of normal fibroblast strains and of the xeroderma pigmentosum group G strain

    International Nuclear Information System (INIS)

    Barrett, S.F.; Tarone, R.E.; Moshell, A.N.; Ganges, M.B.; Robbins, J.H.

    1981-01-01

    In xeroderma pigmentosum, an inherited disorder of defective DNA repair, post-uv colony-forming ability of fibroblasts from patients in complementation groups A through F correlates with the patients' neurological status. The first xeroderma pigmentosum patient assigned to the recently discovered group G had the neurological abnormalities of XP. Researchers have determined the post-uv colony-forming ability of cultured fibroblasts from this patient and from 5 more control donors. Log-phase fibroblasts were irradiated with 254 nm uv light from a germicidal lamp, trypsinized, and replated at known densities. After 2 to 4 weeks' incubation the cells were fixed, stained and scored for colony formation. The strains' post-uv colony-forming ability curves were obtained by plotting the log of the percent remaining post-uv colony-forming ability as a function of the uv dose. The post-uv colony-forming ability of 2 of the 5 new normal strains was in the previously defined control donor zone, but that of the other 3 extended down to the level of the most resistant xeroderma pigmentosum strain. The post-uv colony-forming ability curve of the group G fibroblasts was not significantly different from the curves of the group D fibroblast strains from patients with clinical histories similar to that of the group G patient

  4. Bicervical normal uterus with normal vagina | Okeke | Annals of ...

    African Journals Online (AJOL)

    To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...

  5. Towards continuous improvement of endoscopy standards: Validation of a colonoscopy assessment form.

    LENUS (Irish Health Repository)

    2012-02-01

    Aim: Assessment of procedural colonoscopy skills is an important and topical. The aim of this study was to develop and validate a competency-based colonoscopy assessment form that would be easy to use, suitable for the assessment of junior and senior endoscopists and potentially be a useful instrument to detect differences in performance standards following different training interventions. Method: A standardised assessment form was developed incorporating a checklist with dichotomous yes\\/no responses and a global assessment section incorporating several different elements. This form was used prospectively to evaluate colonoscopy cases during the period of the study in several university teaching hospitals. Results were analysed using ANOVA with Bonferroni corrections for post-hoc analysis. Results: 81 procedures were assessed, performed by eight consultant and 19 trainee endoscopists. There were no serious errors. When divided into three groups based on previous experience (novice, intermediate and expert) the assessment form demonstrated statistically significant differences between all three groups (p<0.05). When separate elements were taken into account, the global assessment section was a better discriminator of skill level than the checklist. Conclusion: This form is a valid, easy to use assessment method. We intend to use it to assess the value of simulator training in trainee endoscopists. It also has the potential to be a useful training tool when feedback is given to the trainee.

  6. Comparison of spectrum normalization techniques for univariate ...

    Indian Academy of Sciences (India)

    Laser-induced breakdown spectroscopy; univariate study; normalization models; stainless steel; standard error of prediction. Abstract. Analytical performance of six different spectrum normalization techniques, namelyinternal normalization, normalization with total light, normalization with background along with their ...

  7. The impact of a standardized consultation form for facial trauma on billing and evaluation and management levels.

    Science.gov (United States)

    Levesque, Andre Y; Tauber, David M; Lee, Johnson C; Rodriguez-Feliz, Jose R; Chao, Jerome D

    2014-02-01

    Facial trauma is among the most frequent consultations encountered by plastic surgeons. Unfortunately, the reimbursement from these consultations can be low, and qualified plastic surgeons may exclude facial trauma from their practice. An audit of our records found insufficient documentation to justify higher evaluation and management (EM) levels of service resulting in lower reimbursement. Utilizing a standardized consultation form can improve documentation resulting in higher billing and EM levels. A facial trauma consultation form was developed in conjunction with the billing department. Three plastic surgery residents completed 30 consultations without the aid of the consult form followed by 30 consultations with the aid of the form. The EM levels and billing data for each consultation were obtained from the billing department for analysis. The 2 groups were compared using χ2 analysis and t tests to determine statistical significance. Using our standardized consultation form, the mean EM level increased from 2.97 to 3.60 (P = 0.002). In addition, the mean billed amount increased from $391 to $501 per consult (P = 0.051) representing a 28% increase in billing. In our institution, the development and implementation of a facial trauma consultation form has resulted in more complete documentation and a subsequent increase in EM level and billed services.

  8. Asymptotic Normality of the Optimal Solution in Multiresponse Surface Mathematical Programming

    OpenAIRE

    Díaz-García, José A.; Caro-Lopera, Francisco J.

    2015-01-01

    An explicit form for the perturbation effect on the matrix of regression coeffi- cients on the optimal solution in multiresponse surface methodology is obtained in this paper. Then, the sensitivity analysis of the optimal solution is studied and the critical point characterisation of the convex program, associated with the optimum of a multiresponse surface, is also analysed. Finally, the asymptotic normality of the optimal solution is derived by the standard methods.

  9. Reward value-based gain control: divisive normalization in parietal cortex.

    Science.gov (United States)

    Louie, Kenway; Grattan, Lauren E; Glimcher, Paul W

    2011-07-20

    The representation of value is a critical component of decision making. Rational choice theory assumes that options are assigned absolute values, independent of the value or existence of other alternatives. However, context-dependent choice behavior in both animals and humans violates this assumption, suggesting that biological decision processes rely on comparative evaluation. Here we show that neurons in the monkey lateral intraparietal cortex encode a relative form of saccadic value, explicitly dependent on the values of the other available alternatives. Analogous to extra-classical receptive field effects in visual cortex, this relative representation incorporates target values outside the response field and is observed in both stimulus-driven activity and baseline firing rates. This context-dependent modulation is precisely described by divisive normalization, indicating that this standard form of sensory gain control may be a general mechanism of cortical computation. Such normalization in decision circuits effectively implements an adaptive gain control for value coding and provides a possible mechanistic basis for behavioral context-dependent violations of rationality.

  10. Corticocortical feedback increases the spatial extent of normalization.

    Science.gov (United States)

    Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.

  11. The exp-normal distribution is infinitely divisible

    OpenAIRE

    Pinelis, Iosif

    2018-01-01

    Let $Z$ be a standard normal random variable (r.v.). It is shown that the distribution of the r.v. $\\ln|Z|$ is infinitely divisible; equivalently, the standard normal distribution considered as the distribution on the multiplicative group over $\\mathbb{R}\\setminus\\{0\\}$ is infinitely divisible.

  12. Denotational Aspects of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2005-01-01

    of soundness (the output term, if any, is in normal form and ß-equivalent to the input term); identification (ß-equivalent terms are mapped to the same result); and completeness (the function is defined for all terms that do have normal forms). We also show how the semantic construction enables a simple yet...... formal correctness proof for the normalization algorithm, expressed as a functional program in an ML-like, call-by-value language. Finally, we generalize the construction to produce an infinitary variant of normal forms, namely Böhm trees. We show that the three-part characterization of correctness...

  13. Corticocortical feedback increases the spatial extent of normalization

    Science.gov (United States)

    Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596

  14. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    International Nuclear Information System (INIS)

    Bateman, Grant A.; Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C.; Schofield, Peter

    2005-01-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  15. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    Energy Technology Data Exchange (ETDEWEB)

    Bateman, Grant A. [John Hunter Hospital, Department of Medical Imaging, Newcastle (Australia); Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C. [Hunter Medical Research Institute, Clinical Neurosciences Program, Newcastle (Australia); Schofield, Peter [James Fletcher Hospital, Neuropsychiatry Unit, Newcastle (Australia)

    2005-10-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  16. Cognitive Factors in the Choice of Syntactic Form by Aphasic and Normal Speakers of English and Japanese: The Speaker's Impulse.

    Science.gov (United States)

    Menn, Lise; And Others

    This study examined the role of empathy in the choice of syntactic form and the degree of independence of pragmatic and syntactic abilities in a range of aphasic patients. Study 1 involved 9 English-speaking and 9 Japanese-speaking aphasic subjects with 10 English-speaking and 4 Japanese normal controls. Study 2 involved 14 English- and 6…

  17. Comparative and quantitative determination of total hemoglobin concentration in normal and psoriatic patients

    International Nuclear Information System (INIS)

    Mahesar, S.M.; Dahot, M.U.; Khuhawar, M.Y.; Mahesar, H.U.

    2004-01-01

    The cyanmethaemoglobin technique is now recommended as the standard method by International Committee for Standardization in Hematology and British Standards Institution 1966. The hemoglobin is treated with reagent containing potassium ferricyanide, Potassium cyanide and potassium dihydrogen phosphate. The ferricyanide forms methamoglobin which is converted to cyanmethaemoglobin by the cyanide. The average values of hemoglobin, percent determined from the blood samples of normal and psoriatic (n=44) males and (n=35) females were 15.0, 12.7, 13.6 and 11.2 g/100ml. The decrease in hemoglobin concentration could be due to anemia resulting during the cell proliferation epidermis in inflammatory state and Keratolytic disorder which take place in psoriasis. (author)

  18. The Effect of Normal Force on Tribocorrosion Behaviour of Ti-10Zr Alloy and Porous TiO2-ZrO2 Thin Film Electrochemical Formed

    Science.gov (United States)

    Dănăilă, E.; Benea, L.

    2017-06-01

    The tribocorrosion behaviour of Ti-10Zr alloy and porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy was evaluated in Fusayama-Mayer artificial saliva solution. Tribocorrosion experiments were performed using a unidirectional pin-on-disc experimental set-up which was mechanically and electrochemically instrumented, under various solicitation conditions. The effect of applied normal force on tribocorrosion performance of the tested materials was determined. Open circuit potential (OCP) measurements performed before, during and after sliding tests were applied in order to determine the tribocorrosion degradation. The applied normal force was found to greatly affect the potential during tribocorrosion experiments, an increase in the normal force inducing a decrease in potential accelerating the depassivation of the materials studied. The results show a decrease in friction coefficient with gradually increasing the normal load. It was proved that the porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy lead to an improvement of tribocorrosion resistance compared to non-anodized Ti-10Zr alloy intended for biomedical applications.

  19. Breast composition measurements using retrospective standard mammogram form (SMF)

    International Nuclear Information System (INIS)

    Highnam, R; Pan, X; Warren, R; Jeffreys, M; Smith, G Davey; Brady, M

    2006-01-01

    The standard mammogram form (SMF) representation of an x-ray mammogram is a standardized, quantitative representation of the breast from which the volume of non-fat tissue and breast density can be easily estimated, both of which are of significant interest in determining breast cancer risk. Previous theoretical analysis of SMF had suggested that a complete and substantial set of calibration data (such as mAs and kVp) would be needed to generate realistic breast composition measures and yet there are many interesting trials that have retrospectively collected images with no calibration data. The main contribution of this paper is to revisit our previous theoretical analysis of SMF with respect to errors in the calibration data and to show how and why that theoretical analysis did not match the results from the practical implementations of SMF. In particular, we show how by estimating breast thickness for every image we are, effectively, compensating for any errors in the calibration data. To illustrate our findings, the current implementation of SMF (version 2.2β) was run over 4028 digitized film-screen mammograms taken from six sites over the years 1988-2002 with and without using the known calibration data. Results show that the SMF implementation running without any calibration data at all generates results which display a strong relationship with when running with a complete set of calibration data, and, most importantly, to an expert's visual assessment of breast composition using established techniques. SMF shows considerable promise in being of major use in large epidemiological studies related to breast cancer which require the automated analysis of large numbers of films from many years previously where little or no calibration data is available

  20. Using open technical e-learning standards and service-orientation to support new forms of e-assessment

    NARCIS (Netherlands)

    Miao, Yongwu; Tattersall, Colin; Schoonenboom, Judith; Stevanov, Krassen; Aleksieva-Petrova, Adelina

    2007-01-01

    Miao, Y., Tattersall, C., Schoonenboom, J., Stevanov, K., & Aleksieva-Petrova, A. (2007). Using open technical e-learning standards and service-orientation to support new forms of e-assessment. In D. Griffiths, R. Koper & O. Liber (Eds.), Proceedings of the second TENCompetence Open Workshop on

  1. ESCA studies on leached glass forms

    International Nuclear Information System (INIS)

    Dawkins, B.G.

    1979-01-01

    Electron Spectroscopy for Chemical Analysis (ESCA) results for frit, obsidian, NBS standard, and Savannah River Laboratory (SRL) glass forms that have been subjected to cumulative water leachings of 36 hours show that [Na] exhibits the largest and fastest change of all the elements observed. Leaching of surface Na occurred within minutes. Surface Na depletion increased with leach time. Continuous x-ray irradiation and argon ion milling induced Na mobility, precluding semiquantitative ESCA analysis at normal operating temperatures. However, the sample stage has been equipped with a liquid nitrogen supply and alkali mobility should be eliminated in future work

  2. MR guided spatial normalization of SPECT scans

    International Nuclear Information System (INIS)

    Crouch, B.; Barnden, L.R.; Kwiatek, R.

    2010-01-01

    Full text: In SPECT population studies where magnetic resonance (MR) scans are also available, the higher resolution of the MR scans allows for an improved spatial normalization of the SPECT scans. In this approach, the SPECT images are first coregistered to their corresponding MR images by a linear (affine) transformation which is calculated using SPM's mutual information maximization algorithm. Non-linear spatial normalization maps are then computed either directly from the MR scans using SPM's built in spatial normalization algorithm, or, from segmented TI MR images using DARTEL, an advanced diffeomorphism based spatial normalization algorithm. We compare these MR based methods to standard SPECT based spatial normalization for a population of 27 fibromyalgia patients and 25 healthy controls with spin echo T 1 scans. We identify significant perfusion deficits in prefrontal white matter in FM patients, with the DARTEL based spatial normalization procedure yielding stronger statistics than the standard SPECT based spatial normalization. (author)

  3. The construction of normal expectations

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Røpke, Inge

    2008-01-01

    The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...

  4. Imaging the corpus callosum, septum pellucidum and fornix in children: normal anatomy and variations of normality

    International Nuclear Information System (INIS)

    Griffiths, Paul D.; Batty, Ruth; Connolly, Dan J.A.; Reeves, Michael J.

    2009-01-01

    The midline structures of the supra-tentorial brain are important landmarks for judging if the brain has formed correctly. In this article, we consider the normal appearances of the corpus callosum, septum pellucidum and fornix as shown on MR imaging in normal and near-normal states. (orig.)

  5. Intercoder Reliability of Mapping Between Pharmaceutical Dose Forms in the German Medication Plan and EDQM Standard Terms.

    Science.gov (United States)

    Sass, Julian; Becker, Kim; Ludmann, Dominik; Pantazoglou, Elisabeth; Dewenter, Heike; Thun, Sylvia

    2018-01-01

    A nationally uniform medication plan has recently been part of German legislation. The specification for the German medication plan was developed in cooperation between various stakeholders of the healthcare system. Its' goal is to enhance usability and interoperability while also providing patients and physicians with the necessary information they require for a safe and high-quality therapy. Within the research and development project named Medication Plan PLUS, the specification of the medication plan was tested and reviewed for semantic interoperability in particular. In this study, the list of pharmaceutical dose forms provided in the specification was mapped to the standard terms of the European Directorate for the Quality of Medicines & HealthCare by different coders. The level of agreement between coders was calculated using Cohen's Kappa (κ). Results show that less than half of the dose forms could be coded with EDQM standard terms. In addition to that Kappa was found to be moderate, which means rather unconvincing agreement among coders. In conclusion, there is still vast room for improvement in utilization of standardized international vocabulary and unused potential considering cross-border eHealth implementations in the future.

  6. Study on electric parameters of wild and cultivated cotton forms being in normal state and irradiated

    International Nuclear Information System (INIS)

    Nazirov, N.N.; Kamalov, N.; Norbaev, N.

    1978-01-01

    The radiation effect on electric conductivity of tissues in case of alternating current, electrical capacity and cell impedance has been studied. Gamma irradiation of seedlings results in definite changes of electric factors of cells (electric conductivity, electric capacity, impedance). It is shown that especially strong changes have been revealed during gamma irradiation of radiosensitive wild form of cotton plants. The deviation of cell electric factors from the standard depends on the violation of evolutionally composed ion heterogeneity and cell colloid system state, which results in changes in their structure and metabolism in them

  7. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  8. Use of normalized total dose to represent the biological effect of fractionated radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Flickinger, J C; Kalend, A [Pittsburgh University School of Medicine (USA). Department of Radiation Oncology Pittsburg Cancer Institute (USA)

    1990-03-01

    There are currently a number of radiobiological models to account for the effects of dose fractionation and time. Normalized total dose (NTD) is not another new model but is a previously reported, clinically useful form in which to represent the biological effect, determined by any specific radiobiological dose-fractionation model, of a course of radiation using a single set of standardized, easily understood terminology. The generalized form of NTD reviewed in this paper describes the effect of a course of radiotherapy administered with nonstandard fractionation as the total dose of radiation in Gy that could be administered with a given reference fractionation such as 2 Gy per fraction, 5 fractions per week that would produce an equivalent biological effect (probability of complications or tumor control) as predicted by a given dose-fractionation formula. The use of normalized total dose with several different exponential and linear-quadratic dose-fraction formulas is presented. (author). 51 refs.; 1 fig.; 1 tab.

  9. Use of normalized total dose to represent the biological effect of fractionated radiotherapy

    International Nuclear Information System (INIS)

    Flickinger, J.C.; Kalend, A.

    1990-01-01

    There are currently a number of radiobiological models to account for the effects of dose fractionation and time. Normalized total dose (NTD) is not another new model but is a previously reported, clinically useful form in which to represent the biological effect, determined by any specific radiobiological dose-fractionation model, of a course of radiation using a single set of standardized, easily understood terminology. The generalized form of NTD reviewed in this paper describes the effect of a course of radiotherapy administered with nonstandard fractionation as the total dose of radiation in Gy that could be administered with a given reference fractionation such as 2 Gy per fraction, 5 fractions per week that would produce an equivalent biological effect (probability of complications or tumor control) as predicted by a given dose-fractionation formula. The use of normalized total dose with several different exponential and linear-quadratic dose-fraction formulas is presented. (author). 51 refs.; 1 fig.; 1 tab

  10. An echocardiographic study of healthy Border Collies with normal reference ranges for the breed.

    Science.gov (United States)

    Jacobson, Jake H; Boon, June A; Bright, Janice M

    2013-06-01

    The objectives of this study were to obtain standard echocardiographic measurements from healthy Border Collies and to compare these measurements to those previously reported for a general population of dogs. Standard echocardiographic data were obtained from twenty apparently healthy Border Collie dogs. These data (n = 20) were compared to data obtained from a general population of healthy dogs (n = 69). Border Collies were deemed healthy based on normal history, physical examination, complete blood count, serum biochemical profile, electrocardiogram, and blood pressure, with no evidence of congenital or acquired heart disease on echocardiographic examination. Standard two dimensional, M-mode, and Doppler echocardiographic measurements were obtained and normal ranges determined. The data were compared to data previously obtained at our hospital from a general population of normal dogs. Two dimensional, M-mode, and Doppler reference ranges for healthy Border Collies are presented in tabular form. Comparison of the weight adjusted M-mode echocardiographic means from Border Collies to those from the general population of dogs showed Border Collies to have larger left ventricular systolic and diastolic dimensions, smaller interventricular septal thickness, and lower fractional shortening. There are differences in some echocardiographic parameters between healthy Border Collies and the general dog population, and the echocardiographic reference ranges provided in this study should be used as breed specific reference values for Border Collies. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Disconnected forms of the standard group

    International Nuclear Information System (INIS)

    McInnes, B.

    1996-10-01

    Recent work in quantum gravity has led to a revival of interest in the concept of disconnected gauge groups. Here we explain how to classify all of the (non-trivial) groups which have the same Lie algebra as the ''standard group'', SU(3) x SU(2) x U(1), without requiring connectedness. The number of possibilities is surprisingly large. We also discuss the geometry of the ''Kiskis effect'', the ambiguity induced by non-trivial spacetime topology in such gauge theories. (author). 12 refs

  12. Normal gravity field in relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  13. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    Science.gov (United States)

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The

  14. Evaluation of the standard normal variate method for Laser-Induced Breakdown Spectroscopy data treatment applied to the discrimination of painting layers

    Science.gov (United States)

    Syvilay, D.; Wilkie-Chancellier, N.; Trichereau, B.; Texier, A.; Martinez, L.; Serfaty, S.; Detalle, V.

    2015-12-01

    Nowadays, Laser-Induced Breakdown Spectroscopy (LIBS) is frequently used for in situ analyses to identify pigments from mural paintings. Nonetheless, in situ analyses require a robust instrumentation in order to face to hard experimental conditions. This may imply variation of fluencies and thus inducing variation of LIBS signal, which degrades spectra and then results. Usually, to overcome these experimental errors, LIBS signal is processed. Signal processing methods most commonly used are the baseline subtraction and the normalization by using a spectral line. However, the latter suggests that this chosen element is a constant component of the material, which may not be the case in paint layers organized in stratigraphic layers. For this reason, it is sometimes difficult to apply this normalization. In this study, another normalization will be carried out to throw off these signal variations. Standard normal variate (SNV) is a normalization designed for these conditions. It is sometimes implemented in Diffuse Reflectance Infrared Fourier Transform Spectroscopy and in Raman Spectroscopy but rarely in LIBS. The SNV transformation is not newly applied on LIBS data, but for the first time the effect of SNV on LIBS spectra was evaluated in details (energy of laser, shot by shot, quantification). The aim of this paper is the quick visualization of the different layers of a stratigraphic painting sample by simple data representations (3D or 2D) after SNV normalization. In this investigation, we showed the potential power of SNV transformation to overcome undesired LIBS signal variations but also its limit of application. This method appears as a promising way to normalize LIBS data, which may be interesting for in-situ depth analyses.

  15. Desnutrição neonatal e microbiota normal da cavidade oral em ratos Neonatal malnutrition and normal microbiota of the oral cavity in rats

    Directory of Open Access Journals (Sweden)

    Solange Maria Magalhães da Silva Porto

    2007-12-01

    Full Text Available OBJETIVO: Avaliar a influência da desnutrição neonatal sobre o padrão e o crescimento de bactérias aeróbias, da microbiota normal da cavidade oral, em ratos Wistar adultos. MÉTODOS: O material da cavidade oral foi coletado através de swabs embebidos em 40µL de solução salina estéril e colocados em tubos estéreis contendo 960µL de brain heart infusion. Posteriormente, fez-se homogeneização de cada uma amostra. Então, destes 1.000µL, retirou-se 1µL e este foi semeado em placas de Petri contendo Agar-sangue e Levine para isolamento e identificação de bactérias Gram+ e Gram-, respectivamente. Essas placas foram incubadas em estufa bacteriológica a 37ºC, 48 horas, e as unidades formadoras de colônias que cresceram foram contadas e seus percentuais calculados. Para a bacterioscopia foram confeccionadas lâminas coradas pelo método de Gram. RESULTADOS: Do 5º ao 21º dia de vida os pesos corporais do grupo desnutrido (33,6g:42,8g, desvio-padrão=27,2g foram menores (pOBJECTIVE: To evaluate the influence of neonatal malnutrition on the pattern and growth of aerobic bacteria of the normal bacterial flora of the oral cavity in adults Wistar rats. METHODS: In the present study, the material of the oral cavity was collected through swabs soaked in 40µL of sterile saline solution. After the collection, each swab was placed in a sterile tube containing 960µL of brain heart infusion. Later, the samples were homogenized. Then, from the 1.000µL, 1µL was collected with a gauged loop to be sowed in Petri dishes containing Agar-blood and Agar-Levine, for the isolation and identification of the Gram-positive and Gram-negative bacteria respectively. The plates were placed into a bacteriological incubator, 37ºC, for 48 hours and the colony-forming units that grew were counted and their percentages were calculated. For bacterioscopy, slides were stained with the Gram method. RESULTS: From the 5th to the 21st day of life, body weight of

  16. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.

  17. Frequency of Verbal Forms and Language Standard

    Directory of Open Access Journals (Sweden)

    Timur I. Galeev

    2017-11-01

    Full Text Available The article offers the description of a modern experiment, which gives the possibility of complex information extraction about the cognitive structure of the linguistic evolution of Language Standart (Norm. The study was conducted using the Google Books Corpus, which provides unprecedented opportunities for linguistic studies. The purpose of the experiment was to identify the patterns of competing forms evolution within the center of the verbal paradigm (3Sg and 3Pl on the basis of the data concerning the frequency of their use. The study was conducted on the material of excess verb forms with the variability of a/o vowels in a root (обусловливать/обуславливать. The graphs for variable word form competition clearly illustrate that the process of norm change consists of stages, each of which has numerical characteristics of two competing word form use. The chronological frameworks for an inflectional model change are established with the accuracy of up to 10 years. The graphs obtained as the result of the experiment make it possible to conclude that almost half of the verbs were not variative, although they previously considered. During the discussion of the obtained empirical data, a conclusion is made about the morphemic structure of a word, in which a root vowel changes. Possessing the information about similar processes in other verb paradigms, researchers are able to predict a possible change of inflectional models in the future and, as a consequence, the fixing of a new norm in lexicographical, orthographic and orthoepic sources.

  18. The method of normal forms for singularly perturbed systems of Fredholm integro-differential equations with rapidly varying kernels

    Energy Technology Data Exchange (ETDEWEB)

    Bobodzhanov, A A; Safonov, V F [National Research University " Moscow Power Engineering Institute" , Moscow (Russian Federation)

    2013-07-31

    The paper deals with extending the Lomov regularization method to classes of singularly perturbed Fredholm-type integro-differential systems, which have not so far been studied. In these the limiting operator is discretely noninvertible. Such systems are commonly known as problems with unstable spectrum. Separating out the essential singularities in the solutions to these problems presents great difficulties. The principal one is to give an adequate description of the singularities induced by 'instability points' of the spectrum. A methodology for separating singularities by using normal forms is developed. It is applied to the above type of systems and is substantiated in these systems. Bibliography: 10 titles.

  19. Problems with using the normal distribution--and ways to improve quality and efficiency of data analysis.

    Directory of Open Access Journals (Sweden)

    Eckhard Limpert

    Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.

  20. A structure-preserving approach to normal form analysis of power systems; Una propuesta de preservacion de estructura al analisis de su forma normal en sistemas de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Carrillo, Irma

    2008-01-15

    Power system dynamic behavior is inherently nonlinear and is driven by different processes at different time scales. The size and complexity of these mechanisms has stimulated the search for methods that reduce the original dimension but retain a certain degree of accuracy. In this dissertation, a novel nonlinear dynamical analysis method for the analysis of large amplitude oscillations that embraces ideas from normal form theory and singular perturbation techniques is proposed. This approach allows the full potential of the normal form method to be reached, and is suitably general for application to a wide variety of nonlinear systems. Drawing on the formal theory of dynamical systems, a structure-preserving model of the system is developed that preservers network and load characteristics. By exploiting the separation of fast and slow time scales of the model, an efficient approach based on singular perturbation techniques, is then derived for constructing a nonlinear power system representation that accurately preserves network structure. The method requires no reduction of the constraint equations and gives therefore, information about the effect of network and load characteristics on system behavior. Analytical expressions are then developed that provide approximate solutions to system performance near a singularity and techniques for interpreting these solutions in terms of modal functions are given. New insights into the nature of nonlinear oscillations are also offered and criteria for characterizing network effects on nonlinear system behavior are proposed. Theoretical insight into the behavior of dynamic coupling of differential-algebraic equations and the origin of nonlinearity is given, and implications for analyzing for design and placement of power system controllers in complex nonlinear systems are discussed. The extent of applicability of the proposed procedure is demonstrated by analyzing nonlinear behavior in two realistic test power systems

  1. Standard test method for static leaching of monolithic waste forms for disposal of radioactive waste

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method provides a measure of the chemical durability of a simulated or radioactive monolithic waste form, such as a glass, ceramic, cement (grout), or cermet, in a test solution at temperatures <100°C under low specimen surface- area-to-leachant volume (S/V) ratio conditions. 1.2 This test method can be used to characterize the dissolution or leaching behaviors of various simulated or radioactive waste forms in various leachants under the specific conditions of the test based on analysis of the test solution. Data from this test are used to calculate normalized elemental mass loss values from specimens exposed to aqueous solutions at temperatures <100°C. 1.3 The test is conducted under static conditions in a constant solution volume and at a constant temperature. The reactivity of the test specimen is determined from the amounts of components released and accumulated in the solution over the test duration. A wide range of test conditions can be used to study material behavior, includin...

  2. A Generalized Form of Context-Dependent Psychophysiological Interactions (gPPI): A Comparison to Standard Approaches

    Science.gov (United States)

    McLaren, Donald G.; Ries, Michele L.; Xu, Guofan; Johnson, Sterling C.

    2012-01-01

    Functional MRI (fMRI) allows one to study task-related regional responses and task-dependent connectivity analysis using psychophysiological interaction (PPI) methods. The latter affords the additional opportunity to understand how brain regions interact in a task-dependent manner. The current implementation of PPI in Statistical Parametric Mapping (SPM8) is configured primarily to assess connectivity differences between two task conditions, when in practice fMRI tasks frequently employ more than two conditions. Here we evaluate how a generalized form of context-dependent PPI (gPPI; http://www.nitrc.org/projects/gppi), which is configured to automatically accommodate more than two task conditions in the same PPI model by spanning the entire experimental space, compares to the standard implementation in SPM8. These comparisons are made using both simulations and an empirical dataset. In the simulated dataset, we compare the interaction beta estimates to their expected values and model fit using the Akaike Information Criterion (AIC). We found that interaction beta estimates in gPPI were robust to different simulated data models, were not different from the expected beta value, and had better model fits than when using standard PPI (sPPI) methods. In the empirical dataset, we compare the model fit of the gPPI approach to sPPI. We found that the gPPI approach improved model fit compared to sPPI. There were several regions that became non-significant with gPPI. These regions all showed significantly better model fits with gPPI. Also, there were several regions where task-dependent connectivity was only detected using gPPI methods, also with improved model fit. Regions that were detected with all methods had more similar model fits. These results suggest that gPPI may have greater sensitivity and specificity than standard implementation in SPM. This notion is tempered slightly as there is no gold standard; however, data simulations with a known outcome support our

  3. Manufacturing technology for practical Josephson voltage normals

    International Nuclear Information System (INIS)

    Kohlmann, Johannes; Kieler, Oliver

    2016-01-01

    In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.

  4. Standard forms and entanglement engineering of multimode Gaussian states under local operations

    International Nuclear Information System (INIS)

    Serafini, Alessio; Adesso, Gerardo

    2007-01-01

    We investigate the action of local unitary operations on multimode (pure or mixed) Gaussian states and single out the minimal number of locally invariant parameters which completely characterize the covariance matrix of such states. For pure Gaussian states, central resources for continuous-variable quantum information, we investigate separately the parameter reduction due to the additional constraint of global purity, and the one following by the local-unitary freedom. Counting arguments and insights from the phase-space Schmidt decomposition and in general from the framework of symplectic analysis, accompany our description of the standard form of pure n-mode Gaussian states. In particular, we clarify why only in pure states with n ≤ 3 modes all the direct correlations between position and momentum operators can be set to zero by local unitary operations. For any n, the emerging minimal set of parameters contains complete information about all forms of entanglement in the corresponding states. An efficient state engineering scheme (able to encode direct correlations between position and momentum operators as well) is proposed to produce entangled multimode Gaussian resources, its number of optical elements matching the minimal number of locally invariant degrees of freedom of general pure n-mode Gaussian states. Finally, we demonstrate that so-called 'block-diagonal' Gaussian states, without direct correlations between position and momentum, are systematically less entangled, on average, than arbitrary pure Gaussian states

  5. Dual radiofrequency drive quantum voltage standard with nanovolt resolution based on a closed-loop refrigeration cycle

    International Nuclear Information System (INIS)

    Georgakopoulos, D; Budovsky, I; Hagen, T; Sasaki, H; Yamamori, H

    2012-01-01

    We have developed a programmable Josephson voltage standard that can produce voltages up to 20 V with a resolution of better than 0.1 µV over the whole voltage range and better than 1 nV for voltages up to 10 mV. The standard has two superconductor–normal metal–superconductor junction arrays connected in series and driven by two radiofrequency oscillators. The cryogenic part of the standard is based on a cryocooler. The new standard agrees with the primary quantum voltage standard maintained at the National Measurement Institute, Australia, within 10 nV and forms the basis of an automated calibration system for digital multimeters and voltage references. (paper)

  6. Visual attention and flexible normalization pools

    Science.gov (United States)

    Schwartz, Odelia; Coen-Cagli, Ruben

    2013-01-01

    Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413

  7. International Electrotechnical Commission standards and French material control standards

    International Nuclear Information System (INIS)

    Furet, J.; Weill, J.

    1978-01-01

    There are reported the international standards incorporated into the IEC Subcommitee 45 A (Nuclear Reactor Instrumentation) and the national standards elaborated by the Commissariat a l'Energie Atomique, CEA, Group of normalized control equipment, the degree of application of those being reported on the base design, call of bids and exploitation of nuclear power plants. (J.E. de C)

  8. Ultrasonographic features of normal lower ureters

    International Nuclear Information System (INIS)

    Kim, Young Soon; Bae, M. Y.; Park, K. J.; Jeon, H. S.; Lee, J. H.

    1990-01-01

    Although ultrasonographic evaluation of the normal ureters is difficult due to bowel gas, the lower segment of the normal ureters can be visualized using the urinary bladder as an acoustic window. Authors prospetively performed ultrasonography with the standard suprapubic technique and analyzed the ultrasonographic features of normal lower ureters in 79 cases(77%). Length of visualized segment of the distal ureter ranged frp, 1.5cm to 7.2 cm and the visualized segment did not exceed 3.9mm in maximum diameter. Knowledge of sonographic features of the normal lower ureters can be helpful in the evaluation of pathologic or suspected pathologic conditions of the lower ureters

  9. Normalization in Lie algebras via mould calculus and applications

    Science.gov (United States)

    Paul, Thierry; Sauzin, David

    2017-11-01

    We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.

  10. Strong normalization by type-directed partial evaluation and run-time code generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1998-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  11. Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1997-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  12. The effectiveness of Microsoft Project in assessing extension of time under PAM 2006 standard form of contract

    Science.gov (United States)

    Suhaida, S. K.; Wong, Z. D.

    2017-11-01

    Time is equal to money; and it is applies in the construction industry where time is very important. Most of the standard form of contracts provide contractual clauses to ascertain time and money related to the scenarios while Extension of Time (EOT) is one of them. Under circumstance and delays, contractor is allow to apply EOT in order to complete the works on a later completion date without Liquidated Damages (LD) imposed to the claimant. However, both claimants and assessors encountered problems in assessing the EOT. The aim of this research is to recommend the usage of Microsoft Project as a tool in assessing EOT associated with the standard form of contract, PAM 2006. A quantitative method is applied towards the respondents that consisted of architects and quantity surveyors (QS) in order to collect data on challenges in assessing EOT claims and the effectiveness of Microsoft Project as a tool. The finding of this research highlighted that Microsoft Project can serve as a basis to perform EOT tasks as this software can be used as a data bank to store handy information which crucial for preparing and evaluating EOT.

  13. Production of sodalite waste forms by addition of glass

    International Nuclear Information System (INIS)

    Pereira, C.

    1995-01-01

    Spent nuclear fuel can be treated in a molten salt electrorefiner for conversion into metal and mineral waste forms for geologic disposal. Sodalite is one of the mineral waste forms under study. Fission products in the molten salt are ion-exchanged into zeolite A, which is converted to sodalite and consolidated. Sodalite can be formed directly from mixtures of salt and zeolite A at temperatures above 975 K; however, nepheline is usually produced as a secondary phase. Addition of small amounts of glass frit to the mixture reduced nepheline formation significantly. Loss of fission products was not observed for reaction below 1000 K. Hot-pressing of the sodalite powders yielded dense pellets (∼2.3 g/cm 3 ) without any loss of fission product species. Normalized release rates were below 1 g/m 2 ·day for pre-washed samples in 28-day leach tests based on standard MCC-1 tests but increased with the presence of free salt on the sodalite

  14. PowerForms

    DEFF Research Database (Denmark)

    Brabrand, Claus; Møller, Anders; Ricky, Mikkel

    2000-01-01

    All uses of HTML forms may benefit from validation of the specified input field values. Simple validation matches individual values against specified formats, while more advanced validation may involve interdependencies of form fields. There is currently no standard for specifying or implementing...

  15. Chern–Simons–Antoniadis–Savvidy forms and standard supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Izaurieta, F., E-mail: fizaurie@udec.cl; Salgado, P., E-mail: pasalgad@udec.cl; Salgado, S., E-mail: sesalgado@udec.cl

    2017-04-10

    In the context of the so called the Chern–Simons–Antoniadis–Savvidy (ChSAS) forms, we use the methods for FDA decomposition in 1-forms to construct a four-dimensional ChSAS supergravity action for the Maxwell superalgebra. On the another hand, we use the Extended Cartan Homotopy Formula to find a method that allows the separation of the ChSAS action into bulk and boundary contributions and permits the splitting of the bulk Lagrangian into pieces that reflect the particular subspace structure of the gauge algebra.

  16. Review of clinically accessible methods to determine lean body mass for normalization of standardized uptake values

    International Nuclear Information System (INIS)

    DEVRIESE, Joke; POTTEL, Hans; BEELS, Laurence; MAES, Alex; VAN DE WIELE, Christophe; GHEYSENS, Olivier

    2016-01-01

    With the routine use of 2-deoxy-2-[ 18 F]-fluoro-D-glucose (18F-FDG) positron emission tomography/computed tomography (PET/CT) scans, metabolic activity of tumors can be quantitatively assessed through calculation of SUVs. One possible normalization parameter for the standardized uptake value (SUV) is lean body mass (LBM), which is generally calculated through predictive equations based on height and body weight. (Semi-)direct measurements of LBM could provide more accurate results in cancer populations than predictive equations based on healthy populations. In this context, four methods to determine LBM are reviewed: bioelectrical impedance analysis, dual-energy X-ray absorptiometry. CT, and magnetic resonance imaging. These methods were selected based on clinical accessibility and are compared in terms of methodology, precision and accuracy. By assessing each method’s specific advantages and limitations, a well-considered choice of method can hopefully lead to more accurate SUVLBM values, hence more accurate quantitative assessment of 18F-FDG PET images.

  17. Radioimmunoassay in the diagnosis of atypical form of thyrotoxicosis

    International Nuclear Information System (INIS)

    Livshits, G.Ya.

    1984-01-01

    Fifty-six patients with ''unmotivated'' disorder of the cardiac rhythm were examined. A combined radionuclide study including a study of iodoabsorptive function with a standard technique, thyroid visualization and determination of the thyroxin, triiodothyronine level in the blood serum with the radioimmunoassay using standard diagnostic kits, was conducted, Latent thyroid hyperfunction was revealed in 24 patients (42.8%). Study of iodoabsorptive function revealed pathological changes in 8 patients only, whereas raaioimmunoassay revealed a significant elevation of the peripheral thyroid hormone level as compared to that of the control group in 24 patients. The conclusion is that patients with ''unmotivated'' disorder of the cardiac rhythm often suffer from latent thyrotoxicosis which is the main etiological factor and trigger mechanism of arrhythmias. In such a situation they are the only clinical symptom of thyrotoxicosis that makes it possible to regard this form of disease as monosymptomatic. The early detection of the cause of cardiac rhythm disorder and the prescription of pathogenetic thyrostatic therapy resulted in the return of the cardiac cycle rate to normal in all the patients with sinus tachycardia and prevented relapses of the paroxysmal forms of rhythm disorder

  18. Radioimmunoassay in the diagnosis of atypical form of thyrotoxicosis

    Energy Technology Data Exchange (ETDEWEB)

    Livshits, G.Ya.

    1984-11-01

    Fifty-six patients with ''unmotivated'' disorder of the cardiac rhythm were examined. A combined radionuclide study including a study of iodoabsorptive function with a standard technique, thyroid visualization and determination of the thyroxin, triiodothyronine level in the blood serum with the radioimmunoassay using standard diagnostic kits, was conducted, Latent thyroid hyperfunction was revealed in 24 patients (42.8%). Study of iodoabsorptive function revealed pathological changes in 8 patients only, whereas raaioimmunoassay revealed a significant elevation of the peripheral thyroid hormone level as compared to that of the control group in 24 patients. The conclusion is that patients with ''unmotivated'' disorder of the cardiac rhythm often suffer from latent thyrotoxicosis which is the main etiological factor and trigger mechanism of arrhythmias. In such a situation they are the only clinical symptom of thyrotoxicosis that makes it possible to regard this form of disease as monosymptomatic. The early detection of the cause of cardiac rhythm disorder and the prescription of pathogenetic thyrostatic therapy resulted in the return of the cardiac cycle rate to normal in all the patients with sinus tachycardia and prevented relapses of the paroxysmal forms of rhythm disorder.

  19. Parallel Computation on Multicore Processors Using Explicit Form of the Finite Element Method and C++ Standard Libraries

    Directory of Open Access Journals (Sweden)

    Rek Václav

    2016-11-01

    Full Text Available In this paper, the form of modifications of the existing sequential code written in C or C++ programming language for the calculation of various kind of structures using the explicit form of the Finite Element Method (Dynamic Relaxation Method, Explicit Dynamics in the NEXX system is introduced. The NEXX system is the core of engineering software NEXIS, Scia Engineer, RFEM and RENEX. It has the possibilities of multithreaded running, which can now be supported at the level of native C++ programming language using standard libraries. Thanks to the high degree of abstraction that a contemporary C++ programming language provides, a respective library created in this way can be very generalized for other purposes of usage of parallelism in computational mechanics.

  20. A compact fiber optics-based heterodyne combined normal and transverse displacement interferometer.

    Science.gov (United States)

    Zuanetti, Bryan; Wang, Tianxue; Prakash, Vikas

    2017-03-01

    While Photonic Doppler Velocimetry (PDV) has become a common diagnostic tool for the measurement of normal component of particle motion in shock wave experiments, this technique has not yet been modified for the measurement of combined normal and transverse motion, as needed in oblique plate impact experiments. In this paper, we discuss the design and implementation of a compact fiber-optics-based heterodyne combined normal and transverse displacement interferometer. Like the standard PDV, this diagnostic tool is assembled using commercially available telecommunications hardware and uses a 1550 nm wavelength 2 W fiber-coupled laser, an optical focuser, and single mode fibers to transport light to and from the target. Two additional optical probes capture first-order beams diffracted from a reflective grating at the target free-surface and deliver the beams past circulators and a coupler where the signal is combined to form a beat frequency. The combined signal is then digitized and analyzed to determine the transverse component of the particle motion. The maximum normal velocity that can be measured by this system is limited by the equivalent transmission bandwidth (3.795 GHz) of the combined detector, amplifier, and digitizer and is estimated to be ∼2.9 km/s. Sample symmetric oblique plate-impact experiments are performed to demonstrate the capability of this diagnostic tool in the measurement of the combined normal and transverse displacement particle motion.

  1. Impact of PET/CT image reconstruction methods and liver uptake normalization strategies on quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kuhnert, Georg; Sterzer, Sergej; Kahraman, Deniz; Dietlein, Markus; Drzezga, Alexander; Kobe, Carsten [University Hospital of Cologne, Department of Nuclear Medicine, Cologne (Germany); Boellaard, Ronald [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Scheffler, Matthias; Wolf, Juergen [University Hospital of Cologne, Lung Cancer Group Cologne, Department I of Internal Medicine, Center for Integrated Oncology Cologne Bonn, Cologne (Germany)

    2016-02-15

    In oncological imaging using PET/CT, the standardized uptake value has become the most common parameter used to measure tracer accumulation. The aim of this analysis was to evaluate ultra high definition (UHD) and ordered subset expectation maximization (OSEM) PET/CT reconstructions for their potential impact on quantification. We analyzed 40 PET/CT scans of lung cancer patients who had undergone PET/CT. Standardized uptake values corrected for body weight (SUV) and lean body mass (SUL) were determined in the single hottest lesion in the lung and normalized to the liver for UHD and OSEM reconstruction. Quantitative uptake values and their normalized ratios for the two reconstruction settings were compared using the Wilcoxon test. The distribution of quantitative uptake values and their ratios in relation to the reconstruction method used were demonstrated in the form of frequency distribution curves, box-plots and scatter plots. The agreement between OSEM and UHD reconstructions was assessed through Bland-Altman analysis. A significant difference was observed after OSEM and UHD reconstruction for SUV and SUL data tested (p < 0.0005 in all cases). The mean values of the ratios after OSEM and UHD reconstruction showed equally significant differences (p < 0.0005 in all cases). Bland-Altman analysis showed that the SUV and SUL and their normalized values were, on average, up to 60 % higher after UHD reconstruction as compared to OSEM reconstruction. OSEM and HD reconstruction brought a significant difference for SUV and SUL, which remained constantly high after normalization to the liver, indicating that standardization of reconstruction and the use of comparable SUV measurements are crucial when using PET/CT. (orig.)

  2. Normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  3. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  4. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  5. 10 CFR 71.71 - Normal conditions of transport.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Normal conditions of transport. 71.71 Section 71.71 Energy..., Special Form, and LSA-III Tests 2 § 71.71 Normal conditions of transport. (a) Evaluation. Evaluation of each package design under normal conditions of transport must include a determination of the effect on...

  6. Standard Test Method for Solar Transmittance (Terrestrial) of Sheet Materials Using Sunlight

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1986-01-01

    1.1 This test method covers the measurement of solar transmittance (terrestrial) of materials in sheet form by using a pyranometer, an enclosure, and the sun as the energy source. 1.2 This test method also allows measurement of solar transmittance at angles other than normal incidence. 1.3 This test method is applicable to sheet materials that are transparent, translucent, textured, or patterned. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  7. Standard Errors for Matrix Correlations.

    Science.gov (United States)

    Ogasawara, Haruhiko

    1999-01-01

    Derives the asymptotic standard errors and intercorrelations for several matrix correlations assuming multivariate normality for manifest variables and derives the asymptotic standard errors of the matrix correlations for two factor-loading matrices. (SLD)

  8. Quantitative thallium-201 myocardial exercise scintigraphy in normal subjects and patients with normal coronary arteries

    International Nuclear Information System (INIS)

    Niemeyer, M.G.; St. Antonius Hospital Nieuwegein; Laarman, G.J.; Lelbach, S.; Cramer, M.J.; Ascoop, C.A.P.L.; Verzijlbergen, J.F.; Wall, E.E. van der; Zwinderman, A.H.; Pauwels, E.K.J.

    1990-01-01

    Quantitative thallium-201 myocardial exercise scintigraphy was tested in two patient populations representing alternative standards for cardiac normality: group I comprised 18 male uncatherized patients with a low likelihood of coronary artery disease (CAD); group II contained 41 patients with normal coronary arteriograms. Group I patients were younger, they achieved a higher rate-pressure product than group II patients; all had normal findings by phisical examination and electrocardiography at rest and exercise. Group II patients comprised 21 females, 11 patients showed abnormal electrocardiography at rest, and five patients showed ischemic ST depression during exercise. Twelve patients had sign of minimal CAD. Twelve patients revealed abnormal visual and quantitative thallium findings, three of these patients had minimal CAD. Profiles of uptake and washout of thallium-201 were derived from both patient groups, and compared with normal limits developed by Maddahi et al. Furthermore, low likelihood and angiographically normal patients may differ substantially, and both sets of normal patients should be considered when establishing criteria of abnormality in exercise thallium imaging. When commercial software containing normal limits for quantitative analysis of exercise thallium-201 imaging is used in clinical practice, it is mandatory to compare these with normal limits of uptake and washout of thallium-201, derived from the less heterogeneous group of low-likelihood subjects, which should be used in selecting a normal population to define normality. (author). 37 refs.; 3 figs; 1 tab

  9. Enhancement of cemented waste forms by supercritical CO2 carbonation of standard portland cements

    International Nuclear Information System (INIS)

    Rubin, J.B.; Carey, J.; Taylor, C.M.V.

    1997-01-01

    We are conducting experiments on an innovative transformation concept, using a traditional immobilization technique, that may significantly reduce the volume of hazardous or radioactive waste requiring transport and long-term storage. The standard practice for the stabilization of radioactive salts and residues is to mix them with cements, which may include additives to enhance immobilization. Many of these wastes do not qualify for underground disposition, however, because they do not meet disposal requirements for free liquids, decay heat, head-space gas analysis, and/or leachability. The treatment method alters the bulk properties of a cemented waste form by greatly accelerating the natural cement-aging reactions, producing a chemically stable form having reduced free liquids, as well as reduced porosity, permeability and pH. These structural and chemical changes should allow for greater actinide loading, as well as the reduced mobility of the anions, cations, and radionuclides in aboveground and underground repositories. Simultaneously, the treatment process removes a majority of the hydrogenous material from the cement. The treatment method allows for on-line process monitoring of leachates and can be transported into the field. We will describe the general features of supercritical fluids, as well as the application of these fluids to the treatment of solid and semi-solid waste forms. some of the issues concerning the economic feasibility of industrial scale-up will be addressed, with particular attention to the engineering requirements for the establishment of on-site processing facilities. Finally, the initial results of physical property measurements made on portland cements before and after supercritical fluid processing will be presented

  10. An approach to normal forms of Kuramoto model with distributed delays and the effect of minimal delay

    Energy Technology Data Exchange (ETDEWEB)

    Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)

    2015-09-25

    Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.

  11. Complete Normal Ordering 1: Foundations

    CERN Document Server

    Ellis, John; Skliros, Dimitri P.

    2016-01-01

    We introduce a new prescription for quantising scalar field theories perturbatively around a true minimum of the full quantum effective action, which is to `complete normal order' the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all `cephalopod' Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of `complete normal ordering' (which is an extension of the standard field theory definition of normal ordering) reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative i...

  12. Weak convergence and uniform normalization in infinitary rewriting

    DEFF Research Database (Denmark)

    Simonsen, Jakob Grue

    2010-01-01

    the starkly surprising result that for any orthogonal system with finitely many rules, the system is weakly normalizing under weak convergence if{f} it is strongly normalizing under weak convergence if{f} it is weakly normalizing under strong convergence if{f} it is strongly normalizing under strong...... convergence. As further corollaries, we derive a number of new results for weakly convergent rewriting: Systems with finitely many rules enjoy unique normal forms, and acyclic orthogonal systems are confluent. Our results suggest that it may be possible to recover some of the positive results for strongly...

  13. Normalized Excited Squeezed Vacuum State and Its Applications

    International Nuclear Information System (INIS)

    Meng Xiangguo; Wang Jisuo; Liang Baolong

    2007-01-01

    By using the intermediate coordinate-momentum representation in quantum optics and generating function for the normalization of the excited squeezed vacuum state (ESVS), the normalized ESVS is obtained. We find that its normalization constants obtained via two new methods are uniform and a new form which is different from the result obtained by Zhang and Fan [Phys. Lett. A 165 (1992) 14]. By virtue of the normalization constant of the ESVS and the intermediate coordinate-momentum representation, the tomogram of the normalized ESVS and some useful formulae are derived.

  14. Three forms of relativity

    International Nuclear Information System (INIS)

    Strel'tsov, V.N.

    1992-01-01

    The physical sense of three forms of the relativity is discussed. The first - instant from - respects in fact the traditional approach based on the concept of instant distance. The normal form corresponds the radar formulation which is based on the light or retarded distances. The front form in the special case is characterized by 'observable' variables, and the known method of k-coefficient is its obvious expression. 16 refs

  15. Random Generators and Normal Numbers

    OpenAIRE

    Bailey, David H.; Crandall, Richard E.

    2002-01-01

    Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...

  16. 29 CFR 1904.29 - Forms.

    Science.gov (United States)

    2010-07-01

    ... OSHA 300 Log. Instead, enter “privacy case” in the space normally used for the employee's name. This...) Basic requirement. You must use OSHA 300, 300-A, and 301 forms, or equivalent forms, for recordable injuries and illnesses. The OSHA 300 form is called the Log of Work-Related Injuries and Illnesses, the 300...

  17. Normalization Methods and Selection Strategies for Reference Materials in Stable Isotope Analyses - Review

    International Nuclear Information System (INIS)

    Skrzypek, G.; Sadler, R.; Paul, D.; Forizs, I.

    2011-01-01

    A stable isotope analyst has to make a number of important decisions regarding how to best determine the 'true' stable isotope composition of analysed samples in reference to an international scale. It has to be decided which reference materials should be used, the number of reference materials and how many repetitions of each standard is most appropriate for a desired level of precision, and what normalization procedure should be selected. In this paper we summarise what is known about propagation of uncertainties associated with normalization procedures and propagation of uncertainties associated with reference materials used as anchors for the determination of 'true' values for δ''1''3C and δ''1''8O. Normalization methods Several normalization methods transforming the 'raw' value obtained from mass spectrometers to one of the internationally recognized scales has been developed. However, as summarised by Paul et al. different normalization transforms alone may lead to inconsistencies between laboratories. The most common normalization procedures are: single-point anchoring (versus working gas and certified reference standard), modified single-point normalization, linear shift between the measured and the true isotopic composition of two certified reference standards, two-point and multipoint linear normalization methods. The accuracy of these various normalization methods has been compared by using analytical laboratory data by Paul et al., with the single-point and normalization versus tank calibrations resulting in the largest normalization errors, and that also exceed the analytical uncertainty recommended for δ 13 C. The normalization error depends greatly on the relative differences between the stable isotope composition of the reference material and the sample. On the other hand, the normalization methods using two or more certified reference standards produces a smaller normalization error, if the reference materials are bracketing the whole range of

  18. Masturbation, sexuality, and adaptation: normalization in adolescence.

    Science.gov (United States)

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  19. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  20. Phenotype of normal spirometry in an aging population.

    Science.gov (United States)

    Vaz Fragoso, Carlos A; McAvay, Gail; Van Ness, Peter H; Casaburi, Richard; Jensen, Robert L; MacIntyre, Neil; Gill, Thomas M; Yaggi, H Klar; Concato, John

    2015-10-01

    In aging populations, the commonly used Global Initiative for Chronic Obstructive Lung Disease (GOLD) may misclassify normal spirometry as respiratory impairment (airflow obstruction and restrictive pattern), including the presumption of respiratory disease (chronic obstructive pulmonary disease [COPD]). To evaluate the phenotype of normal spirometry as defined by a new approach from the Global Lung Initiative (GLI), overall and across GOLD spirometric categories. Using data from COPDGene (n = 10,131; ages 45-81; smoking history, ≥10 pack-years), we evaluated spirometry and multiple phenotypes, including dyspnea severity (Modified Medical Research Council grade 0-4), health-related quality of life (St. George's Respiratory Questionnaire total score), 6-minute-walk distance, bronchodilator reversibility (FEV1 % change), computed tomography-measured percentage of lung with emphysema (% emphysema) and gas trapping (% gas trapping), and small airway dimensions (square root of the wall area for a standardized airway with an internal perimeter of 10 mm). Among 5,100 participants with GLI-defined normal spirometry, GOLD identified respiratory impairment in 1,146 (22.5%), including a restrictive pattern in 464 (9.1%), mild COPD in 380 (7.5%), moderate COPD in 302 (5.9%), and severe COPD in none. Overall, the phenotype of GLI-defined normal spirometry included normal adjusted mean values for dyspnea grade (0.8), St. George's Respiratory Questionnaire (15.9), 6-minute-walk distance (1,424 ft [434 m]), bronchodilator reversibility (2.7%), % emphysema (0.9%), % gas trapping (10.7%), and square root of the wall area for a standardized airway with an internal perimeter of 10 mm (3.65 mm); corresponding 95% confidence intervals were similarly normal. These phenotypes remained normal for GLI-defined normal spirometry across GOLD spirometric categories. GLI-defined normal spirometry, even when classified as respiratory impairment by GOLD, included adjusted mean values in the

  1. The Dynamics of Standardization

    DEFF Research Database (Denmark)

    Brunsson, Nils; Rasche, Andreas; Seidl, David

    2012-01-01

    This paper suggests that when the phenomenon of standards and standardization is examined from the perspective of organization studies, three aspects stand out: the standardization of organizations, standardization by organizations and standardization as (a form of) organization. Following a comp...

  2. The normal range of condylar movement

    International Nuclear Information System (INIS)

    Choe, Han Up; Park, Tae Won

    1978-01-01

    The purpose of this study was to investigate the normal range of condylar movement of normal adults. The author gas observed roentgenographic images of four serial positions of condylar head taken by modified transcranial lateral oblique projection. The serial positions are centric occlusion, rest position, 1 inch open position and maximal open position. The results were obtained as follow; 1. Inter-incisal distance was 46.85 mm in maximal open position. 2. The length between the deepest point of glenoid fossa and summit of condylar head in rest position was wider than that in centric occlusion by 0.8 mm. 3. In 1 inch open position, condylar head moved forward from the standard line in 12.64 mm of horizontal direction and moved downwards from the standard line in 1.84 mm of vertical direction. 4. In maximal open position, condylar head moved forward from the standard line in 19.06 mm of horizontal direction and moved downwards from the standard line in 0.4 mm of vertical direction. 5. In centric occlusion, the width between glenoid fossa and margin of condylar head was greater in the posterior portion than in the anterior portion by 0.4 mm. 6. Except for estimated figures of 1 inch open position, all of the estimated figures was greater in male than in female.

  3. Normalization of satellite imagery

    Science.gov (United States)

    Kim, Hongsuk H.; Elman, Gregory C.

    1990-01-01

    Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.

  4. A strand specific high resolution normalization method for chip-sequencing data employing multiple experimental control measurements

    DEFF Research Database (Denmark)

    Enroth, Stefan; Andersson, Claes; Andersson, Robin

    2012-01-01

    High-throughput sequencing is becoming the standard tool for investigating protein-DNA interactions or epigenetic modifications. However, the data generated will always contain noise due to e.g. repetitive regions or non-specific antibody interactions. The noise will appear in the form of a backg......, the background is only used to adjust peak calling and not as a pre-processing step that aims at discerning the signal from the background noise. A normalization procedure that extracts the signal of interest would be of universal use when investigating genomic patterns....

  5. Anomalous normal mode oscillations in semiconductor microcavities

    Energy Technology Data Exchange (ETDEWEB)

    Wang, H. [Univ. of Oregon, Eugene, OR (United States). Dept. of Physics; Hou, H.Q.; Hammons, B.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-04-01

    Semiconductor microcavities as a composite exciton-cavity system can be characterized by two normal modes. Under an impulsive excitation by a short laser pulse, optical polarizations associated with the two normal modes have a {pi} phase difference. The total induced optical polarization is then expected to exhibit a sin{sup 2}({Omega}t)-like oscillation where 2{Omega} is the normal mode splitting, reflecting a coherent energy exchange between the exciton and cavity. In this paper the authors present experimental studies of normal mode oscillations using three-pulse transient four wave mixing (FWM). The result reveals surprisingly that when the cavity is tuned far below the exciton resonance, normal mode oscillation in the polarization is cos{sup 2}({Omega}t)-like, in contrast to what is expected form the simple normal mode model. This anomalous normal mode oscillation reflects the important role of virtual excitation of electronic states in semiconductor microcavities.

  6. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    Science.gov (United States)

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  7. Self-consistent normal ordering of gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1987-01-01

    Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs

  8. 78 FR 63036 - Transmission Planning Reliability Standards

    Science.gov (United States)

    2013-10-23

    ... Reliability Standards for the Bulk Power System, 130 FERC ] 61,200 (2010). \\8\\ Mandatory Reliability Standards... electric system operations across normal and contingency conditions. We also find that Reliability Standard... Reliability Standards for the Bulk Power System, 131 FERC ] 61,231 at P 21. Comments 24. NERC supports the...

  9. Biochemical response of normal albino rats to the addition of ...

    African Journals Online (AJOL)

    Experiments were conducted to determine the biochemical effect of Hibiscus cannabinus and Murraya koenigii extracts on normal albino rats using standard methods. Analyses carried out indicated that the aqueous leaf extract of H. cannabinus and M. koenigii exhibited significant hypolipideamic activity in normal rats.

  10. Variation in standards of research compensation and child assent practices: a comparison of 69 institutional review board-approved informed permission and assent forms for 3 multicenter pediatric clinical trials.

    Science.gov (United States)

    Kimberly, Michael B; Hoehn, K Sarah; Feudtner, Chris; Nelson, Robert M; Schreiner, Mark

    2006-05-01

    To systematically compare standards for compensation and child participant assent in informed permission, assent, and consent forms (IP-A-CFs) approved by 55 local institutional review boards (IRBs) reviewing 3 standardized multicenter research protocols. Sixty-nine principal investigators participating in any of 3 national, multicenter clinical trials submitted standardized research protocols for their trials to their local IRBs for approval. Copies of the subsequently IRB-approved IP-A-CFs were then forwarded to an academic clinical research organization. This collection of IRB-approved forms allowed for a quasiexperimental retrospective evaluation of the variation in informed permission, assent, and consent standards operationalized by the local IRBs. Standards for compensation and child participant assent varied substantially across 69 IRB-approved IP-A-CFs. Among the 48 IP-A-CFs offering compensation, monetary compensation was offered by 33 as reimbursement for travel, parking, or food expenses, whereas monetary or material compensation was offered by 22 for subject inconvenience and by 13 for subject time. Compensation ranged widely within and across studies (study 1, $180-1425; study 2, $0-500; and study 3, $0-100). Regarding child participant assent, among the 57 IP-A-CFs that included a form of assent documentation, 33 included a line for assent on the informed permission or consent form, whereas 35 included a separate form written in simplified language. Of the IP-A-CFs that stipulated the documentation of assent, 31 specified > or =1 age ranges for obtaining assent. Informed permission or consent forms were addressed either to parents or child participants. In response to identical clinical trial protocols, local IRBs generate IP-A-CFs that vary considerably regarding compensation and child participant assent.

  11. Analysis of visual appearance of retinal nerve fibers in high resolution fundus images: a study on normal subjects.

    Science.gov (United States)

    Kolar, Radim; Tornow, Ralf P; Laemmer, Robert; Odstrcilik, Jan; Mayer, Markus A; Gazarek, Jiri; Jan, Jiri; Kubena, Tomas; Cernosek, Pavel

    2013-01-01

    The retinal ganglion axons are an important part of the visual system, which can be directly observed by fundus camera. The layer they form together inside the retina is the retinal nerve fiber layer (RNFL). This paper describes results of a texture RNFL analysis in color fundus photographs and compares these results with quantitative measurement of RNFL thickness obtained from optical coherence tomography on normal subjects. It is shown that local mean value, standard deviation, and Shannon entropy extracted from the green and blue channel of fundus images are correlated with corresponding RNFL thickness. The linear correlation coefficients achieved values 0.694, 0.547, and 0.512 for respective features measured on 439 retinal positions in the peripapillary area from 23 eyes of 15 different normal subjects.

  12. Normal radiographic findings. 4. act. ed.

    International Nuclear Information System (INIS)

    Moeller, T.B.

    2003-01-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  13. DCC DIFFUSE Standards Frameworks: A Standards Path through the Curation Lifecycle

    Directory of Open Access Journals (Sweden)

    Sarah Higgins

    2009-10-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 DCC DIFFUSE Standards Frameworks aims to offer domain specific advice on standards relevant to digital preservation and curation, to help curators identify which standards they should be using and where they can be appropriately implemented, to ensure authoritative digital material. The Project uses the DCC Curation Lifecycle Model and Web 2.0 technology, to visually present standards frameworks for a number of disciplines. The Digital Curation Centre (DCC is actively working with a different relevant organisations to present searchable frameworks of standards, for a number of domains. These include digital repositories, records management, the geo-information sector, archives and the museum sector. Other domains, such as e-science, will shortly be investigated.

  14. Deformation around basin scale normal faults

    International Nuclear Information System (INIS)

    Spahic, D.

    2010-01-01

    Faults in the earth crust occur within large range of scales from microscale over mesoscopic to large basin scale faults. Frequently deformation associated with faulting is not only limited to the fault plane alone, but rather forms a combination with continuous near field deformation in the wall rock, a phenomenon that is generally called fault drag. The correct interpretation and recognition of fault drag is fundamental for the reconstruction of the fault history and determination of fault kinematics, as well as prediction in areas of limited exposure or beyond comprehensive seismic resolution. Based on fault analyses derived from 3D visualization of natural examples of fault drag, the importance of fault geometry for the deformation of marker horizons around faults is investigated. The complex 3D structural models presented here are based on a combination of geophysical datasets and geological fieldwork. On an outcrop scale example of fault drag in the hanging wall of a normal fault, located at St. Margarethen, Burgenland, Austria, data from Ground Penetrating Radar (GPR) measurements, detailed mapping and terrestrial laser scanning were used to construct a high-resolution structural model of the fault plane, the deformed marker horizons and associated secondary faults. In order to obtain geometrical information about the largely unexposed master fault surface, a standard listric balancing dip domain technique was employed. The results indicate that for this normal fault a listric shape can be excluded, as the constructed fault has a geologically meaningless shape cutting upsection into the sedimentary strata. This kinematic modeling result is additionally supported by the observation of deformed horizons in the footwall of the structure. Alternatively, a planar fault model with reverse drag of markers in the hanging wall and footwall is proposed. Deformation around basin scale normal faults. A second part of this thesis investigates a large scale normal fault

  15. CT of Normal Developmental and Variant Anatomy of the Pediatric Skull: Distinguishing Trauma from Normality.

    Science.gov (United States)

    Idriz, Sanjin; Patel, Jaymin H; Ameli Renani, Seyed; Allan, Rosemary; Vlahos, Ioannis

    2015-01-01

    The use of computed tomography (CT) in clinical practice has been increasing rapidly, with the number of CT examinations performed in adults and children rising by 10% per year in England. Because the radiology community strives to reduce the radiation dose associated with pediatric examinations, external factors, including guidelines for pediatric head injury, are raising expectations for use of cranial CT in the pediatric population. Thus, radiologists are increasingly likely to encounter pediatric head CT examinations in daily practice. The variable appearance of cranial sutures at different ages can be confusing for inexperienced readers of radiologic images. The evolution of multidetector CT with thin-section acquisition increases the clarity of some of these sutures, which may be misinterpreted as fractures. Familiarity with the normal anatomy of the pediatric skull, how it changes with age, and normal variants can assist in translating the increased resolution of multidetector CT into more accurate detection of fractures and confident determination of normality, thereby reducing prolonged hospitalization of children with normal developmental structures that have been misinterpreted as fractures. More important, the potential morbidity and mortality related to false-negative interpretation of fractures as normal sutures may be avoided. The authors describe the normal anatomy of all standard pediatric sutures, common variants, and sutural mimics, thereby providing an accurate and safe framework for CT evaluation of skull trauma in pediatric patients. (©)RSNA, 2015.

  16. Normal and Abnormal Behavior in Early Childhood

    OpenAIRE

    Spinner, Miriam R.

    1981-01-01

    Evaluation of normal and abnormal behavior in the period to three years of age involves many variables. Parental attitudes, determined by many factors such as previous childrearing experience, the bonding process, parental psychological status and parental temperament, often influence the labeling of behavior as normal or abnormal. This article describes the forms of crying, sleep and wakefulness, and affective responses from infancy to three years of age.

  17. Formulae for the determination of the elements of the E\\"otvos matrix of the Earth's normal gravity field and a relation between normal and actual Gaussian curvature

    OpenAIRE

    Manoussakis, G.; Delikaraoglou, D.

    2011-01-01

    In this paper we form relations for the determination of the elements of the E\\"otv\\"os matrix of the Earth's normal gravity field. In addition a relation between the Gauss curvature of the normal equipotential surface and the Gauss curvature of the actual equipotential surface both passing through the point P is presented. For this purpose we use a global Cartesian system (X, Y, Z) and use the variables X, and Y to form a local parameterization a normal equipotential surface to describe its ...

  18. Behavioral finance: Finance with normal people

    Directory of Open Access Journals (Sweden)

    Meir Statman

    2014-06-01

    Behavioral finance substitutes normal people for the rational people in standard finance. It substitutes behavioral portfolio theory for mean-variance portfolio theory, and behavioral asset pricing model for the CAPM and other models where expected returns are determined only by risk. Behavioral finance also distinguishes rational markets from hard-to-beat markets in the discussion of efficient markets, a distinction that is often blurred in standard finance, and it examines why so many investors believe that it is easy to beat the market. Moreover, behavioral finance expands the domain of finance beyond portfolios, asset pricing, and market efficiency and is set to continue that expansion while adhering to the scientific rigor introduced by standard finance.

  19. Decommissioning standards

    International Nuclear Information System (INIS)

    Crofford, W.N.

    1980-01-01

    EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs

  20. The interblink interval in normal and dry eye subjects

    Directory of Open Access Journals (Sweden)

    Johnston PR

    2013-02-01

    Full Text Available Patrick R Johnston,1 John Rodriguez,1 Keith J Lane,1 George Ousler,1 Mark B Abelson1,21Ora, Inc, Andover, MA, USA; 2Schepens Eye Research Institute and Harvard Medical School, Boston, MA, USAPurpose: Our aim was to extend the concept of blink patterns from average interblink interval (IBI to other aspects of the distribution of IBI. We hypothesized that this more comprehensive approach would better discriminate between normal and dry eye subjects.Methods: Blinks were captured over 10 minutes for ten normal and ten dry eye subjects while viewing a standardized televised documentary. Fifty-five blinks were analyzed for each of the 20 subjects. Means, standard deviations, and autocorrelation coefficients were calculated utilizing a single random effects model fit to all data points and a diagnostic model was subsequently fit to predict probability of a subject having dry eye based on these parameters.Results: Mean IBI was 5.97 seconds for normal versus 2.56 seconds for dry eye subjects (ratio: 2.33, P = 0.004. IBI variability was 1.56 times higher in normal subjects (P < 0.001, and the autocorrelation was 1.79 times higher in normal subjects (P = 0.044. With regard to the diagnostic power of these measures, mean IBI was the best dry eye versus normal classifier using receiver operating characteristics (0.85 area under curve (AUC, followed by the standard deviation (0.75 AUC, and lastly, the autocorrelation (0.63 AUC. All three predictors combined had an AUC of 0.89. Based on this analysis, cutoffs of ≤3.05 seconds for median IBI, and ≤0.73 for the coefficient of variation were chosen to classify dry eye subjects.Conclusion: (1 IBI was significantly shorter for dry eye patients performing a visual task compared to normals; (2 there was a greater variability of interblink intervals in normal subjects; and (3 these parameters were useful as diagnostic predictors of dry eye disease. The results of this pilot study merit investigation of IBI

  1. Metacognition and Reading: Comparing Three Forms of Metacognition in Normally Developing Readers and Readers with Dyslexia.

    Science.gov (United States)

    Furnes, Bjarte; Norman, Elisabeth

    2015-08-01

    Metacognition refers to 'cognition about cognition' and includes metacognitive knowledge, strategies and experiences (Efklides, 2008; Flavell, 1979). Research on reading has shown that better readers demonstrate more metacognitive knowledge than poor readers (Baker & Beall, 2009), and that reading ability improves through strategy instruction (Gersten, Fuchs, Williams, & Baker, 2001). The current study is the first to specifically compare the three forms of metacognition in dyslexic (N = 22) versus normally developing readers (N = 22). Participants read two factual texts, with learning outcome measured by a memory task. Metacognitive knowledge and skills were assessed by self-report. Metacognitive experiences were measured by predictions of performance and judgments of learning. Individuals with dyslexia showed insight into their reading problems, but less general knowledge of how to approach text reading. They more often reported lack of available reading strategies, but groups did not differ in the use of deep and surface strategies. Learning outcome and mean ratings of predictions of performance and judgments of learning were lower in dyslexic readers, but not the accuracy with which metacognitive experiences predicted learning. Overall, the results indicate that dyslexic reading and spelling problems are not generally associated with lower levels of metacognitive knowledge, metacognitive strategies or sensitivity to metacognitive experiences in reading situations. 2015 The Authors. Dyslexia Published by John Wiley & Sons Ltd.

  2. Solitary-wave families of the Ostrovsky equation: An approach via reversible systems theory and normal forms

    International Nuclear Information System (INIS)

    Roy Choudhury, S.

    2007-01-01

    The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned

  3. Shear Stress-Normal Stress (Pressure) Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Science.gov (United States)

    Noguchi, Hiroshi; Takehara, Kimie; Ohashi, Yumiko; Suzuki, Ryo; Yamauchi, Toshimasa; Kadowaki, Takashi; Sanada, Hiromi

    2016-01-01

    Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure) and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure) ratio (SPR) was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH) as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure), concretely, peak values (SPR-p) and time integral values (SPR-i). The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i. PMID:28050567

  4. Shear Stress-Normal Stress (Pressure Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Directory of Open Access Journals (Sweden)

    Ayumi Amemiya

    2016-01-01

    Full Text Available Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure ratio (SPR was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure, concretely, peak values (SPR-p and time integral values (SPR-i. The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i.

  5. Control-group feature normalization for multivariate pattern analysis of structural MRI data using the support vector machine.

    Science.gov (United States)

    Linn, Kristin A; Gaonkar, Bilwaj; Satterthwaite, Theodore D; Doshi, Jimit; Davatzikos, Christos; Shinohara, Russell T

    2016-05-15

    Normalization of feature vector values is a common practice in machine learning. Generally, each feature value is standardized to the unit hypercube or by normalizing to zero mean and unit variance. Classification decisions based on support vector machines (SVMs) or by other methods are sensitive to the specific normalization used on the features. In the context of multivariate pattern analysis using neuroimaging data, standardization effectively up- and down-weights features based on their individual variability. Since the standard approach uses the entire data set to guide the normalization, it utilizes the total variability of these features. This total variation is inevitably dependent on the amount of marginal separation between groups. Thus, such a normalization may attenuate the separability of the data in high dimensional space. In this work we propose an alternate approach that uses an estimate of the control-group standard deviation to normalize features before training. We study our proposed approach in the context of group classification using structural MRI data. We show that control-based normalization leads to better reproducibility of estimated multivariate disease patterns and improves the classifier performance in many cases. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Quantitative proteome profiling of normal human circulating microparticles

    DEFF Research Database (Denmark)

    Østergaard, Ole; Nielsen, Christoffer T; Iversen, Line V

    2012-01-01

    Circulating microparticles (MPs) are produced as part of normal physiology. Their numbers, origin, and composition change in pathology. Despite this, the normal MP proteome has not yet been characterized with standardized high-resolution methods. We here quantitatively profile the normal MP...... proteome using nano-LC-MS/MS on an LTQ-Orbitrap with optimized sample collection, preparation, and analysis of 12 different normal samples. Analytical and procedural variation were estimated in triply processed samples analyzed in triplicate from two different donors. Label-free quantitation was validated...... by the correlation of cytoskeletal protein intensities with MP numbers obtained by flow cytometry. Finally, the validity of using pooled samples was evaluated using overlap protein identification numbers and multivariate data analysis. Using conservative parameters, 536 different unique proteins were quantitated...

  7. Percentile estimation using the normal and lognormal probability distribution

    International Nuclear Information System (INIS)

    Bement, T.R.

    1980-01-01

    Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution

  8. Confectionery-based dose forms.

    Science.gov (United States)

    Tangso, Kristian J; Ho, Quy Phuong; Boyd, Ben J

    2015-01-01

    Conventional dosage forms such as tablets, capsules and syrups are prescribed in the normal course of practice. However, concerns about patient preferences and market demands have given rise to the exploration of novel unconventional dosage forms. Among these, confectionery-based dose forms have strong potential to overcome compliance problems. This report will review the availability of these unconventional dose forms used in treating the oral cavity and for systemic drug delivery, with a focus on medicated chewing gums, medicated lollipops, and oral bioadhesive devices. The aim is to stimulate increased interest in the opportunities for innovative new products that are available to formulators in this field, particularly for atypical patient populations.

  9. Analysis of Visual Appearance of Retinal Nerve Fibers in High Resolution Fundus Images: A Study on Normal Subjects

    Directory of Open Access Journals (Sweden)

    Radim Kolar

    2013-01-01

    Full Text Available The retinal ganglion axons are an important part of the visual system, which can be directly observed by fundus camera. The layer they form together inside the retina is the retinal nerve fiber layer (RNFL. This paper describes results of a texture RNFL analysis in color fundus photographs and compares these results with quantitative measurement of RNFL thickness obtained from optical coherence tomography on normal subjects. It is shown that local mean value, standard deviation, and Shannon entropy extracted from the green and blue channel of fundus images are correlated with corresponding RNFL thickness. The linear correlation coefficients achieved values 0.694, 0.547, and 0.512 for respective features measured on 439 retinal positions in the peripapillary area from 23 eyes of 15 different normal subjects.

  10. Calibration of Flick standards

    International Nuclear Information System (INIS)

    Thalmann, Ruedi; Spiller, Jürg; Küng, Alain; Jusko, Otto

    2012-01-01

    Flick standards or magnification standards are widely used for an efficient and functional calibration of the sensitivity of form measuring instruments. The results of a recent measurement comparison have shown to be partially unsatisfactory and revealed problems related to the calibration of these standards. In this paper the influence factors for the calibration of Flick standards using roundness measurement instruments are discussed in detail, in particular the bandwidth of the measurement chain, residual form errors of the device under test, profile distortions due to the diameter of the probing element and questions related to the definition of the measurand. The different contributions are estimated using simulations and are experimentally verified. Also alternative methods to calibrate Flick standards are investigated. Finally the practical limitations of Flick standard calibration are shown and the usability of Flick standards both to calibrate the sensitivity of roundness instruments and to check the filter function of such instruments is analysed. (paper)

  11. Radiographic normal range of condylar movement of mandible

    International Nuclear Information System (INIS)

    Choi, Byung Ihn; Lee, Jae Mun; Kim, Myung Jin

    1981-01-01

    It is the purpose of this article to determine various normal anatomic measurements of temporomandibular joint and normal range of condylar movement using relatively simple X-ray equipment and radiographic technique in consideration of popular clinical application. Author's cases consisted of 100 clinically normal adult males and temporomandibular joint radiographs of 3 serial positions of condylar head were taken by transcranial oblique lateral projection in each case. The serial positions are centric occlusion, 1 inch opening and maximal opening position. The results were as follows; 1. In centric occlusion, the length between the condylar head and glenoid fossa was 2.23 ± 0.58 mm in anterior part, 3.55 ± 0.80 mm in upper part and 2.76 ± 0.72 mm in posterior part. 2. In centric occlusion, the angle (α) between the horizontal standard line (AB) and anterior slope (BC) was 37.22 ± 3.87 .deg. . 3. In 1 inch opening position, the distance between the summit of condylar head from the standard point of articular eminence (B) was -0.64 ± 3.53 mm in horizontal direction and -1.07 ± 1.00 mm in vertical direction. 4. In maximal opening position, the distance between the summit of condylar head from the standard point of articular eminence (B) was 5.83 ± 3.05 mm in horizontal direction and +0.29 ± 1.58 mm in vertical direction. 5. In positional relationship between the condylar head and the standard point of articular eminence (B), the condyles were found to be at the eminences or anterior to them in 51% with 1 inch opening and 95% with maximal opening

  12. Proximity effect in normal-superconductor hybrids for quasiparticle traps

    Energy Technology Data Exchange (ETDEWEB)

    Hosseinkhani, Amin [Peter Grunberg Institute (PGI-2), Forschungszentrum Julich, D-52425 Julich (Germany); JARA-Institute for Quantum Information, RWTH Aachen University, D-52056 Aachen (Germany)

    2016-07-01

    Coherent transport of charges in the form of Cooper pairs is the main feature of Josephson junctions which plays a central role in superconducting qubits. However, the presence of quasiparticles in superconducting devices may lead to incoherent charge transfer and limit the coherence time of superconducting qubits. A way around this so-called ''quasiparticle poisoning'' might be using a normal-metal island to trap quasiparticles; this has motivated us to revisit the proximity effect in normal-superconductor hybrids. Using the semiclassical Usadel equations, we study the density of states (DoS) both within and away from the trap. We find that in the superconducting layer the DoS quickly approaches the BCS form; this indicates that normal-metal traps should be effective at localizing quasiparticles.

  13. Measurement of normal auditory ossicles by high-resolusion CT with application of normal criteria to disease cases

    International Nuclear Information System (INIS)

    Hara, Jyoko

    1988-01-01

    The purposes of this study were to define criteria for the normal position of ossicles and to apply them in patients with rhinolaryngologically or pathologically confirmed diseases. Ossicles were measured on high-resolution CT images of 300 middle ears, including 241 normal ears and 59 diseased ears, in a total of 203 subjects. Angles A, B, and C to the baseline between the most lateral margins of bilateral internal auditory canals, and distance ratio b/a were defined as measurement items. Normal angles A, B, and C and distance ratio b/a ranged from 19 deg to 59 deg, 101 deg to 145 deg, 51 deg to 89 deg, and 0.49 to 0.51, respectively. Based on these criteria, all of these items were within the normal range in 30/34 (88.2 %) ears for otitis media and mastoiditis. One or more items showed far abnormal values (standard deviation; more than 3) in 5/7 (71.4 %) ears for cholesteatoma and 4/4 (100 %) ears for external ear anomaly. These normal measurements may aid in evaluating the position of auditory ossicles especially in the case of cholesteatoma and auditory ossicle abnormality. (Namekawa, K.)

  14. Measurement of normal auditory ossicles by high-resolusion CT with application of normal criteria to disease cases

    Energy Technology Data Exchange (ETDEWEB)

    Hara, Jyoko

    1988-09-01

    The purposes of this study were to define criteria for the normal position of ossicles and to apply them in patients with rhinolaryngologically or pathologically confirmed diseases. Ossicles were measured on high-resolution CT images of 300 middle ears, including 241 normal ears and 59 diseased ears, in a total of 203 subjects. Angles A, B, and C to the baseline between the most lateral margins of bilateral internal auditory canals, and distance ratio b/a were defined as measurement items. Normal angles A, B, and C and distance ratio b/a ranged from 19 deg to 59 deg, 101 deg to 145 deg, 51 deg to 89 deg, and 0.49 to 0.51, respectively. Based on these criteria, all of these items were within the normal range in 30/34 (88.2 %) ears for otitis media and mastoiditis. One or more items showed far abnormal values (standard deviation; more than 3) in 5/7 (71.4 %) ears for cholesteatoma and 4/4 (100 %) ears for external ear anomaly. These normal measurements may aid in evaluating the position of auditory ossicles especially in the case of cholesteatoma and auditory ossicle abnormality. (Namekawa, K.).

  15. Normal range values for thromboelastography in healthy adult volunteers

    Directory of Open Access Journals (Sweden)

    S. Scarpelini

    2009-12-01

    Full Text Available Thromboelastography (TEG® provides a functional evaluation of coagulation. It has characteristics of an ideal coagulation test for trauma, but is not frequently used, partially due to lack of both standardized techniques and normal values. We determined normal values for our population, compared them to those of the manufacturer and evaluated the effect of gender, age, blood type, and ethnicity. The technique was standardized using citrated blood, kaolin and was performed on a Haemoscope 5000 device. Volunteers were interviewed and excluded if pregnant, on anticoagulants or having a bleeding disorder. The TEG® parameters analyzed were R, K, α, MA, LY30, and coagulation index. All volunteers outside the manufacturer’s normal range underwent extensive coagulation investigations. Reference ranges for 95% for 118 healthy volunteers were R: 3.8-9.8 min, K: 0.7-3.4 min, α: 47.8-77.7 degrees, MA: 49.7-72.7 mm, LY30: -2.3-5.77%, coagulation index: -5.1-3.6. Most values were significantly different from those of the manufacturer, which would have diagnosed coagulopathy in 10 volunteers, for whom additional investigation revealed no disease (81% specificity. Healthy women were significantly more hypercoagulable than men. Aging was not associated with hypercoagulability and East Asian ethnicity was not with hypocoagulability. In our population, the manufacturer’s normal values for citrated blood-kaolin had a specificity of 81% and would incorrectly identify 8.5% of the healthy volunteers as coagulopathic. This study supports the manufacturer’s recommendation that each institution should determine its own normal values before adopting TEG®, a procedure which may be impractical. Consideration should be given to a multi-institutional study to establish wide standard values for TEG®.

  16. Reading the New Standard ISA700

    Directory of Open Access Journals (Sweden)

    Daniel Botez

    2010-12-01

    Full Text Available Review of permanent professional standards is a requirement for professional bodies of professional accountants, resulting in broader processes of globalization and harmonization. A set of revised standards on financial audit engagement was published by IFAC in April 2009. International Standard on Auditing (ISA 700 "Forming an opinion and reporting on financial statements” is one of them. This standard deals with the auditor's responsibility to form an opinion on the financial statements and determine the form and content of the auditor's report issued following an audit of financial statements. Even if you do not have major changes, the revised standard contains several provisions that emphasize the important role of the auditor's report and more specifically defines its responsibility.

  17. Antimicrobial effects of herbal extracts on Streptococcus mutans and normal oral streptococci.

    Science.gov (United States)

    Lee, Sung-Hoon

    2013-08-01

    Streptococcus mutans is associated with dental caries. A cariogenic biofilm, in particular, has been studied extensively for its role in the formation of dental caries. Herbal extracts such as Cudrania tricuspidata, Sophora flavescens, Ginkgo biloba, and Betula Schmidtii have been used as a folk remedy for treating diseases. The purpose of this study was to evaluate and compare the antibacterial activity of herbal extracts against normal oral streptococci, planktonic and biofilm of S. mutans. Streptococcus gordonii, Streptococcus oralis, Streptococcus salivarius, Streptococcus sanguinis, and S. mutans were cultivated with brain heart infusion broth and susceptibility assay for the herbal extracts was performed according to the protocol of Clinical and Laboratory Standard Institute. Also, S. mutans biofilm was formed on a polystyrene 12-well plate and 8-well chamber glass slip using BHI broth containing 2% sucrose and 1% mannose after conditioning the plate and the glass slip with unstimulated saliva. The biofilm was treated with the herbal extracts in various concentrations and inoculated on Mitis-Salivarius bacitracin agar plate for enumeration of viable S. mutans by counting colony forming units. Planktonic S. mutans showed susceptibility to all of the extracts and S. mutans biofilm exhibited the highest level of sensitivity for the extracts of S. flavescens. The normal oral streptococci exhibited a weak susceptibility in comparison to S. mutans. S. oralis, however, was resistant to all of the extracts. In conclusion, the extract of S. flavescens may be a potential candidate for prevention and management of dental caries.

  18. Sketching Curves for Normal Distributions--Geometric Connections

    Science.gov (United States)

    Bosse, Michael J.

    2006-01-01

    Within statistics instruction, students are often requested to sketch the curve representing a normal distribution with a given mean and standard deviation. Unfortunately, these sketches are often notoriously imprecise. Poor sketches are usually the result of missing mathematical knowledge. This paper considers relationships which exist among…

  19. Normal radiographic findings. 4. act. ed.; Roentgennormalbefunde

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, T.B. [Gemeinschaftspraxis fuer Radiologie und Nuklearmedizin, Dillingen (Germany)

    2003-07-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  20. Weston Standard battery

    CERN Multimedia

    This is a Weston AOIP standard battery with its calibration certificate (1956). Inside, the glassware forms an "H". Its name comes from the British physicist Edward Weston. A standard is the materialization of a given quantity whose value is known with great accuracy.

  1. Advancing Normal Birth: Organizations, Goals, and Research

    OpenAIRE

    Hotelling, Barbara A.; Humenick, Sharron S.

    2005-01-01

    In this column, the support for advancing normal birth is summarized, based on a comparison of the goals of Healthy People 2010, Lamaze International, the Coalition for Improving Maternity Services, and the midwifery model of care. Research abstracts are presented to provide evidence that the midwifery model of care safely and economically advances normal birth. Rates of intervention experienced, as reported in the Listening to Mothers survey, are compared to the forms of care recommended by ...

  2. A study of prostate delineation referenced against a gold standard created from the visible human data

    International Nuclear Information System (INIS)

    Gao Zhanrong; Wilkins, David; Eapen, Libni; Morash, Christopher; Wassef, Youssef; Gerig, Lee

    2007-01-01

    Purpose: To measure inter- and intra-observer variation and systematic error in CT based prostate delineation, where individual delineations are referenced against a gold standard produced from photographic anatomical images from the Visible Human Project (VHP). Materials and methods: The CT and anatomical images of the VHP male form the basic data set for this study. The gold standard was established based on 1 mm thick anatomical photographic images. These were registered against the 3 mm thick CT images that were used for target delineation. A total of 120 organ delineations were performed by six radiation oncologists. Results: The physician delineated prostate volume was on average 30% larger than the 'true' prostate volume, but on average included only 84% of the gold standard volume. Our study found a systematic delineation error such that posterior portions of the prostate were always missed while anteriorly some normal tissue was always defined as target. Conclusions: Our data suggest that radiation oncologists are more concerned with the unintentional inclusion of rectal tissue than they are in missing prostate volume. In contrast, they are likely to overextend the anterior boundary of the prostate to encompass normal tissue such as the bladder

  3. European standards for composite construction

    NARCIS (Netherlands)

    Stark, J.W.B.

    2000-01-01

    The European Standards Organisation (CEN) has planned to develop a complete set of harmonized European building standards. This set includes standards for composite steel and concrete buildings and bridges. The Eurocodes, being the design standards, form part of this total system of European

  4. A Denotational Account of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2004-01-01

    We show that the standard normalization-by-evaluation construction for the simply-typed λβγ-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a recursively defined invariant relation, in the style of Pitts. In fact, the cons...... proof for the normalization algorithm, expressed as a functional program in an ML-like call-by-value language. A version of this article with detailed proofs is available as a technical report [5]....

  5. Fusion and normalization to enhance anomaly detection

    Science.gov (United States)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  6. Establishing the proteome of normal human cerebrospinal fluid.

    Directory of Open Access Journals (Sweden)

    Steven E Schutzer

    2010-06-01

    Full Text Available Knowledge of the entire protein content, the proteome, of normal human cerebrospinal fluid (CSF would enable insights into neurologic and psychiatric disorders. Until now technologic hurdles and access to true normal samples hindered attaining this goal.We applied immunoaffinity separation and high sensitivity and resolution liquid chromatography-mass spectrometry to examine CSF from healthy normal individuals. 2630 proteins in CSF from normal subjects were identified, of which 56% were CSF-specific, not found in the much larger set of 3654 proteins we have identified in plasma. We also examined CSF from groups of subjects previously examined by others as surrogates for normals where neurologic symptoms warranted a lumbar puncture but where clinical laboratory were reported as normal. We found statistically significant differences between their CSF proteins and our non-neurological normals. We also examined CSF from 10 volunteer subjects who had lumbar punctures at least 4 weeks apart and found that there was little variability in CSF proteins in an individual as compared to subject to subject.Our results represent the most comprehensive characterization of true normal CSF to date. This normal CSF proteome establishes a comparative standard and basis for investigations into a variety of diseases with neurological and psychiatric features.

  7. IMPLICATIONS OF STANDARDIZATION AND HARMONIZATION OF ACCOUNTING FOR ROMANIA

    OpenAIRE

    Mihaela Cristina Onica; Neculina Chebac

    2008-01-01

    Accountancy normalization supposes rules and accountancy standards making, rules which are accomplished in the way of common denominator concerning the way of action and implementation in order to realize a content comparison and accountancy in formations approach. Accountancy normalization process is structured in two main areas: national accountancy normalization and international accountancy normalization. Though international accountancy normalization concept has to be realized a differen...

  8. Determination of the main solid-state form of albendazole in bulk drug, employing Raman spectroscopy coupled to multivariate analysis.

    Science.gov (United States)

    Calvo, Natalia L; Arias, Juan M; Altabef, Aída Ben; Maggio, Rubén M; Kaufman, Teodoro S

    2016-09-10

    Albendazole (ALB) is a broad-spectrum anthelmintic, which exhibits two solid-state forms (Forms I and II). The Form I is the metastable crystal at room temperature, while Form II is the stable one. Because the drug has poor aqueous solubility and Form II is less soluble than Form I, it is desirable to have a method to assess the solid-state form of the drug employed for manufacturing purposes. Therefore, a Partial Least Squares (PLS) model was developed for the determination of Form I of ALB in its mixtures with Form II. For model development, both solid-state forms of ALB were prepared and characterized by microscopic (optical and with normal and polarized light), thermal (DSC) and spectroscopic (ATR-FTIR, Raman) techniques. Mixtures of solids in different ratios were prepared by weighing and mechanical mixing of the components. Their Raman spectra were acquired, and subjected to peak smoothing, normalization, standard normal variate correction and de-trending, before performing the PLS calculations. The optimal spectral region (1396-1280cm(-1)) and number of latent variables (LV=3) were obtained employing a moving window of variable size strategy. The method was internally validated by means of the leave one out procedure, providing satisfactory statistics (r(2)=0.9729 and RMSD=5.6%) and figures of merit (LOD=9.4% and MDDC=1.4). Furthermore, the method's performance was also evaluated by analysis of two validation sets. Validation set I was used for assessment of linearity and range and Validation set II, to demonstrate accuracy and precision (Recovery=101.4% and RSD=2.8%). Additionally, a third set of spiked commercial samples was evaluated, exhibiting excellent recoveries (94.2±6.4%). The results suggest that the combination of Raman spectroscopy with multivariate analysis could be applied to the assessment of the main crystal form and its quantitation in samples of ALB bulk drug, in the routine quality control laboratory. Copyright © 2016 Elsevier B.V. All

  9. Quantification of endogenous metabolites by the postcolumn infused-internal standard method combined with matrix normalization factor in liquid chromatography-electrospray ionization tandem mass spectrometry.

    Science.gov (United States)

    Liao, Hsiao-Wei; Chen, Guan-Yuan; Wu, Ming-Shiang; Liao, Wei-Chih; Tsai, I-Lin; Kuo, Ching-Hua

    2015-01-02

    Quantification of endogenous metabolites has enabled the discovery of biomarkers for diagnosis and provided for an understanding of disease etiology. The standard addition and stable isotope labeled-internal standard (SIL-IS) methods are currently the most widely used approaches to quantifying endogenous metabolites, but both have some limitations for clinical measurement. In this study, we developed a new approach for endogenous metabolite quantification by the postcolumn infused-internal standard (PCI-IS) method combined with the matrix normalization factor (MNF) method. MNF was used to correct the difference in MEs between standard solution and biofluids, and PCI-IS additionally tailored the correction of the MEs for individual samples. Androstenedione and testosterone were selected as test articles to verify this new approach to quantifying metabolites in plasma. The repeatability (n=4 runs) and intermediate precision (n=3 days) in terms of the peak area of androstenedione and testosterone at all tested concentrations were all less than 11% relative standard deviation (RSD). The accuracy test revealed that the recoveries were between 95.72% and 113.46%. The concentrations of androstenedione and testosterone in fifty plasma samples obtained from healthy volunteers were quantified by the PCI-IS combined with the MNF method, and the quantification results were compared with the results of the SIL-IS method. The Pearson correlation test showed that the correlation coefficient was 0.98 for both androstenedione and testosterone. We demonstrated that the PCI-IS combined with the MNF method is an effective and accurate method for quantifying endogenous metabolites. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. 40 CFR 406.50 - Applicability; description of the normal rice milling subcategory.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Applicability; description of the normal rice milling subcategory. 406.50 Section 406.50 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Normal Rice...

  11. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Overburden Stress Normalization and Rod Length Corrections for the Standard Penetration Test (SPT)

    OpenAIRE

    Deger, Tonguc Tolga

    2014-01-01

    The Standard Penetration Test (SPT) has been a staple of geotechnical engineering practice for more than 70 years. Empirical correlations based on in situ SPT data provide an important basis for assessment of a broad range of engineering parameters, and for empirically based analysis and design methods spanning a significant number of areas of geotechnical practice. Despite this longstanding record of usage, the test itself is relatively poorly standardized with regard to the allowable variab...

  13. Standard Test Method for Normal Spectral Emittance at Elevated Temperatures of Nonconducting Specimens

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1971-01-01

    1.1 This test method describes an accurate technique for measuring the normal spectral emittance of electrically nonconducting materials in the temperature range from 1000 to 1800 K, and at wavelengths from 1 to 35 μm. It is particularly suitable for measuring the normal spectral emittance of materials such as ceramic oxides, which have relatively low thermal conductivity and are translucent to appreciable depths (several millimetres) below the surface, but which become essentially opaque at thicknesses of 10 mm or less. 1.2 This test method requires expensive equipment and rather elaborate precautions, but produces data that are accurate to within a few percent. It is particularly suitable for research laboratories, where the highest precision and accuracy are desired, and is not recommended for routine production or acceptance testing. Because of its high accuracy, this test method may be used as a reference method to be applied to production and acceptance testing in case of dispute. 1.3 This test metho...

  14. Weight, iodine content and iodine uptake of the thyroid gland of normal Japanese

    International Nuclear Information System (INIS)

    Yoshizawa, Yasuo; Kusama, Tomoko

    1976-01-01

    Various questions arise in the application of ICRP ''Standard Man'' values to Japanese. One of the questions is that ''Standard Man'' values of the thyroid are different from normal Japanese values. A systematic survey of past reports was carried out with a view to search for normal Japanese values of the thyroid. The subjects of search were weight, iodine content and iodine uptake rate (f sub(w)) of the thyroid. These are important factors in the estimation of the radiation dose of the thyroid caused by internal contamination of radioiodine, and are foreseen to have the difference between Japanese and ''Standard Man''. The result of study suggested that the weight of the thyroid of normal Japanese is about 19 g for adult male and about 17 g for adult female, and that the iodine content is 12-22 mg and iodine uptake rate (f sub(w)) is about 0.2. (auth.)

  15. Evaluation of Normalization Methods to Pave the Way Towards Large-Scale LC-MS-Based Metabolomics Profiling Experiments

    Science.gov (United States)

    Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya

    2013-01-01

    Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607

  16. ESTUDIO ESTADÍSTICO DEL NÚMERO DE REGLAS RESULTANTES AL TRANSFORMAR UNA GRAMÁTICA LIBRE DE CONTEXTO A LA FORMA NORMAL DE CHOMSKY STATISTICAL STUDY OF THE NUMBER OF RESULTING RULES WHEN TRANSFORMING A CONTEXT-FREE GRAMMAR TO CHOMSKY NORMAL FORM

    Directory of Open Access Journals (Sweden)

    Fredy Ángel Miguel Amaya Robayo

    2010-08-01

    Full Text Available Es un hecho conocido que toda gramática libre de contexto puede ser transformada a la forma normal de Chomsky de tal forma que los lenguajes generados por las dos gramáticas son equivalentes. Una gramática en forma normal de Chomsky (FNC, tiene algunas ventajas, por ejemplo sus árboles de derivación son binarios, la forma de sus reglas más simples etc. Por eso es siempre deseable poder trabajar con una gramática en FNC en las aplicaciones que lo requieran. Existe un algoritmo que permite transformar una gramática libre de contexto a una en FNC, sin embargo la cantidad de reglas generadas al hacer la transformación depende del número de reglas en la gramática inicial así como de otras características. En este trabajo se analiza desde el punto de vista experimental y estadístico, la relación existente entre el número de reglas iniciales y el número de reglas que resultan luego de transformar una Gramática Libre de Contexto a la FNC. Esto permite planificar la cantidad de recursos computacionales necesarios en caso de tratar con gramáticas de alguna complejidad.It is well known that any context-free grammar can be transformed to the Chomsky normal form so that the languages generated by each one are equivalent. A grammar in Chomsky Normal Form (CNF, has some advantages: their derivation trees are binary, simplest rules and so on. So it is always desirable to work with a grammar in CNF in applications that require them. There is an algorithm that can transform a context-free grammar to one CNF grammar, however the number of rules generated after the transformation depends on the initial grammar and other circumstances. In this work we analyze from the experimental and statistical point of view the relationship between the number of initial rules and the number of resulting rules after transforming. This allows you to plan the amount of computational resources needed in case of dealing with grammars of some complexity.

  17. The morphological classification of normal and abnormal red blood cell using Self Organizing Map

    Science.gov (United States)

    Rahmat, R. F.; Wulandari, F. S.; Faza, S.; Muchtar, M. A.; Siregar, I.

    2018-02-01

    Blood is an essential component of living creatures in the vascular space. For possible disease identification, it can be tested through a blood test, one of which can be seen from the form of red blood cells. The normal and abnormal morphology of the red blood cells of a patient is very helpful to doctors in detecting a disease. With the advancement of digital image processing technology can be used to identify normal and abnormal blood cells of a patient. This research used self-organizing map method to classify the normal and abnormal form of red blood cells in the digital image. The use of self-organizing map neural network method can be implemented to classify the normal and abnormal form of red blood cells in the input image with 93,78% accuracy testing.

  18. Physical Characteristics of Laboratory Tested Concrete as a Substituion of Gravel on Normal Concrete

    Science.gov (United States)

    Butar-butar, Ronald; Suhairiani; Wijaya, Kinanti; Sebayang, Nono

    2018-03-01

    Concrete technology is highly potential in the field of construction for structural and non-structural construction. The amount uses of this concrete material raise the problem of solid waste in the form of concrete remaining test results in the laboratory. This waste is usually just discarded and not economically valuable. In solving the problem, this experiment was made new materials by using recycle material in the form of recycled aggregate which aims to find out the strength characteristics of the used concrete as a gravel substitution material on the normal concrete and obtain the value of the substitution composition of gravel and used concrete that can achieve the strength of concrete according to the standard. Testing of concrete characteristic is one of the requirements before starting the concrete mixture. This test using SNI method (Indonesian National Standard) with variation of comparison (used concrete : gravel) were 15: 85%, 25: 75%, 35:65%, 50:50 %, 75: 25%. The results of physical tests obtained the mud content value of the mixture gravel and used concrete is 0.03 larger than the standard of SNI 03-4142-1996 that is equal to 1.03%. so the need watering or soaking before use. The water content test results show an increase in the water content value if the composition of the used concrete increases. While the specific gravity value for variation 15: 85% until 35: 65% fulfilled the requirements of SNI 03-1969-1990. the other variasion show the specifics gravity value included on the type of light materials.

  19. ASSESSMENT OF SELECTED PROPERTIES OF NORMAL CONCRETES WITH THE GRINDED RUBBER FROM WORN OUT VEHICLE TYRES

    Directory of Open Access Journals (Sweden)

    Ewa Ołdakowska

    2015-07-01

    Full Text Available Rubber from the worn tyres is associated with a useless material, strenuous for environment, whose most popular recovery method until recently was storage (currently forbidden by law. The adoption and dissemination of new ecological standards, created not only by the European and national legislation, but also developing as a result of expanding ecological consciousness, forces the necessity of seeking efficient methods of utilization of the vehicle tyres. The exemplary solution for the problem of tyres withdrawn from the operation, presented in the article, is using them in the grinded form as a substitute for the natural aggregate for the production of normal concrete. The article presents the results of the tests of selected properties of the modified normal concrete, upon the basis of which it has been found that the rubber causes decrease of compression strength, concrete weight, limits water absorbability, and does not influence significantly the physical and chemical phenomena accompanying the composite structure formation.

  20. Schema Design and Normalization Algorithm for XML Databases Model

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2009-06-01

    Full Text Available In this paper we study the problem of schema design and normalization in XML databases model. We show that, like relational databases, XML documents may contain redundant information, and this redundancy may cause update anomalies. Furthermore, such problems are caused by certain functional dependencies among paths in the document. Based on our research works, in which we presented the functional dependencies and normal forms of XML Schema, we present the decomposition algorithm for converting any XML Schema into normalized one, that satisfies X-BCNF.

  1. Manufacturing technology for practical Josephson voltage normals; Fertigungstechnologie fuer praxistaugliche Josephson-Spannungsnormale

    Energy Technology Data Exchange (ETDEWEB)

    Kohlmann, Johannes; Kieler, Oliver [Physikalisch-Technische Bundesanstalt (PTB), Braunschweig (Germany). Arbeitsgruppe 2.43 ' ' Josephson-Schaltungen' '

    2016-09-15

    In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.

  2. FDG-PET of patients with suspected renal failure. Standardized uptake values in normal tissues

    International Nuclear Information System (INIS)

    Minamimoto, Ryogo; Takahashi, Nobukazu; Inoue, Tomio

    2007-01-01

    This study aims to clarify the effect of renal function on 2-[ 18 F] fluoro-2-deoxy-D-glucose positron emission tomography (FDG-PET) imaging and determine the clinical significance of renal function in this setting. We compared FDG distribution between normal volunteers and patients with suspected renal failure. Twenty healthy volunteers and 20 patients with suspected renal failure who underwent FDG-PET between November 2002 and May 2005 were selected for this study. We define ''patients with suspected renal failure'' as having a blood serum creatinine level in excess of 1.1 mg/dl. The serum creatinine level was examined once in 2 weeks of the FDG-PET study. Regions of interest were placed over 15 regions for semi-quantitative analysis: the white matter, cortex, both upper lung fields, both middle lung fields, both lower lung fields, mediastinum, myocardium of the left ventricle, the left atrium as a cardiac blood pool, central region of the right lobe of the liver, left kidney, and both femoris muscles. The mean standardized uptake values (SUVs) of brain cortex and white matter were higher in healthy volunteers than in renal patients. The mean SUVs of the mediastinum at the level of the aortic arch and left atrium as a cardiac blood pool were lower in healthy volunteers than in patients with suspected renal failure. These regions differed between healthy volunteers and patients with suspected renal failure (P<0.05). We found decreasing brain accumulation and increasing blood pool accumulation of FDG in patients with high plasma creatinine. Although the difference is small, this phenomenon will not have a huge effect on the assessment of FDG-PET imaging in patients with suspected renal failure. (author)

  3. 40 CFR 417.166 - Pretreatment standards for new sources.

    Science.gov (United States)

    2010-07-01

    ... GUIDELINES AND STANDARDS SOAP AND DETERGENT MANUFACTURING POINT SOURCE CATEGORY Manufacture of Liquid... standard shall be: (1) For normal liquid detergent operations the following values pertain: Pollutant or...

  4. Successive Standardization of Rectangular Arrays

    Directory of Open Access Journals (Sweden)

    Richard A. Olshen

    2012-02-01

    Full Text Available In this note we illustrate and develop further with mathematics and examples, the work on successive standardization (or normalization that is studied earlier by the same authors in [1] and [2]. Thus, we deal with successive iterations applied to rectangular arrays of numbers, where to avoid technical difficulties an array has at least three rows and at least three columns. Without loss, an iteration begins with operations on columns: first subtract the mean of each column; then divide by its standard deviation. The iteration continues with the same two operations done successively for rows. These four operations applied in sequence completes one iteration. One then iterates again, and again, and again, ... In [1] it was argued that if arrays are made up of real numbers, then the set for which convergence of these successive iterations fails has Lebesgue measure 0. The limiting array has row and column means 0, row and column standard deviations 1. A basic result on convergence given in [1] is true, though the argument in [1] is faulty. The result is stated in the form of a theorem here, and the argument for the theorem is correct. Moreover, many graphics given in [1] suggest that except for a set of entries of any array with Lebesgue measure 0, convergence is very rapid, eventually exponentially fast in the number of iterations. Because we learned this set of rules from Bradley Efron, we call it “Efron’s algorithm”. More importantly, the rapidity of convergence is illustrated by numerical examples.

  5. Sandstone-filled normal faults: A case study from central California

    Science.gov (United States)

    Palladino, Giuseppe; Alsop, G. Ian; Grippa, Antonio; Zvirtes, Gustavo; Phillip, Ruy Paulo; Hurst, Andrew

    2018-05-01

    Despite the potential of sandstone-filled normal faults to significantly influence fluid transmissivity within reservoirs and the shallow crust, they have to date been largely overlooked. Fluidized sand, forcefully intruded along normal fault zones, markedly enhances the transmissivity of faults and, in general, the connectivity between otherwise unconnected reservoirs. Here, we provide a detailed outcrop description and interpretation of sandstone-filled normal faults from different stratigraphic units in central California. Such faults commonly show limited fault throw, cm to dm wide apertures, poorly-developed fault zones and full or partial sand infill. Based on these features and inferences regarding their origin, we propose a general classification that defines two main types of sandstone-filled normal faults. Type 1 form as a consequence of the hydraulic failure of the host strata above a poorly-consolidated sandstone following a significant, rapid increase of pore fluid over-pressure. Type 2 sandstone-filled normal faults form as a result of regional tectonic deformation. These structures may play a significant role in the connectivity of siliciclastic reservoirs, and may therefore be crucial not just for investigation of basin evolution but also in hydrocarbon exploration.

  6. Relation between Protein Intrinsic Normal Mode Weights and Pre-Existing Conformer Populations.

    Science.gov (United States)

    Ozgur, Beytullah; Ozdemir, E Sila; Gursoy, Attila; Keskin, Ozlem

    2017-04-20

    Intrinsic fluctuations of a protein enable it to sample a large repertoire of conformers including the open and closed forms. These distinct forms of the protein called conformational substates pre-exist together in equilibrium as an ensemble independent from its ligands. The role of ligand might be simply to alter the equilibrium toward the most appropriate form for binding. Normal mode analysis is proved to be useful in identifying the directions of conformational changes between substates. In this study, we demonstrate that the ratios of normalized weights of a few normal modes driving the protein between its substates can give insights about the ratios of kinetic conversion rates of the substates, although a direct relation between the eigenvalues and kinetic conversion rates or populations of each substate could not be observed. The correlation between the normalized mode weight ratios and the kinetic rate ratios is around 83% on a set of 11 non-enzyme proteins and around 59% on a set of 17 enzymes. The results are suggestive that mode motions carry intrinsic relations with thermodynamics and kinetics of the proteins.

  7. Complete normal ordering 1: Foundations

    Directory of Open Access Journals (Sweden)

    John Ellis

    2016-08-01

    Full Text Available We introduce a new prescription for quantising scalar field theories (in generic spacetime dimension and background perturbatively around a true minimum of the full quantum effective action, which is to ‘complete normal order’ the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all ‘cephalopod’ Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of ‘complete normal ordering’ (which is an extension of the standard field theory definition of normal ordering reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative interactions, and by using a point splitting ‘trick’ we extend this result to theories with derivative interactions, such as those appearing as non-linear σ-models in the world-sheet formulation of string theory. We focus here on theories with trivial vacua, generalising the discussion to non-trivial vacua in a follow-up paper.

  8. Selective attention in normal and impaired hearing.

    Science.gov (United States)

    Shinn-Cunningham, Barbara G; Best, Virginia

    2008-12-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.

  9. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Science.gov (United States)

    Edneral, Victor

    2018-02-01

    This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  10. On The Extensive Form Of N-Person Cooperative Games | Udeh ...

    African Journals Online (AJOL)

    On The Extensive Form Of N-Person Cooperative Games. ... games. Keywords: Extensive form game, Normal form game, characteristic function, Coalition, Imputation, Player, Payoff, Strategy and Core ... AJOL African Journals Online. HOW TO ...

  11. Strength of Gamma Rhythm Depends on Normalization

    Science.gov (United States)

    Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.

    2013-01-01

    Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427

  12. Self-Esteem of Gifted, Normal, and Mild Mentally Handicapped Children.

    Science.gov (United States)

    Chiu, Lian-Hwang

    1990-01-01

    Administered Coopersmith Self-Esteem Inventory (SEI) Form B to elementary school students (N=450) identified as gifted, normal, and mild mentally handicapped (MiMH). Results indicated that both the gifted and normal children had significantly higher self-esteem than did the MiMH children, but there were no differences between gifted and normal…

  13. Evaluation of Postprandial Total Antioxidant Activity in Normal and Overweight Individuals

    Directory of Open Access Journals (Sweden)

    Fatma Arslan

    2016-09-01

    Full Text Available Aim: Postprandial changes acutely alter some mechanisms in body. There are many studies showing blood oxidative status changes after food intake, and supplementation. The aim of the present study was to evaluate the effects of a standardized meal on serum total antioxidant activity (TAA in normal weight and overweight individuals. Material and Method: Fourteen normal weight and twelve overweight individuals were given a standardized meal in the morning after an overnight fast. Serum TAA, glucose, total cholesterol, HDL cholesterol, LDL cholesterol, and triglyceride concentrations were measured at baseline, 3rd hour, and 6th hour after the meal in both groups.Results: In both normal and overweight groups, the difference between baseline and 3rd hour was significant for TAA. The TAA of the overweight group was also significantly lower than the TAA of the normal weight group at 3rd hour. However, there was no significant correlation between lipid parameters and TAA levels. Discussion: The present study shows that postprandial oxidative damage occurs more prominently in overweight individuals than in normal weight individuals. Postprandial changes acutely induce oxidative stress and impair the natural antioxidant defense mechanism. It should be noted that consuming foods with antioxidants in order to avoid various diseases and complications is useful, particularly in obese subjects.

  14. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Directory of Open Access Journals (Sweden)

    Edneral Victor

    2018-01-01

    Full Text Available This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  15. Design of Normal Concrete Mixtures Using Workability-Dispersion-Cohesion Method

    Directory of Open Access Journals (Sweden)

    Hisham Qasrawi

    2016-01-01

    Full Text Available The workability-dispersion-cohesion method is a new proposed method for the design of normal concrete mixes. The method uses special coefficients called workability-dispersion and workability-cohesion factors. These coefficients relate workability to mobility and stability of the concrete mix. The coefficients are obtained from special charts depending on mix requirements and aggregate properties. The method is practical because it covers various types of aggregates that may not be within standard specifications, different water to cement ratios, and various degrees of workability. Simple linear relationships were developed for variables encountered in the mix design and were presented in graphical forms. The method can be used in countries where the grading or fineness of the available materials is different from the common international specifications (such as ASTM or BS. Results were compared to the ACI and British methods of mix design. The method can be extended to cover all types of concrete.

  16. 40 CFR 406.30 - Applicability; description of the normal wheat flour milling subcategory.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Applicability; description of the normal wheat flour milling subcategory. 406.30 Section 406.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Normal...

  17. Method for construction of normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1998-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.

  18. Normal central retinal function and structure preserved in retinitis pigmentosa.

    Science.gov (United States)

    Jacobson, Samuel G; Roman, Alejandro J; Aleman, Tomas S; Sumaroka, Alexander; Herrera, Waldo; Windsor, Elizabeth A M; Atkinson, Lori A; Schwartz, Sharon B; Steinberg, Janet D; Cideciyan, Artur V

    2010-02-01

    To determine whether normal function and structure, as recently found in forms of Usher syndrome, also occur in a population of patients with nonsyndromic retinitis pigmentosa (RP). Patients with simplex, multiplex, or autosomal recessive RP (n = 238; ages 9-82 years) were studied with static chromatic perimetry. A subset was evaluated with optical coherence tomography (OCT). Co-localized visual sensitivity and photoreceptor nuclear layer thickness were measured across the central retina to establish the relationship of function and structure. Comparisons were made to patients with Usher syndrome (n = 83, ages 10-69 years). Cross-sectional psychophysical data identified patients with RP who had normal rod- and cone-mediated function in the central retina. There were two other patterns with greater dysfunction, and longitudinal data confirmed that progression can occur from normal rod and cone function to cone-only central islands. The retinal extent of normal laminar architecture by OCT corresponded to the extent of normal visual function in patients with RP. Central retinal preservation of normal function and structure did not show a relationship with age or retained peripheral function. Usher syndrome results were like those in nonsyndromic RP. Regional disease variation is a well-known finding in RP. Unexpected was the observation that patients with presumed recessive RP can have regions with functionally and structurally normal retina. Such patients will require special consideration in future clinical trials of either focal or systemic treatment. Whether there is a common molecular mechanism shared by forms of RP with normal regions of retina warrants further study.

  19. Analysis of Fringe Field Formed Inside LDA Measurement Volume Using Compact Two Hololens Imaging Systems

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.; Yadav, H. L.

    2018-03-01

    We have designed and fabricated four LDA optical setups consisting of aberration compensated four different compact two hololens imaging systems. We have experimentally investigated and realized a hololens recording geometry which is interferogram of converging spherical wavefront with mutually coherent planar wavefront. Proposed real time monitoring and actual fringe field analysis techniques allow complete characterizations of fringes formed at measurement volume and permit to evaluate beam quality, alignment and fringe uniformity with greater precision. After experimentally analyzing the fringes formed at measurement volume by all four imaging systems, it is found that fringes obtained using compact two hololens imaging systems get improved both qualitatively and quantitatively compared to that obtained using conventional imaging system. Results indicate qualitative improvement of non-uniformity in fringe thickness and micro intensity variations perpendicular to the fringes, and quantitative improvement of 39.25% in overall average normalized standard deviations of fringe width formed by compact two hololens imaging systems compare to that of conventional imaging system.

  20. A hybrid electron and photon IMRT planning technique that lowers normal tissue integral patient dose using standard hardware.

    Science.gov (United States)

    Rosca, Florin

    2012-06-01

    To present a mixed electron and photon IMRT planning technique using electron beams with an energy range of 6-22 MeV and standard hardware that minimizes integral dose to patients for targets as deep as 7.5 cm. Ten brain cases, two lung, a thyroid, an abdominal, and a parotid case were planned using two planning techniques: a photon-only IMRT (IMRT) versus a mixed modality treatment (E+IMRT) that includes an enface electron beam and a photon IMRT portion that ensures a uniform target coverage. The electron beam is delivered using a regular cutout placed in an electron cone. The electron energy was chosen to provide a good trade-off between minimizing integral dose and generating a uniform, deliverable plan. The authors choose electron energies that cover the deepest part of PTV with the 65%-70% isodose line. The normal tissue integral dose, the dose for ring structures around the PTV, and the volumes of the 75%, 50%, and 25% isosurfaces were used to compare the dose distributions generated by the two planning techniques. The normal tissue integral dose was lowered by about 20% by the E+IMRT plans compared to the photon-only IMRT ones for most studied cases. With the exception of lungs, the dose reduction associated to the E+IMRT plans was more pronounced further away from the target. The average dose ratio delivered to the 0-2 cm and the 2-4 cm ring structures for brain patients for the two planning techniques were 89.6% and 70.8%, respectively. The enhanced dose sparing away from the target for the brain patients can also be observed in the ratio of the 75%, 50%, and 25% isodose line volumes for the two techniques, which decreases from 85.5% to 72.6% and further to 65.1%, respectively. For lungs, the lateral electron beams used in the E+IMRT plans were perpendicular to the mostly anterior/posterior photon beams, generating much more conformal plans. The authors proved that even using the existing electron delivery hardware, a mixed electron/photon planning

  1. Normalized inverse characterization of sound absorbing rigid porous media.

    Science.gov (United States)

    Zieliński, Tomasz G

    2015-06-01

    This paper presents a methodology for the inverse characterization of sound absorbing rigid porous media, based on standard measurements of the surface acoustic impedance of a porous sample. The model parameters need to be normalized to have a robust identification procedure which fits the model-predicted impedance curves with the measured ones. Such a normalization provides a substitute set of dimensionless (normalized) parameters unambiguously related to the original model parameters. Moreover, two scaling frequencies are introduced, however, they are not additional parameters and for different, yet reasonable, assumptions of their values, the identification procedure should eventually lead to the same solution. The proposed identification technique uses measured and computed impedance curves for a porous sample not only in the standard configuration, that is, set to the rigid termination piston in an impedance tube, but also with air gaps of known thicknesses between the sample and the piston. Therefore, all necessary analytical formulas for sound propagation in double-layered media are provided. The methodology is illustrated by one numerical test and by two examples based on the experimental measurements of the acoustic impedance and absorption of porous ceramic samples of different thicknesses and a sample of polyurethane foam.

  2. 48 CFR 49.602-2 - Inventory forms.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Inventory forms. 49.602-2... TERMINATION OF CONTRACTS Contract Termination Forms and Formats 49.602-2 Inventory forms. Standard Form (SF) 1428, Inventory Disposal Schedule, and SF 1429, Inventory Disposal Schedule—Continuation Sheet, shall...

  3. Clinical and psychological features of normal-weight women with subthreshold anorexia nervosa: a pilot case-control observational study.

    Science.gov (United States)

    Tagliabue, Anna; Ferraris, Cinzia; Martinelli, Valentina; Pinelli, Giovanna; Repossi, Ilaria; Trentani, Claudia

    2012-01-01

    Weight preoccupations have been frequently reported in normal-weight subjects. Subthreshold anorexia nervosa (s-AN, all DSM IV TR criteria except amenorrhea or underweight) is a form of eating disorder not otherwise specified that has received scarce scientific attention. Under a case-control design we compared the general characteristics, body composition, and psychopathological features of normal-weight patients with s-AN with those of BMI- and sex-matched controls. Participants in this pilot study included 9 normal-weight women who met the DSM IV TR criteria for s-AN and 18 BMI-matched normal-weight controls. The general characteristics of the study participants were collected by questionnaire. Body composition was measured by bioelectrical impedance. Behavioral and psychological measures included the standardized symptom checklist (SCL-90-R) and the eating disorder inventory (EDI-2). There were no differences in age, education, employment status, marital status, and history of previous slimming treatment in the two study groups. In addition, anthropometric measures and body composition of s-AN patients and BMI-matched normal weight controls were not significantly different. In the s-AN subgroup, we found a significant relationship between waist circumference and the SCL-90-R obsessivity-compulsivity scale (n=9, r=-0.69, pstudy cohort. These pilot results suggest that psychopathological criteria (particularly related to the obsessivity-compulsivity dimension) may be more useful than anthropometric measures for screening of s-AN in normal-weight women.

  4. Disjoint sum forms in reliability theory

    Directory of Open Access Journals (Sweden)

    B. Anrig

    2014-01-01

    Full Text Available The structure function f of a binary monotone system is assumed to be known and given in a disjunctive normal form, i.e. as the logical union of products of the indicator variables of the states of its subsystems. Based on this representation of f, an improved Abraham algorithm is proposed for generating the disjoint sum form of f. This form is the base for subsequent numerical reliability calculations. The approach is generalized to multivalued systems. Examples are discussed.

  5. Status of conversion of DOE standards to non-Government standards

    Energy Technology Data Exchange (ETDEWEB)

    Moseley, H.L.

    1992-07-01

    One major goal of the DOE Technical Standards Program is to convert existing DOE standards into non-Government standards (NGS's) where possible. This means that a DOE standard may form the basis for a standards-writing committee to produce a standard in the same subject area using the non-Government standards consensus process. This report is a summary of the activities that have evolved to effect conversion of DOE standards to NGSs, and the status of current conversion activities. In some cases, all requirements in a DOE standard will not be incorporated into the published non-Government standard because these requirements may be considered too restrictive or too specific for broader application by private industry. If requirements in a DOE standard are not incorporated in a non-Government standard and the requirements are considered necessary for DOE program applications, the DOE standard will be revised and issued as a supplement to the non-Government standard. The DOE standard will contain only those necessary requirements not reflected by the non-Government standard. Therefore, while complete conversion of DOE standards may not always be realized, the Department's technical standards policy as stated in Order 1300.2A has been fully supported in attempting to make maximum use of the non-Government standard.

  6. Status of conversion of DOE standards to non-Government standards

    Energy Technology Data Exchange (ETDEWEB)

    Moseley, H.L.

    1992-07-01

    One major goal of the DOE Technical Standards Program is to convert existing DOE standards into non-Government standards (NGS`s) where possible. This means that a DOE standard may form the basis for a standards-writing committee to produce a standard in the same subject area using the non-Government standards consensus process. This report is a summary of the activities that have evolved to effect conversion of DOE standards to NGSs, and the status of current conversion activities. In some cases, all requirements in a DOE standard will not be incorporated into the published non-Government standard because these requirements may be considered too restrictive or too specific for broader application by private industry. If requirements in a DOE standard are not incorporated in a non-Government standard and the requirements are considered necessary for DOE program applications, the DOE standard will be revised and issued as a supplement to the non-Government standard. The DOE standard will contain only those necessary requirements not reflected by the non-Government standard. Therefore, while complete conversion of DOE standards may not always be realized, the Department`s technical standards policy as stated in Order 1300.2A has been fully supported in attempting to make maximum use of the non-Government standard.

  7. Measurements of normal joint angles by goniometry in calves.

    Science.gov (United States)

    Sengöz Şirin, O; Timuçin Celik, M; Ozmen, A; Avki, S

    2014-01-01

    The aim of this study was to establish normal reference values of the forelimb and hindlimb joint angles in normal Holstein calves. Thirty clinically normal Holstein calves that were free of any detectable musculoskeletal abnormalities were included in the study. A standard transparent plastic goniometer was used to measure maximum flexion, maximum extension, and range-of-motion of the shoulder, elbow, carpal, hip, stifle, and tarsal joints. The goniometric measurements were done on awake calves that were positioned in lateral recumbency. The goniometric values were measured and recorded by two independent investigators. As a result of the study it was concluded that goniometric values obtained from awake calves in lateral recumbency were found to be highly consistent and accurate between investigators (p <0.05). The data of this study acquired objective and useful information on the normal forelimb and hindlimb joint angles in normal Holstein calves. Further studies can be done to predict detailed goniometric values from different diseases and compare them.

  8. Results of radionuclide ventriculography in normal children and adolescents

    International Nuclear Information System (INIS)

    Reich, O.; Krejcir, M.; Ruth, C.

    1989-01-01

    In order to assess the range of normal values in radionuclide ventriculography, 53 normal children and adolescents were selected in retrospect. All were exdamined by radionuclide angiocardiography on account of clinical and echocardiographical suspicion of congenital heart disease with a left-to-right shunt; a significant shunt was, however, excluded. In all patients, after equilibration of the radiopharmaceutical the ventricular function was examined by radionuclide ventriculography. The usual volume, time and rate characteristics were evaluated. The normal range was defined as the mean ±2 standard deviations which is 47 to 72% for the ejection fraction of the left ventricle and 31 to 56% for the ejection fraction of the right ventricle. (author). 2 tabs., 18 refs

  9. 7 CFR 51.315 - Fairly well formed.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946... Standards for Grades of Apples Definitions § 51.315 Fairly well formed. “Fairly well formed” means that the...

  10. 40 CFR 417.165 - Standards of performance for new sources.

    Science.gov (United States)

    2010-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS SOAP AND DETERGENT MANUFACTURING POINT SOURCE CATEGORY Manufacture of Liquid Detergents Subcategory § 417.165 Standards of performance for new sources. The following standards...) For normal liquid detergent operations the following values pertain: Effluent characteristic Effluent...

  11. Standard test method for splitting tensile strength for brittle nuclear waste forms

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1989-01-01

    1.1 This test method is used to measure the static splitting tensile strength of cylindrical specimens of brittle nuclear waste forms. It provides splitting tensile-strength data that can be used to compare the strength of waste forms when tests are done on one size of specimen. 1.2 The test method is applicable to glass, ceramic, and concrete waste forms that are sufficiently homogeneous (Note 1) but not to coated-particle, metal-matrix, bituminous, or plastic waste forms, or concretes with large-scale heterogeneities. Cementitious waste forms with heterogeneities >1 to 2 mm and 5 mm can be tested using this procedure provided the specimen size is increased from the reference size of 12.7 mm diameter by 6 mm length, to 51 mm diameter by 100 mm length, as recommended in Test Method C 496 and Practice C 192. Note 1—Generally, the specimen structural or microstructural heterogeneities must be less than about one-tenth the diameter of the specimen. 1.3 This test method can be used as a quality control chec...

  12. Comparative Study of Various Normal Mode Analysis Techniques Based on Partial Hessians

    OpenAIRE

    GHYSELS, AN; VAN SPEYBROECK, VERONIQUE; PAUWELS, EWALD; CATAK, SARON; BROOKS, BERNARD R.; VAN NECK, DIMITRI; WAROQUIER, MICHEL

    2010-01-01

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and...

  13. Dosimetry standards for radiation processing

    International Nuclear Information System (INIS)

    Farrar, H. IV

    1999-01-01

    For irradiation treatments to be reproducible in the laboratory and then in the commercial environment, and for products to have certified absorbed doses, standardized dosimetry techniques are needed. This need is being satisfied by standards being developed by experts from around the world under the auspices of Subcommittee E10.01 of the American Society for Testing and Materials (ASTM). In the time period since it was formed in 1984, the subcommittee has grown to 150 members from 43 countries, representing a broad cross-section of industry, government and university interests. With cooperation from other international organizations, it has taken the combined part-time effort of all these people more than 13 years to complete 24 dosimetry standards. Four are specifically for food irradiation or agricultural applications, but the majority apply to all forms of gamma, x-ray, Bremsstrahlung and electron beam radiation processing, including dosimetry for sterilization of health care products and the radiation processing of fruits, vegetables, meats, spices, processed foods, plastics, inks, medical wastes and paper. An additional 6 standards are under development. Most of the standards provide exact procedures for using individual dosimetry systems or for characterizing various types of irradiation facilities, but one covers the selection and calibration of dosimetry systems, and another covers the treatment of uncertainties. Together, this set of standards covers essentially all aspects of dosimetry for radiation processing. The first 20 of these standards have been adopted in their present form by the International Organization of Standardization (ISO), and will be published by ISO in 1999. (author)

  14. A Denotational Account of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2004-01-01

    We show that the standard normalization-by-evaluation construction for the simply-typed λβγ-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a recursively defined invariant relation, in the style of Pitts. In fact, the cons...

  15. Investigation of normal organ development with fetal MRI

    International Nuclear Information System (INIS)

    Prayer, Daniela; Brugger, Peter C.

    2007-01-01

    The understanding of the presentation of normal organ development on fetal MRI forms the basis for recognition of pathological states. During the second and third trimesters, maturational processes include changes in size, shape and signal intensities of organs. Visualization of these developmental processes requires tailored MR protocols. Further prerequisites for recognition of normal maturational states are unequivocal intrauterine orientation with respect to left and right body halves, fetal proportions, and knowledge about the MR presentation of extrafetal/intrauterine organs. Emphasis is laid on the demonstration of normal MR appearance of organs that are frequently involved in malformation syndromes. In addition, examples of time-dependent contrast enhancement of intrauterine structures are given. (orig.)

  16. Investigation of normal organ development with fetal MRI

    Energy Technology Data Exchange (ETDEWEB)

    Prayer, Daniela [Medical University of Vienna, Department of Radiology, Vienna (Austria); Brugger, Peter C. [Medical University of Vienna, Center of Anatomy and Cell Biology, Integrative Morphology Group, Vienna (Austria)

    2007-10-15

    The understanding of the presentation of normal organ development on fetal MRI forms the basis for recognition of pathological states. During the second and third trimesters, maturational processes include changes in size, shape and signal intensities of organs. Visualization of these developmental processes requires tailored MR protocols. Further prerequisites for recognition of normal maturational states are unequivocal intrauterine orientation with respect to left and right body halves, fetal proportions, and knowledge about the MR presentation of extrafetal/intrauterine organs. Emphasis is laid on the demonstration of normal MR appearance of organs that are frequently involved in malformation syndromes. In addition, examples of time-dependent contrast enhancement of intrauterine structures are given. (orig.)

  17. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  18. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  19. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  20. Primary hafnium metal sponge and other forms, approved standard 1973

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    A specification is presented covering virgin hafnium metal commonly designated as sponge because of its porous, sponge-like texture; it may also be in other forms such as chunklets. The specification does not cover crystal bar

  1. Late effects of normal tissues (lent) scoring system: the soma scale

    International Nuclear Information System (INIS)

    Mornex, F.; Pavy, J.J.; Denekamp, J.

    1997-01-01

    Radiation tolerance of normal tissues remains the limiting factor for delivering tumoricidal dose. The late toxicity of normal tissues is the most critical element of an irradiation: somatic, functional and structural alterations occur during the actual treatment itself, but late effects manifest months to years after acute effects heal, and may progress with time. The optimal therapeutic ratio ultimately requires not only complete tumor clearance, but also minimal residual injury to surrounding vital normal tissues. The disparity between the intensity of acute and late effects and the inability to predict the eventual manifestation of late normal tissue injury has made radiation oncologists recognize the importance of careful patient follow-up. There is so far no uniform toxicity scoring system to compare several clinical studies in the absence of a 'common toxicity language'. This justifies the need to establish a precise evaluation system for the analysis of late effects of radiation on normal tissues. The SOMA/LENT scoring system results from an international collaboration. European Organization Treatment of Cancer (EORTC) and Radiation Therapy Oncology Group (RTOG) have created subcommittees with the aim of addressing the question of standardized toxic effects criteria. This effort appeared as a necessity to standardize and improve the data recording, to then describe and evaluate uniform toxicity at regular time intervals. The current proposed scale is not yet validated, and should be used cautiously. (authors)

  2. The anti-tumor efficacy of nanoparticulate form of ICD-85 versus free form

    Directory of Open Access Journals (Sweden)

    Zare Mirakabadi, A.

    2015-04-01

    Full Text Available Biodegradable polymeric nanoparticles (NPs have been intensively studied as a possible way to enhance anti-tumor efficacy while reducing side effects. ICD-85, derived from the venom of two separate species of venomous animals, has been shown to exhibit anti-cancer activity. In this report polymer based sodium alginate nanoparticles of ICD-85 was used to enhance its therapeutic effects and reduce its side effects. The inhibitory effect was evaluated by MTT assay. The necrotic effect was assessed using LDH assay. The induction of apoptosis was analyzed by caspase-8 colorimetric assay kit. Cytotoxicity assay in HeLa cells demonstrated enhanced efficacy of ICD-85 loaded NPs compared to the free ICD-85. The IC50 values obtained in HeLa cells after 48 h, for free ICD-85 and ICD-85 loaded NPs were 26±2.9μg ml-1 and 18±2.5μg ml-1, respectively. While it was observed that free ICD-85 exhibits mild cytotoxicity towards normal MRC-5 cells (IC50>60μg ml-1, ICD-85 loaded NPs was found to have higher efficacy in anti-proliferative activity on HeLa cells in vitro without any significant cytotoxic effect on normal MRC-5 cells. The apoptosis-induction mechanism by both form of ICD-85 on HeLa cells was found to be through activation of caspase-8 with approximately 2 fold greater of ICD-85 loaded NPs as compared to free ICD-85. Our work reveals that although ICD-85 in free form is relatively selective to inhibit the growth of cancer cells via apoptosis as compared to normal cells, but nanoparticulate form increases its selectivity towards cancer cells.

  3. Standardisation in standards

    International Nuclear Information System (INIS)

    McDonald, J. C.

    2012-01-01

    The following observations are offered by one who has served on national and international standards-writing committees and standards review committees. Service on working groups consists of either updating previous standards or developing new standards. The process of writing either type of document proceeds along similar lines. The first order of business is to recognise the need for developing or updating a standard and to identify the potential user community. It is also necessary to ensure that there is a required number of members willing to do the writing. A justification is required as to why a new standard should be developed, and this is written as a new work item proposal or a project initiation notification system form. This document must be filed officially and approved, and a search is then undertaken to ensure that the proposed new standard will not duplicate a standard that has already been published or is underway in another standards organisation. (author)

  4. ANS shielding standards for light-water reactors

    International Nuclear Information System (INIS)

    Trubey, D.K.

    1982-01-01

    The purpose of the American Nuclear Society Standards Subcommittee, ANS-6, Radiation Protection and Shielding, is to develop standards for radiation protection and shield design, to provide shielding information to other standards-writing groups, and to develop standard reference shielding data and test problems. A total of seven published ANS-6 standards are now current. Additional projects of the subcommittee, now composed of nine working groups, include: standard reference data for multigroup cross sections, gamma-ray absorption coefficients and buildup factors, additional benchwork problems for shielding problems and energy spectrum unfolding, power plant zoning design for normal and accident conditions, process radiation monitors, and design for postaccident radiological conditions

  5. Air Force standards for nickel hydrogen battery

    Science.gov (United States)

    Hwang, Warren; Milden, Martin

    1994-01-01

    The topics discussed are presented in viewgraph form and include Air Force nickel hydrogen standardization goals, philosophy, project outline, cell level standardization, battery level standardization, and schedule.

  6. About the principles of radiation level normalization

    International Nuclear Information System (INIS)

    Nosovskij, A.V.

    2000-01-01

    The paper highlights the impact being made by the radiation level normalization principles upon the social and economic indicators. The newly introduced radiation safety standards - 97 are taken as an example. It is emphasized that it is necessary to use a sound approach while defining radiation protection standards, taking into consideration economic and social factors existing in Ukraine at the moment. Based on the concept of the natural radiation background and available results of the epidemiological surveys, the dose limits are proposed for the radiation protection standards. The paper gives a description of the dose limitation system recommended by the International Committee for Radiation Protection. The paper highlights a negative impact of the line non threshold concept, lack of special knowledge in the medical service and mass media to make decisions to protect people who suffered from the Chernobyl accident

  7. Biowaiver monograph for immediate-release solid oral dosage forms: bisoprolol fumarate.

    Science.gov (United States)

    Charoo, Naseem A; Shamsher, Areeg A A; Lian, Lai Y; Abrahamsson, Bertil; Cristofoletti, Rodrigo; Groot, D W; Kopp, Sabine; Langguth, Peter; Polli, James; Shah, Vinod P; Dressman, Jennifer

    2014-02-01

    Literature data relevant to the decision to allow a waiver of in vivo bioequivalence (BE) testing for the approval of immediate-release (IR) solid oral dosage forms containing bisoprolol as the sole active pharmaceutical ingredient (API) are reviewed. Bisoprolol is classified as a Class I API according to the current Biopharmaceutics Classification System (BCS). In addition to the BCS class, its therapeutic index, pharmacokinetic properties, data related to the possibility of excipient interactions, and reported BE/bioavailability problems are taken into consideration. Qualitative compositions of IR tablet dosage forms of bisoprolol with a marketing authorization (MA) in ICH (International Conference on Harmonisation) countries are tabulated. It was inferred that these tablets had been demonstrated to be bioequivalent to the innovator product. No reports of failure to meet BE standards have been made in the open literature. On the basis of all these pieces of evidence, a biowaiver can currently be recommended for bisoprolol fumarate IR dosage forms if (1) the test product contains only excipients that are well known, and used in normal amounts, for example, those tabulated for products with MA in ICH countries and (2) both the test and comparator dosage form are very rapidly dissolving, or, rapidly dissolving with similarity of the dissolution profiles demonstrated at pH 1.2, 4.5, and 6.8. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. International Construction Measurement Standard

    OpenAIRE

    Mitchell, Charles

    2016-01-01

    The International Construction Measurement Standard Coalition (the Coalition) was formed on 17 June 2015 after meeting at the International Monetary Fund in Washington DC, USA. The Coalition, comprising the organisations listed below at the date of publication, aims to bring about consistency in construction cost reporting standards internationally. This is achieved by the creation and adoption of this ICMS, an agreed international standard for the structuring and presentation of cost reports...

  9. The Standard Model in noncommutative geometry: fundamental fermions as internal forms

    Science.gov (United States)

    Dąbrowski, Ludwik; D'Andrea, Francesco; Sitarz, Andrzej

    2018-05-01

    Given the algebra, Hilbert space H, grading and real structure of the finite spectral triple of the Standard Model, we classify all possible Dirac operators such that H is a self-Morita equivalence bimodule for the associated Clifford algebra.

  10. Data Normalization to Accelerate Training for Linear Neural Net to Predict Tropical Cyclone Tracks

    Directory of Open Access Journals (Sweden)

    Jian Jin

    2015-01-01

    Full Text Available When pure linear neural network (PLNN is used to predict tropical cyclone tracks (TCTs in South China Sea, whether the data is normalized or not greatly affects the training process. In this paper, min.-max. method and normal distribution method, instead of standard normal distribution, are applied to TCT data before modeling. We propose the experimental schemes in which, with min.-max. method, the min.-max. value pair of each variable is mapped to (−1, 1 and (0, 1; with normal distribution method, each variable’s mean and standard deviation pair is set to (0, 1 and (100, 1. We present the following results: (1 data scaled to the similar intervals have similar effects, no matter the use of min.-max. or normal distribution method; (2 mapping data to around 0 gains much faster training speed than mapping them to the intervals far away from 0 or using unnormalized raw data, although all of them can approach the same lower level after certain steps from their training error curves. This could be useful to decide data normalization method when PLNN is used individually.

  11. Lie algebra of conformal Killing–Yano forms

    International Nuclear Information System (INIS)

    Ertem, Ümit

    2016-01-01

    We provide a generalization of the Lie algebra of conformal Killing vector fields to conformal Killing–Yano forms. A new Lie bracket for conformal Killing–Yano forms that corresponds to slightly modified Schouten–Nijenhuis bracket of differential forms is proposed. We show that conformal Killing–Yano forms satisfy a graded Lie algebra in constant curvature manifolds. It is also proven that normal conformal Killing–Yano forms in Einstein manifolds also satisfy a graded Lie algebra. The constructed graded Lie algebras reduce to the graded Lie algebra of Killing–Yano forms and the Lie algebras of conformal Killing and Killing vector fields in special cases. (paper)

  12. A case study: forming an effective quality management system according to ISO 9000 standards

    OpenAIRE

    Zağyapan, Orhan

    1995-01-01

    Ankara : The Faculty of Management and the Graduate School of Business Administration of Bilkent Univ., 1995. Thesis (Master's) -- Bilkent University, 1995. Includes bibliographical references leaves 87-88 In today's world, companies which adopt themselves to certain internationally recognized standards are one step ahead of their competitors. ISO 9000 Quality System Standards captured the most attention among all. The aim of the standard is to provide an international bench...

  13. Using color histogram normalization for recovering chromatic illumination-changed images.

    Science.gov (United States)

    Pei, S C; Tseng, C L; Wu, C C

    2001-11-01

    We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.

  14. 48 CFR 1699.70 - Cost accounting standards.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Cost accounting standards... EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION CLAUSES AND FORMS COST ACCOUNTING STANDARDS Cost Accounting Standards 1699.70 Cost accounting standards. With respect to all experience-rated contracts currently...

  15. Heart failure: when form fails to follow function.

    Science.gov (United States)

    Katz, Arnold M; Rolett, Ellis L

    2016-02-01

    Cardiac performance is normally determined by architectural, cellular, and molecular structures that determine the heart's form, and by physiological and biochemical mechanisms that regulate the function of these structures. Impaired adaptation of form to function in failing hearts contributes to two syndromes initially called systolic heart failure (SHF) and diastolic heart failure (DHF). In SHF, characterized by high end-diastolic volume (EDV), the left ventricle (LV) cannot eject a normal stroke volume (SV); in DHF, with normal or low EDV, the LV cannot accept a normal venous return. These syndromes are now generally defined in terms of ejection fraction (EF): SHF became 'heart failure with reduced ejection fraction' (HFrEF) while DHF became 'heart failure with normal or preserved ejection fraction' (HFnEF or HFpEF). However, EF is a chimeric index because it is the ratio between SV--which measures function, and EDV--which measures form. In SHF the LV dilates when sarcomere addition in series increases cardiac myocyte length, whereas sarcomere addition in parallel can cause concentric hypertrophy in DHF by increasing myocyte thickness. Although dilatation in SHF allows the LV to accept a greater venous return, it increases the energy cost of ejection and initiates a vicious cycle that contributes to progressive dilatation. In contrast, concentric hypertrophy in DHF facilitates ejection but impairs filling and can cause heart muscle to deteriorate. Differences in the molecular signals that initiate dilatation and concentric hypertrophy can explain why many drugs that improve prognosis in SHF have little if any benefit in DHF. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  16. 7 CFR 1724.70 - Standard forms of contracts for borrowers.

    Science.gov (United States)

    2010-01-01

    ... required to use in the planning, design, and construction of their electric systems. Borrowers are not required to use these guidance contract forms in the absence of an agreement to do so. [63 FR 58284, Oct... construction, procurement, engineering services, and architectural services financed by a loan made or...

  17. On matrix superpotential and three-component normal modes

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, R. de Lima [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Lima, A.F. de [Universidade Federal de Campina Grande (UFCG), PB (Brazil). Dept. de Fisica; Mello, E.R. Bezerra de; Bezerra, V.B. [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil). Dept. de Fisica]. E-mails: rafael@df.ufcg.edu.br; aerlima@df.ufcg.edu.br; emello@fisica.ufpb.br; valdir@fisica.ufpb.br

    2007-07-01

    We consider the supersymmetric quantum mechanics(SUSY QM) with three-component normal modes for the Bogomol'nyi-Prasad-Sommerfield (BPS) states. An explicit form of the SUSY QM matrix superpotential is presented and the corresponding three-component bosonic zero-mode eigenfunction is investigated. (author)

  18. Comparison of SUVs normalized by lean body mass determined by CT with those normalized by lean body mass estimated by predictive equations in normal tissues

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Woo Hyoung; Kim, Chang Guhn; Kim, Dae Weung [Wonkwang Univ. School of Medicine, Iksan (Korea, Republic of)

    2012-09-15

    Standardized uptake values (SUVs)normalized by lean body mass (LBM)determined by CT were compared with those normalized by LBM estimated using predictive equations (PEs)in normal liver, spleen, and aorta using {sup 18}F FDG PET/CT. Fluorine 18 fluorodeoxyglucose (F FDG)positron emission tomography/computed tomography (PET/CT)was conducted on 453 patients. LBM determined by CT was defined in 3 ways (LBM{sup CT1}-3). Five PEs were used for comparison (LBM{sup PE1}-5). Tissue SUV normalized by LBM (SUL) was calculated using LBM from each method (SUL{sup CT1}-3, SUL{sup PE1}-5). Agreement between methods was assessed by Bland Altman analysis. Percentage difference and percentage error were also calculated. For all liver SUL{sup CTS} vs. liver SUL{sup PES} except liver SUL{sup PE3}, the range of biases, SDs of percentage difference and percentage errors were -0.17-0.24 SUL, 6.15-10.17%, and 25.07-38.91%, respectively. For liver SUL{sup CTs} vs. liver SUL{sup PE3}, the corresponding figures were 0.47-0.69 SUL, 10.90-11.25%, and 50.85-51.55%, respectively, showing the largest percentage errors and positive biases. Irrespective of magnitudes of the biases, large percentage errors of 25.07-51.55% were observed between liver SUL{sup CT1}-3 and liver SUL{sup PE1}-5. The results of spleen and aorta SUL{sup CTs} and SUL{sup PEs} comparison were almost identical to those for liver. The present study demonstrated substantial errors in individual SUL{sup PEs} compared with SUL{sup CTs} as a reference value. Normalization of SUV by LBM determined by CT rather than PEs may be a useful approach to reduce errors in individual SUL{sup PEs}.

  19. Comparison of SUVs normalized by lean body mass determined by CT with those normalized by lean body mass estimated by predictive equations in normal tissues

    International Nuclear Information System (INIS)

    Kim, Woo Hyoung; Kim, Chang Guhn; Kim, Dae Weung

    2012-01-01

    Standardized uptake values (SUVs)normalized by lean body mass (LBM)determined by CT were compared with those normalized by LBM estimated using predictive equations (PEs)in normal liver, spleen, and aorta using 18 F FDG PET/CT. Fluorine 18 fluorodeoxyglucose (F FDG)positron emission tomography/computed tomography (PET/CT)was conducted on 453 patients. LBM determined by CT was defined in 3 ways (LBM CT1 -3). Five PEs were used for comparison (LBM PE1 -5). Tissue SUV normalized by LBM (SUL) was calculated using LBM from each method (SUL CT1 -3, SUL PE1 -5). Agreement between methods was assessed by Bland Altman analysis. Percentage difference and percentage error were also calculated. For all liver SUL CTS vs. liver SUL PES except liver SUL PE3 , the range of biases, SDs of percentage difference and percentage errors were -0.17-0.24 SUL, 6.15-10.17%, and 25.07-38.91%, respectively. For liver SUL CTs vs. liver SUL PE3 , the corresponding figures were 0.47-0.69 SUL, 10.90-11.25%, and 50.85-51.55%, respectively, showing the largest percentage errors and positive biases. Irrespective of magnitudes of the biases, large percentage errors of 25.07-51.55% were observed between liver SUL CT1 -3 and liver SUL PE1 -5. The results of spleen and aorta SUL CTs and SUL PEs comparison were almost identical to those for liver. The present study demonstrated substantial errors in individual SUL PEs compared with SUL CTs as a reference value. Normalization of SUV by LBM determined by CT rather than PEs may be a useful approach to reduce errors in individual SUL PEs

  20. Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis

    Science.gov (United States)

    Střelec, Luboš

    2011-09-01

    The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from

  1. Normal tissue dose-effect models in biological dose optimisation

    International Nuclear Information System (INIS)

    Alber, M.

    2008-01-01

    Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)

  2. of the stomach (ID 345), neutralisation of gastric acid (ID 345), contribution to normal formation of collagen and connective tissue (ID 287, 288, 333, 334, 335, 1405, 1652, 1718, 1719, 1945), maintenance of normal bone (ID 287, 335, 1652, 1718, 1945), maintenance of normal joints (ID 1405, 1652, 1945

    DEFF Research Database (Denmark)

    Tetens, Inge

    claims in relation to silicon and protection against aluminium accumulation in the brain, cardiovascular health, forming a protective coat on the mucous membrane of the stomach, neutralisation of gastric acid, contribution to normal formation of collagen and connective tissue, maintenance of normal bone...

  3. US Pharmacopeial Convention safety evaluation of menaquinone-7, a form of vitamin K.

    Science.gov (United States)

    Marles, Robin J; Roe, Amy L; Oketch-Rabah, Hellen A

    2017-07-01

    Vitamin K plays important biological roles in maintaining normal blood coagulation, bone mineralization, soft tissue physiology, and neurological development. Menaquinone-7 is a form of vitamin K2 that occurs naturally in some animal-derived and fermented foods. It is also available as an ingredient of dietary supplements. Menaquinone-7 has greater bioavailability than other forms of vitamin K, which has led to increasing sales and use of menaquinone-7 supplements. This special article reviews the chemistry, nomenclature, dietary sources, intake levels, and pharmacokinetics of menaquinones, along with the nonclinical toxicity data available and the data on clinical outcomes related to safety (adverse events). In conclusion, the data reviewed indicate that menaquinone-7, when ingested as a dietary supplement, is not associated with any serious risk to health or with other public health concerns. On the basis of this conclusion, US Pharmacopeia monographs have been developed to establish quality standards for menaquinone-7 as a dietary ingredient and as a dietary supplement in various dosage forms. © The Author(s) 2017. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Reconstructing Normality

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov

    2012-01-01

    Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...

  5. Arcuate ligament of the wrist: normal MR appearance and its relationship to palmar midcarpal instability: a cadaveric study

    International Nuclear Information System (INIS)

    Chang, Weiling; Peduto, Anthony J.; Aguiar, Rodrigo O.C.; Trudell, Debra J.; Resnick, Donald L.

    2007-01-01

    To describe the magnetic resonance (MR) imaging and gross anatomic appearance of the scaphocapitate (SC) ligament and triquetrohamocapitate (THC) ligament, which are the radial and ulnar limbs of the composite arcuate ligament, a critical volar midcarpal stabilizing ligament. T1 spin-echo and 3D gradient-echo MR imaging in the standard, coronal oblique, and axial oblique planes were performed both before and following midcarpal arthrography in seven cadaveric wrists. The seven specimens were then sectioned in selected planes to optimally visualize the SC and THC ligaments. These specimens were analyzed and correlated with their corresponding MR images. The SC and THC ligaments can be visualized in MR images as structures of low signal intensity that form an inverted ''V'' joining the proximal and distal carpal rows. The entire ligamentous complex is best visualized with coronal and axial oblique MR imaging but can also be seen in standard imaging planes. SC and THC ligaments together form the arcuate ligament of the wrist. Their function is crucial to the normal functioning of the wrist. Palmar midcarpal instability (PMCI) is a resulting condition when abnormalities of these ligaments occur. Dedicated MR imaging in the coronal and axial imaging planes can be performed in patients suspected of having PCMI. (orig.)

  6. Arcuate ligament of the wrist: normal MR appearance and its relationship to palmar midcarpal instability: a cadaveric study

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Weiling [Veterans Administration Medical Center, Department of Radiology, San Diego, CA (United States); Sharp-Grossmont Hospital, Department of Radiology, La Mesa, CA (United States); Peduto, Anthony J. [Veterans Administration Medical Center, Department of Radiology, San Diego, CA (United States); Westmead Hospital and Western Clinical School of Sydney University, Department of Radiology, Sydney (Australia); Aguiar, Rodrigo O.C. [Veterans Administration Medical Center, Department of Radiology, San Diego, CA (United States); Universidade Federal do Rio de Janeiro, Rio de Janerio (Brazil); Trudell, Debra J.; Resnick, Donald L. [Veterans Administration Medical Center, Department of Radiology, San Diego, CA (United States)

    2007-07-15

    To describe the magnetic resonance (MR) imaging and gross anatomic appearance of the scaphocapitate (SC) ligament and triquetrohamocapitate (THC) ligament, which are the radial and ulnar limbs of the composite arcuate ligament, a critical volar midcarpal stabilizing ligament. T1 spin-echo and 3D gradient-echo MR imaging in the standard, coronal oblique, and axial oblique planes were performed both before and following midcarpal arthrography in seven cadaveric wrists. The seven specimens were then sectioned in selected planes to optimally visualize the SC and THC ligaments. These specimens were analyzed and correlated with their corresponding MR images. The SC and THC ligaments can be visualized in MR images as structures of low signal intensity that form an inverted ''V'' joining the proximal and distal carpal rows. The entire ligamentous complex is best visualized with coronal and axial oblique MR imaging but can also be seen in standard imaging planes. SC and THC ligaments together form the arcuate ligament of the wrist. Their function is crucial to the normal functioning of the wrist. Palmar midcarpal instability (PMCI) is a resulting condition when abnormalities of these ligaments occur. Dedicated MR imaging in the coronal and axial imaging planes can be performed in patients suspected of having PCMI. (orig.)

  7. Chandra-SDSS Normal and Star-Forming Galaxies. I. X-Ray Source Properties of Galaxies Detected by the Chandra X-Ray Observatory in SDSS DR2

    Science.gov (United States)

    Hornschemeier, A. E.; Heckman, T. M.; Ptak, A. F.; Tremonti, C. A.; Colbert, E. J. M.

    2005-01-01

    We have cross-correlated X-ray catalogs derived from archival Chandra X-Ray Observatory ACIS observations with a Sloan Digital Sky Survey Data Release 2 (DR2) galaxy catalog to form a sample of 42 serendipitously X-ray-detected galaxies over the redshift interval 0.03normal galaxies and those in the deepest X-ray surveys. Our chief purpose is to compare optical spectroscopic diagnostics of activity (both star formation and accretion) with X-ray properties of galaxies. Our work supports a normalization value of the X-ray-star formation rate correlation consistent with the lower values published in the literature. The difference is in the allocation of X-ray emission to high-mass X-ray binaries relative to other components, such as hot gas, low-mass X-ray binaries, and/or active galactic nuclei (AGNs). We are able to quantify a few pitfalls in the use of lower resolution, lower signal-to-noise ratio optical spectroscopy to identify X-ray sources (as has necessarily been employed for many X-ray surveys). Notably, we find a few AGNs that likely would have been misidentified as non-AGN sources in higher redshift studies. However, we do not find any X-ray-hard, highly X-ray-luminous galaxies lacking optical spectroscopic diagnostics of AGN activity. Such sources are members of the ``X-ray-bright, optically normal galaxy'' (XBONG) class of AGNs.

  8. Application of normalized spectra in resolving a challenging Orphenadrine and Paracetamol binary mixture

    Science.gov (United States)

    Yehia, Ali M.; Abd El-Rahman, Mohamed K.

    2015-03-01

    Normalized spectra have a great power in resolving spectral overlap of challenging Orphenadrine (ORP) and Paracetamol (PAR) binary mixture, four smart techniques utilizing the normalized spectra were used in this work, namely, amplitude modulation (AM), simultaneous area ratio subtraction (SARS), simultaneous derivative spectrophotometry (S1DD) and ratio H-point standard addition method (RHPSAM). In AM, peak amplitude at 221.6 nm of the division spectra was measured for both ORP and PAR determination, while in SARS, concentration of ORP was determined using the area under the curve from 215 nm to 222 nm of the regenerated ORP zero order absorption spectra, in S1DD, concentration of ORP was determined using the peak amplitude at 224 nm of the first derivative ratio spectra. PAR concentration was determined directly at 288 nm in the division spectra obtained during the manipulation steps in the previous three methods. The last RHPSAM is a dual wavelength method in which two calibrations were plotted at 216 nm and 226 nm. RH point is the intersection of the two calibration lines, where ORP and PAR concentrations were directly determined from coordinates of RH point. The proposed methods were applied successfully for the determination of ORP and PAR in their dosage form.

  9. Identity Work at a Normal University in Shanghai

    Science.gov (United States)

    Cockain, Alex

    2016-01-01

    Based upon ethnographic research, this article explores undergraduate students' experiences at a normal university in Shanghai focusing on the types of identities and forms of sociality emerging therein. Although students' symptoms of disappointment seem to indicate the power of university experiences to extinguish purposeful action, this article…

  10. X-Linked Recessive Form of Nephrogenic Diabetes Insipidus in A 7-Year-Old Boy

    Directory of Open Access Journals (Sweden)

    Janchevska A.

    2014-12-01

    Full Text Available Nephrogenic diabetes insipidus (NDI is caused by the inability of renal collecting duct cells to respond to arginine vasopressin (AVP/antidiuretic hormone (ADH. We present the case of a 7-year-old boy with a history of excretion of large amounts of dilute urine and polydipsia since infancy. The boy had several vomiting episodes with mild dehydration during the first 3 years of life. There was no evidence of headaches, dizziness or visual problems. He drinks between 2 and 3 L/day and has 24-hour diuresis of 2 liters, now. He has prepubertal appearance with appropriate weight [+0.85 standard deviation score (SDS] and height (+0.15 SDS for his age. His intelligence was also normal. The water deprivation test showed low urine osmolality after 8 hours of dehydration. After desmopressin administration, urine osmolality remained low. Serum osmolality was in the normal range for sex and age before and after desmopressin administration. This indicated a nephrogenic form of diabetes insipidus. Molecular analyses revealed a P286L [p.Pro(CCC286Leu(CTC] mutation in the AVPR2 gene, that was inherited from his mother. This patient is the first case with genetically confirmed X-linked inherited form of NDI in the Republic of Macedonia. Molecular analysis confirmed the clinical diagnosis and enabled genetic advice for this family.

  11. Correction of Bowtie-Filter Normalization and Crescent Artifacts for a Clinical CBCT System.

    Science.gov (United States)

    Zhang, Hong; Kong, Vic; Huang, Ke; Jin, Jian-Yue

    2017-02-01

    To present our experiences in understanding and minimizing bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a clinical cone beam computed tomography system. Bowtie-filter position and profile variations during gantry rotation were studied. Two previously proposed strategies (A and B) were applied to the clinical cone beam computed tomography system to correct bowtie-filter crescent artifacts. Physical calibration and analytical approaches were used to minimize the norm phantom misalignment and to correct for bowtie-filter normalization artifacts. A combined procedure to reduce bowtie-filter crescent artifacts and bowtie-filter normalization artifacts was proposed and tested on a norm phantom, CatPhan, and a patient and evaluated using standard deviation of Hounsfield unit along a sampling line. The bowtie-filter exhibited not only a translational shift but also an amplitude variation in its projection profile during gantry rotation. Strategy B was better than strategy A slightly in minimizing bowtie-filter crescent artifacts, possibly because it corrected the amplitude variation, suggesting that the amplitude variation plays a role in bowtie-filter crescent artifacts. The physical calibration largely reduced the misalignment-induced bowtie-filter normalization artifacts, and the analytical approach further reduced bowtie-filter normalization artifacts. The combined procedure minimized both bowtie-filter crescent artifacts and bowtie-filter normalization artifacts, with Hounsfield unit standard deviation being 63.2, 45.0, 35.0, and 18.8 Hounsfield unit for the best correction approaches of none, bowtie-filter crescent artifacts, bowtie-filter normalization artifacts, and bowtie-filter normalization artifacts + bowtie-filter crescent artifacts, respectively. The combined procedure also demonstrated reduction of bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a CatPhan and a patient. We have developed a step

  12. Heavy meson form factors from QCD

    International Nuclear Information System (INIS)

    Falk, A.F.; Georgi, H.; Grinstein, B.

    1990-01-01

    We calculate the leading QCD radiative corrections to the relations which follow from the decoupling of the heavy quark spin as the quark mass goes infinity and from the symmetry between systems with different heavy quarks. One of the effects we calculate gives the leading q 2 -dependence of the form factor of a heavy quark, which in turn dominates the q 2 -dependence of the form factors of bound states of the heavy quark with light quarks. This, combined with the normalization of the form factor provided by symmetry, gives us a first principles calculation of the heavy meson (or baryon) form factors in the limit of very large heavy quark mass. (orig.)

  13. Bernstein Algorithm for Vertical Normalization to 3NF Using Synthesis

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2013-07-01

    Full Text Available This paper demonstrates the use of Bernstein algorithm for vertical normalization to 3NF using synthesis. The aim of the paper is to provide an algorithm for database normalization and present a set of steps which minimize redundancy in order to increase the database management efficiency, and specify tests and algorithms for testing and proving the reversibility (i.e., proving that the normalization did not cause loss of information. Using Bernstein algorithm steps, the paper gives examples of vertical normalization to 3NF through synthesis and proposes a test and an algorithm to demonstrate decomposition reversibility. This paper also sets out to explain that the reasons for generating normal forms are to facilitate data search, eliminate data redundancy as well as delete, insert and update anomalies and explain how anomalies develop using examples-

  14. Generation of Strategies for Environmental Deception in Two-Player Normal-Form Games

    Science.gov (United States)

    2015-06-18

    found in the literature is pre- sented by Kohlberg and Mertens [23]. A stable equilibrium by their definition is an equi- librium in an extensive-form...the equilibrium in this state provides them with an increased payoff. While interesting, Kohlberg and Mertens’ defi- 13 nition of equilibrium...stability used by Kohlberg and Mertens. Arsham’s work focuses on determining the amount by which a mixed-strategy Nash equilibrium’s payoff values can

  15. High-frequency ultrasound measurements of the normal ciliary body and iris.

    Science.gov (United States)

    Garcia, Julian P S; Spielberg, Leigh; Finger, Paul T

    2011-01-01

    To determine the normal ultrasonographic thickness of the iris and ciliary body. This prospective 35-MHz ultrasonographic study included 80 normal eyes of 40 healthy volunteers. The images were obtained at the 12-, 3-, 6-, and 9-o'clock radial meridians, measured at three locations along the radial length of the iris and at the thickest section of the ciliary body. Mixed model was used to estimate eye site-adjusted means and standard errors and to test the statistical difference of adjusted results. Parameters included mean thickness, standard deviation, and range. Mean thicknesses at the iris root, midway along the radial length of the iris, and at the juxtapupillary margin were 0.4 ± 0.1, 0.5 ± 0.1, and 0.6 ± 0.1 mm, respectively. Those of the ciliary body, ciliary processes, and ciliary body + ciliary processes were 0.7 ± 0.1, 0.6 ± 0.1, and 1.3 ± 0.2 mm, respectively. This study provides standard, normative thickness data for the iris and ciliary body in healthy adults using ultrasonographic imaging. Copyright 2011, SLACK Incorporated.

  16. Standard NIM instrumentation system

    International Nuclear Information System (INIS)

    1990-05-01

    NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID-20893 (Rev. 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice

  17. Normalized Mini-Mental State Examination for assessing cognitive change in population-based brain aging studies.

    Science.gov (United States)

    Philipps, Viviane; Amieva, Hélène; Andrieu, Sandrine; Dufouil, Carole; Berr, Claudine; Dartigues, Jean-François; Jacqmin-Gadda, Hélène; Proust-Lima, Cécile

    2014-01-01

    The Mini-Mental State Examination (MMSE) is widely used in population-based longitudinal studies to quantify cognitive change. However, its poor metrological properties, mainly ceiling/floor effects and varying sensitivity to change, have largely restricted its usefulness. We propose a normalizing transformation that corrects these properties, and makes possible the use of standard statistical methods to analyze change in MMSE scores. The normalizing transformation designed to correct at best the metrological properties of MMSE was estimated and validated on two population-based studies (n = 4,889, 20-year follow-up) by cross-validation. The transformation was also validated on two external studies with heterogeneous samples mixing normal and pathological aging, and samples including only demented subjects. The normalizing transformation provided correct inference in contrast with models analyzing the change in crude MMSE that most often lead to biased estimates of risk factors and incorrect conclusions. Cognitive change can be easily and properly assessed with the normalized MMSE using standard statistical methods such as linear (mixed) models. © 2014 S. Karger AG, Basel.

  18. Identification and quantification of amino acids from psoriatic and normal epidermis by high performance liquid chromatography

    International Nuclear Information System (INIS)

    Mahesar, S.M.; Khuhawar, M.Y.

    2010-01-01

    In this study, a modified fluorescence technique high performance liquid chromatography (HPLC) was adapted to separate the amino acids from the hydrolyzed keratin samples. These samples obtained from the epidermal layer of the normal and psoriatic human subjects. The keratin extracts are quantified in gram percentage of the dried skin and the amino acids concentrations are measured in mu g/g, mean retention time (tR), slope value and the coefficient of determination (r2) of each eluted amino acid is calculated. The coefficients of variation for amino acid standards ranged from 0.12% to 0.28%, mean, standard deviation of peak area and coefficients of variation of peak area were calculated. From the normal hydrolysated keratin protein fraction, 12 amino acids were determined and identified as aspartic acid, glutamic acid, asparagines, serine, glutamine, glycine, histidine, citrulline, arganine, fi-alanine, tyrosine, and valine. These amino acids were also determined in psoriatic samples while standard deviations (SD), standard error mean (SEM) and coefficient variation (CV%) of normal and psoriatic samples were also calculated. The higher concentration of amino acids in normal samples against psoriatic samples determined as glutamic acid 92.76+- 16. 83/50. 87+-9.88, glutamine 198.05+-18.74/19.74+-3.74 while higher concentrations of amino acids determined in psoriatic samples against normal samples as asparagines 81. 06+-10+-10.62/29. 98+-3.641; arganine 164.42+-35. 11/46. 14+-46, tyrosine 214.38+-29. 61/59. 64+-8. 82, and valine 169.7+-19.35/128.06+-15.14.1 is concluded that the absolute concentration of amino acids in psoriatic skin indicated a number of variations as compared to normal skin samples. (author)

  19. [Markers of antimicrobial drug resistance in the most common bacteria of normal facultative anaerobic intestinal flora].

    Science.gov (United States)

    Plavsić, Teodora

    2011-01-01

    Bacteria of normal intestinal flora are frequent carriers of markers of antimicrobial drug resistance. Resistance genes may be exchanged with other bacteria of normal flora as well as with pathogenic bacteria. The increase in the number of markers of resistance is one of the major global health problems, which induces the emergence of multi-resistant strains. The aim of this study is to confirm the presence of markers of resistance in bacteria of normal facultative anaerobic intestinal flora in our region. The experiment included a hundred fecal specimens obtained from a hundred healthy donors. A hundred bacterial strains were isolated (the most numerous representatives of the normal facultative-anaerobic intestinal flora) by standard bacteriological methods. The bacteria were cultivated on Endo agar and SS agar for 24 hours at 37 degrees C. Having been incubated, the selected characteristic colonies were submitted to the biochemical analysis. The susceptibility to antimicrobial drugs was tested by standard disc diffusion method, and the results were interpreted according to the Standard of Clinical and Laboratory Standards Institute 2010. The marker of resistance were found in 42% of the isolated bacteria. The resistance was the most common to ampicillin (42% of isolates), amoxicillin with clavulanic acid (14% of isolates), cephalexin (14%) and cotrimoxazole (8%). The finding of 12 multiresistant strains (12% of isolates) and resistance to ciprofloxacin were significant. The frequency of resistance markers was statistically higher in Klebsiella pneumoniae compared to Escherichia coli of normal flora. The finding of a large number of markers of antimicrobial drug resistance among bacteria of normal intestinal flora shows that it is necessary to begin with systematic monitoring of their antimicrobial resistance because it is an indicator of resistance in the population.

  20. General philosophy of safety standards

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1987-01-01

    Safety standards should be related to the form and magnitude of the risk they aim to limit. Because of the lack of direct information at the exposure levels experienced, radiation protection standards have to be based on risk assumptions that, while plausible, are not proven. The pressure for standards has come as much from public perceptions and fears as from the reality of the risk. (author)

  1. 40 CFR 406.36 - Pretreatment standards for new sources.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Pretreatment standards for new sources. 406.36 Section 406.36 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Normal Wheat Flour Milling Subcategory § 406.36...

  2. Challenges in microarray class discovery: a comprehensive examination of normalization, gene selection and clustering

    Directory of Open Access Journals (Sweden)

    Landfors Mattias

    2010-10-01

    Full Text Available Abstract Background Cluster analysis, and in particular hierarchical clustering, is widely used to extract information from gene expression data. The aim is to discover new classes, or sub-classes, of either individuals or genes. Performing a cluster analysis commonly involve decisions on how to; handle missing values, standardize the data and select genes. In addition, pre-processing, involving various types of filtration and normalization procedures, can have an effect on the ability to discover biologically relevant classes. Here we consider cluster analysis in a broad sense and perform a comprehensive evaluation that covers several aspects of cluster analyses, including normalization. Result We evaluated 2780 cluster analysis methods on seven publicly available 2-channel microarray data sets with common reference designs. Each cluster analysis method differed in data normalization (5 normalizations were considered, missing value imputation (2, standardization of data (2, gene selection (19 or clustering method (11. The cluster analyses are evaluated using known classes, such as cancer types, and the adjusted Rand index. The performances of the different analyses vary between the data sets and it is difficult to give general recommendations. However, normalization, gene selection and clustering method are all variables that have a significant impact on the performance. In particular, gene selection is important and it is generally necessary to include a relatively large number of genes in order to get good performance. Selecting genes with high standard deviation or using principal component analysis are shown to be the preferred gene selection methods. Hierarchical clustering using Ward's method, k-means clustering and Mclust are the clustering methods considered in this paper that achieves the highest adjusted Rand. Normalization can have a significant positive impact on the ability to cluster individuals, and there are indications that

  3. Challenges in microarray class discovery: a comprehensive examination of normalization, gene selection and clustering

    Science.gov (United States)

    2010-01-01

    Background Cluster analysis, and in particular hierarchical clustering, is widely used to extract information from gene expression data. The aim is to discover new classes, or sub-classes, of either individuals or genes. Performing a cluster analysis commonly involve decisions on how to; handle missing values, standardize the data and select genes. In addition, pre-processing, involving various types of filtration and normalization procedures, can have an effect on the ability to discover biologically relevant classes. Here we consider cluster analysis in a broad sense and perform a comprehensive evaluation that covers several aspects of cluster analyses, including normalization. Result We evaluated 2780 cluster analysis methods on seven publicly available 2-channel microarray data sets with common reference designs. Each cluster analysis method differed in data normalization (5 normalizations were considered), missing value imputation (2), standardization of data (2), gene selection (19) or clustering method (11). The cluster analyses are evaluated using known classes, such as cancer types, and the adjusted Rand index. The performances of the different analyses vary between the data sets and it is difficult to give general recommendations. However, normalization, gene selection and clustering method are all variables that have a significant impact on the performance. In particular, gene selection is important and it is generally necessary to include a relatively large number of genes in order to get good performance. Selecting genes with high standard deviation or using principal component analysis are shown to be the preferred gene selection methods. Hierarchical clustering using Ward's method, k-means clustering and Mclust are the clustering methods considered in this paper that achieves the highest adjusted Rand. Normalization can have a significant positive impact on the ability to cluster individuals, and there are indications that background correction is

  4. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    Science.gov (United States)

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher

  5. Anticipation of visual form independent of knowing where the form will occur/

    DEFF Research Database (Denmark)

    Bruhn, Pernille; Bundesen, Claus

    2012-01-01

    We investigated how selective preparation for specific forms is affected by concurrent preknowledge of location when upcoming visual stimuli are anticipated. In three experiments, participants performed a two-choice response time (RT) task in which they discriminated between standard upright...... of our effects suggested that preknowledge of form and location, respectively, affected two functionally independent, serial stages of processing. We suggest that the two stages were, first, direction of attention to the stimulus location and, subsequently, discrimination between upright and rotated...... stimuli. Presumably, preknowledge of location advanced the point in time at which attention was directed at the stimulus location, whereas preknowledge of form reduced the time subsequently taken for stimulus discrimination....

  6. Rethinking ADA signage standards for low-vision accessibility.

    Science.gov (United States)

    Arditi, Aries

    2017-05-01

    Americans With Disabilities Act (ADA) and International Code Council (ICC) standards for accessible buildings and facilities affect design and construction of all new and renovated buildings throughout the United States, and form the basis for compliance with the ADA. While these standards may result in acceptable accessibility for people who are fully blind, they fall far short of what they could and should accomplish for those with low vision. In this article I critique the standards, detailing their lack of evidence base and other shortcomings. I suggest that simply making existing requirements stricter (e.g., by mandating larger letter size or higher contrasts) will not ensure visual accessibility and therefore cannot act as a valid basis for compliance with the law. I propose two remedies. First, requirements for visual characteristics of signs intended to improve access for those with low vision should be expressed not in terms of physical features, such as character height and contrast, but rather in terms of the distance at which a sign can be read by someone with nominally normal (20/20) visual acuity under expected lighting conditions for the installed environment. This would give sign designers greater choice in design parameters but place on them the burden of ensuring legibility. Second, mounting of directional signs, which are critical for effective and efficient wayfinding, should be required to be in consistent and approachable locations so that those with reduced acuity may view them at close distance.

  7. Normal modes and continuous spectra

    International Nuclear Information System (INIS)

    Balmforth, N.J.; Morrison, P.J.

    1994-12-01

    The authors consider stability problems arising in fluids, plasmas and stellar systems that contain singularities resulting from wave-mean flow or wave-particle resonances. Such resonances lead to singularities in the differential equations determining the normal modes at the so-called critical points or layers. The locations of the singularities are determined by the eigenvalue of the problem, and as a result, the spectrum of eigenvalues forms a continuum. They outline a method to construct the singular eigenfunctions comprising the continuum for a variety of problems

  8. Simulation and Verification of Form Filling with Self-Compacting Concrete

    DEFF Research Database (Denmark)

    Thrane, Lars Nyholm

    2005-01-01

    This paper presents a form filling experiment and the corresponding 3D simulation. One side of the form is made of a transparent acrylic plate and to improve the visual observations of the flow behaviour, the first and second half of the form is cast with normal grey and red-pigmented SCC, respec...

  9. The metabolomics standards initiative (MSI)

    NARCIS (Netherlands)

    Fiehn, O.; Robertson, D.; Griffin, J.; Werf, M. van der; Nikolau, B.; Morrison, N.; Sumner, L.W.; Goodacre, R.; Hardy, N.W.; Taylor, C.; Fostel, J.; Kristal, B.; Kaddurah-Daouk, R.; Mendes, P.; Ommen, B. van; Lindon, J.C.; Sansone, S.-A.

    2007-01-01

    In 2005, the Metabolomics Standards Initiative has been formed. An outline and general introduction is provided to inform about the history, structure, working plan and intentions of this initiative. Comments on any of the suggested minimal reporting standards are welcome to be sent to the open

  10. Reexamining the validity and reliability of the clinical version of the Iowa gambling task: Evidence from a normal subject group

    Directory of Open Access Journals (Sweden)

    Ching-Hung eLin

    2013-05-01

    Full Text Available Over past decade, the Iowa gambling task (IGT has been utilized to test various decision deficits induced by neurological damage or psychiatric disorders. The IGT has recently been standardized for identifying 13 different neuropsychological disorders. Neuropsychological patients choose bad decks frequently, and normal subjects prefer good EV decks. However, the IGT has several validity and reliability problems. Some research groups have pointed out that the validity of IGT is influenced by the personality and emotional state of subjects. Additionally, several other studies have proposed that the prominent deck B phenomenon (PDB phenomenon – that is, normal subjects preferring bad deck B – may be the most serious problem confronting IGT validity. Specifically, deck B offers a high frequency of gains but negative EV. In the standard IGT administration, choice behavior can be understood with reference to gain-loss frequency (GLF rather than inferred future consequences (EV, the basic assumption of IGT. Furthermore, using two different criteria (basic assumption vs. professional norm results in significantly different classification results. Therefore, we recruited 72 normal subjects to test the validity and reliability of IGT. Each subject performed three runs of the computer-based clinical IGT version. The PDB phenomenon has been observed to a significant degree in the first and second stages of the clinical IGT version. Obviously, validity, reliability and the practice effect were unstable between two given stages. The present form of the clinical IGT version has only one stage, so its use should be reconsidered for examining normal decision makers; results from patient groups must also be interpreted with great care. GLF could be the main factor to be considered in establishing the constructional validity and reliability of the clinical IGT version.

  11. CONVERGENCE OF INTERNATIONAL AUDIT STANDARDS AND AMERICAN AUDIT STANDARDS REGARDING SAMPLING

    Directory of Open Access Journals (Sweden)

    Chis Anca Oana

    2013-07-01

    Full Text Available Abstract: Sampling is widely used in market research, scientific analysis, market analysis, opinion polls and not least in the financial statement audit. We wonder what is actually sampling and how did it appear? Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Nowadays the technique is indispensable, the economic entities operating with sophisticated computer systems and large amounts of data. Economic globalization and complexity of capital markets has made possible not only the harmonization of international accounting standards with the national ones, but also the convergence of international accounting and auditing standards with the American regulations. International Standard on Auditing 530 and Statement on Auditing Standard 39 are the two main international and American normalized referentials referring to audit sampling. This article discusses the origin of audit sampling, mentioning a brief history of the method and different definitions from literature review. The two standards are studied using Jaccard indicators in terms of the degree of similarity and dissimilarity concerning different issues. The Jaccard coefficient measures the degree of convergence of international auditing standards (ISA 530 and U.S. auditing standards (SAS 39. International auditing standards and American auditing standards, study the sampling problem, both regulations presenting common points with regard to accepted sampling techniques, factors influencing the audit sample, treatment of identified misstatements and the circumstances in which sampling is appropriate. The study shows that both standards agree on application of statistical and non-statistical sampling in auditing, that sampling is appropriate for tests of details and controls, the factors affecting audit sampling being audit risk, audit objectives and population\\'s characteristics.

  12. Narrative competence among hearing-impaired and normal-hearing children: analytical cross-sectional study

    Directory of Open Access Journals (Sweden)

    Alexandra Dezani Soares

    Full Text Available CONTEXT AND OBJECTIVE: Oral narrative is a means of language development assessment. However, standardized data for deaf patients are scarce. The aim here was to compare the use of narrative competence between hearing-impaired and normal-hearing children. DESIGN AND SETTING: Analytical cross-sectional study at the Department of Speech-Language and Hearing Sciences, Universidade Federal de São Paulo. METHODS: Twenty-one moderately to profoundly bilaterally hearing-impaired children (cases and 21 normal-hearing children without language abnormalities (controls, matched according to sex, age, schooling level and school type, were studied. A board showing pictures in a temporally logical sequence was presented to each child, to elicit a narrative, and the child's performance relating to narrative structure and cohesion was measured. The frequencies of variables, their associations (Mann-Whitney test and their 95% confidence intervals was analyzed. RESULTS: The deaf subjects showed poorer performance regarding narrative structure, use of connectives, cohesion measurements and general punctuation (P < 0.05. There were no differences in the number of propositions elaborated or in referent specification between the two groups. The deaf children produced a higher proportion of orientation-related propositions (P = 0.001 and lower proportions of propositions relating to complicating actions (P = 0.015 and character reactions (P = 0.005. CONCLUSION: Hearing-impaired children have abnormalities in different aspects of language, involving form, content and use, in relation to their normal-hearing peers. Narrative competence was also associated with the children's ages and the school type.

  13. CT angiography of the renal arteries and veins: normal anatomy and variants.

    Science.gov (United States)

    Hazırolan, Tuncay; Öz, Meryem; Türkbey, Barış; Karaosmanoğlu, Ali Devrim; Oğuz, Berna Sayan; Canyiğit, Murat

    2011-03-01

    Conventional angiography has long been regarded as gold standard imaging modality for evaluation of the renal vasculature. Introduction of multidetector computed tomography (MDCT) angiography had a groundbreaking impact on evaluation of the renal vessels and is gradually replacing conventional angiography as standard imaging. Herein, we review and illustrate the normal and variant anatomy of renal vessels with special emphasis on imaging protocols and reconstruction techniques in MDCT.

  14. 27 CFR 4.71 - Standard wine containers.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Standard wine containers..., DEPARTMENT OF THE TREASURY LIQUORS LABELING AND ADVERTISING OF WINE Standards of Fill for Wine § 4.71 Standard wine containers. (a) A standard wine container shall be made, formed and filled to meet the...

  15. 7 CFR 28.126 - Loaning of forms and exhibits.

    Science.gov (United States)

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE COMMODITY STANDARDS AND STANDARD CONTAINER... Fees and Costs § 28.126 Loaning of forms and exhibits. In the discretion of the Director, limited...

  16. Towards an international address standard

    CSIR Research Space (South Africa)

    Coetzee, S

    2008-02-01

    Full Text Available in a better user experience. Standards compliance allows for the separation of concerns: HTML for content, Cascading Style Sheets (CSS) for presentation and JavaScript for dynamic behaviour. Standards compliant documents are also...) and cascading style sheets through CSS (CSS n.d.), whilst the JavaScript specification has been standardised by Ecma International (another standards organisation for information and communication systems), in the form of EcmaScript (Ecma...

  17. Density- and wavefunction-normalized Cartesian spherical harmonics for l ≤ 20.

    Science.gov (United States)

    Michael, J Robert; Volkov, Anatoliy

    2015-03-01

    The widely used pseudoatom formalism [Stewart (1976). Acta Cryst. A32, 565-574; Hansen & Coppens (1978). Acta Cryst. A34, 909-921] in experimental X-ray charge-density studies makes use of real spherical harmonics when describing the angular component of aspherical deformations of the atomic electron density in molecules and crystals. The analytical form of the density-normalized Cartesian spherical harmonic functions for up to l ≤ 7 and the corresponding normalization coefficients were reported previously by Paturle & Coppens [Acta Cryst. (1988), A44, 6-7]. It was shown that the analytical form for normalization coefficients is available primarily for l ≤ 4 [Hansen & Coppens, 1978; Paturle & Coppens, 1988; Coppens (1992). International Tables for Crystallography, Vol. B, Reciprocal space, 1st ed., edited by U. Shmueli, ch. 1.2. Dordrecht: Kluwer Academic Publishers; Coppens (1997). X-ray Charge Densities and Chemical Bonding. New York: Oxford University Press]. Only in very special cases it is possible to derive an analytical representation of the normalization coefficients for 4 4 the density normalization coefficients were calculated numerically to within seven significant figures. In this study we review the literature on the density-normalized spherical harmonics, clarify the existing notations, use the Paturle-Coppens (Paturle & Coppens, 1988) method in the Wolfram Mathematica software to derive the Cartesian spherical harmonics for l ≤ 20 and determine the density normalization coefficients to 35 significant figures, and computer-generate a Fortran90 code. The article primarily targets researchers who work in the field of experimental X-ray electron density, but may be of some use to all who are interested in Cartesian spherical harmonics.

  18. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  19. 40 CFR 417.156 - Pretreatment standards for new sources.

    Science.gov (United States)

    2010-07-01

    ... standard shall be: (1) For normal operation of spray drying towers above, the following values pertain.... Oil and grease Do. pH Do. (2) For air quality restricted operation of a spray drying tower, but only... GUIDELINES AND STANDARDS SOAP AND DETERGENT MANUFACTURING POINT SOURCE CATEGORY Manufacture of Spray Dried...

  20. Assembling of (βLPH) beta-lypothrophine radioimmunoassay. Plasma levels standardization in normal individuals and patients with hypophysis and adrenals diseases

    International Nuclear Information System (INIS)

    Castro, Margaret de.

    1988-01-01

    The present study investigates the extraction and radioimmunoassay (RIA) conditions of plasma βLPH. It was extracted by the activated silicic acid method, with a mean extraction efficiency of 31.6% and a mean intra-extraction variation coefficient of 8.1%. Radioiodination was performed by the chloramine-T method and βLPH 125 I was purified by gel chromatography on Sephadex G100. Estimated specific activity ranged from 100 to 192.8 μCi/μg, with a mean incorporation percentage of 66.6%. The titer of the first antibody was 1:50.000/100 μl. The assay was performed under non-equilibrium conditions, with a pre-incubation period of 24 hours and incubation of 4 hours. Mean immunoreactivity (Bo/Total) was 21.1%, with a mean Blank/Total ratio of 2.3%. Sensitivity, expressed as the mean minimum detectable dose, was 40 pg/tube, equivalent to 56 pg/ml plasma. Intra-assay variation coefficients were 6.5%, 3.8% and 6.8%, respectively, at B/Bo levels of 0.8, 0.6 and 0.4 of the standard curve. At B/Bo equal to 0.5, the intra-assay variation coefficient was 20.9%. Replicates of 14 plasma samples showed a correlation coefficient of r 0.99, (p< 0.05). Parallelism between the curve obtained with different volumes of an extract with a high βLPH value and the standard curve was found. The method was controlled biologically by the presence of correlation between the plasma βLPH levels and determined pathological states and with clinical functional studies. Twenty seven normal individuals, 10 patients with Cushing's disease to a tumor of the hypophysis, 4 patients with Cushing syndrome due to an adrenal tumor, 10 patients Addison disease, and 8 patients with hypopituitarism were studied. (author). 119 refs., 28 figs., 2 tabs

  1. 7 CFR 61.1 - Words in singular form.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Words in singular form. 61.1 Section 61.1 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... Words in singular form. Words used in the regulations in this subpart in the singular form shall be...

  2. Chemical compatibility of DWPF canistered waste forms

    International Nuclear Information System (INIS)

    Harbour, J.R.

    1993-01-01

    The Waste Acceptance Preliminary Specifications (WAPS) require that the contents of the canistered waste form are compatible with one another and the stainless steel canister. The canistered waste form is a closed system comprised of a stainless steel vessel containing waste glass, air, and condensate. This system will experience a radiation field and an elevated temperature due to radionuclide decay. This report discusses possible chemical reactions, radiation interactions, and corrosive reactions within this system both under normal storage conditions and after exposure to temperatures up to the normal glass transition temperature, which for DWPF waste glass will be between 440 and 460 degrees C. Specific conclusions regarding reactions and corrosion are provided. This document is based on the assumption that the period of interim storage prior to packaging at the federal repository may be as long as 50 years

  3. Normal anatomical measurements in cervical computerized tomography

    International Nuclear Information System (INIS)

    Zaunbauer, W.; Daepp, S.; Haertel, M.

    1985-01-01

    Radiodiagnostically relevant normal values and variations for measurements of the cervical region, the arithmetical average and the standard deviation were determined from adequate computer tomograms on 60 healthy women and men, aged 20 to 83 years. The sagittal diameter of the prevertebral soft tissue and the lumina of the upper respiratory tract were evaluated at exactly defined levels between the hyoid bone and the incisura jugularis sterni. - The thickness of the aryepiglottic folds, the maximal sagittal and transverse diameters of the thyroid gland and the calibre of the great cervical vessels were defined. - To assess information about laryngeal function in computerized tomography, measurements of distances between the cervical spine and anatomical fixed points of the larynx and hypopharynx were made as well as of the degree of vocal cord movement during normal respiration and phonation. (orig.) [de

  4. The measurement of intrinsic cellular radiosensitivity in human tumours and normal tissues

    International Nuclear Information System (INIS)

    Lawton, P.A.

    1995-01-01

    Human tumour and normal cell radiosensitivity are thought to be important factors determining the response of tumour and normal tissues to radiotherapy, respectively. Clonogenic assays are the standard method for measuring radiosensitivity but they are of limited applicability for clinical use with fresh human tumours. The main aim of this work was to evaluate the Adhesive Tumour Cell Culture System (ATCCS), as a method for measuring the radiosensitivity of human tumours. A soft agar clonogenic assay, the modified Courtenay-Mills assay, was used as a standard to compare with the ATCCS. The demonstration that fibroblast contamination could occur with both assay methods led to the investigation of a new technique for removing unwanted fibroblasts from tumour cell suspensions and to the use of a multiwell assay for measuring fibroblast radiosensitivity. (author)

  5. The measurement of intrinsic cellular radiosensitivity in human tumours and normal tissues

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, P.A.

    1995-12-31

    Human tumour and normal cell radiosensitivity are thought to be important factors determining the response of tumour and normal tissues to radiotherapy, respectively. Clonogenic assays are the standard method for measuring radiosensitivity but they are of limited applicability for clinical use with fresh human tumours. The main aim of this work was to evaluate the Adhesive Tumour Cell Culture System (ATCCS), as a method for measuring the radiosensitivity of human tumours. A soft agar clonogenic assay, the modified Courtenay-Mills assay, was used as a standard to compare with the ATCCS. The demonstration that fibroblast contamination could occur with both assay methods led to the investigation of a new technique for removing unwanted fibroblasts from tumour cell suspensions and to the use of a multiwell assay for measuring fibroblast radiosensitivity. (author).

  6. Hemoglobin levels in normal Filipino pregnant women.

    Science.gov (United States)

    Kuizon, M D; Natera, M G; Ancheta, L P; Platon, T P; Reyes, G D; Macapinlac, M P

    1981-09-01

    The hemoglobin concentrations during pregnancy in Filipinos belonging to the upper income group, who were prescribed 105 mg elemental iron daily, and who had acceptable levels of transferrin saturation, were examined in an attempt to define normal levels. The hemoglobin concentrations for each trimester followed a Gaussian distribution. The hemoglobin values equal to the mean minus one standard deviation were 11.4 gm/dl for the first trimester and 10.4 gm/dl for the second and third trimesters. Using these values as the lower limits of normal, in one group of pregnant women the prevalence of anemia during the last two trimesters was found lower than that obtained when WHO levels for normal were used. Groups of women with hemoglobin of 10.4 to 10.9 gm/dl (classified anemic by WHO criteria but normal in the present study) and those with 11.0 gm/dl and above could not be distinguished on the basis of their serum ferritin levels nor on the degree of decrease in their hemoglobin concentration during pregnancy. Many subjects in both groups, however, had serum ferritin levels less than 12 ng/ml which indicate poor iron stores. It might be desirable in future studies to determine the hemoglobin cut-off point that will delineate subjects who are both non-anemic and adequate in iron stores using serum ferritin levels as criterion for the latter.

  7. 7 CFR 46.1 - Words in singular form.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Words in singular form. 46.1 Section 46.1 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Words in singular form. Words in this part in the singular form shall be deemed to import the plural...

  8. Cloning and characterization of the complementary DNA for the B chain of normal human serum C1q.

    Science.gov (United States)

    Reid, K B; Bentley, D R; Wood, K J

    1984-09-06

    Normal human C1q is a serum glycoprotein of 460 kDa containing 18 polypeptide chains (6A, 6B, 6C) each 226 amino acids long and each containing an N-terminal collagen-like domain and a C-terminal globular domain. Two unusual forms of C1q have been described: a genetically defective form, which has a molecular mass of approximately 160 kDa and is found in the sera of homozygotes for the defect who show a marked susceptibility to immune complex related disease; a fibroblast form, shown to be synthesized and secreted, in vitro, with a molecular mass of about 800 kDa and with chains approximately 16 kDa greater than those of normal C1q. A higher than normal molecular mass form of C1q has also been described in human colostrum and a form of C1q has been claimed to represent one of the types of Fc receptor on guinea-pig macrophages. To initiate studies, at the genomic level, on these various forms of C1q, and to investigate the possible relation between the C1q genes and the procollagen genes, the complementary DNA corresponding to the B chain of normal C1q has been cloned and characterized.

  9. Proton MR spectroscopic features of liver cirrhosis : comparing with normal liver

    International Nuclear Information System (INIS)

    Cho, Soon Gu; Choi, Won; Kim, Young Soo; Kim, Mi Young; Jee, Keum Nahn; Lee, Kyung Hee; Suh, Chang Hae

    2000-01-01

    The purpose of this study was to determine the proton MR spectroscopic features of liver cirrhosis and the different proton MR spectroscopic features between liver cirrhosis and the normal human liver by comparing the two different conditions. The investigation involved 30 cases of in-vivo proton MR spectra obtained from 15 patients with liver cirrhosis demonstrated on the basis of radiologic and clinical findings, and from 15 normal volunteers without a past or current history of liver disease. MR spectroscopy involved the use of 1.5T GESigna Horizon system (GE Medical Systems, Milwaukee, U. S. A.) with body coil. STEAM (STimulated Echo-Acquisition Mode) with 3000/30 msec of TR/TE was used for signal acquisition; patients were in the prone position and respiration was not interrupted. Cases were assigned to either the cirrhosis or normal group, and using the proton MR spectra of cases of in each group, peak changes occurring in lipids (at 1.3 ppm), glutamate and glutamine (at 2.4-2.5 ppm), phosphomonoesters (at 3.0-3.1 ppm), and glycogen and glucose (at 3.4-3.9 ppm) were evaluated. Mean and standard deviation of the ratio of glutamate + glutamine/lipids, phosphomonoesters/lipids, glycogen + glucose/lipids were calculated from the area of their peaks. The ratio of various metabolites to lipid content was compared between the normal and cirrhosis group. The main characteristic change in proton MR spectra in cases of liver cirrhosis compared with normal liver was decreased relative intensity of lipid peak. Mean and standard deviation of ratio of glutamate + glutamine/lipids, phosphomonoesters /lipids, glycogen + glucose /lipid calculated from the area of their peaks of normal and cirrhotic liver were 0.0204 ±0.0067 and 0.0693 ±0.0371 (p less than 0.05), 0.0146 ± 0.0090 and 0.0881 ±0.0276 (p less than 0.05), 0.0403 ± 0.0267 and 0.2325 ± 0.1071 (p less than 0.05), respectively The other characteristic feature of proton MR spectra of liver cirrhosis was the peak

  10. Connection between effective-range expansion and nuclear vertex constant or asymptotic normalization coefficient

    International Nuclear Information System (INIS)

    Yarmukhamedov, R.; Baye, D.

    2011-01-01

    Explicit relations between the effective-range expansion and the nuclear vertex constant or asymptotic normalization coefficient (ANC) for the virtual decay B→A+a are derived for an arbitrary orbital momentum together with the corresponding location condition for the (A+a) bound-state energy. They are valid both for the charged case and for the neutral case. Combining these relations with the standard effective-range function up to order six makes it possible to reduce to two the number of free effective-range parameters if an ANC value is known from experiment. Values for the scattering length, effective range, and form parameter are determined in this way for the 16 O+p, α+t, and α+ 3 He collisions in partial waves where a bound state exists by using available ANCs deduced from experiments. The resulting effective-range expansions for these collisions are valid up to energies larger than 5 MeV.

  11. P-nflation: generating cosmic Inflation with p-forms

    Energy Technology Data Exchange (ETDEWEB)

    Germani, Cristiano [LUTH, Observatoire de Paris, CNRS UMR 8102, Universite Paris Diderot, 5 Place Jules Janssen, 92195 Meudon Cedex (France); Kehagias, Alex, E-mail: cristiano.germani@obspm.fr, E-mail: kehagias@central.ntua.gr [Department of Physics, National Technical University of Athens, GR-15773, Zografou, Athens (Greece)

    2009-03-15

    We show that an inflationary background might be realized by using any p-form non-minimally coupled to gravity. Standard scalar field inflation corresponds to the 0-form case and vector inflation to the 1-form. Moreover, we show that the 2- and 3-form fields are dual to a new vector and scalar inflationary theories where the kinetic terms are non-minimally coupled to gravity.

  12. A clinical study to evaluate the correlation between maxillary central incisor tooth form and face form in an Indian population.

    Science.gov (United States)

    Koralakunte, Pavankumar R; Budihal, Dhanyakumar H

    2012-09-01

    A study was performed to examine the correlation between maxillary central incisor tooth form and face form in males and females in an Indian population. The selection of prosthetic teeth for edentulous patients is a primary issue in denture esthetics, especially in the case of maxillary central incisors, which are the most prominent teeth in the arch. Two hundred dental students of Indian origin comprising 79 males and 121 females aged 18-28 years studying at Bapuji Dental College and Hospital were randomly selected as the study subjects. A standardized photographic procedure was used to obtain images of the face and the maxillary central incisors. The outline forms of the face and the maxillary right central incisor tooth were determined using a standardized method. The outline forms obtained were used to classify both face form and tooth form on the basis of visual and William's methods. The means were considered after evaluation by five prosthodontists, and the results were tabulated. Statistical analysis was performed using the chi-squared test for association and Z-test for equality of proportions. A correlation greater than 50% was observed between tooth form and face form by the visual method, compared with one of 31.5% by William's method. There was no highly defined correlation between maxillary central incisor tooth form and face form among the male and female Indian subjects studied.

  13. Estimating structural equation models with non-normal variables by using transformations

    NARCIS (Netherlands)

    Montfort, van K.; Mooijaart, A.; Meijerink, F.

    2009-01-01

    We discuss structural equation models for non-normal variables. In this situation the maximum likelihood and the generalized least-squares estimates of the model parameters can give incorrect estimates of the standard errors and the associated goodness-of-fit chi-squared statistics. If the sample

  14. 40 CFR 471.73 - New source performance standards (NSPS).

    Science.gov (United States)

    2010-07-01

    ... GUIDELINES AND STANDARDS NONFERROUS METALS FORMING AND METAL POWDERS POINT SOURCE CATEGORY Uranium Forming... achieve the following new source performance standards (NSPS). The mass of pollutants in the uranium... mg/off-kg (pounds per million off-pounds) of uranium extruded Cadmium 0.007 0.003 Chromium 0.013 0...

  15. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  16. Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.

    Science.gov (United States)

    Sznitman, Sharon R; Taubman, Danielle S

    2016-09-01

    Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.

  17. Basic characterization of normal multifocal electroretinogram

    International Nuclear Information System (INIS)

    Fernandez Cherkasova, Lilia; Rojas Rondon, Irene; Castro Perez, Pedro Daniel; Lopez Felipe, Daniel; Santiesteban Freixas, Rosaralis; Mendoza Santiesteban, Carlos E

    2008-01-01

    A scientific literature review was made on the novel multifocal electroretinogram technique, the involved cell mechanisms and some of the factors modifying its results together with the form of presentation. The basic characteristics of this electrophysiological record obtained from several regions of the retina of normal subjects is important in order to create at a small scale a comparative database to evaluate pathological eye tracing. All this will greatly help in early less invasive electrodiagnosis of localized retinal lesions. (Author)

  18. STOCHASTIC PRICING MODEL FOR THE REAL ESTATE MARKET: FORMATION OF LOG-NORMAL GENERAL POPULATION

    Directory of Open Access Journals (Sweden)

    Oleg V. Rusakov

    2015-01-01

    Full Text Available We construct a stochastic model of real estate pricing. The method of the pricing construction is based on a sequential comparison of the supply prices. We proof that under standard assumptions imposed upon the comparison coefficients there exists an unique non-degenerated limit in distribution and this limit has the lognormal law of distribution. The accordance of empirical distributions of prices to thetheoretically obtained log-normal distribution we verify by numerous statistical data of real estate prices from Saint-Petersburg (Russia. For establishing this accordance we essentially apply the efficient and sensitive test of fit of Kolmogorov-Smirnov. Basing on “The Russian Federal Estimation Standard N2”, we conclude that the most probable price, i.e. mode of distribution, is correctly and uniquely defined under the log-normal approximation. Since the mean value of log-normal distribution exceeds the mode - most probable value, it follows that the prices valued by the mathematical expectation are systematically overstated.

  19. Fast Eigensolver for Computing 3D Earth's Normal Modes

    Science.gov (United States)

    Shi, J.; De Hoop, M. V.; Li, R.; Xi, Y.; Saad, Y.

    2017-12-01

    We present a novel parallel computational approach to compute Earth's normal modes. We discretize Earth via an unstructured tetrahedral mesh and apply the continuous Galerkin finite element method to the elasto-gravitational system. To resolve the eigenvalue pollution issue, following the analysis separating the seismic point spectrum, we utilize explicitly a representation of the displacement for describing the oscillations of the non-seismic modes in the fluid outer core. Effectively, we separate out the essential spectrum which is naturally related to the Brunt-Väisälä frequency. We introduce two Lanczos approaches with polynomial and rational filtering for solving this generalized eigenvalue problem in prescribed intervals. The polynomial filtering technique only accesses the matrix pair through matrix-vector products and is an ideal candidate for solving three-dimensional large-scale eigenvalue problems. The matrix-free scheme allows us to deal with fluid separation and self-gravitation in an efficient way, while the standard shift-and-invert method typically needs an explicit shifted matrix and its factorization. The rational filtering method converges much faster than the standard shift-and-invert procedure when computing all the eigenvalues inside an interval. Both two Lanczos approaches solve for the internal eigenvalues extremely accurately, comparing with the standard eigensolver. In our computational experiments, we compare our results with the radial earth model benchmark, and visualize the normal modes using vector plots to illustrate the properties of the displacements in different modes.

  20. The Influence of Normalization Weight in Population Pharmacokinetic Covariate Models.

    Science.gov (United States)

    Goulooze, Sebastiaan C; Völler, Swantje; Välitalo, Pyry A J; Calvier, Elisa A M; Aarons, Leon; Krekels, Elke H J; Knibbe, Catherijne A J

    2018-03-23

    In covariate (sub)models of population pharmacokinetic models, most covariates are normalized to the median value; however, for body weight, normalization to 70 kg or 1 kg is often applied. In this article, we illustrate the impact of normalization weight on the precision of population clearance (CL pop ) parameter estimates. The influence of normalization weight (70, 1 kg or median weight) on the precision of the CL pop estimate, expressed as relative standard error (RSE), was illustrated using data from a pharmacokinetic study in neonates with a median weight of 2.7 kg. In addition, a simulation study was performed to show the impact of normalization to 70 kg in pharmacokinetic studies with paediatric or obese patients. The RSE of the CL pop parameter estimate in the neonatal dataset was lowest with normalization to median weight (8.1%), compared with normalization to 1 kg (10.5%) or 70 kg (48.8%). Typical clearance (CL) predictions were independent of the normalization weight used. Simulations showed that the increase in RSE of the CL pop estimate with 70 kg normalization was highest in studies with a narrow weight range and a geometric mean weight away from 70 kg. When, instead of normalizing with median weight, a weight outside the observed range is used, the RSE of the CL pop estimate will be inflated, and should therefore not be used for model selection. Instead, established mathematical principles can be used to calculate the RSE of the typical CL (CL TV ) at a relevant weight to evaluate the precision of CL predictions.

  1. Normalization constraint for variational bounds on fluid permeability

    International Nuclear Information System (INIS)

    Berryman, J.G.; Milton, G.W.

    1985-01-01

    A careful reexamination of the formulation of Prager's original variational principle for viscous flow through porous media has uncovered a subtle error in the normalization constraint on the trial functions. Although a certain surface integral of the true pressure field over the internal surface area always vanishes for isotropic materials, the corresponding surface integral for a given trial pressure field does not necessarily vanish but has nevertheless been previously neglected in the normalization. When this error is corrected, the form of the variational estimate is actually simpler than before and furthermore the resulting bounds have been shown to improve when the constant trial functions are used in either the two-point or three-point bounds

  2. Visualization of normal pleural sinuses with AMBER

    International Nuclear Information System (INIS)

    Aarts, N.J.; Kool, L.J.S.; Oestmann, J.W.

    1991-01-01

    This paper reports that ventral and dorsal pleural sinuses are frequently better appreciated with advanced modulated beam equalization radiography (AMBER) than with standard chest radiography. The visualization of the sinuses with both techniques was compared and their typical configuration studied. Four hundred patients without known chest disease were evaluated. Two groups of 200 patients were studied with either AMBER or standard chest radiography. Visualization was evaluated by three radiologists using a four-point scale. The shape of the sinus was traced if sufficiently visible. A significantly larger segment of the respective sinuses was seen with the AMBER technique. The dorsal sinus was significantly easier to trace than the ventral. Various sinus configurations were noted. AMBER improves the visibility of the pleural sinuses. Knowledge of their normal configuration is the precondition for correctly diagnosing lesions hitherto frequently overlooked

  3. Normal Values and Reproducibilitiy of Electric Current Perception Threshold in Sensory Fibers

    Directory of Open Access Journals (Sweden)

    Reza Salman-Roghani

    2006-04-01

    Full Text Available Objective: Routine electrodiagnosis (EMG-NCS has some shortcomings in the evaluation of peripheral nervous system, auch as autonomous nervous system evaluation, in pure sensory radiculopathies and acute hyperesthetic stages of neuropathies. Quantitative sensory testings such as current perception threshold (CPT with electrical stimulations are suggested for above mentioned pathologies. Ttest results should be compared with a normal value of similar identical population. This study is conducted to determine normal value and reproducibility of CPT in the Iranian population. Materials & Methods: Fifty normal volunteers (32 men, 18 woman in the range of 20-40 years without exclusion criteria (such as neuro- musculoskeletal disorders, diabetes mellitus and alcoholism were recruited with simple randomized selection and CPT test was conducted on C8 (4th finger and L5 (1st Toedermatomes. To determine test’s reproducibility, 6 persons (4 men, 2 women were examined 3 times a day, 2 day per week. Collected data were analyzed to determine mean and standard deviation. Results: Normal values of CPT test was defined as one standard deviation from mean of our CPT data. These values are in C8 dermatome 2000 Hz: 2.04± 47 250 Hz: 0.75±0.25 5 Hz: 0.76±0.3 and for L5 dermatome 2000Hz: 2.83± 0.73 250 Hz: 1.24 ± 45 5Hz: 0.76± 0.3 To determine our results reproducibility and reliability, Alpha- cronbach (existed in SPSS software was used and %98.5 & 99% were obtained for C8 & L5 dermatomes respectively. Conclusion: Our findings are about C8 & L5 dermatomes which could be used as a normal Values for such dermatomes. Regarding to its good correlation with international results we can use international references as a normal Valueswith consideration of each clinic’s reproducibility should be assessed individually.

  4. Radiological Impacts Assessment during Normal Decommissioning Operation for EU-APR

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Do Hyun; Lee, Keun Sung [KHNP CRI, Daejeon (Korea, Republic of); Lee, ChongHui [KEPCO Engineering and Construction, Gimcheon (Korea, Republic of)

    2016-10-15

    In this paper, radiological impacts on human beings during normal execution of the decommissioning operations from the current standard design of EU-APR which has been modified and improved from its original design of APR1400 to comply with EUR, are evaluated. Decommissioning is the final phase in the life cycle of a nuclear installation, covering all activities from shutdown and removal of fissile material to environmental restoration of the site. According to article 5.4 specified in chapter 2.20 of European Utility Requirements (EUR), all relevant radiological impacts on human being should be considered during the environmental assessment of decommissioning, including external exposure from direct radiation of plant and other radiation sources, and internal exposure due to inhalation and ingestion. In this paper, radiological impacts on human beings during normal circumstances of the decommissioning operation were evaluated from the current standard design of EU-APR based on the simple transport model and practical generic methodology for assessing the radiological impact provided by IAEA. The results of dose assessment fulfilled the dose limit for all scenarios.

  5. State Air Quality Standards.

    Science.gov (United States)

    Pollution Engineering, 1978

    1978-01-01

    This article presents in tabular form the air quality standards for sulfur dioxide, carbon monoxide, nitrogen dioxide, photochemicals, non-methane hydrocarbons and particulates for each of the 50 states and the District of Columbia. (CS)

  6. Normalization Methods and Selection Strategies for Reference Materials in Stable Isotope Analyes. Review

    Energy Technology Data Exchange (ETDEWEB)

    Skrzypek, G. [West Australian Biogeochemistry Centre, John de Laeter Centre of Mass Spectrometry, School of Plant Biology, University of Western Australia, Crawley (Australia); Sadler, R. [School of Agricultural and Resource Economics, University of Western Australia, Crawley (Australia); Paul, D. [Department of Civil Engineering (Geosciences), Indian Institute of Technology Kanpur, Kanpur (India); Forizs, I. [Institute for Geochemical Research, Hungarian Academy of Sciences, Budapest (Hungary)

    2013-07-15

    Stable isotope ratio mass spectrometers are highly precise, but not accurate instruments. Therefore, results have to be normalized to one of the isotope scales (e.g., VSMOW, VPDB) based on well calibrated reference materials. The selection of reference materials, numbers of replicates, {delta}-values of these reference materials and normalization technique have been identified as crucial in determining the uncertainty associated with the final results. The most common normalization techniques and reference materials have been tested using both Monte Carlo simulations and laboratory experiments to investigate aspects of error propagation during the normalization of isotope data. The range of observed differences justifies the need to employ the same sets of standards worldwide for each element and each stable isotope analytical technique. (author)

  7. Indentation stiffness does not discriminate between normal and degraded articular cartilage.

    Science.gov (United States)

    Brown, Cameron P; Crawford, Ross W; Oloyede, Adekunle

    2007-08-01

    Relative indentation characteristics are commonly used for distinguishing between normal healthy and degraded cartilage. The application of this parameter in surgical decision making and an appreciation of articular cartilage biomechanics has prompted us to hypothesise that it is difficult to define a reference stiffness to characterise normal articular cartilage. This hypothesis is tested for validity by carrying out biomechanical indentation of articular cartilage samples that are characterised as visually normal and degraded relative to proteoglycan depletion and collagen disruption. Compressive loading was applied at known strain rates to visually normal, artificially degraded and naturally osteoarthritic articular cartilage and observing the trends of their stress-strain and stiffness characteristics. While our results demonstrated a 25% depreciation in the stiffness of individual samples after proteoglycan depletion, they also showed that when compared to the stiffness of normal samples only 17% lie outside the range of the stress-strain behaviour of normal samples. We conclude that the extent of the variability in the properties of normal samples, and the degree of overlap (81%) of the biomechanical properties of normal and degraded matrices demonstrate that indentation data cannot form an accurate basis for distinguishing normal from abnormal articular cartilage samples with consequences for the application of this mechanical process in the clinical environment.

  8. Standard deviation and standard error of the mean.

    Science.gov (United States)

    Lee, Dong Kyu; In, Junyong; Lee, Sangseok

    2015-06-01

    In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results.

  9. Normalization of RNA-seq data using factor analysis of control genes or samples

    Science.gov (United States)

    Risso, Davide; Ngai, John; Speed, Terence P.; Dudoit, Sandrine

    2015-01-01

    Normalization of RNA-seq data has proven essential to ensure accurate inference of expression levels. Here we show that usual normalization approaches mostly account for sequencing depth and fail to correct for library preparation and other more-complex unwanted effects. We evaluate the performance of the External RNA Control Consortium (ERCC) spike-in controls and investigate the possibility of using them directly for normalization. We show that the spike-ins are not reliable enough to be used in standard global-scaling or regression-based normalization procedures. We propose a normalization strategy, remove unwanted variation (RUV), that adjusts for nuisance technical effects by performing factor analysis on suitable sets of control genes (e.g., ERCC spike-ins) or samples (e.g., replicate libraries). Our approach leads to more-accurate estimates of expression fold-changes and tests of differential expression compared to state-of-the-art normalization methods. In particular, RUV promises to be valuable for large collaborative projects involving multiple labs, technicians, and/or platforms. PMID:25150836

  10. Standard model without Higgs particles

    International Nuclear Information System (INIS)

    Kovalenko, S.G.

    1992-10-01

    A modification of the standard model of electroweak interactions with the nonlocal Higgs sector is proposed. Proper form of nonlocality makes Higgs particles unobservable after the electroweak symmetry breaking. They appear only as a virtual state because their propagator is an entire function. We discuss some specific consequences of this approach comparing it with the conventional standard model. (author). 12 refs

  11. Sensitivity to ultraviolet radiation in a dominantly inherited form of xeroderma pigmentosum

    International Nuclear Information System (INIS)

    Imray, F.P.; Relf, W.; Ramsay, R.G.; Kidson, C.; Hockey, A.

    1986-01-01

    An Australian family is described in which a mild form of xeroderma pigmentosum (XP) is inherited as an autosomal dominant trait. Studies of lymphoblastoid cells and fibroblasts from affected person demonstrated sensitivity to ultraviolet (UV) light as judged by diminished clonogenicity and higher frequencies of UV induced chromosome aberrations compared to normal controls. After UV irradiation of dominant XP cells, replicative DNA synthesis was depressed to a greater extent than normal and the level of UV induced DNA repair synthesis was lower than that in normal cells. The level of sister chromatid exchanges and the numbers of 6-thioguanine resistant mutants induced by UV irradiation were equal to those found in normal controls. Although two subjects in the family had skin cancers, this dominant form of XP is not apparently associated with high risk, or large numbers of skin cancers in affected persons. (author)

  12. The classification of normal screening mammograms

    Science.gov (United States)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  13. 75 FR 21666 - Canadian Standards Association; Application for Expansion of Recognition

    Science.gov (United States)

    2010-04-26

    ...] Canadian Standards Association; Application for Expansion of Recognition AGENCY: Occupational Safety and... the Canadian Standards Association for expansion of its recognition and presents the Agency's... Office's normal business hours, 8:15 a.m.-4:45 p.m., e.t. Instructions: All submissions must include the...

  14. Normalization of test and evaluation of biothreat detection systems: overcoming microbial air content fluctuations by using a standardized reagent bacterial mixture.

    Science.gov (United States)

    Berchebru, Laurent; Rameil, Pascal; Gaudin, Jean-Christophe; Gausson, Sabrina; Larigauderie, Guilhem; Pujol, Céline; Morel, Yannick; Ramisse, Vincent

    2014-10-01

    Test and evaluation of engineered biothreat agent detection systems ("biodetectors") are a challenging task for government agencies and industries involved in biosecurity and biodefense programs. In addition to user friendly features, biodetectors need to perform both highly sensitive and specific detection, and must not produce excessive false alerts. In fact, the atmosphere displays a number of variables such as airborne bacterial content that can interfere with the detection process, thus impeding comparative tests when carried out at different times or places. To overcome these bacterial air content fluctuations, a standardized reagent bacterial mixture (SRBM), consisting in a collection of selected cultivable environmental species that are prevalent in temperate climate bioaerosols, was designed to generate a stable, reproducible, and easy to use surrogate of bioaerosol sample. The rationale, design, and production process are reported. The results showed that 8.59; CI 95%: 8.46-8.72 log cfu distributed into vials underwent a 0.95; CI 95%: 0.65-1.26 log viability decay after dehydration and subsequent reconstitution, thus advantageously mimicking a natural bioaerosol sample which is typically composed of cultivable and uncultivable particles. Dehydrated SRBM was stable for more than 12months at 4°C and allowed the reconstitution of a dead/live cells aqueous suspension that is stable for 96h at +4°C, according to plate counts. Specific detection of a simulating biothreat agent (e.g. Bacillus atrophaeus) by immuno-magnetic or PCR assays did not display any significant loss of sensitivity, false negative or positive results in the presence of SRBM. This work provides guidance on testing and evaluating detection devices, and may contribute to the establishment of suitable standards and normalized procedures. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. VIERS Electronic Form Submission Service (EFSS)

    Data.gov (United States)

    Department of Veterans Affairs — The D2D EFSS (Inc 1 and 2) provides a common access point to standardize, centralize, and integrate the universal collection of Benefit Claim Forms and supporting...

  16. Stochastic simulations of normal aging and Werner's syndrome.

    KAUST Repository

    Qi, Qi

    2014-04-26

    Human cells typically consist of 23 pairs of chromosomes. Telomeres are repetitive sequences of DNA located at the ends of chromosomes. During cell replication, a number of basepairs are lost from the end of the chromosome and this shortening restricts the number of divisions that a cell can complete before it becomes senescent, or non-replicative. In this paper, we use Monte Carlo simulations to form a stochastic model of telomere shortening to investigate how telomere shortening affects normal aging. Using this model, we study various hypotheses for the way in which shortening occurs by comparing their impact on aging at the chromosome and cell levels. We consider different types of length-dependent loss and replication probabilities to describe these processes. After analyzing a simple model for a population of independent chromosomes, we simulate a population of cells in which each cell has 46 chromosomes and the shortest telomere governs the replicative potential of the cell. We generalize these simulations to Werner\\'s syndrome, a condition in which large sections of DNA are removed during cell division and, amongst other conditions, results in rapid aging. Since the mechanisms governing the loss of additional basepairs are not known, we use our model to simulate a variety of possible forms for the rate at which additional telomeres are lost per replication and several expressions for how the probability of cell division depends on telomere length. As well as the evolution of the mean telomere length, we consider the standard deviation and the shape of the distribution. We compare our results with a variety of data from the literature, covering both experimental data and previous models. We find good agreement for the evolution of telomere length when plotted against population doubling.

  17. Adverse event reporting and developments in radiation biology after normal tissue injury: International Atomic Energy Agency consultation

    International Nuclear Information System (INIS)

    Chen Yuhchyau; Trotti, Andy; Coleman, C. Norman; Machtay, Mitchell; Mirimanoff, Rene O.; Hay, John; O'Brien, Peter C.; El-Gueddari, Brahim; Salvajoli, Joao V.; Jeremic, Branislav

    2006-01-01

    Purpose: Recent research has enhanced our understanding of radiation injury at the molecular-cellular and tissue levels; significant strides have occurred in standardization of adverse event reporting in clinical trials. In response, the International Atomic Energy Agency, through its Division of Human Health and its section for Applied Radiation Biology and Radiotherapy, organized a consultation meeting in Atlanta (October 2, 2004) to discuss developments in radiobiology, normal tissue reactions, and adverse event reporting. Methods and Materials: Representatives from cooperative groups of African Radiation Oncology Group, Curriculo Radioterapeutica Ibero Latino Americana, European Organization for Research and Treatment of Cancer, National Cancer Institute of Canada Clinical Trials Group, Radiation Therapy Oncology Group, and Trans-Tasman Radiation Oncology Group held the meeting discussion. Results: Representatives of major radiotherapy groups/organizations and prominent leaders in radiotherapy discussed current understanding of normal tissue radiobiologic effects, the design and implementation of future clinical and translational projects for normal tissue injury, and the standardization of adverse-event reporting worldwide. Conclusions: The consensus was to adopt NCI comprehensive adverse event reporting terminology and grading system (CTCAE v3.0) as the new standard for all cooperative group trials. Future plans included the implementation of coordinated research projects focusing on normal tissue biomarkers and data collection methods

  18. ASSESSING RADIATION PRESSURE AS A FEEDBACK MECHANISM IN STAR-FORMING GALAXIES

    International Nuclear Information System (INIS)

    Andrews, Brett H.; Thompson, Todd A.

    2011-01-01

    Radiation pressure from the absorption and scattering of starlight by dust grains may be an important feedback mechanism in regulating star-forming galaxies. We compile data from the literature on star clusters, star-forming subregions, normal star-forming galaxies, and starbursts to assess the importance of radiation pressure on dust as a feedback mechanism, by comparing the luminosity and flux of these systems to their dust Eddington limit. This exercise motivates a novel interpretation of the Schmidt law, the L IR -L' CO correlation, and the L IR -L' HCN correlation. In particular, the linear L IR -L' HCN correlation is a natural prediction of radiation pressure regulated star formation. Overall, we find that the Eddington limit sets a hard upper bound to the luminosity of any star-forming region. Importantly, however, many normal star-forming galaxies have luminosities significantly below the Eddington limit. We explore several explanations for this discrepancy, especially the role of 'intermittency' in normal spirals-the tendency for only a small number of subregions within a galaxy to be actively forming stars at any moment because of the time dependence of the feedback process and the luminosity evolution of the stellar population. If radiation pressure regulates star formation in dense gas, then the gas depletion timescale is 6 Myr, in good agreement with observations of the densest starbursts. Finally, we highlight the importance of observational uncertainties, namely, the dust-to-gas ratio and the CO-to-H 2 and HCN-to-H 2 conversion factors, that must be understood before a definitive assessment of radiation pressure as a feedback mechanism in star-forming galaxies.

  19. Effect of food service form on eating rate: meal served in a separated form might lower eating rate.

    Science.gov (United States)

    Suh, Hyung Joo; Jung, Eun Young

    2016-01-01

    In this study, we investigated the association between food form (mixed vs separated) and eating rate. The experiment used a within-subjects design (n=29, young healthy women with normal weight). Test meals (white rice and side dishes) with the same content and volume were served at lunch in a mixed or separated form. The form in which the food was served had significant effects on consumption volume and eating rate; subjects ate significantly more (p<0.05) when a test meal was served as a mixed form (285 g, 575 kcal) compared to a separated form (244 g, 492 kcal). Moreover, subjects also ate significantly faster (p<0.05) when the test meal was served as a mixed form (22.4 g/min) as compared to a separated form (16.2 g/min). Despite consuming more when the test meal was served as a mixed form than when served as a separated form, the subjects did not feel significantly fuller. In conclusion, we confirmed that meals served in a separated form might lower the eating rate and, moreover, slower eating might be associated with less energy intake, without compromising satiety.

  20. Normal paraspinal muscle electromyographic fatigue characteristics in patients with primary fibromyalgia.

    Science.gov (United States)

    Stokes, M J; Colter, C; Klestov, A; Cooper, R G

    1993-08-01

    Paraspinal muscle fatigue mechanisms were compared in 14 primary fibromyalgia patients and 14 age and sex matched normal subjects using a standardized 60-s isometric endurance test of the paraspinal muscles, during which surface integrated electromyographic (IEMG) activity was recorded. Fatigue-induced IEMG increases were similar for both groups during the initial 40 s (up to 112 +/- 20% and 111 +/- 6% of initial values in patients and normal subjects respectively). Thereafter, IEMG fell significantly in patients (P BMI, range 19-25 in controls) those with a BMI BMI > 26 (n = 9) showed greater IEMG declines after 40 s than either normal subjects or in the fibromyalgia group as a whole. Paraspinal muscle fatigue mechanisms appear normal in primary fibromyalgia patients. Isometric force maintenance in overweight patients, despite IEMG declines, illustrates the action of intrinsic fatigue resistance mechanisms which were presumably utilized to a greater extent in these patients to cope with the extra load.

  1. Commensuration and Legitimacy in Standards

    DEFF Research Database (Denmark)

    Hale, Lara

    This paper claims that commensuration is a form of valuation crucial for the legitimacy of standards. It is thus far poorly understood how standards are constructed in a legitimate manner, let alone the role of commensuration, the micro-process of converting qualities into measurable quantities...... for the purpose of comparison. The aim is to show how commensuration affects legitimacy at different phases of a standard's formation and diffusion. In order to do this, the lens is placed upon the relationship between the commensuration processes and input and output legitimacies. Research on the Active House...... legitimacy in different stages, either technical for the standard's specifications or contextual for the standard's implementation. Based on these findings, the paper offers a model of the commensurative development undergone in order to develop the legitimacy of a standard....

  2. Evaluation of directional normalization methods for Landsat TM/ETM+ over primary Amazonian lowland forests

    Science.gov (United States)

    Van doninck, Jasper; Tuomisto, Hanna

    2017-06-01

    Biodiversity mapping in extensive tropical forest areas poses a major challenge for the interpretation of Landsat images, because floristically clearly distinct forest types may show little difference in reflectance. In such cases, the effects of the bidirectional reflection distribution function (BRDF) can be sufficiently strong to cause erroneous image interpretation and classification. Since the opening of the Landsat archive in 2008, several BRDF normalization methods for Landsat have been developed. The simplest of these consist of an empirical view angle normalization, whereas more complex approaches apply the semi-empirical Ross-Li BRDF model and the MODIS MCD43-series of products to normalize directional Landsat reflectance to standard view and solar angles. Here we quantify the effect of surface anisotropy on Landsat TM/ETM+ images over old-growth Amazonian forests, and evaluate five angular normalization approaches. Even for the narrow swath of the Landsat sensors, we observed directional effects in all spectral bands. Those normalization methods that are based on removing the surface reflectance gradient as observed in each image were adequate to normalize TM/ETM+ imagery to nadir viewing, but were less suitable for multitemporal analysis when the solar vector varied strongly among images. Approaches based on the MODIS BRDF model parameters successfully reduced directional effects in the visible bands, but removed only half of the systematic errors in the infrared bands. The best results were obtained when the semi-empirical BRDF model was calibrated using pairs of Landsat observation. This method produces a single set of BRDF parameters, which can then be used to operationally normalize Landsat TM/ETM+ imagery over Amazonian forests to nadir viewing and a standard solar configuration.

  3. New method for computing ideal MHD normal modes in axisymmetric toroidal geometry

    International Nuclear Information System (INIS)

    Wysocki, F.; Grimm, R.C.

    1984-11-01

    Analytic elimination of the two magnetic surface components of the displacement vector permits the normal mode ideal MHD equations to be reduced to a scalar form. A Galerkin procedure, similar to that used in the PEST codes, is implemented to determine the normal modes computationally. The method retains the efficient stability capabilities of the PEST 2 energy principle code, while allowing computation of the normal mode frequencies and eigenfunctions, if desired. The procedure is illustrated by comparison with earlier various of PEST and by application to tilting modes in spheromaks, and to stable discrete Alfven waves in tokamak geometry

  4. An investigation on comprehensive evaluation and standard of image quality of high voltage chest radiograph

    International Nuclear Information System (INIS)

    Yan Shulin; Li Shuopeng; Zhao Bo; Niu Yantao

    1998-01-01

    Purpose: Based on clinical diagnostic demand, patient irradiation dose and imaging technical parameters, to establish a comprehensive evaluation method and standard in chest radiograph. Methods: (1) From 10 normal chest radiographs, the authors selected the evaluation area on thoracic PA (posteroanterior) radiographs and set up standard for diagnostic demand; (2) Using chest CT scans of 20 males and 20 females, the authors calculated the ratio of lung field to mediastinum; (3) Selecting 100 chest films using 125 kVp, the authors measured the standard density values of each evaluation area; (4) Body surface irradiation doses of 478 normal adults were measured. Results: (1) Based on diagnostic demand, the authors confirmed 7 evaluation areas and 4 physical evaluation factors. At the same time, evaluation standards were obtained; (2) Comprehensive evaluation methods were established; (3) Standard height, weight and body surface irradiation dose of Chinese normal adults were investigated preliminarily. Conclusion: Based on the concept of comprehensive evaluation, investigation on the evaluation methods and standard in chest PA radiograph was carried out which might be taken as the foundation for future approach on nation-wide basis

  5. Meissner effect in diffusive normal metal/d-wave superconductor junctions

    NARCIS (Netherlands)

    Yokoyama, Takehito; Tanaka, Yukio; Golubov, Alexandre Avraamovitch; Inoue, Jun-ichiro; Asano, Yasuhiro

    2005-01-01

    The Meissner effect in diffusive normal metal/insulator/d-wave superconductor junctions is studied theoretically in the framework of the Usadel equation under the generalized boundary condition. The effect of midgap Andreev resonant states (MARS) formed at the interface of d-wave superconductor is

  6. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  7. Evaluation of the Normal Fetal Kidney Length and Its Correlation with Gestational Age

    OpenAIRE

    Farrokh Seilanian Toosi; Hossein Rezaie-Delui

    2013-01-01

    A true estimation of gestational age (GA) plays an important role in quality maternity care and scheduling the labor date. This study aimed to evaluate the normal fetal kidney length (KL) and its correlation with GA. A cross-sectional study on 92 pregnant women between 8th and 10th week of gestation with normal singleton pregnancy underwent standard ultrasound fetal biometry and kidney length measurement. univariate and multivariate linear regression analysis was used to create a predictive e...

  8. 76 FR 6514 - Reports, Forms, and Recordkeeping Requirements

    Science.gov (United States)

    2011-02-04

    ... Vehicle Theft Prevention Standard (49 CFR Part 543). OMB Control Number: 2127-0542. Form Number: None... theft prevention standard to provide for the identification of certain motor vehicles and their major replacement parts to impede motor vehicle theft. 49 U.S.C. 33106 provides for an exemption to this...

  9. Cystic fibrosis with normal sweat chloride concentration: case report

    Directory of Open Access Journals (Sweden)

    Silva Filho Luiz Vicente Ferreira da

    2003-01-01

    Full Text Available Cystic fibrosis is a genetic disease usually diagnosed by abnormal sweat testing. We report a case of an 18-year-old female with bronchiectasis, chronic P. aeruginosa infection, and normal sweat chloride concentrations who experienced rapid decrease of lung function and clinical deterioration despite treatment. Given the high suspicion ofcystic fibrosis, broad genotyping testing was performed, showing a compound heterozygous with deltaF508 and 3849+10kb C->T mutations, therefore confirming cystic fibrosis diagnosis. Although the sweat chloride test remains the gold standard for the diagnosis of cystic fibrosis, alternative diagnostic tests such as genotyping and electrophysiologic measurements must be performed if there is suspicion of cystic fibrosis, despite normal or borderline sweat chloride levels.

  10. MRI study of normal pituitary glands in stage of puberty

    International Nuclear Information System (INIS)

    Lin Guangwu; Zhang Tao; Yang Ning; Cai Feng; Shi Yifan; Deng Jieying; Zhang Luodong; Jiang Yayun

    2005-01-01

    Objective: To study the changes of shape, size and signal intensity of normal pituitary glands in adolescents and to correlate the size and shape of normal pituitary glands with the age, height and weight in stage of puberty. Methods: One hundred and fifty-five cases of MRI data of pituitary glands in normal adolescents range from 6.0 year to 18.9 year were used. Using high-field 1.5T MR scanner, the appearances of pituitary glands in 152 normal adolescents were analyzed on T 1 WI in standard median sagittal and coronal plane. Results: Three groups quantitative data of size, shape and single intensity changes of normal pituitary glands were obtained, which were divided into 6- m =0.74, t=3.624, P=0.004; r f =0.94, t=9.562, P=0.000), however, it was not markedly correlated with the height and weight (P>0.05). Conclusion: Obvious changes of the size and shape of pituitary glands were found in health adolescents. The pituitary glands manifest physiologic hypertrophy with more convex of upper border when age increased in stage of puberty. The spherical appearance of the pituitary glands is a normal developmental feature and should not warrant clinical investigation for the presence of an underlying micro-adenoma in teenage females. (authors)

  11. 46 CFR 308.545 - Facultative cargo policy, Form MA-316.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Facultative cargo policy, Form MA-316. 308.545 Section 308.545 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK... policy, Form MA-316. The standard form of War Risk Facultative Cargo Policy, Form MA-316, may be obtained...

  12. External and internal standards in the single-isotope derivative (radioenzymatic) measurement of plasma norepinephrine and epinephrine

    International Nuclear Information System (INIS)

    Shah, S.D.; Clutter, W.E.; Cryer, P.E.

    1985-01-01

    In plasma from normal humans (n = 9, 35 samples) and from patients with diabetes mellitus (n = 12, 24 samples) single-isotope derivative (radioenzymatic) plasma norepinephrine and epinephrine concentrations calculated from external standard curves constructed in a normal plasma pool were identical to those calculated from internal standards added to an aliquot of each plasma sample. In plasma from patients with end-stage renal failure receiving long-term dialysis (n = 34, 109 samples), competitive catechol-O-methyltransferase (COMT) inhibitory activity resulted in a systematic error when external standards in a normal plasma pool were used, as reported previously; values so calculated averaged 21% (+/- 12%, SD) lower than those calculated from internal standards. However, when external standard curves were constructed in plasma from a given patient with renal failure and used to calculate that patient's values, or in a renal failure plasma pool and used to calculate all renal failure values, norepinephrine and epinephrine concentrations were not significantly different from those calculated from internal standards. We conclude: (1) External standard curves constructed in plasma from a given patient with renal failure can be used to measure norepinephrine and epinephrine in plasma from that patient; further, external standards in a renal failure plasma pool can be used for assays in patients with end-stage renal failure receiving long-term dialysis. (2) Major COMT inhibitory activity is not present commonly if samples from patients with renal failure are excluded. Thus, it would appear that external standard curves constructed in normal plasma can be used to measure norepinephrine and epinephrine precisely in samples from persons who do not have renal failure

  13. 40 CFR 471.70 - Applicability; description of the uranium forming subcategory.

    Science.gov (United States)

    2010-07-01

    ... uranium forming subcategory. 471.70 Section 471.70 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS NONFERROUS METALS FORMING AND METAL POWDERS POINT SOURCE CATEGORY Uranium Forming Subcategory § 471.70 Applicability; description of the uranium forming...

  14. Theory of normal metals

    International Nuclear Information System (INIS)

    Mahan, G.D.

    1992-01-01

    The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors

  15. 20 CFR 703.2 - Forms.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Forms. 703.2 Section 703.2 Employees' Benefits EMPLOYMENT STANDARDS ADMINISTRATION, DEPARTMENT OF LABOR LONGSHOREMEN'S AND HARBOR WORKERS... OWCP district offices and on the Internet at http://www.dol.gov/esa/owcp/dlhwc/lsforms.htm. ...

  16. Kinetics of small lymphocytes in normal and nude mice after splenectomy

    DEFF Research Database (Denmark)

    Hougen, H P; Hansen, F; Jensen, E K

    1977-01-01

    Autoradiography and various quantitations on lymphoid tissues have been used to evaluate the kinetics of small lymphocytes in normal (+/nu or +/+) and congenitally athymic nude (nu/nu) NMRI mice 1 month after splenectomy or sham-splenectomy. The results indicate that splenectomy causes depressed...... thymic activity and diminished numbers of T lymphocytes in peripheral lymphoid tissues. The total number of cells in these tissues as well as the blast cell activity, were within normal limits. Bone marrow lymphocyte numbers and kinetics as well as blood lymphocyte levels in splenectomized and sham......-splenectomized normal animals were comparable. Blood lymphocyte numbers were at normal levels in splenectomized nude mice, in spite of reduced numbers of bone marrow and thoracic duct lymphocytes. It is suggested that increased number of newly-formed lymphocytes, found in lymph nodes and blood of splenectomized mice...

  17. Mutual-friction induced instability of normal-fluid vortex tubes in superfluid helium-4

    Science.gov (United States)

    Kivotides, Demosthenes

    2018-06-01

    It is shown that, as a result of its interactions with superfluid vorticity, a normal-fluid vortex tube in helium-4 becomes unstable and disintegrates. The superfluid vorticity acquires only a small (few percents of normal-fluid tube strength) polarization, whilst expanding in a front-like manner in the intervortex space of the normal-fluid, forming a dense, unstructured tangle in the process. The accompanied energy spectra scalings offer a structural explanation of analogous scalings in fully developed finite-temperature superfluid turbulence. A macroscopic mutual-friction model incorporating these findings is proposed.

  18. Learning attention for historical text normalization by learning to pronounce

    DEFF Research Database (Denmark)

    Bollmann, Marcel; Bingel, Joachim; Søgaard, Anders

    2017-01-01

    Automated processing of historical texts often relies on pre-normalization to modern word forms. Training encoder-decoder architectures to solve such problems typically requires a lot of training data, which is not available for the named task. We address this problem by using several novel encoder...

  19. Revision of the occupational health examination form for radiation workers

    International Nuclear Information System (INIS)

    Liu Chang'an; Chen Erdong

    2005-01-01

    Objective: To revise the Occupational Health Examination Form for Radiation Workers, which is served as annex 3 of Management Regulations for Occupational Health Surveillance (Decree No.23 of Ministry of Health, P.R. China), so as to further improve and standardize the occupational health management for radiation workers. Methods: Based on corresponding laws, standards and general principles of occupational medicine. Results: The new version of the Form was established and passed auditing. Conclusion: The theoretical foundation, intention and methods of the revision process are briefly introduced. Requirements and necessary recommendations for implement the new Form are also described. (authors)

  20. 7 CFR 54.1015 - Official reports, forms, and certificates.

    Science.gov (United States)

    2010-01-01

    ... MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT..., Processing, and Packaging of Livestock and Poultry Products § 54.1015 Official reports, forms, and...

  1. How far is the root apex of a unilateral impacted canine from the root apices' arch form?

    Science.gov (United States)

    Kim, Sung-Hun; Kim, You-Min; Oh, Sewoong; Kim, Seong-Sik; Park, Soo-Byung; Son, Woo-Sung; Kim, Yong-Il

    2017-02-01

    The purpose of this study was to determine the arch form of the root apices of normally erupting teeth and then determine the differences in the location of the apex of impacted canines relative to normally erupting canines. In addition, we sought to determine whether the labiopalatal position of the impacted canines influences the position of the apices. The study included 21 patients with unerupted canines that subsequently had a normal eruption, 21 patients with palatally impacted canines, 27 patients with labially impacted canines, and 17 patients with midalveolus impacted canines. Images were obtained using cone beam computed tomography, and the x, y, and z coordinates of the root apices were determined using Ondemand3D software (Cybermed Co., Seoul, Korea). Two-dimensional coordinates were converted from acquired 3-dimensional coordinates via projection on a palatal plane, and the Procrustes method was used to process the converted 2-dimensional coordinates and to draw the arch forms of the root apices. Finally, we measured the extent of root apex deviation from the arch forms of the root apices. Normally erupting canines showed that even though calcifications may be immature, their positions were aligned with a normal arch form. The root apices of the impacted canines were an average of 6.572 mm away from the root apices' arch form, whereas those of the contralateral nonimpacted canines were an average distance of 2.221 mm away, a statistically significant difference. The palatally impacted canines' root apices distribution tended toward the first premolar root apices. Incompletely calcified, unerupted teeth with a subsequent normal eruption showed a normal arch form of the root apices. The root apices of impacted canines were farther from the arch forms than were the nonimpacted canines. Also, the root apices of impacted canines in the palatal area showed distributions different from those of the other impacted canine groups. Copyright © 2017 American

  2. 7 CFR 28.107 - Original cotton standards and reserve sets.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Original cotton standards and reserve sets. 28.107... Standards Act Practical Forms of Cotton Standards § 28.107 Original cotton standards and reserve sets. (a) The containers of the original Universal Standards and other official cotton standards of the United...

  3. Fish gelatin thin film standards for biological application of PIXE

    Science.gov (United States)

    Manuel, Jack E.; Rout, Bibhudutta; Szilasi, Szabolcs Z.; Bohara, Gyanendra; Deaton, James; Luyombya, Henry; Briski, Karen P.; Glass, Gary A.

    2014-08-01

    There exists a critical need to understand the flow and accumulation of metallic ions, both naturally occurring and those introduced to biological systems. In this paper the results of fabricating thin film elemental biological standards containing nearly any combination of trace elements in a protein matrix are presented. Because it is capable of high elemental sensitivity, particle induced X-ray emission spectrometry (PIXE) is an excellent candidate for in situ analysis of biological tissues. Additionally, the utilization of microbeam PIXE allows the determination of elemental concentrations in and around biological cells. However, obtaining elemental reference standards with the same matrix constituents as brain tissue is difficult. An excellent choice for simulating brain-like tissue is Norland® photoengraving glue which is derived from fish skin. Fish glue is water soluble, liquid at room temperature, and resistant to dilute acid. It can also be formed into a thin membrane which dries into a durable, self-supporting film. Elements of interest are introduced to the fish glue in precise volumetric additions of well quantified atomic absorption standard solutions. In this study GeoPIXE analysis package is used to quantify elements intrinsic to the fish glue as well as trace amounts of manganese added to the sample. Elastic (non-Rutherford) backscattered spectroscopy (EBS) and the 1.734 MeV proton-on-carbon 12C(p,p)12C resonance is used for a normalization scheme of the PIXE spectra to account for any discrepancies in X-ray production arising from thickness variation of the prepared standards. It is demonstrated that greater additions of the atomic absorption standard cause a viscosity reduction of the liquid fish glue resulting in thinner films but the film thickness can be monitored by using simultaneous PIXE and EBS proton data acquisition.

  4. Fish gelatin thin film standards for biological application of PIXE

    International Nuclear Information System (INIS)

    Manuel, Jack E.; Rout, Bibhudutta; Szilasi, Szabolcs Z.; Bohara, Gyanendra; Deaton, James; Luyombya, Henry; Briski, Karen P.; Glass, Gary A.

    2014-01-01

    There exists a critical need to understand the flow and accumulation of metallic ions, both naturally occurring and those introduced to biological systems. In this paper the results of fabricating thin film elemental biological standards containing nearly any combination of trace elements in a protein matrix are presented. Because it is capable of high elemental sensitivity, particle induced X-ray emission spectrometry (PIXE) is an excellent candidate for in situ analysis of biological tissues. Additionally, the utilization of microbeam PIXE allows the determination of elemental concentrations in and around biological cells. However, obtaining elemental reference standards with the same matrix constituents as brain tissue is difficult. An excellent choice for simulating brain-like tissue is Norland® photoengraving glue which is derived from fish skin. Fish glue is water soluble, liquid at room temperature, and resistant to dilute acid. It can also be formed into a thin membrane which dries into a durable, self-supporting film. Elements of interest are introduced to the fish glue in precise volumetric additions of well quantified atomic absorption standard solutions. In this study GeoPIXE analysis package is used to quantify elements intrinsic to the fish glue as well as trace amounts of manganese added to the sample. Elastic (non-Rutherford) backscattered spectroscopy (EBS) and the 1.734 MeV proton-on-carbon 12 C(p,p) 12 C resonance is used for a normalization scheme of the PIXE spectra to account for any discrepancies in X-ray production arising from thickness variation of the prepared standards. It is demonstrated that greater additions of the atomic absorption standard cause a viscosity reduction of the liquid fish glue resulting in thinner films but the film thickness can be monitored by using simultaneous PIXE and EBS proton data acquisition

  5. Fish gelatin thin film standards for biological application of PIXE

    Energy Technology Data Exchange (ETDEWEB)

    Manuel, Jack E., E-mail: jaelma@gmail.com [Ion Beam Modification and Analysis Laboratory, University of North Texas, Denton, TX 76203 (United States); Rout, Bibhudutta; Szilasi, Szabolcs Z.; Bohara, Gyanendra [Ion Beam Modification and Analysis Laboratory, University of North Texas, Denton, TX 76203 (United States); Deaton, James; Luyombya, Henry [Louisiana Accelerator Center, University of Louisiana at Lafayette, Lafayette, LA 70503 (United States); Briski, Karen P. [Department of Basic Pharmaceutical Sciences, University of Louisiana at Monroe, Monroe, LA 71209 (United States); Glass, Gary A. [Ion Beam Modification and Analysis Laboratory, University of North Texas, Denton, TX 76203 (United States)

    2014-08-01

    There exists a critical need to understand the flow and accumulation of metallic ions, both naturally occurring and those introduced to biological systems. In this paper the results of fabricating thin film elemental biological standards containing nearly any combination of trace elements in a protein matrix are presented. Because it is capable of high elemental sensitivity, particle induced X-ray emission spectrometry (PIXE) is an excellent candidate for in situ analysis of biological tissues. Additionally, the utilization of microbeam PIXE allows the determination of elemental concentrations in and around biological cells. However, obtaining elemental reference standards with the same matrix constituents as brain tissue is difficult. An excellent choice for simulating brain-like tissue is Norland® photoengraving glue which is derived from fish skin. Fish glue is water soluble, liquid at room temperature, and resistant to dilute acid. It can also be formed into a thin membrane which dries into a durable, self-supporting film. Elements of interest are introduced to the fish glue in precise volumetric additions of well quantified atomic absorption standard solutions. In this study GeoPIXE analysis package is used to quantify elements intrinsic to the fish glue as well as trace amounts of manganese added to the sample. Elastic (non-Rutherford) backscattered spectroscopy (EBS) and the 1.734 MeV proton-on-carbon {sup 12}C(p,p){sup 12}C resonance is used for a normalization scheme of the PIXE spectra to account for any discrepancies in X-ray production arising from thickness variation of the prepared standards. It is demonstrated that greater additions of the atomic absorption standard cause a viscosity reduction of the liquid fish glue resulting in thinner films but the film thickness can be monitored by using simultaneous PIXE and EBS proton data acquisition.

  6. Advection-diffusion model for normal grain growth and the stagnation of normal grain growth in thin films

    International Nuclear Information System (INIS)

    Lou, C.

    2002-01-01

    An advection-diffusion model has been set up to describe normal grain growth. In this model grains are divided into different groups according to their topological classes (number of sides of a grain). Topological transformations are modelled by advective and diffusive flows governed by advective and diffusive coefficients respectively, which are assumed to be proportional to topological classes. The ordinary differential equations governing self-similar time-independent grain size distribution can be derived analytically from continuity equations. It is proved that the time-independent distributions obtained by solving the ordinary differential equations have the same form as the time-dependent distributions obtained by solving the continuity equations. The advection-diffusion model is extended to describe the stagnation of normal grain growth in thin films. Grain boundary grooving prevents grain boundaries from moving, and the correlation between neighbouring grains accelerates the stagnation of normal grain growth. After introducing grain boundary grooving and the correlation between neighbouring grains into the model, the grain size distribution is close to a lognormal distribution, which is usually found in experiments. A vertex computer simulation of normal grain growth has also been carried out to make a cross comparison with the advection-diffusion model. The result from the simulation did not verify the assumption that the advective and diffusive coefficients are proportional to topological classes. Instead, we have observed that topological transformations usually occur on certain topological classes. This suggests that the advection-diffusion model can be improved by making a more realistic assumption on topological transformations. (author)

  7. Chemical forms of radioiodine

    International Nuclear Information System (INIS)

    Tachikawa, Enzo

    1979-01-01

    Release of radioiodine built-up during reactor operations presents a potential problem from the standpoint of environmental safety. Among the chemical forms of radioiodine, depending upon the circumstances, organic iodides cast a most serious problem because of its difficulties in the trapping and because of its stability compared to other chemical forms. Furthermore, pellet-cladding interaction (PCl) fuel failures in LWR fuel rods are believed to be stress corrosion cracks caused by embrittling fission product species, radioiodine. To deal with these problems, knowledge is required on the chemical behaviors of radioiodine in and out of fuels, as well as the release behaviors from fuels. Here a brief review is given of these respects, in aiming at clearing-up the questions still remaining unknown. The data seem to indicate that radioiodine exists as a combined form in fuels. upon heating slightly irradiated fuels, the iodine atoms are released in a chemical form associated with uranium atoms. Experiments, however, as needed with specimen of higher burnup, where the interactions of radioiodine with metallic fission products could be favored. The dominant release mechanism of radioiodine under normal operating temperatures will be diffusion to grain boundaries leading to open surfaces. Radiation-induced internal traps, however, after the rate of diffusion significantly. The carbon sources of organic iodides formed under various conditions and its formation mechanisms have also been considered. (author)

  8. Non-standard employment relationship and the gender dimension

    OpenAIRE

    Mihaela-Emilia Marica

    2015-01-01

    Besides influences economic, political and social on the standard form of individual employment contract, which led to a more flexible regulatory framework in the field of labor relations, an important factor that marked trend evolving contract atypical employment is the number of women who entered the labor market in recent decades. Because most strongly feminized form of employment non-standard employment relationship part-time, this article captures the issues most important about the r...

  9. Binary trading relations and the limits of EDI standards

    DEFF Research Database (Denmark)

    Damsgaard, Jan; Truex, D.

    2000-01-01

    This paper provides a critical examination of electronic data interchange (EDI) standards and their application in different types of trading relationships. It argues that EDI standards are not directly comparable to more stable sets of technical standards in that they are dynamically tested...... and negotiated in use with each trading exchange. It takes the position that EDI standards are an emergent language form and must mean different things at the institutional and local levels. Using the lens of emergent linguistic analysis it shows how the institutional and local levels must always be distinct...... and yet can coexist. EDI standards can never represent the creation of an 'Esperanto of institutional communication'. Instead we believe that standards must be developed such that they support and accommodate general basic grammatical forms that can be customised to individual needs. The analysis...

  10. Normal-zone detectors for the MFTF-B coils. Revision 1

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.

    1983-01-01

    In order to protect a set of inductively coupled superconducting magnets, it is necessary to locate and measure normal zone voltages that are small compared with the mutual and self-induced voltages. The method described in this report uses two sets of voltage measurements to locate and measure one or more normal zones in any number of coupled coils. One set of voltages is the outputs of bridges that balance out the self-induced voltages. The other set of voltages can be the voltages across the coils, although alternatives are possible. The two sets of equations form a single combined set of equations. Each normal zone location or combination of normal zones has a set of these combined equations associated with it. It is demonstrated that the normal zone can be located and the correct set chosen, allowing determination of the size of the normal zone. Only a few operations take place in a working detector: multiplication of a constant, addition, and simple decision-making. In many cases the detector for each coil, although weakly linked to the other detectors, can be considered to be independent. An example of the detector design is given for four coils with realistic parameters. The effect on accuracy of changes in the system parameters is discussed

  11. Custom v. Standardized Risk Models

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-05-01

    Full Text Available We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: (1 longer horizon risk factors (value, growth, etc. increase noise trades and trading costs; (2 arbitrary risk factors can neutralize alpha; (3 “standardized” industries are artificial and insufficiently granular; (4 normalization of style risk factors is lost for the trading universe; (5 diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building.

  12. Forms and genesis of species abundance distributions

    Directory of Open Access Journals (Sweden)

    Evans O. Ochiaga

    2015-12-01

    Full Text Available Species abundance distribution (SAD is one of the most important metrics in community ecology. SAD curves take a hollow or hyperbolic shape in a histogram plot with many rare species and only a few common species. In general, the shape of SAD is largely log-normally distributed, although the mechanism behind this particular SAD shape still remains elusive. Here, we aim to review four major parametric forms of SAD and three contending mechanisms that could potentially explain this highly skewed form of SAD. The parametric forms reviewed here include log series, negative binomial, lognormal and geometric distributions. The mechanisms reviewed here include the maximum entropy theory of ecology, neutral theory and the theory of proportionate effect.

  13. Two possible approaches to form sub-millisecond pulsars

    OpenAIRE

    Du, Yuanjie; Xu, R. X.; Qiao, G. J.; Han, J. L.

    2008-01-01

    Pulsars have been recognized as normal neutron stars or quark stars. Sub-millisecond pulsars, if detected, would play an essential and important role in distinguishing quark stars from neutron stars. A key question is how sub-millisecond pulsars could form. Both sub-Keplerian (for neutron and quark stars) and super-Keplerian cases (only for quark stars, which are bound additionally by strong interaction) have been discussed in this paper in order to investigate possible ways of forming sub-mi...

  14. Acoustic wave spread in superconducting-normal-superconducting sandwich

    International Nuclear Information System (INIS)

    Urushadze, G.I.

    2004-01-01

    The acoustic wave spread, perpendicular to the boundaries between superconducting and normal metals in superconducting-normal-superconducting (SNS) sandwich has been considered. The alternate current flow sound induced by the Green function method has been found and the coefficient of the acoustic wave transmission through the junction γ=(S 1 -S 2 )/S 1 , (where S 1 and S 2 are average energy flows formed on the first and second boundaries) as a function of the phase difference between superconductors has been investigated. It is shown that while the SNS sandwich is almost transparent for acoustic waves (γ 0 /τ), n=0,1,2, ... (where τ 0 /τ is the ratio of the broadening of the quasiparticle energy levels in impurity normal metal as a result of scattering of the carriers by impurities 1/τ to the spacing between energy levels 1/τ 0 ), γ=2, (S 2 =-S 1 ), which corresponds to the full reflection of the acoustic wave from SNS sandwich. This result is valid for the limit of a pure normal metal but in the main impurity case there are two amplification and reflection regions for acoustic waves. The result obtained shows promise for the SNS sandwich as an ideal mirror for acoustic wave reflection

  15. Flicker-defined form perimetry in glaucoma patients.

    Science.gov (United States)

    Horn, Folkert K; Kremers, Jan; Mardin, Christian Y; Jünemann, Anselm G; Adler, Werner; Tornow, Ralf P

    2015-03-01

    To assess the potential of flicker-defined form (FDF) perimetry to detect functional loss in patient groups with beginning glaucoma, and to evaluate the dynamic range of the FDF stimulus in individual patients and at individual test positions. FDF perimetry and standard automated perimetry (SAP) were performed at identical test locations (adapted G1 protocol) in 60 healthy subjects and 111 glaucoma patients. All patients showed glaucomatous optic disc appearance. Grouping within the glaucoma cohort was based on SAP-performance: 33 "preperimetric" open-angle glaucoma (OAG) patients, 28 "borderline" OAG (focal defects and SAP-mean defect (MD) <2 dB), 33 "early" OAG (SAP-MD < 5 dB), 17 "advanced" OAG. All participants were experienced in psychophysical and perimetric tests. Defect values and the areas under receiver operating characteristic curves (ROC) in patient groups were statistically compared. The values of FDF-MD in the preperimetric, borderline, and early OAG group were 2.7 ± 3.4 dB, 5.5 ± 2.6 dB, and 8.5 ± 3.4 dB respectively (all significantly above normal). The percentage of patients exceeding normal FDF-MD was 27.3 %, 60.7 %, and 87.9 % respectively. The age-adjusted FDF-mean defect (MD) of the G1X-protocol was not significantly correlated with refractive error, lens opacity, pupil size, or gender. Occurrence of ceiling effects (inability to detect targets at highest contrast) showed a high correlation with visual field losses (R = 0.72, p < 0.001). Local analysis indicates that SAP losses exceeding 5 dB could not be distinguished with the FDF technique. The FDF stimulus was able to detect beginning glaucoma damage. Patients with SAP-MD values exceeding 5 dB should be monitored with conventional perimetry because of its larger dynamic range.

  16. Apparently abnormal Wechsler Memory Scale index score patterns in the normal population.

    Science.gov (United States)

    Carrasco, Roman Marcus; Grups, Josefine; Evans, Brittney; Simco, Edward; Mittenberg, Wiley

    2015-01-01

    Interpretation of the Wechsler Memory Scale-Fourth Edition may involve examination of multiple memory index score contrasts and similar comparisons with Wechsler Adult Intelligence Scale-Fourth Edition ability indexes. Standardization sample data suggest that 15-point differences between any specific pair of index scores are relatively uncommon in normal individuals, but these base rates refer to a comparison between a single pair of indexes rather than multiple simultaneous comparisons among indexes. This study provides normative data for the occurrence of multiple index score differences calculated by using Monte Carlo simulations and validated against standardization data. Differences of 15 points between any two memory indexes or between memory and ability indexes occurred in 60% and 48% of the normative sample, respectively. Wechsler index score discrepancies are normally common and therefore not clinically meaningful when numerous such comparisons are made. Explicit prior interpretive hypotheses are necessary to reduce the number of index comparisons and associated false-positive conclusions. Monte Carlo simulation accurately predicts these false-positive rates.

  17. Interaction between amylose and 1-butanol during 1-butanol-hydrochloric acid hydrolysis of normal rice starch.

    Science.gov (United States)

    Hu, Xiuting; Wei, Benxi; Zhang, Bao; Li, Hongyan; Xu, Xueming; Jin, Zhengyu; Tian, Yaoqi

    2013-10-01

    The aim of this study was to examine the interaction between amylose and 1-butanol during the 1-butanol-hydrochloric acid (1-butanol-HCl) hydrolysis of normal rice starch. The interaction model between amylose and 1-butanol was proposed using gas chromatography-mass spectrometry (GC-MS), (13)C cross polarization and magic angle spinning NMR analysis ((13)C CP/MAS NMR), differential scanning calorimetry (DSC), and thermalgravimetric analysis (TGA). GC-MS data showed that another form of 1-butanol existed in 1-butanol-HCl-hydrolyzed normal rice starch, except in the form of free molecules absorbed on the starch granules. The signal of 1-butanol-HCl-hydrolyzed starch at 100.1 ppm appeared in the (13)C CP/MAS NMR spectrum, indicating that the amylose-1-butanol complex was formed. DSC and TGA data also demonstrated the formation of the complex, which significantly affected the thermal properties of normal rice starch. These findings revealed that less dextrin with low molecular weight formed might be attributed to resistance of this complex to acid during 1-butanol-HCl hydrolysis. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  18. Environmental dosimetry for normal operations at SRP. Revision 1

    International Nuclear Information System (INIS)

    Marter, W.L.

    1984-01-01

    The radiological effect of environmental releases from SRP during normal operations has been assessed annually since 1972 with a dosimetry model developed by SRL in 1971 to 1972, as implemented in the MREM code for atmospheric releases and RIVDOSE code for liquid releases. Starting in 1978, SRL started using environmental models and dose commitment factors developed by Nuclear Regulatory Commission (NRC) for all other environmental dose calculations. The NRC models are more flexible than the older SRL models, use more up-to-date methodologies, cover more exposure pathways, and permit more detailed analysis of effects of normal operations. It is recommended that the NRC models, as implemented in the computer codes X0QD0Q and GASPAR for atmospheric releases and LADTAP for liquid releases, and NRC dose commitment factors be used as the standard method at SRP for assessing offsite dose from normal operations in Health Protection Department annual environmental monitoring reports, and in National Environmental Policy Act documents and Safety Analysis Reports for SRP facilities. 23 references, 3 figures, 9 tables

  19. Normalization Approaches for Removing Systematic Biases Associated with Mass Spectrometry and Label-Free Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Callister, Stephen J.; Barry, Richard C.; Adkins, Joshua N.; Johnson, Ethan T.; Qian, Weijun; Webb-Robertson, Bobbie-Jo M.; Smith, Richard D.; Lipton, Mary S.

    2006-02-01

    Central tendency, linear regression, locally weighted regression, and quantile techniques were investigated for normalization of peptide abundance measurements obtained from high-throughput liquid chromatography-Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR MS). Arbitrary abundances of peptides were obtained from three sample sets, including a standard protein sample, two Deinococcus radiodurans samples taken from different growth phases, and two mouse striatum samples from control and methamphetamine-stressed mice (strain C57BL/6). The selected normalization techniques were evaluated in both the absence and presence of biological variability by estimating extraneous variability prior to and following normalization. Prior to normalization, replicate runs from each sample set were observed to be statistically different, while following normalization replicate runs were no longer statistically different. Although all techniques reduced systematic bias, assigned ranks among the techniques revealed significant trends. For most LC-FTICR MS analyses, linear regression normalization ranked either first or second among the four techniques, suggesting that this technique was more generally suitable for reducing systematic biases.

  20. 46 CFR 308.517 - Open Cargo Policy, Form MA-300.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Open Cargo Policy, Form MA-300. 308.517 Section 308.517... Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.517 Open Cargo Policy, Form MA-300. The standard form of War Risk Open Cargo, Form MA-300, may be obtained from the American War Risk...

  1. Improving historical spelling normalization with bi-directional LSTMs and multi-task learning

    OpenAIRE

    Bollmann, Marcel; Søgaard, Anders

    2016-01-01

    Natural-language processing of historical documents is complicated by the abundance of variant spellings and lack of annotated data. A common approach is to normalize the spelling of historical words to modern forms. We explore the suitability of a deep neural network architecture for this task, particularly a deep bi-LSTM network applied on a character level. Our model compares well to previously established normalization algorithms when evaluated on a diverse set of texts from Early New Hig...

  2. Hyperprolactinemia with normal serum prolactin: Its clinical significance

    Directory of Open Access Journals (Sweden)

    Manika Agarwal

    2010-01-01

    Full Text Available Amenorrhea and infertility with an added feature of galactorrhea makes a provisional diagnosis of hyperprolactinemia. But again, normal serum prolactin with all clinical features of hyperprolactinemia might question the diagnosis and further management. The answer lies in the heterogeneity of the peptide hormone - the immunoactive and the bioactive forms. This has been further illustrated with the help of a case which had been treated with cabergoline.

  3. Normalizing Landsat and ASTER Data Using MODIS Data Products for Forest Change Detection

    Science.gov (United States)

    Gao, Feng; Masek, Jeffrey G.; Wolfe, Robert E.; Tan, Bin

    2010-01-01

    Monitoring forest cover and its changes are a major application for optical remote sensing. In this paper, we present an approach to integrate Landsat, ASTER and MODIS data for forest change detection. Moderate resolution (10-100m) images (e.g. Landsat and ASTER) acquired from different seasons and times are normalized to one "standard" date using MODIS data products as reference. The normalized data are then used to compute forest disturbance index for forest change detection. Comparing to the results from original data, forest disturbance index from the normalized images is more consistent spatially and temporally. This work demonstrates an effective approach for mapping forest change over a large area from multiple moderate resolution sensors on various acquisition dates.

  4. The radiographic anatomy of the normal ovine digit, the metacarpophalangeal and metatarsophalangeal joints.

    Science.gov (United States)

    Duncan, Jennifer S; Singer, Ellen R; Devaney, Jane; Oultram, Joanne W H; Walby, Anna J; Lester, Bridie R; Williams, Helen J

    2013-03-01

    The aim of this project was to develop a detailed, accessible set of reference images of the normal radiographic anatomy of the ovine digit up to and including the metacarpo/metatatarsophalangeal joints. The lower front and hind limbs of 5 Lleyn ewes were radiographed using portable radiography equipment, a digital image processer and standard projections. Twenty images, illustrating the normal radiographic anatomy of the limb were selected, labelled and presented along with a detailed description and corresponding images of the bony skeleton. These images are aimed to be of assistance to veterinary surgeons, veterinary students and veterinary researchers by enabling understanding of the normal anatomy of the ovine lower limb, and allowing comparison with the abnormal.

  5. Normal zone detectors for a large number of inductively coupled coils

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.

    1983-01-01

    In order to protect a set of inductively coupled superconducting magnets, it is necessary to locate and measure normal zone voltages that are small compared with the mutual and self-induced voltages. The method described in this paper uses two sets of voltage measurements to locate and measure one or more normal zones in any number of coupled coils. One set of voltages is the outputs of bridges that balance out the self-induced voltages. The other set of voltages can be the voltages across the coils, although alternatives are possible. The two sets of equations form a single combined set of equations. Each normal zone location or combination of normal zones has a set of these combined equations associated with it. It is demonstrated that the normal zone can be located and the correct set chosen, allowing determination of the size of the normal zone. Only a few operations take place in a working detector: multiplication of a constant, addition, and simple decision-making. In many cases the detector for each coil, although weakly linked to the other detectors, can be considered to be independent

  6. EMG normalization method based on grade 3 of manual muscle testing: Within- and between-day reliability of normalization tasks and application to gait analysis.

    Science.gov (United States)

    Tabard-Fougère, Anne; Rose-Dulcina, Kevin; Pittet, Vincent; Dayer, Romain; Vuillerme, Nicolas; Armand, Stéphane

    2018-02-01

    Electromyography (EMG) is an important parameter in Clinical Gait Analysis (CGA), and is generally interpreted with timing of activation. EMG amplitude comparisons between individuals, muscles or days need normalization. There is no consensus on existing methods. The gold standard, maximum voluntary isometric contraction (MVIC), is not adapted to pathological populations because patients are often unable to perform an MVIC. The normalization method inspired by the isometric grade 3 of manual muscle testing (isoMMT3), which is the ability of a muscle to maintain a position against gravity, could be an interesting alternative. The aim of this study was to evaluate the within- and between-day reliability of the isoMMT3 EMG normalizing method during gait compared with the conventional MVIC method. Lower limb muscles EMG (gluteus medius, rectus femoris, tibialis anterior, semitendinosus) were recorded bilaterally in nine healthy participants (five males, aged 29.7±6.2years, BMI 22.7±3.3kgm -2 ) giving a total of 18 independent legs. Three repeated measurements of the isoMMT3 and MVIC exercises were performed with an EMG recording. EMG amplitude of the muscles during gait was normalized by these two methods. This protocol was repeated one week later. Within- and between-day reliability of normalization tasks were similar for isoMMT3 and MVIC methods. Within- and between-day reliability of gait EMG normalized by isoMMT3 was higher than with MVIC normalization. These results indicate that EMG normalization using isoMMT3 is a reliable method with no special equipment needed and will support CGA interpretation. The next step will be to evaluate this method in pathological populations. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A novel waste form for disposal of spent-nuclear-fuel reprocessing waste: A vitrifiable cement

    International Nuclear Information System (INIS)

    Gougar, M.L.D.; Scheetz, B.E.; Siemer, D.D.

    1999-01-01

    A cement capable of being hot isostatically pressed into a glass ceramic has been proposed as the waste form for spent-nuclear-fuel reprocessing wastes at the Idaho National Engineering and Environmental Laboratory (INEEL). This intermediate cement, with a composition based on that of common glasses, has been designed and tested. The cement formulations included mixed INEEL wastes, blast furnace slag, reactive silica, and INEEL soil or vermiculite, which were activated with potassium or sodium hydroxide. Following autoclave processing, the cements were characterized. X-ray diffraction analysis revealed three notable crystalline phases: quartz, calcite, and fluorite. Results of compressive strength testing ranged from 1452 and 4163 psi, exceeding the US Nuclear Regulatory Commission (NRC)-suggested standard of >500 psi. From American National Standards Institute/American Nuclear Society 16.1-1986 leach testing, effective diffusivities for Cs were determined to be on the order of 10 -11 to 10 -10 cm 2 /s and for Sr were 10 -12 cm 2 /s, which are four orders of magnitude less than diffusivities in some other radwaste materials. Average leach indices (LI) were 9.6 and 11.9 for Cs and Sr, respectively, meeting the NRC Standard of LI > 6. The 28-day Materials Characterization Center-1 leach testing resulted in normalized elemental mass losses between 0.63 and 28 g/(m 2 ·day) for Cs and between 0.34 and 0.70 g/(m 2 ·day) industry-accepted standard while Cs losses indicate a process sensitive parameter

  8. Modelling of tension stiffening for normal and high strength concrete

    DEFF Research Database (Denmark)

    Christiansen, Morten Bo; Nielsen, Mogens Peter

    1998-01-01

    form the model is extended to apply to biaxial stress fields as well. To determine the biaxial stress field, the theorem of minimum complementary elastic energy is used. The theory has been compared with tests on rods, disks, and beams of both normal and high strength concrete, and very good results...

  9. The construction of solid waste form test and inspection facility

    International Nuclear Information System (INIS)

    Park, Hun Hwee; Lee, Kang Moo; Jung, In Ha; Kim, Sung Hwan; Yoo, Jeong Woo; Lee, Jong Youl; Bae, Sang Min

    1988-01-01

    The solid waste form test and inspection facility is a facility to test and inspect the characteristics of waste forms, such as homogenity, mechanical structure, thermal behaviour, water resistance and leachability. Such kinds of characteristics in waste forms are required to meet a certain conditions for long-term storage or for final disposal of wastes. The facility will be used to evaluate safety for the disposal of wastes by test and inspection. At this moment, the efforts to search the most effective management of the radioactive wastes generated from power plants and radioisotope user are being executed by the people related to this field. Therefore, the facility becomes more significant tool because of its guidance of sucessfully converting wastes into forms to give a credit to the safety of waste disposal for managing the radioactive wastes. In addition the overall technical standards for inspecting of waste forms such as the standardized equipment and processes in the facility will be estabilished in the begining of 1990's when the project of waste management will be on the stream. Some of the items of the project have been standardized for the purpose of localization. In future, this facility will be utilized not only for the inspection of waste forms but also for the periodic decontamination apparatus by remote operation techniques. (Author)

  10. Performance standard for dose Calibrator

    CERN Document Server

    Darmawati, S

    2002-01-01

    Dose calibrator is an instrument used in hospitals to determine the activity of radionuclide for nuclear medicine purposes. International Electrotechnical Commission (IEC) has published IEC 1303:1994 standard that can be used as guidance to test the performance of the instrument. This paper briefly describes content of the document,as well as explains the assessment that had been carried out to test the instrument accuracy in Indonesia through intercomparison measurement.Its is suggested that hospitals acquire a medical physicist to perform the test for its dose calibrator. The need for performance standard in the form of Indonesia Standard is also touched.

  11. Alpha-amidated peptides derived from pro-opiomelanocortin in normal human pituitary

    DEFF Research Database (Denmark)

    Fenger, M; Johnsen, A H

    1988-01-01

    Normal human pituitaries were extracted in boiling water and acetic acid, and the alpha-amidated peptide products of pro-opiomelanocortin (POMC), alpha-melanocyte-stimulating hormone (alpha MSH), gamma-melanocyte-stimulating hormone (gamma 1MSH), and amidated hinge peptide (HP-N), as well...... (ACTH)-(1-39), ACTH-(1-14) and alpha MSH immunoreactivity]. alpha MSH and ACTH-(1-14) were only present in non- or mono-acetylated forms. Only large forms of gamma 1MSH and gamma 2MSH were present in partly glycosylated states. The hinge peptides were amidated to an extent two to three orders...... amidated POMC-related peptides are present in normal human pituitary. It also shows that cleavage in vivo at all dibasic amino acids but one, takes place at the N-terminal POMC region; the exception is at the POMC-(49-50) N-terminal of the gamma MSH sequence. The pattern of peptides produced suggests...

  12. The effects of normal aging on multiple aspects of financial decision-making.

    Directory of Open Access Journals (Sweden)

    Dorien F Bangma

    Full Text Available Financial decision-making (FDM is crucial for independent living. Due to cognitive decline that accompanies normal aging, older adults might have difficulties in some aspects of FDM. However, an improved knowledge, personal experience and affective decision-making, which are also related to normal aging, may lead to a stable or even improved age-related performance in some other aspects of FDM. Therefore, the present explorative study examines the effects of normal aging on multiple aspects of FDM.One-hundred and eighty participants (range 18-87 years were assessed with eight FDM tests and several standard neuropsychological tests. Age effects were evaluated using hierarchical multiple regression analyses. The validity of the prediction models was examined by internal validation (i.e. bootstrap resampling procedure as well as external validation on another, independent, sample of participants (n = 124. Multiple regression and correlation analyses were applied to investigate the mediation effect of standard measures of cognition on the observed effects of age on FDM.On a relatively basic level of FDM (e.g., paying bills or using FDM styles no significant effects of aging were found. However more complex FDM, such as making decisions in accordance with specific rules, becomes more difficult with advancing age. Furthermore, an older age was found to be related to a decreased sensitivity for impulsive buying. These results were confirmed by the internal and external validation analyses. Mediation effects of numeracy and planning were found to explain parts of the association between one aspect of FDM (i.e. Competence in decision rules and age; however, these cognitive domains were not able to completely explain the relation between age and FDM.Normal aging has a negative influence on a complex aspect of FDM, however, other aspects appear to be unaffected by normal aging or improve.

  13. Inflation and dark energy from three-forms

    International Nuclear Information System (INIS)

    Koivisto, Tomi S.; Nunes, Nelson J.

    2009-01-01

    Three-forms can give rise to viable cosmological scenarios of inflation and dark energy with potentially observable signatures distinct from standard single scalar field models. In this study, the background dynamics and linear perturbations of self-interacting three-form cosmology are investigated. The phase space of cosmological solutions possesses (super)-inflating attractors and saddle points, which can describe three-form driven inflation or dark energy. The quantum generation and the classical evolution of perturbations is considered. The scalar and tensor spectra from a three-form inflation and the impact from the presence of a three-form on matter perturbations are computed. Stability properties and equivalence of the model with alternative formulations are discussed.

  14. Gaps-in-Noise test: gap detection thresholds in 9-year-old normal-hearing children.

    Science.gov (United States)

    Marculino, Carolina Finetti; Rabelo, Camila Maia; Schochat, Eliane

    2011-12-01

    To establish the standard criteria for the Gaps-in-Noise (GIN) test in 9-year-old normal-hearing children; to obtain the mean gap detection thresholds; and to verify the influence of the variables gender and ear on the gap detection thresholds. Forty normal-hearing individuals, 20 male and 20 female, with ages ranging from 9 years to 9 years and 11 months, were evaluated. The procedures performed were: anamnesis, audiological evaluation, acoustic immittance measures (tympanometry and acoustic reflex), Dichotic Digits Test, and GIN test. The results obtained were statistically analyzed. The results revealed similar performance of right and left ears in the population studied. There was also no difference regarding the variable gender. In the subjects evaluated, the mean gap detection thresholds were 4.4 ms for the right ear, and 4.2 ms for the left ear. The values obtained for right and left ear, as well as their standard deviations, can be used as standard criteria for 9-year-old children, regardless of ear or gender.

  15. Feasibility of Computed Tomography-Guided Methods for Spatial Normalization of Dopamine Transporter Positron Emission Tomography Image.

    Science.gov (United States)

    Kim, Jin Su; Cho, Hanna; Choi, Jae Yong; Lee, Seung Ha; Ryu, Young Hoon; Lyoo, Chul Hyoung; Lee, Myung Sik

    2015-01-01

    Spatial normalization is a prerequisite step for analyzing positron emission tomography (PET) images both by using volume-of-interest (VOI) template and voxel-based analysis. Magnetic resonance (MR) or ligand-specific PET templates are currently used for spatial normalization of PET images. We used computed tomography (CT) images acquired with PET/CT scanner for the spatial normalization for [18F]-N-3-fluoropropyl-2-betacarboxymethoxy-3-beta-(4-iodophenyl) nortropane (FP-CIT) PET images and compared target-to-cerebellar standardized uptake value ratio (SUVR) values with those obtained from MR- or PET-guided spatial normalization method in healthy controls and patients with Parkinson's disease (PD). We included 71 healthy controls and 56 patients with PD who underwent [18F]-FP-CIT PET scans with a PET/CT scanner and T1-weighted MR scans. Spatial normalization of MR images was done with a conventional spatial normalization tool (cvMR) and with DARTEL toolbox (dtMR) in statistical parametric mapping software. The CT images were modified in two ways, skull-stripping (ssCT) and intensity transformation (itCT). We normalized PET images with cvMR-, dtMR-, ssCT-, itCT-, and PET-guided methods by using specific templates for each modality and measured striatal SUVR with a VOI template. The SUVR values measured with FreeSurfer-generated VOIs (FSVOI) overlaid on original PET images were also used as a gold standard for comparison. The SUVR values derived from all four structure-guided spatial normalization methods were highly correlated with those measured with FSVOI (P normalization methods provided reliable striatal SUVR values comparable to those obtained with MR-guided methods. CT-guided methods can be useful for analyzing dopamine transporter PET images when MR images are unavailable.

  16. Mean fields and self consistent normal ordering of lattice spin and gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1986-01-01

    Classical Heisenberg spin models on lattices possess mean field theories that are well defined real field theories on finite lattices. These mean field theories can be self consistently normal ordered. This leads to a considerable improvement over standard mean field theory. This concept is carried over to lattice gauge theories. We construct first an appropriate real mean field theory. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean field theory are derived. (orig.)

  17. Evaluation of the normal fetal kidney length and its correlation with gestational age.

    Science.gov (United States)

    Seilanian Toosi, Farrokh; Rezaie-Delui, Hossein

    2013-05-30

    A true estimation of gestational age (GA) plays an important role in quality maternity care and scheduling the labor date. This study aimed to evaluate the normal fetal kidney length (KL) and its correlation with GA. A cross-sectional study on 92 pregnant women between 8th and 10th week of gestation with normal singleton pregnancy underwent standard ultrasound fetal biometry and kidney length measurement. univariate and multivariate linear regression analysis was used to create a predictive equation to estimate GA on the KL and fetobiometry parameters. A significant correlation was found between GA and KL (r=0.83, P<0.002). The best GA predictor was obtained by combining head circumference, fetal biparietal diameter, femur length and KL with a standard error (SE) about 14.2 days. Our findings showed that KL measurements combination with other fetal biometric parameters could predict age of pregnancy with a better precision.

  18. Perron–Frobenius theorem for nonnegative multilinear forms and extensions

    OpenAIRE

    Friedland, S.; Gaubert, S.; Han, L.

    2013-01-01

    We prove an analog of Perron-Frobenius theorem for multilinear forms with nonnegative coefficients, and more generally, for polynomial maps with nonnegative coefficients. We determine the geometric convergence rate of the power algorithm to the unique normalized eigenvector.

  19. In vivo H MR spectroscopy of human brain in six normal volunteers

    International Nuclear Information System (INIS)

    Choe, Bo Young; Suh, Tae Suk; Bahk, Yong Whee; Shinn, Kyung Sub

    1993-01-01

    In vivo H MR spectroscopic studies were performed on the human brain in six normal volunteers. Some distinct proton metabolites, such as N-acetylaspartate (NAA), creatine/phosphocreatine (Cr), choline/phosphocholine (Cho), myo-inositol (Ins) and lipid (fat) were clearly identified in normal brain tissue. The signal intensity of NAA resonance is strongest. The standard ratios of metabolites from the normal brain tissue in specific regions were obtained for the references of further in vivo H MR spectroscopic studies. Our initial resulting suggest the in vivo H MR spectroscopy may provide more precise diagnosis on the basis of the metabolic information on brain tissues. The unique ability of In vivo H MR spectroscopy to offer noninvasive information about tissue biochemistry in patients will stimulate its impact on clinical research and disease diagnosis

  20. Statistical properties of the normalized ice particle size distribution

    Science.gov (United States)

    Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.

    2005-05-01

    Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000

  1. Culture of normal human blood cells in a diffusion chamber system II. Lymphocyte and plasma cell kinetics

    International Nuclear Information System (INIS)

    Chikkappa, G.; Carsten, A.L.; Chanana, A.D.; Cronkite, E.P.

    1979-01-01

    Normal human blood leukocytes were cultured in Millipore diffusion chambers implanted into the peritoneal cavities of irradiated mice. The evaluation of survival and proliferation kinetics of cells in lymphyocytic series suggested that the lymphoid cells are formed from transition of small and/or large lymphocytes, and the lymphoblasts from the lymphoid cells. There was also evidence indicating that some of the cells in these two compartments are formed by proliferation. The evaluation of plasmacytic series suggested that the plasma cells are formed from plasmacytoid-lymphocytes by transition, and the latter from the transition of lymphocytes. In addition, relatively a small fraction of cells in these two compartments are formed by proliferation. mature plasma cells do not and immature plasma cells do proliferate. Estimation of magnitude of plasma cells formed in the cultures at day 18 indicated that at least one plasma cell is formed for every 6 normal human blood lymphocytes introduced into the culture

  2. A normalized model for the half-bridge series resonant converter

    Science.gov (United States)

    King, R.; Stuart, T. A.

    1981-01-01

    Closed-form steady-state equations are derived for the half-bridge series resonant converter with a rectified (dc) load. Normalized curves for various currents and voltages are then plotted as a function of the circuit parameters. Experimental results based on a 10-kHz converter are presented for comparison with the calculations.

  3. Mathematical Communication in State Standards before the Common Core

    Science.gov (United States)

    Kosko, Karl Wesley; Gao, Yang

    2017-01-01

    Mathematical communication has been an important feature of standards documents since National Council of Teachers of Mathematics' (NCTM) (1989) "Curriculum and Evaluation Standards." Such an emphasis has influenced content standards of states from then to present. This study examined how effective the prevalence of various forms of…

  4. A spectrum standardization approach for laser-induced breakdown spectroscopy measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wang Zhe, E-mail: zhewang@mail.tsinghua.edu.cn; Li Lizhi; West, Logan; Li Zheng, E-mail: lz-dte@tsinghua.edu.cn; Ni Weidou

    2012-02-15

    This paper follows and completes a previous presentation of a spectrum normalization method for laser-induced breakdown spectroscopy (LIBS) measurements by converting the experimentally recorded line intensity at varying operational conditions to the intensity that would be obtained under a 'standard state' condition, characterized by a standard plasma temperature, electron number density, and total number density of the interested species. At first, for each laser shot and corresponding spectrum, the line intensities of the interested species are converted to the intensity at a fixed plasma temperature and electron number density, but with varying total number density. Under this state, if the influence of changing plasma morphology is neglected, the sum of multiple spectral line intensities for the measured element is proportional to the total number density of the specific element. Therefore, the fluctuation of the total number density, or the variation of ablation mass, can be compensated for by applying the proportional relationship. The application of this method to Cu in 29 brass alloy samples, showed an improvement over the commonly applied normalization method with regard to measurement precision and accuracy. The average relative standard deviation (RSD) value, average value of the error bar, R{sup 2}, root mean square error of prediction (RMSEP), and average value of the maximum relative error were: 5.29%, 0.68%, 0.98, 2.72%, 16.97%, respectively, while the above parameter values for normalization with the whole spectrum area were: 8.61%, 1.37%, 0.95, 3.28%, 29.19%, respectively. - Highlights: Black-Right-Pointing-Pointer Intensity converted into an ideal standard plasma state for uncertainty reduction. Black-Right-Pointing-Pointer Ablated mass fluctuations compensated by variation of sum of multiple intensities. Black-Right-Pointing-Pointer A spectrum standardization model established. Black-Right-Pointing-Pointer Results in both uncertainty

  5. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  6. Functional Differences Between Placental Micro- and Macrovascular Endothelial Colony-Forming Cells

    Science.gov (United States)

    Solomon, Ioana; O’Reilly, Megan; Ionescu, Lavinia; Alphonse, Rajesh S.; Rajabali, Saima; Zhong, Shumei; Vadivel, Arul; Shelley, W. Chris; Yoder, Mervin C.

    2016-01-01

    Alterations in the development of the placental vasculature can lead to pregnancy complications, such as preeclampsia. Currently, the cause of preeclampsia is unknown, and there are no specific prevention or treatment strategies. Further insight into the placental vasculature may aid in identifying causal factors. Endothelial colony-forming cells (ECFCs) are a subset of endothelial progenitor cells capable of self-renewal and de novo vessel formation in vitro. We hypothesized that ECFCs exist in the micro- and macrovasculature of the normal, term human placenta. Human placentas were collected from term pregnancies delivered by cesarean section (n = 16). Placental micro- and macrovasculature was collected from the maternal and fetal side of the placenta, respectively, and ECFCs were isolated and characterized. ECFCs were CD31+, CD105+, CD144+, CD146+, CD14−, and CD45−, took up 1,1′-dioctadecyl-3,3,3′,3′-tetramethyl-indocarbocyanine perchlorate-labeled acetylated low-density lipoprotein, and bound Ulex europaeus agglutinin 1. In vitro, macrovascular ECFCs had a greater potential to generate high-proliferative colonies and formed more complex capillary-like networks on Matrigel compared with microvascular ECFCs. In contrast, in vivo assessment demonstrated that microvascular ECFCs had a greater potential to form vessels. Macrovascular ECFCs were of fetal origin, whereas microvascular ECFCs were of maternal origin. ECFCs exist in the micro- and macrovasculature of the normal, term human placenta. Although macrovascular ECFCs demonstrated greater vessel and colony-forming potency in vitro, this did not translate in vivo, where microvascular ECFCs exhibited a greater vessel-forming ability. These important findings contribute to the current understanding of normal placental vascular development and may aid in identifying factors involved in preeclampsia and other pregnancy complications. Significance This research confirms that resident endothelial colony-forming

  7. Effects of Foveal Ablation on Emmetropization and Form-Deprivation Myopia

    Science.gov (United States)

    Smith, Earl L.; Ramamirtham, Ramkumar; Qiao-Grider, Ying; Hung, Li-Fang; Huang, Juan; Kee, Chea-su; Coats, David; Paysse, Evelyn

    2009-01-01

    Purpose Because of the prominence of central vision in primates, it has generally been assumed that signals from the fovea dominate refractive development. To test this assumption, the authors determined whether an intact fovea was essential for either normal emmetropization or the vision-induced myopic errors produced by form deprivation. Methods In 13 rhesus monkeys at 3 weeks of age, the fovea and most of the perifovea in one eye were ablated by laser photocoagulation. Five of these animals were subsequently allowed unrestricted vision. For the other eight monkeys with foveal ablations, a diffuser lens was secured in front of the treated eyes to produce form deprivation. Refractive development was assessed along the pupillary axis by retinoscopy, keratometry, and A-scan ultrasonography. Control data were obtained from 21 normal monkeys and three infants reared with plano lenses in front of both eyes. Results Foveal ablations had no apparent effect on emmetropization. Refractive errors for both eyes of the treated infants allowed unrestricted vision were within the control range throughout the observation period, and there were no systematic interocular differences in refractive error or axial length. In addition, foveal ablation did not prevent form deprivation myopia; six of the eight infants that experienced monocular form deprivation developed myopic axial anisometropias outside the control range. Conclusions Visual signals from the fovea are not essential for normal refractive development or the vision-induced alterations in ocular growth produced by form deprivation. Conversely, the peripheral retina, in isolation, can regulate emmetropizing responses and produce anomalous refractive errors in response to abnormal visual experience. These results indicate that peripheral vision should be considered when assessing the effects of visual experience on refractive development. PMID:17724167

  8. A4 see-saw models and form dominance

    International Nuclear Information System (INIS)

    Chen, M-C; King, Stephen F.

    2009-01-01

    We introduce the idea of Form Dominance in the (type I) see-saw mechanism, according to which a particular right-handed neutrino mass eigenstate is associated with a particular physical neutrino mass eigenstate, leading to a form diagonalizable effective neutrino mass matrix. Form Dominance, which allows an arbitrary neutrino mass spectrum, may be regarded as a generalization of Constrained Sequential Dominance which only allows strongly hierarchical neutrino masses. We consider alternative implementations of the see-saw mechanism in minimal A 4 see-saw models and show that such models satisfy Form Dominance, leading to neutrino mass sum rules which predict closely spaced neutrino masses with a normal or inverted neutrino mass ordering. To avoid the partial cancellations inherent in such models we propose Natural Form Dominance, in which a different flavon is associated with each physical neutrino mass eigenstate.

  9. 75 FR 28777 - Information Collection; Financial Information Security Request Form

    Science.gov (United States)

    2010-05-24

    ... Collection; Financial Information Security Request Form AGENCY: Forest Service, USDA. ACTION: Notice; Request... currently approved information collection; Financial Information Security Request Form. DATES: Comments must... Standard Time, Monday through Friday. SUPPLEMENTARY INFORMATION: Title: Financial Information Security...

  10. Helicobacter pylori infection in patients with dyspeptic symptoms having normal endoscopy

    International Nuclear Information System (INIS)

    Malik, M.F.; Hussain, T.; Khan, M.N.; Mirza, S.A.

    2010-01-01

    To find out the frequency of Helicobacter pylori infection in the local population presenting with dyspeptic symptoms but having normal upper gastrointestinal endoscopic findings. Hundred cases of dyspepsia having normal upper gastrointestinal endoscopy were taken as study population. Although the gold standard for presence or absence of Helicobacter pylori infection is culture but in this study the diagnostic method used was histopathology of gastric antrum. The male and female ratio was 2:1. Majority of the patients were either 40 years of age or less, mean age being 40.52 (sd+-13.22). The chief symptoms were pain epigastrium (46%) and upper abdominal discomfort (27%). Helicobacter pylori gastritis was found in 51% of cases. We conclude that Helicobacter pylori infection is quite common in dyspeptic patients apparently having normal endoscopic gastric mucosal findings. Eradication therapy should be instituted in positive cases to avoid its long-term complications. (author)

  11. Proton MR spectroscopic features of the human liver: in-vivo application to the normal condition

    International Nuclear Information System (INIS)

    Cho, Soon Gu; Kim, Mi Young; Kim, Young Soo; Choi, Won; Shin, Seok Hwan; Ok, Chul Soo; Suh, Chang Hae

    1999-01-01

    To determine the feasibility of MR spectroscopy in the living human liver, and to evaluate the corresponding proton MR spectroscopic features. In fifteen normal volunteers with neither previous nor present liver disease, the proton MR spectroscopic findings were reviewed. Twelve subjects were male and three were female ; they were aged between 28 and 32 (mean, 30) years. MR spectroscopy involved the use of a 1.5T GE Signa Horizon system with body coil(GE Medical System, Milwaukee, U.S.A). We used STEAM (Stimulated Echo-Acquisition Mode) with 3000/30 msec of TR/TE for signal acquisition, and the prone position without respiratory interruption. Mean and standard deviation of the ratios of glutamate+glutamine/lipids, phosphomonoesters/lipids, and glycogen+glucose/lipids were calculated from the area of their peaks. The proton MR spectroscopic findings of normal human livers showed four distinctive peaks, i.e. lipids, glutamate and glutamine complex, phosphomonoesters, and glycogen and glucose complex. The mean and standard deviation of the ratios of glutamate+glutamine/lipids, phosphomonoesters/lipids, and glycogen+glucose/lipids were 0.02±0.01, 0.01±0.01, and 0.04±0.03, respectively. In living normal human livers, MR spectroscopy can be successfully applied. When applied to a liver whose condition is pathologic, the findings can be used as a standard

  12. From explicit to implicit normal mode initialization of a limited-area model

    Energy Technology Data Exchange (ETDEWEB)

    Bijlsma, S.J.

    2013-02-15

    In this note the implicit normal mode initialization of a limited-area model is discussed from a different point of view. To that end it is shown that the equations describing the explicit normal mode initialization applied to the shallow water equations in differentiated form on the sphere can readily be derived in normal mode space if the model equations are separable, but only in the case of stationary Rossby modes can be transformed into the implicit equations in physical space. This is a consequence of the simple relations between the components of the different modes in that case. In addition a simple eigenvalue problem is given for the frequencies of the gravity waves. (orig.)

  13. Ubiquitous isotopic anomalies in Ti from normal Allende inclusions

    International Nuclear Information System (INIS)

    Niemeyer, S.; Lugmair, G.W.

    1981-01-01

    A newly developed technique for high-precision isotopic analyses of titanium was applied to terrestrial rocks and course- and fine-grained Allende inclusions. Repeated analyses of three terrestrial rocks gave excellent agreement (usually less than 2 x 10 -4 deviations) with a Ti metal standard. All seven Allende inclusions studied here were previously determined to contain isotopically normal Nd and/or Sm, indicating that none belongs to a small group of peculiar inclusions, dubbed as FUN inclusions. (orig./ME)

  14. On the Ergodic Capacity of Dual-Branch Correlated Log-Normal Fading Channels with Applications

    KAUST Repository

    Al-Quwaiee, Hessa

    2015-05-01

    Closed-form expressions of the ergodic capacity of independent or correlated diversity branches over Log-Normal fading channels are not available in the literature. Thus, it is become of an interest to investigate the behavior of such metric at high signal-to-noise (SNR). In this work, we propose simple closed-form asymptotic expressions of the ergodic capacity of dual-branch correlated Log- Normal corresponding to selection combining, and switch-and-stay combining. Furthermore, we capitalize on these new results to find new asymptotic ergodic capacity of correlated dual- branch free-space optical communication system under the impact of pointing error with both heterodyne and intensity modulation/direct detection. © 2015 IEEE.

  15. Radiographic analysis of the temporomandibular joint by the standardized projection technique

    International Nuclear Information System (INIS)

    Choe, Han Up; Park, Tae Won

    1983-01-01

    The purpose of this study was to investigate the radiographic images of the condylar head in clinically normal subjects and the TMJ patients using standardized projection technique. 45 subjects who have not clinical evidence of TMJ problems and 96 patients who have the clinical evidence of TMJ problems were evaluated, but the patients who had fracture, trauma and tumor on TMJ area were discluded in this study. For the evaluation of radiographic images, the author has observed the condylar head positions in closed mouth and 2.54 cm open mouth position taken by the standardized transcranial oblique lateral projection technique. The results were as follow: 1. In closed mouth position, the crest of condylar head took relatively posterior position to the deepest point of the glenoid fossa in 8.9% of the normals and in 26.6% of TMJ patients. 2. In 2.54 cm open mouth position, condylar head took relatively posterior position to the articular eminence in 2 .2% of TMJ patients and 39.6% of the normals. 3. In open mouth position, the horizontal distance from the deepest point of the glenoid fossa to the condylar head was 13.96 mm in the normals and 10.68 mm in TMJ patients. 4. The distance of true movement of condylar head was 13.49 mm in the normals and 10.27 mm in TMJ patients. 5. The deviation of mandible in TMJ patients was slightly greater than of the normals.

  16. [Precautions of physical performance requirements and test methods during product standard drafting process of medical devices].

    Science.gov (United States)

    Song, Jin-Zi; Wan, Min; Xu, Hui; Yao, Xiu-Jun; Zhang, Bo; Wang, Jin-Hong

    2009-09-01

    The major idea of this article is to discuss standardization and normalization for the product standard of medical devices. Analyze the problem related to the physical performance requirements and test methods during product standard drafting process and make corresponding suggestions.

  17. Effect of standards on new equipment design by new international standards and industry restraints

    Science.gov (United States)

    Endelman, Lincoln L.

    1991-01-01

    The use of international standards to further trade is one of the objectives of creating a standard. By having form fit and function compatible the free interchange of manufactured goods can be handled without hindrance. Unfortunately by setting up standards that are peculiar to a particular country or district it is possible to exclude competition from a group of manufacturers. A major effort is now underway to develop international laser standards. In the May I 990 issue of Laser Focus World Donald R. Johnson the director of industrial technology services for the National Institute of Standards and Technology (NIST formerly the National Bureau of Standards) is quoted as follows: " The common means of protectionism has been through certification for the market place. " The article goes on to say " Mr. Johnson expects this tradition to continue and that the new European Community (EC) will demand not just safety standards but performance standards as well. . . . the American laser industry must move very quickly on this issue or risk being left behind the European standards bandwagon. " The article continues laser companies must get involved in the actual standards negotiating process if they are to have a say in future policy. A single set of standards would reduce the need to repeatedly recalibrate products for different national markets. " As a member of ISO TC-72 SC9 I am

  18. Effect of Image Linearization on Normalized Compression Distance

    Science.gov (United States)

    Mortensen, Jonathan; Wu, Jia Jie; Furst, Jacob; Rogers, John; Raicu, Daniela

    Normalized Information Distance, based on Kolmogorov complexity, is an emerging metric for image similarity. It is approximated by the Normalized Compression Distance (NCD) which generates the relative distance between two strings by using standard compression algorithms to compare linear strings of information. This relative distance quantifies the degree of similarity between the two objects. NCD has been shown to measure similarity effectively on information which is already a string: genomic string comparisons have created accurate phylogeny trees and NCD has also been used to classify music. Currently, to find a similarity measure using NCD for images, the images must first be linearized into a string, and then compared. To understand how linearization of a 2D image affects the similarity measure, we perform four types of linearization on a subset of the Corel image database and compare each for a variety of image transformations. Our experiment shows that different linearization techniques produce statistically significant differences in NCD for identical spatial transformations.

  19. Dobinski-type relations and the log-normal distribution

    International Nuclear Information System (INIS)

    Blasiak, P; Penson, K A; Solomon, A I

    2003-01-01

    We consider sequences of generalized Bell numbers B(n), n = 1, 2, ..., which can be represented by Dobinski-type summation formulae, i.e. B(n) = 1/C Σ k =0 ∞ [P(k)] n /D(k), with P(k) a polynomial, D(k) a function of k and C = const. They include the standard Bell numbers (P(k) k, D(k) = k!, C = e), their generalizations B r,r (n), r = 2, 3, ..., appearing in the normal ordering of powers of boson monomials (P(k) (k+r)!/k!, D(k) = k!, C = e), variants of 'ordered' Bell numbers B o (p) (n) (P(k) = k, D(k) = (p+1/p) k , C = 1 + p, p = 1, 2 ...), etc. We demonstrate that for α, β, γ, t positive integers (α, t ≠ 0), [B(αn 2 + βn + γ)] t is the nth moment of a positive function on (0, ∞) which is a weighted infinite sum of log-normal distributions. (letter to the editor)

  20. Making nuclear 'normal'

    International Nuclear Information System (INIS)

    Haehlen, Peter; Elmiger, Bruno

    2000-01-01

    The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be

  1. National Green Building Standard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-07-01

    DOE's Building America Program is a research and development program to improve the energy performance of new and existing homes. The ultimate goal of the Building America Program is to achieve examples of cost-effective, energy efficient solutions for all U.S. climate zones. Periodic maintenance of an ANSI standard by review of the entire document and action to revise or reaffirm it on a schedule not to exceed five years is required by ANSI. In compliance, a consensus group has once again been formed and the National Green Building Standard is currently being reviewed to comply with the periodic maintenance requirement of an ANSI standard.

  2. MISTRAL V1.1.1: assessing doses from atmospheric releases in normal and off-normal conditions

    International Nuclear Information System (INIS)

    David Kerouanton; Patrick Devin; Malvina Rennesson

    2006-01-01

    intensity. When the meteorology is specified, the user characterizes the release. In off-normal situation, successive stages can be defined independently from the meteorological stages. For each one, radionuclides are chosen, the activity released is given and the lung absorption type is specified in accordance with its chemical form. All those compositions can be recorded. In case of annual releases, release duration is automatically set to 1 year. When the release is fully specified, the receptors have to be characterized by giving their coordinates or their distance and angle from release point. Doses can be calculated at various moments specified by the user. In case of annual releases, the standard observation times correspond to one year but the user has the possibility to calculate for any whole number of years up to 70 years. Finally, the user has to specify the output data needed: primary data such as concentrations in air or on soil (at the different moments specified or integrated) or doses. For the public, five potential exposure pathways are available in MISTRAL V1.1.1. Code: - From plume: internal and external exposure, - From deposit: external exposure, internal exposure due to inhalation of resuspended radionuclides and ingestion. Exposure from plume occurs as soon as radionuclides are released in atmosphere whereas deposited activity generates long time exposure. Dose impact is calculated at all distances and times stated by the user for different age classes. Ingestion is calculated by using a dynamic food chain model. Ingestion dose is calculated by using annual food consumptions. The user has to possibility to record its own data. In off-normal situations, doses from deposit are integrated over 30 days or one year duration. If the routine release scheme is chosen, doses are integrated over the duration specified by the user as observation times. The time spent inside and outside by any individual can be specified. In addition, attenuation coefficient for

  3. Development of Abbreviated Nine-Item Forms of the Raven's Standard Progressive Matrices Test

    Science.gov (United States)

    Bilker, Warren B.; Hansen, John A.; Brensinger, Colleen M.; Richard, Jan; Gur, Raquel E.; Gur, Ruben C.

    2012-01-01

    The Raven's Standard Progressive Matrices (RSPM) is a 60-item test for measuring abstract reasoning, considered a nonverbal estimate of fluid intelligence, and often included in clinical assessment batteries and research on patients with cognitive deficits. The goal was to develop and apply a predictive model approach to reduce the number of items…

  4. Quasi-normal frequencies: Semi-analytic results for highly damped modes

    International Nuclear Information System (INIS)

    Skakala, Jozef; Visser, Matt

    2011-01-01

    Black hole highly-damped quasi-normal frequencies (QNFs) are very often of the form ω n = (offset) + in (gap). We have investigated the genericity of this phenomenon for the Schwarzschild-deSitter (SdS) black hole by considering a model potential that is piecewise Eckart (piecewise Poschl-Teller), and developing an analytic 'quantization condition' for the highly-damped quasi-normal frequencies. We find that the ω n = (offset) + in (gap) behaviour is common but not universal, with the controlling feature being whether or not the ratio of the surface gravities is a rational number. We furthermore observed that the relation between rational ratios of surface gravities and periodicity of QNFs is very generic, and also occurs within different analytic approaches applied to various types of black hole spacetimes. These observations are of direct relevance to any physical situation where highly-damped quasi-normal modes are important.

  5. Geometrical nuclear diagnosis and total paths of cervical cell evolution from normality to cancer

    Directory of Open Access Journals (Sweden)

    Javier Oswaldo Rodríguez Velásquez

    2015-01-01

    Full Text Available Background: The diagnosis of cervix cytology has problems of inter-observer reproducibility. Methodologies based on fractal geometry objectively differentiated normal, low-grade squamous intraepithelial lesion (L-SIL and high-grade squamous intraepithelial lesion (H-SIL states. Aims: The aim was to develop a mathematical-physical diagnosis and a theoretical generalization of the evolution paths of cervical cells from normal to carcinoma based on their occupation in the box-counting space. Subjects and Methods: Overlaying a grid of 8x8 pixels, the a number of squares occupying the nucleus surface and cytoplasm of 5 normal cells, 5 ASCUS, 5 L-SIL and 5 H-SIL were evaluated, as well as the ratio C/N, establishing differences between states. Sensitivity, specificity, negative likelihood ratio, and Kappa coefficient over the gold standard were calculated. Also was developed a generalization of all possible paths from normality to carcinoma. Results: The occupancy spaces of the nuclear surface allow differentiating normal L-SIL and H-SIL thus avoiding the indeterminacy of ASCUS cells. Compared to the Gold Standard, this method has sensitivity and specificity of 100%, negative likelihood ratio of 0, and Kappa coefficient of 1. 62,900 possible routes of evolution were determined between normal and H-SIL, states, based on the structural basis of the cells. Conclusions: it was obtained an objective and reproducible diagnostic methodology of the development of preneoplastic and neoplastic cervical cells for clinical application. Additionally were developed all possible paths of preneoplastic cellular alteration to carcinoma which facilitates the tracking of patients over time to clinical level, warning of alterations that lead to malignancy, based on the spatial occupation measurements of the nucleus in fractal space regardless of causes or risk factors.

  6. On a direct algorithm for the generation of log-normal pseudo-random numbers

    CERN Document Server

    Chamayou, J M F

    1976-01-01

    The random variable ( Pi /sub i=1//sup n/X/sub i//X/sub i+n/)/sup 1/ square root 2n/ is used to generate standard log normal variables Lambda (0, 1), where the X/sub i/ are independent uniform variables on (0, 1). (8 refs).

  7. ProNormz--an integrated approach for human proteins and protein kinases normalization.

    Science.gov (United States)

    Subramani, Suresh; Raja, Kalpana; Natarajan, Jeyakumar

    2014-02-01

    The task of recognizing and normalizing protein name mentions in biomedical literature is a challenging task and important for text mining applications such as protein-protein interactions, pathway reconstruction and many more. In this paper, we present ProNormz, an integrated approach for human proteins (HPs) tagging and normalization. In Homo sapiens, a greater number of biological processes are regulated by a large human gene family called protein kinases by post translational phosphorylation. Recognition and normalization of human protein kinases (HPKs) is considered to be important for the extraction of the underlying information on its regulatory mechanism from biomedical literature. ProNormz distinguishes HPKs from other HPs besides tagging and normalization. To our knowledge, ProNormz is the first normalization system available to distinguish HPKs from other HPs in addition to gene normalization task. ProNormz incorporates a specialized synonyms dictionary for human proteins and protein kinases, a set of 15 string matching rules and a disambiguation module to achieve the normalization. Experimental results on benchmark BioCreative II training and test datasets show that our integrated approach achieve a fairly good performance and outperforms more sophisticated semantic similarity and disambiguation systems presented in BioCreative II GN task. As a freely available web tool, ProNormz is useful to developers as extensible gene normalization implementation, to researchers as a standard for comparing their innovative techniques, and to biologists for normalization and categorization of HPs and HPKs mentions in biomedical literature. URL: http://www.biominingbu.org/pronormz. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. 76 FR 36864 - Special Conditions: Gulfstream Model GVI Airplane; Operation Without Normal Electric Power

    Science.gov (United States)

    2011-06-23

    ... Normal Electric Power AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Final special... Interface Branch, ANM-111, Transport Standards Staff, Transport Airplane Directorate, Aircraft Certification... Model GVI airplane will be an all-new, two- engine jet transport airplane. The maximum takeoff weight...

  9. Theoretical Discussion on Forms of Cultural Capital in Singapore

    Science.gov (United States)

    Tan, Cheng Yong

    2013-01-01

    This article is a theoretical discussion on five forms of cultural resources that constitute cultural capital for children in the meritocratic yet stratified society of Singapore. These five forms of cultural capital are namely "academic" tastes and leisure preferences, use of Standard English, access to and dispositions toward…

  10. Normal zone detectors for a large number of inductively coupled coils

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.

    1983-01-01

    In order to protect a set of inductively coupled superconducting magnets, it is necessary to locate and measure normal zone voltages that are small compared with the mutual and self-induced voltages. The method described in this report uses two sets of voltage measurements to locate and measure one or more normal zones in any number of coupled coils. One set of voltages is the outputs of bridges that balance out the self-induced voltages The other set of voltages can be the voltages across the coils, although alternatives are possible. The two sets of equations form a single combined set of equations. Each normal zone location or combination of normal zones has a set of these combined equations associated with it. It is demonstrated that the normal zone can be located and the correct set chosen, allowing determination of the size of the normal zone. Only a few operations take plae in a working detector: multiplication of a constant, addition, and simple decision-making. In many cases the detector for each coil, although weakly linked to the other detectors, can be considered to be independent. An example of the detector design is given for four coils with realistic parameters. The effect on accuracy of changes in the system parameters is discussed

  11. Evaluation of the Normal Fetal Kidney Length and Its Correlation with Gestational Age

    Directory of Open Access Journals (Sweden)

    Farrokh Seilanian Toosi

    2013-05-01

    Full Text Available A true estimation of gestational age (GA plays an important role in quality maternity care and scheduling the labor date. This study aimed to evaluate the normal fetal kidney length (KL and its correlation with GA. A cross-sectional study on 92 pregnant women between 8th and 10th week of gestation with normal singleton pregnancy underwent standard ultrasound fetal biometry and kidney length measurement. univariate and multivariate linear regression analysis was used to create a predictive equation to estimate GA on the KL and fetobiometry parameters. A significant correlation was found between GA and KL (r=0.83, P<0.002. The best GA predictor was obtained by combining head circumference, fetal biparietal diameter, femur length and KL with a standard error (SE about 14.2 days. Our findings showed that KL measurements combination with other fetal biometric parameters could predict age of pregnancy with a better precision.

  12. Superwideband Bandwidth Extension Using Normalized MDCT Coefficients for Scalable Speech and Audio Coding

    Directory of Open Access Journals (Sweden)

    Young Han Lee

    2013-01-01

    Full Text Available A bandwidth extension (BWE algorithm from wideband to superwideband (SWB is proposed for a scalable speech/audio codec that uses modified discrete cosine transform (MDCT coefficients as spectral parameters. The superwideband is first split into several subbands that are represented as gain parameters and normalized MDCT coefficients in the proposed BWE algorithm. We then estimate normalized MDCT coefficients of the wideband to be fetched for the superwideband and quantize the fetch indices. After that, we quantize gain parameters by using relative ratios between adjacent subbands. The proposed BWE algorithm is embedded into a standard superwideband codec, the SWB extension of G.729.1 Annex E, and its bitrate and quality are compared with those of the BWE algorithm already employed in the standard superwideband codec. It is shown from the comparison that the proposed BWE algorithm relatively reduces the bitrate by around 19% with better quality, compared to the BWE algorithm in the SWB extension of G.729.1 Annex E.

  13. Non normal and non quadratic anisotropic plasticity coupled with ductile damage in sheet metal forming: Application to the hydro bulging test

    International Nuclear Information System (INIS)

    Badreddine, Houssem; Saanouni, Khemaies; Dogui, Abdelwaheb

    2007-01-01

    In this work an improved material model is proposed that shows good agreement with experimental data for both hardening curves and plastic strain ratios in uniaxial and equibiaxial proportional loading paths for steel metal until the final fracture. This model is based on non associative and non normal flow rule using two different orthotropic equivalent stresses in both yield criterion and plastic potential functions. For the plastic potential the classical Hill 1948 quadratic equivalent stress is considered while for the yield criterion the Karafillis and Boyce 1993 non quadratic equivalent stress is used taking into account the non linear mixed (kinematic and isotropic) hardening. Applications are made to hydro bulging tests using both circular and elliptical dies. The results obtained with different particular cases of the model such as the normal quadratic and the non normal non quadratic cases are compared and discussed with respect to the experimental results

  14. Calculation of the Nucleon Axial Form Factor Using Staggered Lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Aaron S. [Fermilab; Hill, Richard J. [Perimeter Inst. Theor. Phys.; Kronfeld, Andreas S. [Fermilab; Li, Ruizi [Indiana U.; Simone, James N. [Fermilab

    2016-10-14

    The nucleon axial form factor is a dominant contribution to errors in neutrino oscillation studies. Lattice QCD calculations can help control theory errors by providing first-principles information on nucleon form factors. In these proceedings, we present preliminary results on a blinded calculation of $g_A$ and the axial form factor using HISQ staggered baryons with 2+1+1 flavors of sea quarks. Calculations are done using physical light quark masses and are absolutely normalized. We discuss fitting form factor data with the model-independent $z$ expansion parametrization.

  15. Mitogen-stimulated phospholipid synthesis in normal and immune-deficient human B cells

    International Nuclear Information System (INIS)

    Chien, M.M.; Yokoyama, W.M.; Ashman, R.F.

    1986-01-01

    Eight patients with common variable panhypogammaglobulinemia were shown in the in vitro Ig biosynthesis assay to have defective B cell responses to pokeweed mitogen (PWM). Phospholipid synthesis was assessed in the B cell plus monocyte fraction (MB) and irradiated T cells (T*) of patients and paired normal controls. Cell populations were studied separately and in the four possible combinations (1:1), with and without PWM, to reveal the effect of cell interactions. At 16 to 20 hr the mean stimulation index (SI) +/- standard error for MB cells alone was 1.01 +/- 0.02 for eight patients and 0.99 +/- 0.02 for the paired normals; the T* cell SI was 1.25 +/- 0.04 for patients and 1.28 +/- 0.05 for normals. Combinations of normal MB cells with normal T* cells showed significantly higher SI when compared with the combinations of normal MB cells with patient T* cells (p less than 0.005). However, the combination of patient MB cells with patient T* cells and the combination of patient MB cells with normal T* cells were not significantly different in SI (0.05 less than p less than 0.1). Isolation of patient and normal B cells, T* cells, and monocytes after the choline pulse showed that patient B cells gave a higher SI with normal T* help than with patient T* help. Of greatest interest is the finding that patient B cells that were defective in PWM-stimulated Ig production nevertheless showed a phospholipid synthesis response to PWM in the normal range, suggesting that the maturation defect in these B cells occurs later than the phospholipid synthesis acceleration step, or on a different pathway

  16. Cross-cultural adaptation of the US consumer form of the short Primary Care Assessment Tool (PCAT): the Korean consumer form of the short PCAT (KC PCAT) and the Korean standard form of the short PCAT (KS PCAT).

    Science.gov (United States)

    Jeon, Ki-Yeob

    2011-01-01

    It is well known that countries with well-structured primary care have better health outcomes, better health equity and reduced healthcare costs. This study aimed to culturally modify and validate the US consumer form of the short Primary Care Assessment Tool (PCAT) in primary care in the Republic of Korea (hereafter referred to as Korea). The Korean consumer form of the short PCAT (KC PCAT) was cross-culturally modified from the original version using a standardised transcultural adaptation method. A pre-test version of the KC PCAT was formulated by replacement of four items and modification of a further four items from the 37 items of the original consumer form of the short PCAT at face value evaluation meetings. Pilot testing was done with a convenience sample of 15 responders at two different sites. Test-retest showed high reliability. To validate the KC PCAT, 606 clients participated in a survey carried out in Korea between February and May 2006. Internal consistency reliability, test-retest reliability and factor analysis were conducted in order to test validity. Psychometric testing was carried out on 37 items of the KC PCAT to make the KS PCAT which has 30 items and has seven principal domains: first contact utilisation, first contact accessibility, ongoing accountable care (ongoing care and coordinated rapport care), integrated care (patient-centred care with integration between primary and specialty care or between different specialties), comprehensive care, community-oriented care and culturally-oriented care. Component factors of the verified KS PCAT explained 58.28% of the total variance in the total item scores of primary care. The verified KS PCAT has been characterised by the seven classic domains of primary care with minor modifications. This may provide clues concerning differences in expectations for primary care in the Korean population as compared with that of the US. The KS PCAT is a reliable and valid tool for the evaluation of the quality of

  17. A novel generalized normal distribution for human longevity and other negatively skewed data.

    Science.gov (United States)

    Robertson, Henry T; Allison, David B

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.

  18. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  19. Controlling the magic and normal sizes of white CdSe quantum dots

    Science.gov (United States)

    Su, Yu-Sheng; Chung, Shu-Ru

    2017-08-01

    In this study, we have demonstrated a facile chemical route to prepare CdSe QDs with white light emission, and the performance of white CdSe-based white light emitting diode (WLED) is also exploded. An organic oleic acid (OA) is used to form Cd-OA complex first and hexadecylamine (HDA) and 1-octadecene (ODE) is used as surfactants. Meanwhile, by varying the reaction time from 1 s to 60 min, CdSe QDs with white light can be obtained. The result shows that the luminescence spectra compose two obvious emission peaks and entire visible light from 400 to 700 nm, when the reaction time less than 10 min. The wide emission wavelength combine two particle sizes of CdSe, magic and normal, and the magic-CdSe has band-edge and surface-state emission, while normal size only possess band-edge emission. The TEM characterization shows that the two different sizes with diameter of 1.5 nm and 2.7 nm for magic and normal size CdSe QDs can be obtained when the reaction time is 4 min. We can find that the magic size of CdSe is produced when the reaction time is less than 3 min. In the time ranges from 3 to 10 min, two sizes of CdSe QDs are formed, and with QY from 20 to 60 %. Prolong the reaction time to 60 min, only normal size of CdSe QD can be observed due to the Ostwald repining, and its QYs is 8 %. Based on the results we can conclude that the two emission peaks are generated from the coexistence of magic size and normal size CdSe to form the white light QDs, and the QY and emission wavelength of CdSe QDs can be increased with prolonging reaction time. The sample reacts for 2 (QY 30 %), 4 (QY 32 %) and 60 min (QY 8 %) are choosing to mixes with transparent acrylic-based UV curable resin for WLED fabrication. The Commission International d'Eclairage (CIE) chromaticity, color rendering index (CRI), and luminous efficacy for magic, mix, and normal size CdSe are (0.49, 0.44), 81, 1.5 lm/W, (0.35, 0.30), 86, 1.9 lm/W, and (0.39, 0.25), 40, 0.3 lm/W, respectively.

  20. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    Science.gov (United States)

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.