WorldWideScience

Sample records for normal forms standard

  1. Closed-form confidence intervals for functions of the normal mean and standard deviation.

    Science.gov (United States)

    Donner, Allan; Zou, G Y

    2012-08-01

    Confidence interval methods for a normal mean and standard deviation are well known and simple to apply. However, the same cannot be said for important functions of these parameters. These functions include the normal distribution percentiles, the Bland-Altman limits of agreement, the coefficient of variation and Cohen's effect size. We present a simple approach to this problem by using variance estimates recovered from confidence limits computed for the mean and standard deviation separately. All resulting confidence intervals have closed forms. Simulation results demonstrate that this approach performs very well for limits of agreement, coefficients of variation and their differences.

  2. Normal forms in Poisson geometry

    NARCIS (Netherlands)

    Marcut, I.T.

    2013-01-01

    The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric

  3. TRASYS form factor matrix normalization

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1992-01-01

    A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.

  4. Normal form theory and spectral sequences

    OpenAIRE

    Sanders, Jan A.

    2003-01-01

    The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.

  5. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  6. Nonlinear dynamics exploration through normal forms

    CERN Document Server

    Kahn, Peter B

    2014-01-01

    Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations

  7. Normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)

  8. Normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  9. Normal equivariant forms of vector fields

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    We prove a theorem of linearization of type Siegel and a theorem of normal forms of type Poincare-Dulac for germs of holomorphic vector fields in the origin of C 2 , Γ -equivariants, where Γ is a finite subgroup of GL (2,C). (author). 5 refs

  10. Normal form for mirror machine Hamiltonians

    International Nuclear Information System (INIS)

    Dragt, A.J.; Finn, J.M.

    1979-01-01

    A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior

  11. Standardized waste form test methods

    International Nuclear Information System (INIS)

    Slate, S.C.

    1984-11-01

    The Materials Characterization Center (MCC) is developing standard tests to characterize nuclear waste forms. Development of the first thirteen tests was originally initiated to provide data to compare different high-level waste (HLW) forms and to characterize their basic performance. The current status of the first thirteen MCC tests and some sample test results is presented: The radiation stability tests (MCC-6 and 12) and the tensile-strength test (MCC-11) are approved; the static leach tests (MCC-1, 2, and 3) are being reviewed for full approval; the thermal stability (MCC-7) and microstructure evaluation (MCC-13) methods are being considered for the first time; and the flowing leach tests methods (MCC-4 and 5), the gas generation methods (MCC-8 and 9), and the brittle fracture method (MCC-10) are indefinitely delayed. Sample static leach test data on the ARM-1 approved reference material are presented. Established tests and proposed new tests will be used to meet new testing needs. For waste form production, tests on stability and composition measurement are needed to provide data to ensure waste form quality. In transportation, data are needed to evaluate the effects of accidents on canisterized waste forms. The new MCC-15 accident test method and some data are presented. Compliance testing needs required by the recent draft repository waste acceptance specifications are described. These specifications will control waste form contents, processing, and performance. 2 references, 2 figures

  12. Standardized waste form test methods

    International Nuclear Information System (INIS)

    Slate, S.C.

    1984-01-01

    The Materials Characterization Center (MCC) is developing standard tests to characterize nuclear waste forms. Development of the first thirteen tests was originally initiated to provide data to compare different high-level waste (HLW) forms and to characterize their basic performance. The current status of the first thirteen MCC tests and some sample test results are presented: the radiation stability tests (MCC-6 and 12) and the tensile-strength test (MCC-11) are approved; the static leach tests (MCC-1, 2, and 3) are being reviewed for full approval; the thermal stability (MCC-7) and microstructure evaluation (MCC-13) methods are being considered for the first time; and the flowing leach test methods (MCC-4 and 5), the gas generation methods (MCC-8 and 9), and the brittle fracture method (MCC-10) are indefinitely delayed. Sample static leach test data on the ARM-1 approved reference material are presented. Established tests and proposed new tests will be used to meet new testing needs. For waste form production, tests on stability and composition measurement are needed to provide data to ensure waste form quality. In transporation, data are needed to evaluate the effects of accidents on canisterized waste forms. The new MCC-15 accident test method and some data are presented. Compliance testing needs required by the recent draft repository waste acceptance specifications are described. These specifications will control waste form contents, processing, and performance

  13. Automatic identification and normalization of dosage forms in drug monographs

    Science.gov (United States)

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  14. A standard form for generalized CP transformations

    International Nuclear Information System (INIS)

    Ecker, G.; Grimus, W.; Neufeld, H.

    1987-01-01

    The investigation of general CP transformations leads to transformations of the form U → W T UW with unitary matrices U, W. It is shown that a basis for weak eigenstates can always be chosen such that W T UW has a certain real standard form. (Author)

  15. AFP Algorithm and a Canonical Normal Form for Horn Formulas

    OpenAIRE

    Majdoddin, Ruhollah

    2014-01-01

    AFP Algorithm is a learning algorithm for Horn formulas. We show that it does not improve the complexity of AFP Algorithm, if after each negative counterexample more that just one refinements are performed. Moreover, a canonical normal form for Horn formulas is presented, and it is proved that the output formula of AFP Algorithm is in this normal form.

  16. An Algorithm for Higher Order Hopf Normal Forms

    Directory of Open Access Journals (Sweden)

    A.Y.T. Leung

    1995-01-01

    Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.

  17. Normal form and synchronization of strict-feedback chaotic systems

    International Nuclear Information System (INIS)

    Wang, Feng; Chen, Shihua; Yu Minghai; Wang Changping

    2004-01-01

    This study concerns the normal form and synchronization of strict-feedback chaotic systems. We prove that, any strict-feedback chaotic system can be rendered into a normal form with a invertible transform and then a design procedure to synchronize the normal form of a non-autonomous strict-feedback chaotic system is presented. This approach needs only a scalar driving signal to realize synchronization no matter how many dimensions the chaotic system contains. Furthermore, the Roessler chaotic system is taken as a concrete example to illustrate the procedure of designing without transforming a strict-feedback chaotic system into its normal form. Numerical simulations are also provided to show the effectiveness and feasibility of the developed methods

  18. Normal form of linear systems depending on parameters

    International Nuclear Information System (INIS)

    Nguyen Huynh Phan.

    1995-12-01

    In this paper we resolve completely the problem to find normal forms of linear systems depending on parameters for the feedback action that we have studied for the special case of controllable linear systems. (author). 24 refs

  19. Volume-preserving normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-01-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)

  20. Volume-preserving normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  1. Standard forms of construction contracts in Romania

    Directory of Open Access Journals (Sweden)

    Cristian Bănică

    2013-12-01

    Full Text Available Construction industry in Romania is under pressure to modernize in order to cope with the new demands of development and convergence with EU. Contractual procedures in construction have to become an integral part in this process of modernization. The article makes an introduction to the advantages of standard forms of contract and professional contract administration in construction and presents the current state-of-the art in the use of standard construction contracts in Romania. Some practical conclusions and recommendations are presented considering the need for further contract studies.

  2. Utilizing Nested Normal Form to Design Redundancy Free JSON Schemas

    Directory of Open Access Journals (Sweden)

    Wai Yin Mok

    2016-12-01

    Full Text Available JSON (JavaScript Object Notation is a lightweight data-interchange format for the Internet. JSON is built on two structures: (1 a collection of name/value pairs and (2 an ordered list of values (http://www.json.org/. Because of this simple approach, JSON is easy to use and it has the potential to be the data interchange format of choice for the Internet. Similar to XML, JSON schemas allow nested structures to model hierarchical data. As data interchange over the Internet increases exponentially due to cloud computing or otherwise, redundancy free JSON data are an attractive form of communication because they improve the quality of data communication through eliminating update anomaly. Nested Normal Form, a normal form for hierarchical data, is a precise characterization of redundancy. A nested table, or a hierarchical schema, is in Nested Normal Form if and only if it is free of redundancy caused by multivalued and functional dependencies. Using Nested Normal Form as a guide, this paper introduces a JSON schema design methodology that begins with UML use case diagrams, communication diagrams and class diagrams that model a system under study. Based on the use cases’ execution frequencies and the data passed between involved parties in the communication diagrams, the proposed methodology selects classes from the class diagrams to be the roots of JSON scheme trees and repeatedly adds classes from the class diagram to the scheme trees as long as the schemas satisfy Nested Normal Form. This process continues until all of the classes in the class diagram have been added to some JSON scheme trees.

  3. Normal Forms for Fuzzy Logics: A Proof-Theoretic Approach

    Czech Academy of Sciences Publication Activity Database

    Cintula, Petr; Metcalfe, G.

    2007-01-01

    Roč. 46, č. 5-6 (2007), s. 347-363 ISSN 1432-0665 R&D Projects: GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * normal form * proof theory * hypersequents Subject RIV: BA - General Mathematics Impact factor: 0.620, year: 2007

  4. A New One-Pass Transformation into Monadic Normal Form

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We present a translation from the call-by-value λ-calculus to monadic normal forms that includes short-cut boolean evaluation. The translation is higher-order, operates in one pass, duplicates no code, generates no chains of thunks, and is properly tail recursive. It makes a crucial use of symbolic...

  5. Fast Bitwise Implementation of the Algebraic Normal Form Transform

    OpenAIRE

    Bakoev, Valentin

    2017-01-01

    The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...

  6. A New Normal Form for Multidimensional Mode Conversion

    International Nuclear Information System (INIS)

    Tracy, E. R.; Richardson, A. S.; Kaufman, A. N.; Zobin, N.

    2007-01-01

    Linear conversion occurs when two wave types, with distinct polarization and dispersion characteristics, are locally resonant in a nonuniform plasma [1]. In recent work, we have shown how to incorporate a ray-based (WKB) approach to mode conversion in numerical algorithms [2,3]. The method uses the ray geometry in the conversion region to guide the reduction of the full NxN-system of wave equations to a 2x2 coupled pair which can be solved and matched to the incoming and outgoing WKB solutions. The algorithm in [2] assumes the ray geometry is hyperbolic and that, in ray phase space, there is an 'avoided crossing', which is the most common type of conversion. Here, we present a new formulation that can deal with more general types of conversion [4]. This formalism is based upon the fact (first proved in [5]) that it is always possible to put the 2x2 wave equation into a 'normal' form, such that the diagonal elements of the dispersion matrix Poisson-commute with the off-diagonals (at leading order). Therefore, if we use the diagonals (rather than the eigenvalues or the determinant) of the dispersion matrix as ray Hamiltonians, the off-diagonals will be conserved quantities. When cast into normal form, the 2x2 dispersion matrix has a very natural physical interpretation: the diagonals are the uncoupled ray hamiltonians and the off-diagonals are the coupling. We discuss how to incorporate the normal form into ray tracing algorithms

  7. The COBE normalization for standard cold dark matter

    Science.gov (United States)

    Bunn, Emory F.; Scott, Douglas; White, Martin

    1995-01-01

    The Cosmic Background Explorer Satellite (COBE) detection of microwave anisotropies provides the best way of fixing the amplitude of cosmological fluctuations on the largest scales. This normalization is usually given for an n = 1 spectrum, including only the anisotropy caused by the Sachs-Wolfe effect. This is certainly not a good approximation for a model containing any reasonable amount of baryonic matter. In fact, even tilted Sachs-Wolfe spectra are not a good fit to models like cold dark matter (CDM). Here, we normalize standard CDM (sCDM) to the two-year COBE data and quote the best amplitude in terms of the conventionally used measures of power. We also give normalizations for some specific variants of this standard model, and we indicate how the normalization depends on the assumed values on n, Omega(sub B) and H(sub 0). For sCDM we find the mean value of Q = 19.9 +/- 1.5 micro-K, corresponding to sigma(sub 8) = 1.34 +/- 0.10, with the normalization at large scales being B = (8.16 +/- 1.04) x 10(exp 5)(Mpc/h)(exp 4), and other numbers given in the table. The measured rms temperature fluctuation smoothed on 10 deg is a little low relative to this normalization. This is mainly due to the low quadrupole in the data: when the quadrupole is removed, the measured value of sigma(10 deg) is quite consistent with the best-fitting the mean value of Q. The use of the mean value of Q should be preferred over sigma(10 deg), when its value can be determined for a particular theory, since it makes full use of the data.

  8. Frequency of Verbal Forms and Language Standard

    Directory of Open Access Journals (Sweden)

    Timur I. Galeev

    2017-11-01

    Full Text Available The article offers the description of a modern experiment, which gives the possibility of complex information extraction about the cognitive structure of the linguistic evolution of Language Standart (Norm. The study was conducted using the Google Books Corpus, which provides unprecedented opportunities for linguistic studies. The purpose of the experiment was to identify the patterns of competing forms evolution within the center of the verbal paradigm (3Sg and 3Pl on the basis of the data concerning the frequency of their use. The study was conducted on the material of excess verb forms with the variability of a/o vowels in a root (обусловливать/обуславливать. The graphs for variable word form competition clearly illustrate that the process of norm change consists of stages, each of which has numerical characteristics of two competing word form use. The chronological frameworks for an inflectional model change are established with the accuracy of up to 10 years. The graphs obtained as the result of the experiment make it possible to conclude that almost half of the verbs were not variative, although they previously considered. During the discussion of the obtained empirical data, a conclusion is made about the morphemic structure of a word, in which a root vowel changes. Possessing the information about similar processes in other verb paradigms, researchers are able to predict a possible change of inflectional models in the future and, as a consequence, the fixing of a new norm in lexicographical, orthographic and orthoepic sources.

  9. Disconnected forms of the standard group

    International Nuclear Information System (INIS)

    McInnes, B.

    1996-10-01

    Recent work in quantum gravity has led to a revival of interest in the concept of disconnected gauge groups. Here we explain how to classify all of the (non-trivial) groups which have the same Lie algebra as the ''standard group'', SU(3) x SU(2) x U(1), without requiring connectedness. The number of possibilities is surprisingly large. We also discuss the geometry of the ''Kiskis effect'', the ambiguity induced by non-trivial spacetime topology in such gauge theories. (author). 12 refs

  10. NON-STANDARD FORMS OF EMPLOYMENT IN BUSINESS ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    A. E. Chekanov

    2013-01-01

    Full Text Available The article discusses the emergence and development of non-standard forms of employment and flexible working. The causes of their use reflects the results of research conducted in the workplace. Non-standard forms of employment and attractive today as they allow to expand the circle of the workforce.

  11. Standard Test Method for Normal Spectral Emittance at Elevated Temperatures

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1972-01-01

    1.1 This test method describes a highly accurate technique for measuring the normal spectral emittance of electrically conducting materials or materials with electrically conducting substrates, in the temperature range from 600 to 1400 K, and at wavelengths from 1 to 35 μm. 1.2 The test method requires expensive equipment and rather elaborate precautions, but produces data that are accurate to within a few percent. It is suitable for research laboratories where the highest precision and accuracy are desired, but is not recommended for routine production or acceptance testing. However, because of its high accuracy this test method can be used as a referee method to be applied to production and acceptance testing in cases of dispute. 1.3 The values stated in SI units are to be regarded as the standard. The values in parentheses are for information only. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this stan...

  12. Normalization Of Thermal-Radiation Form-Factor Matrix

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1994-01-01

    Report describes algorithm that adjusts form-factor matrix in TRASYS computer program, which calculates intraspacecraft radiative interchange among various surfaces and environmental heat loading from sources such as sun.

  13. Development of standard testing methods for nuclear-waste forms

    International Nuclear Information System (INIS)

    Mendel, J.E.; Nelson, R.D.

    1981-11-01

    Standard test methods for waste package component development and design, safety analyses, and licensing are being developed for the Nuclear Waste Materials Handbook. This paper describes mainly the testing methods for obtaining waste form materials data

  14. Comparative analysis of JKR Sarawak form of contract and Malaysia Standard form of building contract (PWD203A)

    Science.gov (United States)

    Yunus, A. I. A.; Muhammad, W. M. N. W.; Saaid, M. N. F.

    2018-04-01

    Standard form of contract is normally being used in Malaysia construction industry in establishing legal relation between contracting parties. Generally, most of Malaysia federal government construction project used PWD203A which is a standard form of contract to be used where Bills of Quantities Form Part of the Contract and it is issued by Public Works Department (PWD/JKR). On the other hand in Sarawak, the largest state in Malaysia, the state government has issued their own standard form of contract namely JKR Sarawak Form of Contract 2006. Even both forms have been used widely in construction industry; there is still lack of understanding on both forms. The aim of this paper is to identify significant provision on both forms of contract. Document analysis has been adopted in conducting an in-depth review on both forms. It is found that, both forms of contracts have differences and similarities on several provisions specifically matters to definitions and general; execution of the works; payments, completion and final account; and delay, dispute resolution and determination.

  15. 41 CFR 101-1.4901 - Standard forms. [Reserved

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Standard forms. [Reserved] 101-1.4901 Section 101-1.4901 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS GENERAL 1-INTRODUCTION 1.49-Illustrations of Forms...

  16. Diagonalization and Jordan Normal Form--Motivation through "Maple"[R

    Science.gov (United States)

    Glaister, P.

    2009-01-01

    Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package…

  17. On the relationship between LTL normal forms and Büchi automata

    DEFF Research Database (Denmark)

    Li, Jianwen; Pu, Geguang; Zhang, Lijun

    2013-01-01

    In this paper, we revisit the problem of translating LTL formulas to Büchi automata. We first translate the given LTL formula into a special disjuctive-normal form (DNF). The formula will be part of the state, and its DNF normal form specifies the atomic properties that should hold immediately...

  18. Normal forms of invariant vector fields under a finite group action

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    Let Γ be a finite subgroup of GL(n,C). This subgroup acts on the space of germs of holomorphic vector fields vanishing at the origin in C n . We prove a theorem of invariant conjugation to a normal form and linearization for the subspace of invariant elements and we give a description of these normal forms in dimension n=2. (author)

  19. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.

    Science.gov (United States)

    Frejlich, Pedro; Mărcuț, Ioan

    2018-01-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  20. Application of normal form methods to the analysis of resonances in particle accelerators

    International Nuclear Information System (INIS)

    Davies, W.G.

    1992-01-01

    The transformation to normal form in a Lie-algebraic framework provides a very powerful method for identifying and analysing non-linear behaviour and resonances in particle accelerators. The basic ideas are presented and illustrated. (author). 4 refs

  1. On some hypersurfaces with time like normal bundle in pseudo Riemannian space forms

    International Nuclear Information System (INIS)

    Kashani, S.M.B.

    1995-12-01

    In this work we classify immersed hypersurfaces with constant sectional curvature in pseudo Riemannian space forms if the normal bundle is time like and the mean curvature is constant. (author). 9 refs

  2. Normal Forms for Retarded Functional Differential Equations and Applications to Bogdanov-Takens Singularity

    Science.gov (United States)

    Faria, T.; Magalhaes, L. T.

    The paper addresses, for retarded functional differential equations (FDEs), the computation of normal forms associated with the flow on a finite-dimensional invariant manifold tangent to invariant spaces for the infinitesimal generator of the linearized equation at a singularity. A phase space appropriate to the computation of these normal forms is introduced, and adequate nonresonance conditions for the computation of the normal forms are derived. As an application, the general situation of Bogdanov-Takens singularity and its versal unfolding for scalar retarded FDEs with nondegeneracy at second order is considered, both in the general case and in the case of differential-delay equations of the form ẋ( t) = ƒ( x( t), x( t-1)).

  3. Quantifying Normal Craniofacial Form and Baseline Craniofacial Asymmetry in the Pediatric Population.

    Science.gov (United States)

    Cho, Min-Jeong; Hallac, Rami R; Ramesh, Jananie; Seaward, James R; Hermann, Nuno V; Darvann, Tron A; Lipira, Angelo; Kane, Alex A

    2018-03-01

    Restoring craniofacial symmetry is an important objective in the treatment of many craniofacial conditions. Normal form has been measured using anthropometry, cephalometry, and photography, yet all of these modalities have drawbacks. In this study, the authors define normal pediatric craniofacial form and craniofacial asymmetry using stereophotogrammetric images, which capture a densely sampled set of points on the form. After institutional review board approval, normal, healthy children (n = 533) with no known craniofacial abnormalities were recruited at well-child visits to undergo full head stereophotogrammetric imaging. The children's ages ranged from 0 to 18 years. A symmetric three-dimensional template was registered and scaled to each individual scan using 25 manually placed landmarks. The template was deformed to each subject's three-dimensional scan using a thin-plate spline algorithm and closest point matching. Age-based normal facial models were derived. Mean facial asymmetry and statistical characteristics of the population were calculated. The mean head asymmetry across all pediatric subjects was 1.5 ± 0.5 mm (range, 0.46 to 4.78 mm), and the mean facial asymmetry was 1.2 ± 0.6 mm (range, 0.4 to 5.4 mm). There were no significant differences in the mean head or facial asymmetry with age, sex, or race. Understanding the "normal" form and baseline distribution of asymmetry is an important anthropomorphic foundation. The authors present a method to quantify normal craniofacial form and baseline asymmetry in a large pediatric sample. The authors found that the normal pediatric craniofacial form is asymmetric, and does not change in magnitude with age, sex, or race.

  4. A normal form approach to the theory of nonlinear betatronic motion

    International Nuclear Information System (INIS)

    Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.

    1994-01-01

    The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)

  5. SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS

    Directory of Open Access Journals (Sweden)

    A. V. Sokolov

    2016-01-01

    Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.

  6. 48 CFR 53.301-1427 - Standard Form 1427, Inventory Schedule A-Construction Sheet (Metals in Mill Product Form).

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Standard Form 1427, Inventory Schedule A-Construction Sheet (Metals in Mill Product Form). 53.301-1427 Section 53.301-1427... Illustrations of Forms 53.301-1427 Standard Form 1427, Inventory Schedule A—Construction Sheet (Metals in Mill...

  7. Effects of a prolonged standardized diet on normalizing the human metabolome123

    OpenAIRE

    Winnike, Jason H; Busby, Marjorie G; Watkins, Paul B; O'Connell, Thomas M

    2009-01-01

    Background: Although the effects of acute dietary interventions on the human metabolome have been studied, the extent to which the metabolome can be normalized by extended dietary standardization has not yet been examined.

  8. Breast composition measurements using retrospective standard mammogram form (SMF)

    International Nuclear Information System (INIS)

    Highnam, R; Pan, X; Warren, R; Jeffreys, M; Smith, G Davey; Brady, M

    2006-01-01

    The standard mammogram form (SMF) representation of an x-ray mammogram is a standardized, quantitative representation of the breast from which the volume of non-fat tissue and breast density can be easily estimated, both of which are of significant interest in determining breast cancer risk. Previous theoretical analysis of SMF had suggested that a complete and substantial set of calibration data (such as mAs and kVp) would be needed to generate realistic breast composition measures and yet there are many interesting trials that have retrospectively collected images with no calibration data. The main contribution of this paper is to revisit our previous theoretical analysis of SMF with respect to errors in the calibration data and to show how and why that theoretical analysis did not match the results from the practical implementations of SMF. In particular, we show how by estimating breast thickness for every image we are, effectively, compensating for any errors in the calibration data. To illustrate our findings, the current implementation of SMF (version 2.2β) was run over 4028 digitized film-screen mammograms taken from six sites over the years 1988-2002 with and without using the known calibration data. Results show that the SMF implementation running without any calibration data at all generates results which display a strong relationship with when running with a complete set of calibration data, and, most importantly, to an expert's visual assessment of breast composition using established techniques. SMF shows considerable promise in being of major use in large epidemiological studies related to breast cancer which require the automated analysis of large numbers of films from many years previously where little or no calibration data is available

  9. Standard form contracts and a smart contract future

    Directory of Open Access Journals (Sweden)

    Kristin B. Cornelius

    2018-05-01

    Full Text Available With a budding market of widespread smart contract implementation on the horizon, there is much conversation about how to regulate this new technology. Discourse on standard form contracts (SFCs and how they have been adopted in a digital environment is useful toward predicting how smart contracts might be interpreted. This essay provides a critical review of the discourse surrounding digitised SFCs and applies it to issues in smart contract regulation. An exploration of the literature surrounding specific instances SFCs finds that it lacks a close examination of the textual and documentary aspects of SFCs, which are particularly important in a digital environment as a shift in medium prompts a different procedural process. Instead, common perspectives are either based on outdated notions of paper versions of these contracts or on ideologies of industry and business that do not sufficiently address the needs of consumers/users in the digital age. Most importantly, noting the failure of contract law to address the inequities of SFCs in this environment can help prevent them from being codified further with smart contracts.

  10. Reconstruction of normal forms by learning informed observation geometries from data.

    Science.gov (United States)

    Yair, Or; Talmon, Ronen; Coifman, Ronald R; Kevrekidis, Ioannis G

    2017-09-19

    The discovery of physical laws consistent with empirical observations is at the heart of (applied) science and engineering. These laws typically take the form of nonlinear differential equations depending on parameters; dynamical systems theory provides, through the appropriate normal forms, an "intrinsic" prototypical characterization of the types of dynamical regimes accessible to a given model. Using an implementation of data-informed geometry learning, we directly reconstruct the relevant "normal forms": a quantitative mapping from empirical observations to prototypical realizations of the underlying dynamics. Interestingly, the state variables and the parameters of these realizations are inferred from the empirical observations; without prior knowledge or understanding, they parametrize the dynamics intrinsically without explicit reference to fundamental physical quantities.

  11. Standardized uptake values of fluorine-18 fluorodeoxyglucose: the value of different normalization procedures

    International Nuclear Information System (INIS)

    Schomburg, A.; Bender, H.; Reichel, C.; Sommer, T.; Ruhlmann, J.; Kozak, B.; Biersack, H.J.

    1996-01-01

    While the evident advantages of absolute metabolic rate determinations cannot be equalled by static image analysis of fluorine-18 fluorodexyglucose positron emission tomographic (FDG PET) studies, various algorithms for the normalization of static FDG uptake values have been proposed. This study was performed to compare different normalization procedures in terms of dependency on individual patient characteristics. Standardized FDG uptake values (SUVs) were calculated for liver and lung tissue in 126 patients studied with whole-body FDG PET. Uptake values were normalized for total body weight, lean body mass and body surface area. Ranges, means, medians, standard deviations and variation coefficients of these SUV parameters were calculated and their interdependency with total body weight, lean body mass, body surface area, patient height and blood sugar levels was calculated by means of regression analysis. Standardized FDG uptake values normalized for body surface area were clearly superior to SUV parameters normalized for total body weight or lean body mass. Variation and correlation coefficients of body surface area-normalized uptake values were minimal when compared with SUV parameters derived from the other normalization procedures. Normalization for total body weight resulted in uptake values still dependent on body weight and blood sugar levels, while normalization for lean body mass did not eliminate the positive correlation with lean body mass and patient height. It is concluded that normalization of FDG uptake values for body surface area is less dependent on the individual patient characteristics than are FDG uptake values normalized for other parameters, and therefore appears to be preferable for FDG PET studies in oncology. (orig.)

  12. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Science.gov (United States)

    2010-10-01

    ... Policy, Form MA-283. 308.409 Section 308.409 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF... of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk Builder's Risk Insurance Policy, Form MA-283 may be obtained from the American War Risk Agency or MARAD. ...

  13. On the construction of the Kolmogorov normal form for the Trojan asteroids

    CERN Document Server

    Gabern, F; Locatelli, U

    2004-01-01

    In this paper we focus on the stability of the Trojan asteroids for the planar Restricted Three-Body Problem (RTBP), by extending the usual techniques for the neighbourhood of an elliptic point to derive results in a larger vicinity. Our approach is based on the numerical determination of the frequencies of the asteroid and the effective computation of the Kolmogorov normal form for the corresponding torus. This procedure has been applied to the first 34 Trojan asteroids of the IAU Asteroid Catalog, and it has worked successfully for 23 of them. The construction of this normal form allows for computer-assisted proofs of stability. To show it, we have implemented a proof of existence of families of invariant tori close to a given asteroid, for a high order expansion of the Hamiltonian. This proof has been successfully applied to three Trojan asteroids.

  14. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.; Spoto, F.; Scollo, Giuseppe; Nijholt, Antinus

    2003-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq 1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  15. Generating all permutations by context-free grammars in Chomsky normal form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2006-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq1$, with

  16. Generating All Permutations by Context-Free Grammars in Chomsky Normal Form

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2004-01-01

    Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with

  17. THE METHOD OF CONSTRUCTING A BOOLEAN FORMULA OF A POLYGON IN THE DISJUNCTIVE NORMAL FORM

    Directory of Open Access Journals (Sweden)

    A. A. Butov

    2014-01-01

    Full Text Available The paper focuses on finalizing the method of finding a polygon Boolean formula in disjunctive normal form, described in the previous article [1]. An improved method eliminates the drawback asso-ciated with the existence of a class of problems for which the solution is only approximate. The pro-posed method always allows to find an exact solution. The method can be used, in particular, in the systems of computer-aided design of integrated circuits topology.

  18. Analysis of a renormalization group method and normal form theory for perturbed ordinary differential equations

    Science.gov (United States)

    DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.

    2008-06-01

    For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).

  19. Normalization method for metabolomics data using optimal selection of multiple internal standards

    Directory of Open Access Journals (Sweden)

    Yetukuri Laxman

    2007-03-01

    Full Text Available Abstract Background Success of metabolomics as the phenotyping platform largely depends on its ability to detect various sources of biological variability. Removal of platform-specific sources of variability such as systematic error is therefore one of the foremost priorities in data preprocessing. However, chemical diversity of molecular species included in typical metabolic profiling experiments leads to different responses to variations in experimental conditions, making normalization a very demanding task. Results With the aim to remove unwanted systematic variation, we present an approach that utilizes variability information from multiple internal standard compounds to find optimal normalization factor for each individual molecular species detected by metabolomics approach (NOMIS. We demonstrate the method on mouse liver lipidomic profiles using Ultra Performance Liquid Chromatography coupled to high resolution mass spectrometry, and compare its performance to two commonly utilized normalization methods: normalization by l2 norm and by retention time region specific standard compound profiles. The NOMIS method proved superior in its ability to reduce the effect of systematic error across the full spectrum of metabolite peaks. We also demonstrate that the method can be used to select best combinations of standard compounds for normalization. Conclusion Depending on experiment design and biological matrix, the NOMIS method is applicable either as a one-step normalization method or as a two-step method where the normalization parameters, influenced by variabilities of internal standard compounds and their correlation to metabolites, are first calculated from a study conducted in repeatability conditions. The method can also be used in analytical development of metabolomics methods by helping to select best combinations of standard compounds for a particular biological matrix and analytical platform.

  20. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, James A.; Heinemann, Klaus [New Mexico Univ., Albuquerque, NM (United States). Dept. of Mathematics and Statistics; Vogt, Mathias [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Gooden, Matthew [North Carolina State Univ., Raleigh, NC (United States). Dept. of Physics

    2013-03-15

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length {lambda} of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As {lambda} varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in

  1. Planar undulator motion excited by a fixed traveling wave. Quasiperiodic averaging normal forms and the FEL pendulum

    International Nuclear Information System (INIS)

    Ellison, James A.; Heinemann, Klaus; Gooden, Matthew

    2013-03-01

    We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length λ of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As λ varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in the

  2. Primary hafnium metal sponge and other forms, approved standard 1973

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    A specification is presented covering virgin hafnium metal commonly designated as sponge because of its porous, sponge-like texture; it may also be in other forms such as chunklets. The specification does not cover crystal bar

  3. 7 CFR 1755.30 - List of telecommunications standard contract forms.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false List of telecommunications standard contract forms... UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE TELECOMMUNICATIONS POLICIES ON SPECIFICATIONS, ACCEPTABLE MATERIALS, AND STANDARD CONTRACT FORMS § 1755.30 List of telecommunications standard contract forms. (a...

  4. Understanding Emotions from Standardized Facial Expressions in Autism and Normal Development

    Science.gov (United States)

    Castelli, Fulvia

    2005-01-01

    The study investigated the recognition of standardized facial expressions of emotion (anger, fear, disgust, happiness, sadness, surprise) at a perceptual level (experiment 1) and at a semantic level (experiments 2 and 3) in children with autism (N= 20) and normally developing children (N= 20). Results revealed that children with autism were as…

  5. Chern–Simons–Antoniadis–Savvidy forms and standard supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Izaurieta, F., E-mail: fizaurie@udec.cl; Salgado, P., E-mail: pasalgad@udec.cl; Salgado, S., E-mail: sesalgado@udec.cl

    2017-04-10

    In the context of the so called the Chern–Simons–Antoniadis–Savvidy (ChSAS) forms, we use the methods for FDA decomposition in 1-forms to construct a four-dimensional ChSAS supergravity action for the Maxwell superalgebra. On the another hand, we use the Extended Cartan Homotopy Formula to find a method that allows the separation of the ChSAS action into bulk and boundary contributions and permits the splitting of the bulk Lagrangian into pieces that reflect the particular subspace structure of the gauge algebra.

  6. High molecular gas fractions in normal massive star-forming galaxies in the young Universe.

    Science.gov (United States)

    Tacconi, L J; Genzel, R; Neri, R; Cox, P; Cooper, M C; Shapiro, K; Bolatto, A; Bouché, N; Bournaud, F; Burkert, A; Combes, F; Comerford, J; Davis, M; Schreiber, N M Förster; Garcia-Burillo, S; Gracia-Carpio, J; Lutz, D; Naab, T; Omont, A; Shapley, A; Sternberg, A; Weiner, B

    2010-02-11

    Stars form from cold molecular interstellar gas. As this is relatively rare in the local Universe, galaxies like the Milky Way form only a few new stars per year. Typical massive galaxies in the distant Universe formed stars an order of magnitude more rapidly. Unless star formation was significantly more efficient, this difference suggests that young galaxies were much more molecular-gas rich. Molecular gas observations in the distant Universe have so far largely been restricted to very luminous, rare objects, including mergers and quasars, and accordingly we do not yet have a clear idea about the gas content of more normal (albeit massive) galaxies. Here we report the results of a survey of molecular gas in samples of typical massive-star-forming galaxies at mean redshifts of about 1.2 and 2.3, when the Universe was respectively 40% and 24% of its current age. Our measurements reveal that distant star forming galaxies were indeed gas rich, and that the star formation efficiency is not strongly dependent on cosmic epoch. The average fraction of cold gas relative to total galaxy baryonic mass at z = 2.3 and z = 1.2 is respectively about 44% and 34%, three to ten times higher than in today's massive spiral galaxies. The slow decrease between z approximately 2 and z approximately 1 probably requires a mechanism of semi-continuous replenishment of fresh gas to the young galaxies.

  7. Generating All Circular Shifts by Context-Free Grammars in Greibach Normal Form

    NARCIS (Netherlands)

    Asveld, Peter R.J.

    2007-01-01

    For each alphabet Σn = {a1,a2,…,an}, linearly ordered by a1 < a2 < ⋯ < an, let Cn be the language of circular or cyclic shifts over Σn, i.e., Cn = {a1a2 ⋯ an-1an, a2a3 ⋯ ana1,…,ana1 ⋯ an-2an-1}. We study a few families of context-free grammars Gn (n ≥1) in Greibach normal form such that Gn generates

  8. Normal standards for kidney length as measured with US in premature infants

    International Nuclear Information System (INIS)

    Schlesinger, A.E.; Hedlund, G.L.; Pierson, W.P.; Null, D.M.

    1986-01-01

    In order to develop normal standards for kidney length in premature infants, the authors measured kidney length by US imaging in 39 (to date) premature infants less than 72 hours old and without known renal disease. Kidney length was compared with four different parameters of body size, including gestational age, birth weight, birth length, and body surface area. Similar standards have been generated previously for normal renal length as measured by US imaging in full-term infants and older children. These standards have proven utility in cases of congenital and acquired disorders that abnormally increase or decrease renal size. Scatter plots of kidney length versus body weight and kidney length versus body surface area conformed well to a logarithmic distribution, with a high correlation coefficient and close-fitting 95% confidence limits (SEE = 2.05)

  9. Normal form of particle motion under the influence of an ac dipole

    Directory of Open Access Journals (Sweden)

    R. Tomás

    2002-05-01

    Full Text Available ac dipoles in accelerators are used to excite coherent betatron oscillations at a drive frequency close to the tune. These beam oscillations may last arbitrarily long and, in principle, there is no significant emittance growth if the ac dipole is adiabatically turned on and off. Therefore the ac dipole seems to be an adequate tool for nonlinear diagnostics provided the particle motion is well described in the presence of the ac dipole and nonlinearities. Normal forms and Lie algebra are powerful tools to study the nonlinear content of an accelerator lattice. In this article a way to obtain the normal form of the Hamiltonian of an accelerator with an ac dipole is described. The particle motion to first order in the nonlinearities is derived using Lie algebra techniques. The dependence of the Hamiltonian terms on the longitudinal coordinate is studied showing that they vary differently depending on the ac dipole parameters. The relation is given between the lines of the Fourier spectrum of the turn-by-turn motion and the Hamiltonian terms.

  10. Principal Typings in a Restricted Intersection Type System for Beta Normal Forms with De Bruijn Indices

    Directory of Open Access Journals (Sweden)

    Daniel Ventura

    2010-01-01

    Full Text Available The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it.

  11. Theory and praxis pf map analsys in CHEF part 1: Linear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /Fermilab

    2008-10-01

    This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.

  12. 41 CFR 102-194.5 - What is the Standard and Optional Forms Management Program?

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is the Standard and Optional Forms Management Program? 102-194.5 Section 102-194.5 Public Contracts and Property Management... PROGRAMS 194-STANDARD AND OPTIONAL FORMS MANAGEMENT PROGRAM § 102-194.5 What is the Standard and Optional...

  13. Study on electric parameters of wild and cultivated cotton forms being in normal state and irradiated

    International Nuclear Information System (INIS)

    Nazirov, N.N.; Kamalov, N.; Norbaev, N.

    1978-01-01

    The radiation effect on electric conductivity of tissues in case of alternating current, electrical capacity and cell impedance has been studied. Gamma irradiation of seedlings results in definite changes of electric factors of cells (electric conductivity, electric capacity, impedance). It is shown that especially strong changes have been revealed during gamma irradiation of radiosensitive wild form of cotton plants. The deviation of cell electric factors from the standard depends on the violation of evolutionally composed ion heterogeneity and cell colloid system state, which results in changes in their structure and metabolism in them

  14. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    International Nuclear Information System (INIS)

    Michelotti, Leo

    2009-01-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from

  15. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /FERMILAB

    2009-04-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material

  16. Optimization of accelerator parameters using normal form methods on high-order transfer maps

    Energy Technology Data Exchange (ETDEWEB)

    Snopok, Pavel [Michigan State Univ., East Lansing, MI (United States)

    2007-05-01

    Methods of analysis of the dynamics of ensembles of charged particles in collider rings are developed. The following problems are posed and solved using normal form transformations and other methods of perturbative nonlinear dynamics: (1) Optimization of the Tevatron dynamics: (a) Skew quadrupole correction of the dynamics of particles in the Tevatron in the presence of the systematic skew quadrupole errors in dipoles; (b) Calculation of the nonlinear tune shift with amplitude based on the results of measurements and the linear lattice information; (2) Optimization of the Muon Collider storage ring: (a) Computation and optimization of the dynamic aperture of the Muon Collider 50 x 50 GeV storage ring using higher order correctors; (b) 750 x 750 GeV Muon Collider storage ring lattice design matching the Tevatron footprint. The normal form coordinates have a very important advantage over the particle optical coordinates: if the transformation can be carried out successfully (general restrictions for that are not much stronger than the typical restrictions imposed on the behavior of the particles in the accelerator) then the motion in the new coordinates has a very clean representation allowing to extract more information about the dynamics of particles, and they are very convenient for the purposes of visualization. All the problem formulations include the derivation of the objective functions, which are later used in the optimization process using various optimization algorithms. Algorithms used to solve the problems are specific to collider rings, and applicable to similar problems arising on other machines of the same type. The details of the long-term behavior of the systems are studied to ensure the their stability for the desired number of turns. The algorithm of the normal form transformation is of great value for such problems as it gives much extra information about the disturbing factors. In addition to the fact that the dynamics of particles is represented

  17. Bioactive form of resveratrol in glioblastoma cells and its safety for normal brain cells

    Directory of Open Access Journals (Sweden)

    Xiao-Hong Shu

    2013-05-01

    Full Text Available ABSTRACTBackground: Resveratrol, a plant polyphenol existing in grapes and many other natural foods, possesses a wide range of biological activities including cancer prevention. It has been recognized that resveratrol is intracellularly biotransformed to different metabolites, but no direct evidence has been available to ascertain its bioactive form because of the difficulty to maintain resveratrol unmetabolized in vivo or in vitro. It would be therefore worthwhile to elucidate the potential therapeutic implications of resveratrol metabolism using a reliable resveratrol-sensitive cancer cells.Objective: To identify the real biological form of trans-resveratrol and to evaluate the safety of the effective anticancer dose of resveratrol for the normal brain cells.Methods: The samples were prepared from the condition media and cell lysates of human glioblastoma U251 cells, and were purified by solid phase extraction (SPE. The samples were subjected to high performance liquid chromatography (HPLC and liquid chromatography/tandem mass spectrometry (LC/MS analysis. According to the metabolite(s, trans-resveratrol was biotransformed in vitro by the method described elsewhere, and the resulting solution was used to treat U251 cells. Meanwhile, the responses of U251 and primarily cultured rat normal brain cells (glial cells and neurons to 100μM trans-resveratrol were evaluated by multiple experimental methods.Results: The results revealed that resveratrol monosulfate was the major metabolite in U251 cells. About half fraction of resveratrol monosulfate was prepared in vitro and this trans-resveratrol and resveratrol monosulfate mixture showed little inhibitory effect on U251 cells. It is also found that rat primary brain cells (PBCs not only resist 100μM but also tolerate as high as 200μM resveratrol treatment.Conclusions: Our study thus demonstrated that trans-resveratrol was the bioactive form in glioblastoma cells and, therefore, the biotransforming

  18. Estimates and Standard Errors for Ratios of Normalizing Constants from Multiple Markov Chains via Regeneration.

    Science.gov (United States)

    Doss, Hani; Tan, Aixin

    2014-09-01

    In the classical biased sampling problem, we have k densities π 1 (·), …, π k (·), each known up to a normalizing constant, i.e. for l = 1, …, k , π l (·) = ν l (·)/ m l , where ν l (·) is a known function and m l is an unknown constant. For each l , we have an iid sample from π l , · and the problem is to estimate the ratios m l /m s for all l and all s . This problem arises frequently in several situations in both frequentist and Bayesian inference. An estimate of the ratios was developed and studied by Vardi and his co-workers over two decades ago, and there has been much subsequent work on this problem from many different perspectives. In spite of this, there are no rigorous results in the literature on how to estimate the standard error of the estimate. We present a class of estimates of the ratios of normalizing constants that are appropriate for the case where the samples from the π l 's are not necessarily iid sequences, but are Markov chains. We also develop an approach based on regenerative simulation for obtaining standard errors for the estimates of ratios of normalizing constants. These standard error estimates are valid for both the iid case and the Markov chain case.

  19. Normal form analysis of linear beam dynamics in a coupled storage ring

    International Nuclear Information System (INIS)

    Wolski, Andrzej; Woodley, Mark D.

    2004-01-01

    The techniques of normal form analysis, well known in the literature, can be used to provide a straightforward characterization of linear betatron dynamics in a coupled lattice. Here, we consider both the beam distribution and the betatron oscillations in a storage ring. We find that the beta functions for uncoupled motion generalize in a simple way to the coupled case. Defined in the way that we propose, the beta functions remain well behaved (positive and finite) under all circumstances, and have essentially the same physical significance for the beam size and betatron oscillation amplitude as in the uncoupled case. Application of this analysis to the online modeling of the PEP-II rings is also discussed

  20. 7 CFR 28.123 - Costs of practical forms of cotton standards.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Costs of practical forms of cotton standards. 28.123 Section 28.123 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE COMMODITY STANDARDS AND STANDARD...

  1. Cephalometric Standards of Pre-University Boys with Normal Occlusion in Hamadan

    Directory of Open Access Journals (Sweden)

    N. Farhadian

    2005-04-01

    Full Text Available The important base of orthodontic treatment is correct diagnosis . One of the diagnostic tools is lateral cephalogram. There are some differences in normal standards between different races. The present study was carried out with the aim of determining and assessing the cephalometric standards of boys with the age of 17 to 20 years old in Hamadan.Among 1204 boys of preuniversity centers , 27 persons were selected based on IOTN and normal occlusal standards. Lateral cephalograms were obtained in Natural Head Position. 22 cephalometric variables (15 angles , 5 lines , 2 ratios were determined and measured three times by an orthodontist . Student t - test used for analysis.Mean age of the cases were 18.2±1.4 years. Range of reliability coefficient was between 0.901 to 0.986. In comparison with similar studies following variables were statistically different at p<0.05 level: Articular Angle= 146 ,Gonial Angle =118 , NPog-TH =89 , AB-TH = 4.6 , L1 –TH =116 , Go Gn –TH =20 , Ant. Cranial Base =76mm.The length of anterior cranial base in our study was significantly less than Michigan standards and there was a tendency to more straight profile in this evaluation . In comparison with the Cooke standards there was less protrusion in mandibular incisors and more counter-clockwise rotation of mandible. In comparison with similar study on girls(with normal occlusion and 18.2±1.1 years old linear measurements were generally greater in boys. Therefore it is important to consider the ethnic and racial variations in our ideal treatment plan.

  2. A Mathematical Framework for Critical Transitions: Normal Forms, Variance and Applications

    Science.gov (United States)

    Kuehn, Christian

    2013-06-01

    Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast-subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension-two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.

  3. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    Science.gov (United States)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  4. Improved overall delivery documentation following implementation of a standardized shoulder dystocia delivery form

    Science.gov (United States)

    Moragianni, Vasiliki A.; Hacker, Michele R.; Craparo, Frank J.

    2013-01-01

    Objective Our objective was to evaluate whether using a standardized shoulder dystocia delivery form improved documentation. A standardized delivery form was added to our institution’s obstetrical record in August 2003. Methods A retrospective cohort study was conducted comparing 100 vaginal deliveries complicated by shoulder dystocia before, and 81 after implementation of the standardized delivery form. The two groups were compared in terms of obstetric characteristics, neonatal outcomes and documentation components. Results Charts that included the standardized delivery form were more likely to contain documentation of estimated fetal weight (82.7% vs. 39.0% without the form, Pdystocia, and second stage duration. Conclusions Inclusion of a standardized form in the delivery record improves the rate of documentation of both shoulder dystocia-specific and general delivery components. PMID:22017330

  5. Child in a Form: The Definition of Normality and Production of Expertise in Teacher Statement Forms--The Case of Northern Finland, 1951-1990

    Science.gov (United States)

    Koskela, Anne; Vehkalahti, Kaisa

    2017-01-01

    This article shows the importance of paying attention to the role of professional devices, such as standardised forms, as producers of normality and deviance in the history of education. Our case study focused on the standardised forms used by teachers during child guidance clinic referrals and transfers to special education in northern Finland,…

  6. Standard heart and vessel size on plain films of normal children

    International Nuclear Information System (INIS)

    Stoever, B.

    1986-01-01

    Standards of heart size, i.e. heart diameters and heart volume of normal children aged 4-15 years were obtained. In all cases requiring exact heart size determination, heart volume calculation is mandatory in children as well as in adults. Statistical work to date has provided precise calculation of heart volume plain films in the upright position. Additional plain films in prone position are unnecessary because no evident orthostatic influence on heart volume in children can be found. Percentiles of normal heart volume related to body weight representing the best correlation to the individual data are given as well as percentiles related to age. Furthermore ratios of normal vessel size to the height of the 8sup(th) thoracic vertebral body, measured on the same plain film, are given. In addition the ratio of upper to lower lung vessel size is calculated. These ratios are useful criteria in estimating normal vessel size and also in cases with increased pulmonary venous pressure. (orig.) [de

  7. 12 CFR 22.6 - Required use of standard flood hazard determination form.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Required use of standard flood hazard determination form. 22.6 Section 22.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY... the Act. The standard flood hazard determination form may be used in a printed, computerized, or...

  8. 41 CFR 101-39.4901 - Obtaining standard and optional forms.

    Science.gov (United States)

    2010-07-01

    ... VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.49-Forms § 101-39.4901 Obtaining standard and optional... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Obtaining standard and optional forms. 101-39.4901 Section 101-39.4901 Public Contracts and Property Management Federal Property...

  9. 76 FR 44965 - Notice of Revision of Standard Forms 39 and 39-A

    Science.gov (United States)

    2011-07-27

    ... OFFICE OF PERSONNEL MANAGEMENT Notice of Revision of Standard Forms 39 and 39-A AGENCY: U.S... Management (OPM) has revised Standard Form (SF) 39, Request For Referral Of Eligibles, and SF 39-A, Request... part 332. The SF 39 outlines instructions to be used by hiring officials to request a list of eligible...

  10. Metacognition and Reading: Comparing Three Forms of Metacognition in Normally Developing Readers and Readers with Dyslexia.

    Science.gov (United States)

    Furnes, Bjarte; Norman, Elisabeth

    2015-08-01

    Metacognition refers to 'cognition about cognition' and includes metacognitive knowledge, strategies and experiences (Efklides, 2008; Flavell, 1979). Research on reading has shown that better readers demonstrate more metacognitive knowledge than poor readers (Baker & Beall, 2009), and that reading ability improves through strategy instruction (Gersten, Fuchs, Williams, & Baker, 2001). The current study is the first to specifically compare the three forms of metacognition in dyslexic (N = 22) versus normally developing readers (N = 22). Participants read two factual texts, with learning outcome measured by a memory task. Metacognitive knowledge and skills were assessed by self-report. Metacognitive experiences were measured by predictions of performance and judgments of learning. Individuals with dyslexia showed insight into their reading problems, but less general knowledge of how to approach text reading. They more often reported lack of available reading strategies, but groups did not differ in the use of deep and surface strategies. Learning outcome and mean ratings of predictions of performance and judgments of learning were lower in dyslexic readers, but not the accuracy with which metacognitive experiences predicted learning. Overall, the results indicate that dyslexic reading and spelling problems are not generally associated with lower levels of metacognitive knowledge, metacognitive strategies or sensitivity to metacognitive experiences in reading situations. 2015 The Authors. Dyslexia Published by John Wiley & Sons Ltd.

  11. Use of newly developed standardized form for interpretation of high-resolution CT in screening for pneumoconiosis

    International Nuclear Information System (INIS)

    Julien, P.J.; Sider, L.; Silverman, J.M.; Dahlgren, J.; Harber, P.; Bunn, W.

    1991-01-01

    This paper reports that although the International Labour Office (ILO) standard for interpretation of the posteroanterior chest radiograph has been available for 10 years, there has been no attempt to standardize the high-resolution CT (HRTC) readings for screening of pneumoconiosis. An integrated respirator surveillance program for 87 workers exposed to inorganic dust was conducted. This program consisted of a detailed occupational exposure history, physical symptoms and signs, spirometry, chest radiography, and HRCT. Two groups of workers with known exposure were studied with HRCT. Group 1 had normal spirometry results and chest radiographs, and group 2 had abnormalities at spirometry or on chest radiographs. The HRCT scans were read independently of the clinical findings and chest radiographs. The HRCT scans were interpreted by using an ILO-based standard form developed by the authors for this project. With the newly developed HRCT form, individual descriptive abnormality localized severity, and overall rating systems have been developed and compared for inter- and intraobserver consistency

  12. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    Science.gov (United States)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when

  13. Normalize the response of EPID in pursuit of linear accelerator dosimetry standardization.

    Science.gov (United States)

    Cai, Bin; Goddu, S Murty; Yaddanapudi, Sridhar; Caruthers, Douglas; Wen, Jie; Noel, Camille; Mutic, Sasa; Sun, Baozhou

    2018-01-01

    Normalize the response of electronic portal imaging device (EPID) is the first step toward an EPID-based standardization of Linear Accelerator (linac) dosimetry quality assurance. In this study, we described an approach to generate two-dimensional (2D) pixel sensitivity maps (PSM) for EPIDs response normalization utilizing an alternative beam and dark-field (ABDF) image acquisition technique and large overlapping field irradiations. The automated image acquisition was performed by XML-controlled machine operation and the PSM was generated based on a recursive calculation algorithm for Varian linacs equipped with aS1000 and aS1200 imager panels. Cross-comparisons of normalized beam profiles and 1.5%/1.5 mm 1D Gamma analysis was adopted to quantify the improvement of beam profile matching before and after PSM corrections. PSMs were derived for both photon (6, 10, 15 MV) and electron (6, 20 MeV) beams via proposed method. The PSM-corrected images reproduced a horn-shaped profile for photon beams and a relative uniform profiles for electrons. For dosimetrically matched linacs equipped with aS1000 panels, PSM-corrected images showed increased 1D-Gamma passing rates for all energies, with an average 10.5% improvement for crossline and 37% for inline beam profiles. Similar improvements in the phantom study were observed with a maximum improvement of 32% for 15 MV and 22% for 20 MeV. The PSM value showed no significant change for all energies over a 3-month period. In conclusion, the proposed approach correct EPID response for both aS1000 and aS1200 panels. This strategy enables the possibility to standardize linac dosimetry QA and to benchmark linac performance utilizing EPID as the common detector. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  14. Standard-Chinese Lexical Neighborhood Test in normal-hearing young children.

    Science.gov (United States)

    Liu, Chang; Liu, Sha; Zhang, Ning; Yang, Yilin; Kong, Ying; Zhang, Luo

    2011-06-01

    The purposes of the present study were to establish the Standard-Chinese version of Lexical Neighborhood Test (LNT) and to examine the lexical and age effects on spoken-word recognition in normal-hearing children. Six lists of monosyllabic and six lists of disyllabic words (20 words/list) were selected from the database of daily speech materials for normal-hearing (NH) children of ages 3-5 years. The lists were further divided into "easy" and "hard" halves according to the word frequency and neighborhood density in the database based on the theory of Neighborhood Activation Model (NAM). Ninety-six NH children (age ranged between 4.0 and 7.0 years) were divided into three different age groups of 1-year intervals. Speech-perception tests were conducted using the Standard-Chinese monosyllabic and disyllabic LNT. The inter-list performance was found to be equivalent and inter-rater reliability was high with 92.5-95% consistency. Results of word-recognition scores showed that the lexical effects were all significant. Children scored higher with disyllabic words than with monosyllabic words. "Easy" words scored higher than "hard" words. The word-recognition performance also increased with age in each lexical category. A multiple linear regression analysis showed that neighborhood density, age, and word frequency appeared to have increasingly more contributions to Chinese word recognition. The results of the present study indicated that performances of Chinese word recognition were influenced by word frequency, age, and neighborhood density, with word frequency playing a major role. These results were consistent with those in other languages, supporting the application of NAM in the Chinese language. The development of Standard-Chinese version of LNT and the establishment of a database of children of 4-6 years old can provide a reliable means for spoken-word recognition test in children with hearing impairment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. Investigation of reliability, validity and normality Persian version of the California Critical Thinking Skills Test; Form B (CCTST

    Directory of Open Access Journals (Sweden)

    Khallli H

    2003-04-01

    Full Text Available Background: To evaluate the effectiveness of the present educational programs in terms of students' achieving problem solving, decision making and critical thinking skills, reliable, valid and standard instrument are needed. Purposes: To Investigate the Reliability, validity and Norm of CCTST Form.B .The California Critical Thinking Skills Test contain 34 multi-choice questions with a correct answer in the jive Critical Thinking (CT cognitive skills domain. Methods: The translated CCTST Form.B were given t0405 BSN nursing students ojNursing Faculties located in Tehran (Tehran, Iran and Shahid Beheshti Universitiesthat were selected in the through random sampling. In order to determine the face and content validity the test was translated and edited by Persian and English language professor and researchers. it was also confirmed by judgments of a panel of medical education experts and psychology professor's. CCTST reliability was determined with internal consistency and use of KR-20. The construct validity of the test was investigated with factor analysis and internal consistency and group difference. Results: The test coefficien for reliablity was 0.62. Factor Analysis indicated that CCTST has been formed from 5 factor (element namely: Analysis, Evaluation, lriference, Inductive and Deductive Reasoning. Internal consistency method shows that All subscales have been high and positive correlation with total test score. Group difference method between nursing and philosophy students (n=50 indicated that there is meaningfUl difference between nursing and philosophy students scores (t=-4.95,p=0.OOO1. Scores percentile norm also show that percentile offifty scores related to 11 raw score and 95, 5 percentiles are related to 17 and 6 raw score ordinary. Conclusions: The Results revealed that the questions test is sufficiently reliable as a research tool, and all subscales measure a single construct (Critical Thinking and are able to distinguished the

  16. Analysis of approaches to classification of forms of non-standard employment

    Directory of Open Access Journals (Sweden)

    N. V. Dorokhova

    2017-01-01

    Full Text Available Currently becoming more widespread non-standard forms of employment. If this is not clear approach to the definition and maintenance of non-standard employment. In the article the analysis of diverse interpretations of the concept, on what basis, the author makes a conclusion about the complexity and contradictory nature of precarious employment as an economic category. It examines different approaches to classification of forms of precarious employment. The main forms of precarious employment such as flexible working year, flexible working week, flexible working hours, remote work, employees on call, shift forwarding; Agency employment, self-employment, negotiator, underemployment, over employment, employment on the basis of fixed-term contracts employment based on contract of civil-legal nature, one-time employment, casual employment, temporary employment, secondary employment and part-time. The author’s approach to classification of non-standard forms of employment, based on identifying the impact of atypical employment on the development of human potential. For the purpose of classification of non-standard employment forms from the standpoint of their impact on human development as the criteria of classification proposed in the following: working conditions, wages and social guarantees, possibility of workers ' participation in management, personal development and self-employment stability. Depending on what value each of these criteria, some form of non-standard employment can be attributed to the progressive or regressive. Classification of non-standard forms of employment should be the basis of the state policy of employment management.

  17. MRI of the normal appendix in children: data toward a new reference standard

    Energy Technology Data Exchange (ETDEWEB)

    Swenson, David W. [Alpert Medical School of Brown University and Rhode Island Hospital, Department of Diagnostic Imaging, Providence, RI (United States); Schooler, Gary R. [Duke University Medical Center, Department of Radiology, Durham, NC (United States); Stamoulis, Catherine; Lee, Edward Y. [Boston Children' s Hospital and Harvard Medical School, Department of Radiology, Boston, MA (United States)

    2016-06-15

    Magnetic resonance imaging (MRI) might prove useful in the diagnostic evaluation of pediatric appendicitis in the effort to avoid exposing children to the ionizing radiation of CT, yet there is a paucity of literature describing the normal range of appearances of the pediatric appendix on MRI. To investigate MRI characteristics of the normal appendix to aid in establishing a reference standard in the pediatric population. We conducted a retrospective study of children and young adults (≤18 years of age) who underwent lumbar spine or pelvis MRI between Jan. 1, 2013, and Dec. 31, 2013, for indications unrelated to appendicitis. Two board-certified radiologists independently reviewed all patients' MRI examinations for appendix visualization, diameter, intraluminal content signal, and presence of periappendiceal inflammation or free fluid. We used the Cohen kappa statistic and Spearman correlation coefficient to assess reader agreement on qualitative and quantitative data, respectively. Three hundred forty-six patients met inclusion criteria. Both readers visualized the appendix in 192/346 (55.5%) patients (kappa = 0.88, P < 0.0001). Estimated median appendix diameter was 5 mm for reader 1 and 6 mm for reader 2 ([25th, 75th] quartiles = [5, 6] mm; range, 2-11 mm; r = 0.81, P < 0.0001). Appendix intraluminal signal characteristics were variable. Periappendiceal inflammation was present in 0/192 (0%) and free fluid in 6/192 (3.1%) MRI examinations (kappa = 1.0). The normal appendix was seen on MRI in approximately half of pediatric patients, with a mean diameter of ∝5-6 mm, variable intraluminal signal characteristics, no adjacent inflammatory changes, and rare surrounding free fluid. (orig.)

  18. Oblique projections and standard-form transformations for discrete inverse problems

    DEFF Research Database (Denmark)

    Hansen, Per Christian

    2013-01-01

    This tutorial paper considers a specific computational tool for the numerical solution of discrete inverse problems, known as the standard-form transformation, by which we can treat general Tikhonov regularization problems efficiently. In the tradition of B. N. Datta's expositions of numerical li...... linear algebra, we use the close relationship between oblique projections, pseudoinverses, and matrix computations to derive a simple geometric motivation and algebraic formulation of the standard-form transformation....

  19. Mandibulary dental arch form differences between level four polynomial method and pentamorphic pattern for normal occlusion sample

    Directory of Open Access Journals (Sweden)

    Y. Yuliana

    2011-07-01

    Full Text Available The aim of an orthodontic treatment is to achieve aesthetic, dental health and the surrounding tissues, occlusal functional relationship, and stability. The success of an orthodontic treatment is influenced by many factors, such as diagnosis and treatment plan. In order to do a diagnosis and a treatment plan, medical record, clinical examination, radiographic examination, extra oral and intra oral photos, as well as study model analysis are needed. The purpose of this study was to evaluate the differences in dental arch form between level four polynomial and pentamorphic arch form and to determine which one is best suitable for normal occlusion sample. This analytic comparative study was conducted at Faculty of Dentistry Universitas Padjadjaran on 13 models by comparing the dental arch form using the level four polynomial method based on mathematical calculations, the pattern of the pentamorphic arch and mandibular normal occlusion as a control. The results obtained were tested using statistical analysis T student test. The results indicate a significant difference both in the form of level four polynomial method and pentamorphic arch form when compared with mandibular normal occlusion dental arch form. Level four polynomial fits better, compare to pentamorphic arch form.

  20. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Science.gov (United States)

    Edneral, Victor

    2018-02-01

    This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  1. Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs

    Directory of Open Access Journals (Sweden)

    Edneral Victor

    2018-01-01

    Full Text Available This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.

  2. Best-Matched Internal Standard Normalization in Liquid Chromatography-Mass Spectrometry Metabolomics Applied to Environmental Samples.

    Science.gov (United States)

    Boysen, Angela K; Heal, Katherine R; Carlson, Laura T; Ingalls, Anitra E

    2018-01-16

    The goal of metabolomics is to measure the entire range of small organic molecules in biological samples. In liquid chromatography-mass spectrometry-based metabolomics, formidable analytical challenges remain in removing the nonbiological factors that affect chromatographic peak areas. These factors include sample matrix-induced ion suppression, chromatographic quality, and analytical drift. The combination of these factors is referred to as obscuring variation. Some metabolomics samples can exhibit intense obscuring variation due to matrix-induced ion suppression, rendering large amounts of data unreliable and difficult to interpret. Existing normalization techniques have limited applicability to these sample types. Here we present a data normalization method to minimize the effects of obscuring variation. We normalize peak areas using a batch-specific normalization process, which matches measured metabolites with isotope-labeled internal standards that behave similarly during the analysis. This method, called best-matched internal standard (B-MIS) normalization, can be applied to targeted or untargeted metabolomics data sets and yields relative concentrations. We evaluate and demonstrate the utility of B-MIS normalization using marine environmental samples and laboratory grown cultures of phytoplankton. In untargeted analyses, B-MIS normalization allowed for inclusion of mass features in downstream analyses that would have been considered unreliable without normalization due to obscuring variation. B-MIS normalization for targeted or untargeted metabolomics is freely available at https://github.com/IngallsLabUW/B-MIS-normalization .

  3. Cognitive Factors in the Choice of Syntactic Form by Aphasic and Normal Speakers of English and Japanese: The Speaker's Impulse.

    Science.gov (United States)

    Menn, Lise; And Others

    This study examined the role of empathy in the choice of syntactic form and the degree of independence of pragmatic and syntactic abilities in a range of aphasic patients. Study 1 involved 9 English-speaking and 9 Japanese-speaking aphasic subjects with 10 English-speaking and 4 Japanese normal controls. Study 2 involved 14 English- and 6…

  4. A simple global representation for second-order normal forms of Hamiltonian systems relative to periodic flows

    International Nuclear Information System (INIS)

    Avendaño-Camacho, M; Vallejo, J A; Vorobjev, Yu

    2013-01-01

    We study the determination of the second-order normal form for perturbed Hamiltonians relative to the periodic flow of the unperturbed Hamiltonian H 0 . The formalism presented here is global, and can be easily implemented in any computer algebra system. We illustrate it by means of two examples: the Hénon–Heiles and the elastic pendulum Hamiltonians. (paper)

  5. Algorithms for finding Chomsky and Greibach normal forms for a fuzzy context-free grammar using an algebraic approach

    Energy Technology Data Exchange (ETDEWEB)

    Lee, E.T.

    1983-01-01

    Algorithms for the construction of the Chomsky and Greibach normal forms for a fuzzy context-free grammar using the algebraic approach are presented and illustrated by examples. The results obtained in this paper may have useful applications in fuzzy languages, pattern recognition, information storage and retrieval, artificial intelligence, database and pictorial information systems. 16 references.

  6. Shear Stress-Normal Stress (Pressure) Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Science.gov (United States)

    Noguchi, Hiroshi; Takehara, Kimie; Ohashi, Yumiko; Suzuki, Ryo; Yamauchi, Toshimasa; Kadowaki, Takashi; Sanada, Hiromi

    2016-01-01

    Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure) and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure) ratio (SPR) was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH) as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure), concretely, peak values (SPR-p) and time integral values (SPR-i). The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i. PMID:28050567

  7. Shear Stress-Normal Stress (Pressure Ratio Decides Forming Callus in Patients with Diabetic Neuropathy

    Directory of Open Access Journals (Sweden)

    Ayumi Amemiya

    2016-01-01

    Full Text Available Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure ratio (SPR was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure, concretely, peak values (SPR-p and time integral values (SPR-i. The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i.

  8. Review of clinically accessible methods to determine lean body mass for normalization of standardized uptake values

    International Nuclear Information System (INIS)

    DEVRIESE, Joke; POTTEL, Hans; BEELS, Laurence; MAES, Alex; VAN DE WIELE, Christophe; GHEYSENS, Olivier

    2016-01-01

    With the routine use of 2-deoxy-2-[ 18 F]-fluoro-D-glucose (18F-FDG) positron emission tomography/computed tomography (PET/CT) scans, metabolic activity of tumors can be quantitatively assessed through calculation of SUVs. One possible normalization parameter for the standardized uptake value (SUV) is lean body mass (LBM), which is generally calculated through predictive equations based on height and body weight. (Semi-)direct measurements of LBM could provide more accurate results in cancer populations than predictive equations based on healthy populations. In this context, four methods to determine LBM are reviewed: bioelectrical impedance analysis, dual-energy X-ray absorptiometry. CT, and magnetic resonance imaging. These methods were selected based on clinical accessibility and are compared in terms of methodology, precision and accuracy. By assessing each method’s specific advantages and limitations, a well-considered choice of method can hopefully lead to more accurate SUVLBM values, hence more accurate quantitative assessment of 18F-FDG PET images.

  9. Center manifolds, normal forms and bifurcations of vector fields with application to coupling between periodic and steady motions

    Science.gov (United States)

    Holmes, Philip J.

    1981-06-01

    We study the instabilities known to aeronautical engineers as flutter and divergence. Mathematically, these states correspond to bifurcations to limit cycles and multiple equilibrium points in a differential equation. Making use of the center manifold and normal form theorems, we concentrate on the situation in which flutter and divergence become coupled, and show that there are essentially two ways in which this is likely to occur. In the first case the system can be reduced to an essential model which takes the form of a single degree of freedom nonlinear oscillator. This system, which may be analyzed by conventional phase-plane techniques, captures all the qualitative features of the full system. We discuss the reduction and show how the nonlinear terms may be simplified and put into normal form. Invariant manifold theory and the normal form theorem play a major role in this work and this paper serves as an introduction to their application in mechanics. Repeating the approach in the second case, we show that the essential model is now three dimensional and that far more complex behavior is possible, including nonperiodic and ‘chaotic’ motions. Throughout, we take a two degree of freedom system as an example, but the general methods are applicable to multi- and even infinite degree of freedom problems.

  10. Generation of Strategies for Environmental Deception in Two-Player Normal-Form Games

    Science.gov (United States)

    2015-06-18

    found in the literature is pre- sented by Kohlberg and Mertens [23]. A stable equilibrium by their definition is an equi- librium in an extensive-form...the equilibrium in this state provides them with an increased payoff. While interesting, Kohlberg and Mertens’ defi- 13 nition of equilibrium...stability used by Kohlberg and Mertens. Arsham’s work focuses on determining the amount by which a mixed-strategy Nash equilibrium’s payoff values can

  11. Syntactic Dependencies and Verbal Inflection: Complementisers and Verbal Forms in Standard Arabic

    Directory of Open Access Journals (Sweden)

    Feras Saeed

    2015-12-01

    Full Text Available This paper investigates the syntactic dependency between complementisers and verbal forms in Standard Arabic and provides a new analysis of this dependency. The imperfective verb in this language surfaces with three different forms, where each form is indicated by a different suffixal marker attached to the end of the verb as (-u, (-a, or (-Ø. The occurrence of each suffixal marker on the verb corresponds to the co-occurrence of a particular type of Comp-elements in the C/T domain. I argue that these morphological markers on the three verbal forms are the manifestation of an Agree relation between an interpretable unvalued finiteness feature [Fin] on C and an uninterpretable but valued instance of the same feature on v, assuming feature transfer and feature sharing between C/T and v (Pesetsky & Torrego 2007; Chomsky 2008. I also argue that the different verbal forms in Standard Arabic are dictated by the co-occurrence of three types of Comp-elements: i C-elements; ii T-elements which ultimately move to C; and iii imperative/negative elements. Keywords: feature transfer/sharing, verbal forms, complementisers, finiteness, syntactic dependency, Standard Arabic

  12. FDG-PET of patients with suspected renal failure. Standardized uptake values in normal tissues

    International Nuclear Information System (INIS)

    Minamimoto, Ryogo; Takahashi, Nobukazu; Inoue, Tomio

    2007-01-01

    This study aims to clarify the effect of renal function on 2-[ 18 F] fluoro-2-deoxy-D-glucose positron emission tomography (FDG-PET) imaging and determine the clinical significance of renal function in this setting. We compared FDG distribution between normal volunteers and patients with suspected renal failure. Twenty healthy volunteers and 20 patients with suspected renal failure who underwent FDG-PET between November 2002 and May 2005 were selected for this study. We define ''patients with suspected renal failure'' as having a blood serum creatinine level in excess of 1.1 mg/dl. The serum creatinine level was examined once in 2 weeks of the FDG-PET study. Regions of interest were placed over 15 regions for semi-quantitative analysis: the white matter, cortex, both upper lung fields, both middle lung fields, both lower lung fields, mediastinum, myocardium of the left ventricle, the left atrium as a cardiac blood pool, central region of the right lobe of the liver, left kidney, and both femoris muscles. The mean standardized uptake values (SUVs) of brain cortex and white matter were higher in healthy volunteers than in renal patients. The mean SUVs of the mediastinum at the level of the aortic arch and left atrium as a cardiac blood pool were lower in healthy volunteers than in patients with suspected renal failure. These regions differed between healthy volunteers and patients with suspected renal failure (P<0.05). We found decreasing brain accumulation and increasing blood pool accumulation of FDG in patients with high plasma creatinine. Although the difference is small, this phenomenon will not have a huge effect on the assessment of FDG-PET imaging in patients with suspected renal failure. (author)

  13. First-order systems of linear partial differential equations: normal forms, canonical systems, transform methods

    Directory of Open Access Journals (Sweden)

    Heinz Toparkus

    2014-04-01

    Full Text Available In this paper we consider first-order systems with constant coefficients for two real-valued functions of two real variables. This is both a problem in itself, as well as an alternative view of the classical linear partial differential equations of second order with constant coefficients. The classification of the systems is done using elementary methods of linear algebra. Each type presents its special canonical form in the associated characteristic coordinate system. Then you can formulate initial value problems in appropriate basic areas, and you can try to achieve a solution of these problems by means of transform methods.

  14. Informed consent for clinical trials: a comparative study of standard versus simplified forms.

    Science.gov (United States)

    Davis, T C; Holcombe, R F; Berkel, H J; Pramanik, S; Divers, S G

    1998-05-06

    A high level of reading skill and comprehension is necessary to understand and complete most consent forms that are required for participation in clinical research studies. This study was conducted to test the hypothesis that a simplified consent form would be less intimidating and more easily understood by individuals with low-to-marginal reading skills. During July 1996, 183 adults (53 patients with cancer or another medical condition and 130 apparently healthy participants) were tested for reading ability and then asked to read either the standard Southwestern Oncology Group (SWOG) consent form (16th grade level) or a simplified form (7th grade level) developed at Louisiana State University Medical Center-Shreveport (LSU). Participants were interviewed to assess their attitudes toward and comprehension of the form read. Then they were given the alternate consent form and asked which one they preferred and why. Overall, participants preferred the LSU form (62%; 95% confidence interval [CI] = 54.8%-69.2%) over the SWOG form (38%; 95% CI = 30.8%-45.2%) (P = .0033). Nearly all participants thought that the LSU form was easier to read (97%; 95% CI = 93.1%-99.9%) than the SWOG form (75%; 95% CI = 65.1%-85.7%) (Pinformed consent documents for the substantial proportion of Americans with low-to-marginal literacy skills.

  15. 48 CFR 53.301-252 - Standard Form 252, Architect-Engineer Contract.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Standard Form 252, Architect-Engineer Contract. 53.301-252 Section 53.301-252 Federal Acquisition Regulations System FEDERAL..., Architect-Engineer Contract. EC01MY91.035 EC01MY91.036 ...

  16. A structure-preserving approach to normal form analysis of power systems; Una propuesta de preservacion de estructura al analisis de su forma normal en sistemas de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Carrillo, Irma

    2008-01-15

    Power system dynamic behavior is inherently nonlinear and is driven by different processes at different time scales. The size and complexity of these mechanisms has stimulated the search for methods that reduce the original dimension but retain a certain degree of accuracy. In this dissertation, a novel nonlinear dynamical analysis method for the analysis of large amplitude oscillations that embraces ideas from normal form theory and singular perturbation techniques is proposed. This approach allows the full potential of the normal form method to be reached, and is suitably general for application to a wide variety of nonlinear systems. Drawing on the formal theory of dynamical systems, a structure-preserving model of the system is developed that preservers network and load characteristics. By exploiting the separation of fast and slow time scales of the model, an efficient approach based on singular perturbation techniques, is then derived for constructing a nonlinear power system representation that accurately preserves network structure. The method requires no reduction of the constraint equations and gives therefore, information about the effect of network and load characteristics on system behavior. Analytical expressions are then developed that provide approximate solutions to system performance near a singularity and techniques for interpreting these solutions in terms of modal functions are given. New insights into the nature of nonlinear oscillations are also offered and criteria for characterizing network effects on nonlinear system behavior are proposed. Theoretical insight into the behavior of dynamic coupling of differential-algebraic equations and the origin of nonlinearity is given, and implications for analyzing for design and placement of power system controllers in complex nonlinear systems are discussed. The extent of applicability of the proposed procedure is demonstrated by analyzing nonlinear behavior in two realistic test power systems

  17. Post-UV colony-forming ability of normal fibroblast strains and of the xeroderma pigmentosum group G strain

    International Nuclear Information System (INIS)

    Barrett, S.F.; Tarone, R.E.; Moshell, A.N.; Ganges, M.B.; Robbins, J.H.

    1981-01-01

    In xeroderma pigmentosum, an inherited disorder of defective DNA repair, post-uv colony-forming ability of fibroblasts from patients in complementation groups A through F correlates with the patients' neurological status. The first xeroderma pigmentosum patient assigned to the recently discovered group G had the neurological abnormalities of XP. Researchers have determined the post-uv colony-forming ability of cultured fibroblasts from this patient and from 5 more control donors. Log-phase fibroblasts were irradiated with 254 nm uv light from a germicidal lamp, trypsinized, and replated at known densities. After 2 to 4 weeks' incubation the cells were fixed, stained and scored for colony formation. The strains' post-uv colony-forming ability curves were obtained by plotting the log of the percent remaining post-uv colony-forming ability as a function of the uv dose. The post-uv colony-forming ability of 2 of the 5 new normal strains was in the previously defined control donor zone, but that of the other 3 extended down to the level of the most resistant xeroderma pigmentosum strain. The post-uv colony-forming ability curve of the group G fibroblasts was not significantly different from the curves of the group D fibroblast strains from patients with clinical histories similar to that of the group G patient

  18. The method of normal forms for singularly perturbed systems of Fredholm integro-differential equations with rapidly varying kernels

    Energy Technology Data Exchange (ETDEWEB)

    Bobodzhanov, A A; Safonov, V F [National Research University " Moscow Power Engineering Institute" , Moscow (Russian Federation)

    2013-07-31

    The paper deals with extending the Lomov regularization method to classes of singularly perturbed Fredholm-type integro-differential systems, which have not so far been studied. In these the limiting operator is discretely noninvertible. Such systems are commonly known as problems with unstable spectrum. Separating out the essential singularities in the solutions to these problems presents great difficulties. The principal one is to give an adequate description of the singularities induced by 'instability points' of the spectrum. A methodology for separating singularities by using normal forms is developed. It is applied to the above type of systems and is substantiated in these systems. Bibliography: 10 titles.

  19. Overburden Stress Normalization and Rod Length Corrections for the Standard Penetration Test (SPT)

    OpenAIRE

    Deger, Tonguc Tolga

    2014-01-01

    The Standard Penetration Test (SPT) has been a staple of geotechnical engineering practice for more than 70 years. Empirical correlations based on in situ SPT data provide an important basis for assessment of a broad range of engineering parameters, and for empirically based analysis and design methods spanning a significant number of areas of geotechnical practice. Despite this longstanding record of usage, the test itself is relatively poorly standardized with regard to the allowable variab...

  20. 33 CFR Appendix - List of FPC Standard Articles Forms Used in Permits and Licenses for Hydroelectric Projects

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false List of FPC Standard Articles Forms Used in Permits and Licenses for Hydroelectric Projects Navigation and Navigable Waters CORPS OF... Forms Used in Permits and Licenses for Hydroelectric Projects The following FPC standard articles Forms...

  1. Quantitative Analysis of Torso FDG-PET Scans by Using Anatomical Standardization of Normal Cases from Thorough Physical Examinations.

    Directory of Open Access Journals (Sweden)

    Takeshi Hara

    Full Text Available Understanding of standardized uptake value (SUV of 2-deoxy-2-[18F]fluoro-d-glucose positron emission tomography (FDG-PET depends on the background accumulations of glucose because the SUV often varies the status of patients. The purpose of this study was to develop a new method for quantitative analysis of SUV of FDG-PET scan images. The method included an anatomical standardization and a statistical comparison with normal cases by using Z-score that are often used in SPM or 3D-SSP approach for brain function analysis. Our scheme consisted of two approaches, which included the construction of a normal model and the determination of the SUV scores as Z-score index for measuring the abnormality of an FDG-PET scan image. To construct the normal torso model, all of the normal images were registered into one shape, which indicated the normal range of SUV at all voxels. The image deformation process consisted of a whole body rigid registration of shoulder to bladder region and liver registration and a non-linear registration of body surface by using the thin-plate spline technique. In order to validate usefulness of our method, we segment suspicious regions on FDG-PET images manually, and obtained the Z-scores of the regions based on the corresponding voxels that stores the mean and the standard deviations from the normal model. We collected 243 (143 males and 100 females normal cases to construct the normal model. We also extracted 432 abnormal spots from 63 abnormal cases (73 cancer lesions to validate the Z-scores. The Z-scores of 417 out of 432 abnormal spots were higher than 2.0, which statistically indicated the severity of the spots. In conclusions, the Z-scores obtained by our computerized scheme with anatomical standardization of torso region would be useful for visualization and detection of subtle lesions on FDG-PET scan images even when the SUV may not clearly show an abnormality.

  2. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game

    Directory of Open Access Journals (Sweden)

    Adam Karbowski

    2017-09-01

    Full Text Available The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants’ attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  3. Imagine-Self Perspective-Taking and Rational Self-Interested Behavior in a Simple Experimental Normal-Form Game.

    Science.gov (United States)

    Karbowski, Adam; Ramsza, Michał

    2017-01-01

    The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants' attributions of susceptibility to errors or non-self-interested motivation to the opponents.

  4. Towards continuous improvement of endoscopy standards: Validation of a colonoscopy assessment form.

    LENUS (Irish Health Repository)

    2012-02-01

    Aim: Assessment of procedural colonoscopy skills is an important and topical. The aim of this study was to develop and validate a competency-based colonoscopy assessment form that would be easy to use, suitable for the assessment of junior and senior endoscopists and potentially be a useful instrument to detect differences in performance standards following different training interventions. Method: A standardised assessment form was developed incorporating a checklist with dichotomous yes\\/no responses and a global assessment section incorporating several different elements. This form was used prospectively to evaluate colonoscopy cases during the period of the study in several university teaching hospitals. Results were analysed using ANOVA with Bonferroni corrections for post-hoc analysis. Results: 81 procedures were assessed, performed by eight consultant and 19 trainee endoscopists. There were no serious errors. When divided into three groups based on previous experience (novice, intermediate and expert) the assessment form demonstrated statistically significant differences between all three groups (p<0.05). When separate elements were taken into account, the global assessment section was a better discriminator of skill level than the checklist. Conclusion: This form is a valid, easy to use assessment method. We intend to use it to assess the value of simulator training in trainee endoscopists. It also has the potential to be a useful training tool when feedback is given to the trainee.

  5. Standard Test Method for Normal Spectral Emittance at Elevated Temperatures of Nonconducting Specimens

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1971-01-01

    1.1 This test method describes an accurate technique for measuring the normal spectral emittance of electrically nonconducting materials in the temperature range from 1000 to 1800 K, and at wavelengths from 1 to 35 μm. It is particularly suitable for measuring the normal spectral emittance of materials such as ceramic oxides, which have relatively low thermal conductivity and are translucent to appreciable depths (several millimetres) below the surface, but which become essentially opaque at thicknesses of 10 mm or less. 1.2 This test method requires expensive equipment and rather elaborate precautions, but produces data that are accurate to within a few percent. It is particularly suitable for research laboratories, where the highest precision and accuracy are desired, and is not recommended for routine production or acceptance testing. Because of its high accuracy, this test method may be used as a reference method to be applied to production and acceptance testing in case of dispute. 1.3 This test metho...

  6. 41 CFR 102-194.30 - What role does my agency play in the Standard and Optional Forms Management Program?

    Science.gov (United States)

    2010-07-01

    ... What role does my agency play in the Standard and Optional Forms Management Program? Your agency head... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What role does my agency play in the Standard and Optional Forms Management Program? 102-194.30 Section 102-194.30 Public...

  7. 41 CFR 304-6.5 - What guidelines must we follow when using the Standard Form (SF) 326?

    Science.gov (United States)

    2010-07-01

    ... REQUIREMENTS 6-PAYMENT GUIDELINES Reports § 304-6.5 What guidelines must we follow when using the Standard Form... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false What guidelines must we follow when using the Standard Form (SF) 326? 304-6.5 Section 304-6.5 Public Contracts and Property...

  8. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    Science.gov (United States)

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while

  9. Standard test method for static leaching of monolithic waste forms for disposal of radioactive waste

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method provides a measure of the chemical durability of a simulated or radioactive monolithic waste form, such as a glass, ceramic, cement (grout), or cermet, in a test solution at temperatures <100°C under low specimen surface- area-to-leachant volume (S/V) ratio conditions. 1.2 This test method can be used to characterize the dissolution or leaching behaviors of various simulated or radioactive waste forms in various leachants under the specific conditions of the test based on analysis of the test solution. Data from this test are used to calculate normalized elemental mass loss values from specimens exposed to aqueous solutions at temperatures <100°C. 1.3 The test is conducted under static conditions in a constant solution volume and at a constant temperature. The reactivity of the test specimen is determined from the amounts of components released and accumulated in the solution over the test duration. A wide range of test conditions can be used to study material behavior, includin...

  10. Towards reporting standards for neuropsychological study results: A proposal to minimize communication errors with standardized qualitative descriptors for normalized test scores.

    Science.gov (United States)

    Schoenberg, Mike R; Rum, Ruba S

    2017-11-01

    Rapid, clear and efficient communication of neuropsychological results is essential to benefit patient care. Errors in communication are a lead cause of medical errors; nevertheless, there remains a lack of consistency in how neuropsychological scores are communicated. A major limitation in the communication of neuropsychological results is the inconsistent use of qualitative descriptors for standardized test scores and the use of vague terminology. PubMed search from 1 Jan 2007 to 1 Aug 2016 to identify guidelines or consensus statements for the description and reporting of qualitative terms to communicate neuropsychological test scores was conducted. The review found the use of confusing and overlapping terms to describe various ranges of percentile standardized test scores. In response, we propose a simplified set of qualitative descriptors for normalized test scores (Q-Simple) as a means to reduce errors in communicating test results. The Q-Simple qualitative terms are: 'very superior', 'superior', 'high average', 'average', 'low average', 'borderline' and 'abnormal/impaired'. A case example illustrates the proposed Q-Simple qualitative classification system to communicate neuropsychological results for neurosurgical planning. The Q-Simple qualitative descriptor system is aimed as a means to improve and standardize communication of standardized neuropsychological test scores. Research are needed to further evaluate neuropsychological communication errors. Conveying the clinical implications of neuropsychological results in a manner that minimizes risk for communication errors is a quintessential component of evidence-based practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Testing of Software Routine to Determine Deviate and Cumulative Probability: ModStandardNormal Version 1.0

    International Nuclear Information System (INIS)

    A.H. Monib

    1999-01-01

    The purpose of this calculation is to document that the software routine ModStandardNomal Version 1.0 which is a Visual Fortran 5.0 module, provides correct results for a normal distribution up to five significant figures (three significant figures at the function tails) for a specified range of input parameters. The software routine may be used for quality affecting work. Two types of output are generated in ModStandardNomal: a deviate, x, given a cumulative probability, p, between 0 and 1; and a cumulative probability, p, given a deviate, x, between -8 and 8. This calculation supports Performance Assessment, under Technical Product Development Plan, TDP-EBS-MD-000006 (Attachment I, DIRS 3) and is written in accordance with the AP-3.12Q Calculations procedure (Attachment I, DIRS 4)

  12. Standard forms and entanglement engineering of multimode Gaussian states under local operations

    International Nuclear Information System (INIS)

    Serafini, Alessio; Adesso, Gerardo

    2007-01-01

    We investigate the action of local unitary operations on multimode (pure or mixed) Gaussian states and single out the minimal number of locally invariant parameters which completely characterize the covariance matrix of such states. For pure Gaussian states, central resources for continuous-variable quantum information, we investigate separately the parameter reduction due to the additional constraint of global purity, and the one following by the local-unitary freedom. Counting arguments and insights from the phase-space Schmidt decomposition and in general from the framework of symplectic analysis, accompany our description of the standard form of pure n-mode Gaussian states. In particular, we clarify why only in pure states with n ≤ 3 modes all the direct correlations between position and momentum operators can be set to zero by local unitary operations. For any n, the emerging minimal set of parameters contains complete information about all forms of entanglement in the corresponding states. An efficient state engineering scheme (able to encode direct correlations between position and momentum operators as well) is proposed to produce entangled multimode Gaussian resources, its number of optical elements matching the minimal number of locally invariant degrees of freedom of general pure n-mode Gaussian states. Finally, we demonstrate that so-called 'block-diagonal' Gaussian states, without direct correlations between position and momentum, are systematically less entangled, on average, than arbitrary pure Gaussian states

  13. Enhancement of cemented waste forms by supercritical CO2 carbonation of standard portland cements

    International Nuclear Information System (INIS)

    Rubin, J.B.; Carey, J.; Taylor, C.M.V.

    1997-01-01

    We are conducting experiments on an innovative transformation concept, using a traditional immobilization technique, that may significantly reduce the volume of hazardous or radioactive waste requiring transport and long-term storage. The standard practice for the stabilization of radioactive salts and residues is to mix them with cements, which may include additives to enhance immobilization. Many of these wastes do not qualify for underground disposition, however, because they do not meet disposal requirements for free liquids, decay heat, head-space gas analysis, and/or leachability. The treatment method alters the bulk properties of a cemented waste form by greatly accelerating the natural cement-aging reactions, producing a chemically stable form having reduced free liquids, as well as reduced porosity, permeability and pH. These structural and chemical changes should allow for greater actinide loading, as well as the reduced mobility of the anions, cations, and radionuclides in aboveground and underground repositories. Simultaneously, the treatment process removes a majority of the hydrogenous material from the cement. The treatment method allows for on-line process monitoring of leachates and can be transported into the field. We will describe the general features of supercritical fluids, as well as the application of these fluids to the treatment of solid and semi-solid waste forms. some of the issues concerning the economic feasibility of industrial scale-up will be addressed, with particular attention to the engineering requirements for the establishment of on-site processing facilities. Finally, the initial results of physical property measurements made on portland cements before and after supercritical fluid processing will be presented

  14. Solitary-wave families of the Ostrovsky equation: An approach via reversible systems theory and normal forms

    International Nuclear Information System (INIS)

    Roy Choudhury, S.

    2007-01-01

    The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned

  15. An approach to normal forms of Kuramoto model with distributed delays and the effect of minimal delay

    Energy Technology Data Exchange (ETDEWEB)

    Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)

    2015-09-25

    Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.

  16. 41 CFR 101-26.4901-149 - Standard Form 149, U.S. Government National Credit Card.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Standard Form 149, U.S. Government National Credit Card. 101-26.4901-149 Section 101-26.4901-149 Public Contracts and Property... 149, U.S. Government National Credit Card. Note: The form illustrated in § 101-26.4901-149 is filed as...

  17. A case study: forming an effective quality management system according to ISO 9000 standards

    OpenAIRE

    Zağyapan, Orhan

    1995-01-01

    Ankara : The Faculty of Management and the Graduate School of Business Administration of Bilkent Univ., 1995. Thesis (Master's) -- Bilkent University, 1995. Includes bibliographical references leaves 87-88 In today's world, companies which adopt themselves to certain internationally recognized standards are one step ahead of their competitors. ISO 9000 Quality System Standards captured the most attention among all. The aim of the standard is to provide an international bench...

  18. Adherence to a Standardized Order Form for Gastric Cancer in a Referral Chemotherapy Teaching Hospital, Mashhad, Iran

    Directory of Open Access Journals (Sweden)

    Mitra Asgarian

    2017-09-01

    Full Text Available Background: Standardized forms for prescription and medication administration are one solution to reduce medication errors in the chemotherapy process. Gastric cancer is the most common cancer in Iran. In this study, we have attempted to design and validate a standard printed chemotherapy form and evaluate adherence by oncologists and nurses to this form. Methods: We performed this cross-sectional study in a Mashhad, Iran teaching hospital from August 2015 until January 2016. A clinical pharmacist designed the chemotherapy form that included various demographic and clinical parameters and approved chemotherapy regimens for gastric cancer. Clinical oncologists that worked in this center validated the form. We included all eligible patients. A pharmacy student identified adherence by the oncologists and nurses to this form and probable medication errors. Results are mean ± standard deviation or number (percentages for nominal variables. Data analysis was performed using the SPSS 16.0 statistical package. Results:We evaluated 54 patients and a total of 249 chemotherapy courses. In 146 (58.63% chemotherapy sessions, the administered regimens lacked compatibility with the standard form. Approximately 66% of recorded errors occurred in the prescription phase and the remainder during the administration phase. The most common errors included improper dose (61% and wrong infusion time (34%. We observed that 37 dose calculation errors occurred in 32 chemotherapy sessions. Conclusions: In general, adherence by oncologists and nurses with the developed form for chemotherapy treatment of gastric cancer was not acceptable. These findings indicated the necessity for a standardized order sheet to simplify the chemotherapy process for the clinicians, and reduce prescription and administration errors.

  19. The Standard Model in noncommutative geometry: fundamental fermions as internal forms

    Science.gov (United States)

    Dąbrowski, Ludwik; D'Andrea, Francesco; Sitarz, Andrzej

    2018-05-01

    Given the algebra, Hilbert space H, grading and real structure of the finite spectral triple of the Standard Model, we classify all possible Dirac operators such that H is a self-Morita equivalence bimodule for the associated Clifford algebra.

  20. A hybrid electron and photon IMRT planning technique that lowers normal tissue integral patient dose using standard hardware.

    Science.gov (United States)

    Rosca, Florin

    2012-06-01

    To present a mixed electron and photon IMRT planning technique using electron beams with an energy range of 6-22 MeV and standard hardware that minimizes integral dose to patients for targets as deep as 7.5 cm. Ten brain cases, two lung, a thyroid, an abdominal, and a parotid case were planned using two planning techniques: a photon-only IMRT (IMRT) versus a mixed modality treatment (E+IMRT) that includes an enface electron beam and a photon IMRT portion that ensures a uniform target coverage. The electron beam is delivered using a regular cutout placed in an electron cone. The electron energy was chosen to provide a good trade-off between minimizing integral dose and generating a uniform, deliverable plan. The authors choose electron energies that cover the deepest part of PTV with the 65%-70% isodose line. The normal tissue integral dose, the dose for ring structures around the PTV, and the volumes of the 75%, 50%, and 25% isosurfaces were used to compare the dose distributions generated by the two planning techniques. The normal tissue integral dose was lowered by about 20% by the E+IMRT plans compared to the photon-only IMRT ones for most studied cases. With the exception of lungs, the dose reduction associated to the E+IMRT plans was more pronounced further away from the target. The average dose ratio delivered to the 0-2 cm and the 2-4 cm ring structures for brain patients for the two planning techniques were 89.6% and 70.8%, respectively. The enhanced dose sparing away from the target for the brain patients can also be observed in the ratio of the 75%, 50%, and 25% isodose line volumes for the two techniques, which decreases from 85.5% to 72.6% and further to 65.1%, respectively. For lungs, the lateral electron beams used in the E+IMRT plans were perpendicular to the mostly anterior/posterior photon beams, generating much more conformal plans. The authors proved that even using the existing electron delivery hardware, a mixed electron/photon planning

  1. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  2. Molecular Form Differences Between Prostate-Specific Antigen (PSA) Standards Create Quantitative Discordances in PSA ELISA Measurements

    Science.gov (United States)

    McJimpsey, Erica L.

    2016-01-01

    The prostate-specific antigen (PSA) assays currently employed for the detection of prostate cancer (PCa) lack the specificity needed to differentiate PCa from benign prostatic hyperplasia and have high false positive rates. The PSA calibrants used to create calibration curves in these assays are typically purified from seminal plasma and contain many molecular forms (intact PSA and cleaved subforms). The purpose of this study was to determine if the composition of the PSA molecular forms found in these PSA standards contribute to the lack of PSA test reliability. To this end, seminal plasma purified PSA standards from different commercial sources were investigated by western blot (WB) and in multiple research grade PSA ELISAs. The WB results revealed that all of the PSA standards contained different mass concentrations of intact and cleaved molecular forms. Increased mass concentrations of intact PSA yielded higher immunoassay absorbance values, even between lots from the same manufacturer. Standardization of seminal plasma derived PSA calibrant molecular form mass concentrations and purification methods will assist in closing the gaps in PCa testing measurements that require the use of PSA values, such as the % free PSA and Prostate Health Index by increasing the accuracy of the calibration curves. PMID:26911983

  3. Molecular Form Differences Between Prostate-Specific Antigen (PSA) Standards Create Quantitative Discordances in PSA ELISA Measurements

    Science.gov (United States)

    McJimpsey, Erica L.

    2016-02-01

    The prostate-specific antigen (PSA) assays currently employed for the detection of prostate cancer (PCa) lack the specificity needed to differentiate PCa from benign prostatic hyperplasia and have high false positive rates. The PSA calibrants used to create calibration curves in these assays are typically purified from seminal plasma and contain many molecular forms (intact PSA and cleaved subforms). The purpose of this study was to determine if the composition of the PSA molecular forms found in these PSA standards contribute to the lack of PSA test reliability. To this end, seminal plasma purified PSA standards from different commercial sources were investigated by western blot (WB) and in multiple research grade PSA ELISAs. The WB results revealed that all of the PSA standards contained different mass concentrations of intact and cleaved molecular forms. Increased mass concentrations of intact PSA yielded higher immunoassay absorbance values, even between lots from the same manufacturer. Standardization of seminal plasma derived PSA calibrant molecular form mass concentrations and purification methods will assist in closing the gaps in PCa testing measurements that require the use of PSA values, such as the % free PSA and Prostate Health Index by increasing the accuracy of the calibration curves.

  4. Using open technical e-learning standards and service-orientation to support new forms of e-assessment

    NARCIS (Netherlands)

    Miao, Yongwu; Tattersall, Colin; Schoonenboom, Judith; Stevanov, Krassen; Aleksieva-Petrova, Adelina

    2007-01-01

    Miao, Y., Tattersall, C., Schoonenboom, J., Stevanov, K., & Aleksieva-Petrova, A. (2007). Using open technical e-learning standards and service-orientation to support new forms of e-assessment. In D. Griffiths, R. Koper & O. Liber (Eds.), Proceedings of the second TENCompetence Open Workshop on

  5. Development of Abbreviated Nine-Item Forms of the Raven's Standard Progressive Matrices Test

    Science.gov (United States)

    Bilker, Warren B.; Hansen, John A.; Brensinger, Colleen M.; Richard, Jan; Gur, Raquel E.; Gur, Ruben C.

    2012-01-01

    The Raven's Standard Progressive Matrices (RSPM) is a 60-item test for measuring abstract reasoning, considered a nonverbal estimate of fluid intelligence, and often included in clinical assessment batteries and research on patients with cognitive deficits. The goal was to develop and apply a predictive model approach to reduce the number of items…

  6. 76 FR 50117 - Commission Rules and Forms Related to the FASB's Accounting Standards Codification

    Science.gov (United States)

    2011-08-12

    .... generally accepted accounting principles (``U.S. GAAP''). Statement No. 168 became effective for financial... Codification'' is a registered trademark of the Financial Accounting Foundation. DATES: Effective Date: August... accounting principles established by a standard-setting body that meets specified criteria. On April 25, 2003...

  7. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  8. Standard test method for splitting tensile strength for brittle nuclear waste forms

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1989-01-01

    1.1 This test method is used to measure the static splitting tensile strength of cylindrical specimens of brittle nuclear waste forms. It provides splitting tensile-strength data that can be used to compare the strength of waste forms when tests are done on one size of specimen. 1.2 The test method is applicable to glass, ceramic, and concrete waste forms that are sufficiently homogeneous (Note 1) but not to coated-particle, metal-matrix, bituminous, or plastic waste forms, or concretes with large-scale heterogeneities. Cementitious waste forms with heterogeneities >1 to 2 mm and 5 mm can be tested using this procedure provided the specimen size is increased from the reference size of 12.7 mm diameter by 6 mm length, to 51 mm diameter by 100 mm length, as recommended in Test Method C 496 and Practice C 192. Note 1—Generally, the specimen structural or microstructural heterogeneities must be less than about one-tenth the diameter of the specimen. 1.3 This test method can be used as a quality control chec...

  9. BIOCHEMICAL EFFECTS IN NORMAL AND STONE FORMING RATS TREATED WITH THE RIPE KERNEL JUICE OF PLANTAIN (MUSA PARADISIACA)

    Science.gov (United States)

    Devi, V. Kalpana; Baskar, R.; Varalakshmi, P.

    1993-01-01

    The effect of Musa paradisiaca stem kernel juice was investigated in experimental urolithiatic rats. Stone forming rats exhibited a significant elevation in the activities of two oxalate synthesizing enzymes - Glycollic acid oxidase and Lactate dehydrogenase. Deposition and excretion of stone forming constituents in kidney and urine were also increased in these rats. The enzyme activities and the level of crystalline components were lowered with the extract treatment. The extract also reduced the activities of urinary alkaline phosphatase, lactate dehydrogenase, r-glutamyl transferase, inorganic pyrophosphatase and β-glucuronidase in calculogenic rats. No appreciable changes were noticed with leucine amino peptidase activity in treated rats. PMID:22556626

  10. 7 CFR 1724.70 - Standard forms of contracts for borrowers.

    Science.gov (United States)

    2010-01-01

    ... required to use in the planning, design, and construction of their electric systems. Borrowers are not required to use these guidance contract forms in the absence of an agreement to do so. [63 FR 58284, Oct... construction, procurement, engineering services, and architectural services financed by a loan made or...

  11. The impact of a standardized consultation form for facial trauma on billing and evaluation and management levels.

    Science.gov (United States)

    Levesque, Andre Y; Tauber, David M; Lee, Johnson C; Rodriguez-Feliz, Jose R; Chao, Jerome D

    2014-02-01

    Facial trauma is among the most frequent consultations encountered by plastic surgeons. Unfortunately, the reimbursement from these consultations can be low, and qualified plastic surgeons may exclude facial trauma from their practice. An audit of our records found insufficient documentation to justify higher evaluation and management (EM) levels of service resulting in lower reimbursement. Utilizing a standardized consultation form can improve documentation resulting in higher billing and EM levels. A facial trauma consultation form was developed in conjunction with the billing department. Three plastic surgery residents completed 30 consultations without the aid of the consult form followed by 30 consultations with the aid of the form. The EM levels and billing data for each consultation were obtained from the billing department for analysis. The 2 groups were compared using χ2 analysis and t tests to determine statistical significance. Using our standardized consultation form, the mean EM level increased from 2.97 to 3.60 (P = 0.002). In addition, the mean billed amount increased from $391 to $501 per consult (P = 0.051) representing a 28% increase in billing. In our institution, the development and implementation of a facial trauma consultation form has resulted in more complete documentation and a subsequent increase in EM level and billed services.

  12. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.

  13. Zirconium sponge and other forms of virgin metal for nuclear applications - approved standard 1973

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    This specification covers virgin zirconium metal commonly designated as sponge because of its porous, sponge-like texture, but it may also take other forms such as chunklets. One grade is described which is designated as reactor grade R-1, suitable for use in nuclear applications. The most important characteristic of the reactor grade is its low nuclear cross section as achieved by removal of hafnium and careful quality control in manufacturing procedures to prevent contamination with other high cross section materials

  14. The Effect of Normal Force on Tribocorrosion Behaviour of Ti-10Zr Alloy and Porous TiO2-ZrO2 Thin Film Electrochemical Formed

    Science.gov (United States)

    Dănăilă, E.; Benea, L.

    2017-06-01

    The tribocorrosion behaviour of Ti-10Zr alloy and porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy was evaluated in Fusayama-Mayer artificial saliva solution. Tribocorrosion experiments were performed using a unidirectional pin-on-disc experimental set-up which was mechanically and electrochemically instrumented, under various solicitation conditions. The effect of applied normal force on tribocorrosion performance of the tested materials was determined. Open circuit potential (OCP) measurements performed before, during and after sliding tests were applied in order to determine the tribocorrosion degradation. The applied normal force was found to greatly affect the potential during tribocorrosion experiments, an increase in the normal force inducing a decrease in potential accelerating the depassivation of the materials studied. The results show a decrease in friction coefficient with gradually increasing the normal load. It was proved that the porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy lead to an improvement of tribocorrosion resistance compared to non-anodized Ti-10Zr alloy intended for biomedical applications.

  15. A Hard X-Ray Study of the Normal Star-Forming Galaxy M83 with NuSTAR

    DEFF Research Database (Denmark)

    Yukita, M.; Hornschemeier, A. E.; Lehmer, B. D.

    2016-01-01

    We present the results from sensitive, multi-epoch NuSTAR observations of the late-type star-forming galaxy M83 (d = 4.6 Mpc). This is the first investigation to spatially resolve the hard (E > 10 keV) X-ray emission of this galaxy. The nuclear region and similar to 20 off-nuclear point sources......, including a previously discovered ultraluminous X-ray source, are detected in our NuSTAR observations. The X-ray hardnesses and luminosities of the majority of the point sources are consistent with hard X-ray sources resolved in the starburst galaxy NGC 253. We infer that the hard X-ray emission is most...

  16. Dynamic pathways to mediate reactions buried in thermal fluctuations. I. Time-dependent normal form theory for multidimensional Langevin equation.

    Science.gov (United States)

    Kawai, Shinnosuke; Komatsuzaki, Tamiki

    2009-12-14

    We present a novel theory which enables us to explore the mechanism of reaction selectivity and robust functions in complex systems persisting under thermal fluctuation. The theory constructs a nonlinear coordinate transformation so that the equation of motion for the new reaction coordinate is independent of the other nonreactive coordinates in the presence of thermal fluctuation. In this article we suppose that reacting systems subject to thermal noise are described by a multidimensional Langevin equation without a priori assumption for the form of potential. The reaction coordinate is composed not only of all the coordinates and velocities associated with the system (solute) but also of the random force exerted by the environment (solvent) with friction constants. The sign of the reaction coordinate at any instantaneous moment in the region of a saddle determines the fate of the reaction, i.e., whether the reaction will proceed through to the products or go back to the reactants. By assuming the statistical properties of the random force, one can know a priori a well-defined boundary of the reaction which separates the full position-velocity space in the saddle region into mainly reactive and mainly nonreactive regions even under thermal fluctuation. The analytical expression of the reaction coordinate provides the firm foundation on the mechanism of how and why reaction proceeds in thermal fluctuating environments.

  17. Intercoder Reliability of Mapping Between Pharmaceutical Dose Forms in the German Medication Plan and EDQM Standard Terms.

    Science.gov (United States)

    Sass, Julian; Becker, Kim; Ludmann, Dominik; Pantazoglou, Elisabeth; Dewenter, Heike; Thun, Sylvia

    2018-01-01

    A nationally uniform medication plan has recently been part of German legislation. The specification for the German medication plan was developed in cooperation between various stakeholders of the healthcare system. Its' goal is to enhance usability and interoperability while also providing patients and physicians with the necessary information they require for a safe and high-quality therapy. Within the research and development project named Medication Plan PLUS, the specification of the medication plan was tested and reviewed for semantic interoperability in particular. In this study, the list of pharmaceutical dose forms provided in the specification was mapped to the standard terms of the European Directorate for the Quality of Medicines & HealthCare by different coders. The level of agreement between coders was calculated using Cohen's Kappa (κ). Results show that less than half of the dose forms could be coded with EDQM standard terms. In addition to that Kappa was found to be moderate, which means rather unconvincing agreement among coders. In conclusion, there is still vast room for improvement in utilization of standardized international vocabulary and unused potential considering cross-border eHealth implementations in the future.

  18. Radiotherapy. Non-standard fractionated regimens improving cancer treatment. Part II. Response of normal tissues to fractionated irradiation

    International Nuclear Information System (INIS)

    Villar, A.; Hernandez, M.; Pera, J.; Cambray, M.; Villa, S.; Arnaiz, M.D.

    1988-01-01

    The phenomena participating in the response of tissues to fractionated irradiation are analyzed with special emphasis on the most relevant points influencing the design of non-standard fractionated regimens. (Author)

  19. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    Energy Technology Data Exchange (ETDEWEB)

    Bateman, Grant A. [John Hunter Hospital, Department of Medical Imaging, Newcastle (Australia); Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C. [Hunter Medical Research Institute, Clinical Neurosciences Program, Newcastle (Australia); Schofield, Peter [James Fletcher Hospital, Neuropsychiatry Unit, Newcastle (Australia)

    2005-10-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  20. The pathophysiology of the aqueduct stroke volume in normal pressure hydrocephalus: can co-morbidity with other forms of dementia be excluded?

    International Nuclear Information System (INIS)

    Bateman, Grant A.; Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C.; Schofield, Peter

    2005-01-01

    Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)

  1. Parallel Computation on Multicore Processors Using Explicit Form of the Finite Element Method and C++ Standard Libraries

    Directory of Open Access Journals (Sweden)

    Rek Václav

    2016-11-01

    Full Text Available In this paper, the form of modifications of the existing sequential code written in C or C++ programming language for the calculation of various kind of structures using the explicit form of the Finite Element Method (Dynamic Relaxation Method, Explicit Dynamics in the NEXX system is introduced. The NEXX system is the core of engineering software NEXIS, Scia Engineer, RFEM and RENEX. It has the possibilities of multithreaded running, which can now be supported at the level of native C++ programming language using standard libraries. Thanks to the high degree of abstraction that a contemporary C++ programming language provides, a respective library created in this way can be very generalized for other purposes of usage of parallelism in computational mechanics.

  2. Multiple internal standard normalization for improving HS-SPME-GC-MS quantitation in virgin olive oil volatile organic compounds (VOO-VOCs) profile.

    Science.gov (United States)

    Fortini, Martina; Migliorini, Marzia; Cherubini, Chiara; Cecchi, Lorenzo; Calamai, Luca

    2017-04-01

    The commercial value of virgin olive oils (VOOs) strongly depends on their classification, also based on the aroma of the oils, usually evaluated by a panel test. Nowadays, a reliable analytical method is still needed to evaluate the volatile organic compounds (VOCs) and support the standard panel test method. To date, the use of HS-SPME sampling coupled to GC-MS is generally accepted for the analysis of VOCs in VOOs. However, VOO is a challenging matrix due to the simultaneous presence of: i) compounds at ppm and ppb concentrations; ii) molecules belonging to different chemical classes and iii) analytes with a wide range of molecular mass. Therefore, HS-SPME-GC-MS quantitation based upon the use of external standard method or of only a single internal standard (ISTD) for data normalization in an internal standard method, may be troublesome. In this work a multiple internal standard normalization is proposed to overcome these problems and improving quantitation of VOO-VOCs. As many as 11 ISTDs were used for quantitation of 71 VOCs. For each of them the most suitable ISTD was selected and a good linearity in a wide range of calibration was obtained. Except for E-2-hexenal, without ISTD or with an unsuitable ISTD, the linear range of calibration was narrower with respect to that obtained by a suitable ISTD, confirming the usefulness of multiple internal standard normalization for the correct quantitation of VOCs profile in VOOs. The method was validated for 71 VOCs, and then applied to a series of lampante virgin olive oils and extra virgin olive oils. In light of our results, we propose the application of this analytical approach for routine quantitative analyses and to support sensorial analysis for the evaluation of positive and negative VOOs attributes. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. The effectiveness of Microsoft Project in assessing extension of time under PAM 2006 standard form of contract

    Science.gov (United States)

    Suhaida, S. K.; Wong, Z. D.

    2017-11-01

    Time is equal to money; and it is applies in the construction industry where time is very important. Most of the standard form of contracts provide contractual clauses to ascertain time and money related to the scenarios while Extension of Time (EOT) is one of them. Under circumstance and delays, contractor is allow to apply EOT in order to complete the works on a later completion date without Liquidated Damages (LD) imposed to the claimant. However, both claimants and assessors encountered problems in assessing the EOT. The aim of this research is to recommend the usage of Microsoft Project as a tool in assessing EOT associated with the standard form of contract, PAM 2006. A quantitative method is applied towards the respondents that consisted of architects and quantity surveyors (QS) in order to collect data on challenges in assessing EOT claims and the effectiveness of Microsoft Project as a tool. The finding of this research highlighted that Microsoft Project can serve as a basis to perform EOT tasks as this software can be used as a data bank to store handy information which crucial for preparing and evaluating EOT.

  4. Composite Reliability and Standard Errors of Measurement for a Seven-Subtest Short Form of the Wechsler Adult Intelligence Scale-Revised.

    Science.gov (United States)

    Schretlen, David; And Others

    1994-01-01

    Composite reliability and standard errors of measurement were computed for prorated Verbal, Performance, and Full-Scale intelligence quotient (IQ) scores from a seven-subtest short form of the Wechsler Adult Intelligence Scale-Revised. Results with 1,880 adults (standardization sample) indicate that this form is as reliable as the complete test.…

  5. Guidelines for certification of International Normalized Ratio (INR) for vitamin K antagonists monitoring according to the EN ISO 22870 standards.

    Science.gov (United States)

    Brionne-François, Marie; Bauters, Anne; Mouton, Christine; Voisin, Sophie; Flaujac, Claire; Le Querrec, Agnès; Lasne, Dominique

    2018-06-01

    Point of care testing (POCT) must comply with regulatory requirements according to standard EN ISO 22870, which identify biologists as responsible for POCT. INR for vitamin K antagonists (VKAs) monitoring is a test frequently performed in haemostasis laboratories. Bedside INR is useful in emergency room, in particular in case of VKAs overdosage but also for specific populations of patients like paediatrics or geriatrics. INR POCT devices are widely used at home by the patients for self-testing, but their use in the hospital by the clinical staff for bedside measurement is growing, with devices which now comply with standard for POCT accreditation for hospital use. The majority of point of care devices for INR monitoring has shown a good precision and accuracy with results similar to those obtained in laboratory. With the aim to help the multidisciplinary groups for POCT supervision, the medical departments and the biologists to be in accordance with the standard, we present the guidelines of the GFHT (Groupe français d'étude sur l'hémostase et la thrombose, subcommittee "CEC et biologie délocalisée") for the certification of POCT INR. These guidelines are based on the SFBC guidelines for the certification of POCT and on the analysis of the literature to ascertain the justification of clinical need and assess the analytical performance of main analysers used in France, as well as on a survey conducted with biologists.

  6. Evaluation of the standard normal variate method for Laser-Induced Breakdown Spectroscopy data treatment applied to the discrimination of painting layers

    Science.gov (United States)

    Syvilay, D.; Wilkie-Chancellier, N.; Trichereau, B.; Texier, A.; Martinez, L.; Serfaty, S.; Detalle, V.

    2015-12-01

    Nowadays, Laser-Induced Breakdown Spectroscopy (LIBS) is frequently used for in situ analyses to identify pigments from mural paintings. Nonetheless, in situ analyses require a robust instrumentation in order to face to hard experimental conditions. This may imply variation of fluencies and thus inducing variation of LIBS signal, which degrades spectra and then results. Usually, to overcome these experimental errors, LIBS signal is processed. Signal processing methods most commonly used are the baseline subtraction and the normalization by using a spectral line. However, the latter suggests that this chosen element is a constant component of the material, which may not be the case in paint layers organized in stratigraphic layers. For this reason, it is sometimes difficult to apply this normalization. In this study, another normalization will be carried out to throw off these signal variations. Standard normal variate (SNV) is a normalization designed for these conditions. It is sometimes implemented in Diffuse Reflectance Infrared Fourier Transform Spectroscopy and in Raman Spectroscopy but rarely in LIBS. The SNV transformation is not newly applied on LIBS data, but for the first time the effect of SNV on LIBS spectra was evaluated in details (energy of laser, shot by shot, quantification). The aim of this paper is the quick visualization of the different layers of a stratigraphic painting sample by simple data representations (3D or 2D) after SNV normalization. In this investigation, we showed the potential power of SNV transformation to overcome undesired LIBS signal variations but also its limit of application. This method appears as a promising way to normalize LIBS data, which may be interesting for in-situ depth analyses.

  7. A Generalized Form of Context-Dependent Psychophysiological Interactions (gPPI): A Comparison to Standard Approaches

    Science.gov (United States)

    McLaren, Donald G.; Ries, Michele L.; Xu, Guofan; Johnson, Sterling C.

    2012-01-01

    Functional MRI (fMRI) allows one to study task-related regional responses and task-dependent connectivity analysis using psychophysiological interaction (PPI) methods. The latter affords the additional opportunity to understand how brain regions interact in a task-dependent manner. The current implementation of PPI in Statistical Parametric Mapping (SPM8) is configured primarily to assess connectivity differences between two task conditions, when in practice fMRI tasks frequently employ more than two conditions. Here we evaluate how a generalized form of context-dependent PPI (gPPI; http://www.nitrc.org/projects/gppi), which is configured to automatically accommodate more than two task conditions in the same PPI model by spanning the entire experimental space, compares to the standard implementation in SPM8. These comparisons are made using both simulations and an empirical dataset. In the simulated dataset, we compare the interaction beta estimates to their expected values and model fit using the Akaike Information Criterion (AIC). We found that interaction beta estimates in gPPI were robust to different simulated data models, were not different from the expected beta value, and had better model fits than when using standard PPI (sPPI) methods. In the empirical dataset, we compare the model fit of the gPPI approach to sPPI. We found that the gPPI approach improved model fit compared to sPPI. There were several regions that became non-significant with gPPI. These regions all showed significantly better model fits with gPPI. Also, there were several regions where task-dependent connectivity was only detected using gPPI methods, also with improved model fit. Regions that were detected with all methods had more similar model fits. These results suggest that gPPI may have greater sensitivity and specificity than standard implementation in SPM. This notion is tempered slightly as there is no gold standard; however, data simulations with a known outcome support our

  8. Determining the normal range for IGF-I, IGFBP-3, and ALS: new reference data based on current internal standards.

    Science.gov (United States)

    Ertl, Diana-Alexandra; Gleiss, Andreas; Sagmeister, Susanne; Haeusler, Gabriele

    2014-09-01

    The measurement of insulin-like growth factors (IGF-I) and insulin-like growth factor-binding protein (IGFBP-3) often serves as first-line testing in children with growth disorders. The role of acid-labile subunit (ALS) as a screening parameter for homozygous or heterozygous mutations of the ALS gene still has to be determined. IGF-I, IGFBP-3, and ALS were measured in 252 samples from children and adolescents. Reference curves were fitted using generalized additive model for location, scale and shape (GAMLSS) models and SD-Scores were calculated. Bootstrap analysis was used to quantify the uncertainty of the estimated percentiles. Bland-Altman plots were used to investigate the discrepancy between our newly estimated standard deviation scores (SDS) and SDS calculated on the basis of previous reference data. We present reference data for enzyme-linked immunosorbent assay (ELISA) measurements based on recommended internal standard for IGF-I, IGFBP-3, and ALS suitable for calculation of SD-scores. The Bland-Altman plot shows a rough agreement between the previous SDS calculation and our new one only for SDS around 1; for SDS at -2, an average difference of 0.83 SD was noticed. Our IGF-I reference values for the interval of interest in diagnosing growth hormone deficiency (GHD) (prepubertal age) are solid as proved by bootstrap analysis. The difference in calculated SD scores by using data provided previously highlights the importance of using labor and method specific reference data.

  9. Quantification of endogenous metabolites by the postcolumn infused-internal standard method combined with matrix normalization factor in liquid chromatography-electrospray ionization tandem mass spectrometry.

    Science.gov (United States)

    Liao, Hsiao-Wei; Chen, Guan-Yuan; Wu, Ming-Shiang; Liao, Wei-Chih; Tsai, I-Lin; Kuo, Ching-Hua

    2015-01-02

    Quantification of endogenous metabolites has enabled the discovery of biomarkers for diagnosis and provided for an understanding of disease etiology. The standard addition and stable isotope labeled-internal standard (SIL-IS) methods are currently the most widely used approaches to quantifying endogenous metabolites, but both have some limitations for clinical measurement. In this study, we developed a new approach for endogenous metabolite quantification by the postcolumn infused-internal standard (PCI-IS) method combined with the matrix normalization factor (MNF) method. MNF was used to correct the difference in MEs between standard solution and biofluids, and PCI-IS additionally tailored the correction of the MEs for individual samples. Androstenedione and testosterone were selected as test articles to verify this new approach to quantifying metabolites in plasma. The repeatability (n=4 runs) and intermediate precision (n=3 days) in terms of the peak area of androstenedione and testosterone at all tested concentrations were all less than 11% relative standard deviation (RSD). The accuracy test revealed that the recoveries were between 95.72% and 113.46%. The concentrations of androstenedione and testosterone in fifty plasma samples obtained from healthy volunteers were quantified by the PCI-IS combined with the MNF method, and the quantification results were compared with the results of the SIL-IS method. The Pearson correlation test showed that the correlation coefficient was 0.98 for both androstenedione and testosterone. We demonstrated that the PCI-IS combined with the MNF method is an effective and accurate method for quantifying endogenous metabolites. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Cross-cultural adaptation of the US consumer form of the short Primary Care Assessment Tool (PCAT): the Korean consumer form of the short PCAT (KC PCAT) and the Korean standard form of the short PCAT (KS PCAT).

    Science.gov (United States)

    Jeon, Ki-Yeob

    2011-01-01

    It is well known that countries with well-structured primary care have better health outcomes, better health equity and reduced healthcare costs. This study aimed to culturally modify and validate the US consumer form of the short Primary Care Assessment Tool (PCAT) in primary care in the Republic of Korea (hereafter referred to as Korea). The Korean consumer form of the short PCAT (KC PCAT) was cross-culturally modified from the original version using a standardised transcultural adaptation method. A pre-test version of the KC PCAT was formulated by replacement of four items and modification of a further four items from the 37 items of the original consumer form of the short PCAT at face value evaluation meetings. Pilot testing was done with a convenience sample of 15 responders at two different sites. Test-retest showed high reliability. To validate the KC PCAT, 606 clients participated in a survey carried out in Korea between February and May 2006. Internal consistency reliability, test-retest reliability and factor analysis were conducted in order to test validity. Psychometric testing was carried out on 37 items of the KC PCAT to make the KS PCAT which has 30 items and has seven principal domains: first contact utilisation, first contact accessibility, ongoing accountable care (ongoing care and coordinated rapport care), integrated care (patient-centred care with integration between primary and specialty care or between different specialties), comprehensive care, community-oriented care and culturally-oriented care. Component factors of the verified KS PCAT explained 58.28% of the total variance in the total item scores of primary care. The verified KS PCAT has been characterised by the seven classic domains of primary care with minor modifications. This may provide clues concerning differences in expectations for primary care in the Korean population as compared with that of the US. The KS PCAT is a reliable and valid tool for the evaluation of the quality of

  11. Analysis of the nonlinear dynamic behavior of power systems using normal forms of superior order; Analisis del comportamiento dinamico no lineal de sistemas de potencia usando formas normales de orden superior

    Energy Technology Data Exchange (ETDEWEB)

    Marinez Carrillo, Irma

    2003-08-01

    This thesis investigates the application of parameter disturbance methods of analysis to the nonlinear dynamic systems theory, for the study of the stability of small signal of electric power systems. The work is centered in the determination of two fundamental aspects of interest in the study of the nonlinear dynamic behavior of the system: the characterization and quantification of the nonlinear interaction degree between the fundamental ways of oscillation of the system and the study of the ways with greater influence in the response of the system in the presence of small disturbances. With these objectives, a general mathematical model, based on the application of the expansion in series of power of the nonlinear model of the power system and the theory of normal forms of vector fields is proposed for the study of the dynamic behavior of the power system. The proposed tool generalizes the existing methods in the literature to consider effects of superior order in the dynamic model of the power system. Starting off of this representation, a methodology is proposed to obtain analytical solutions of loop back and the extension of the existing methods is investigated to identify and quantify the of interaction degree among the fundamental ways of oscillation of the system. The developed tool allows, from analytical expressions of loop backs, the development of analytical measures to evaluate the stress degree in the system, the interaction between the fundamental ways of oscillation and the determination of stability borders. The conceptual development of the proposed method in this thesis offers, on the other hand, a great flexibility to incorporate detailed models of the power system and the evaluation of diverse measures of the nonlinear modal interaction. Finally, the results are presented of the application of the method of analysis proposed for the study of the nonlinear dynamic behavior in a machine-infinite bus system considering different modeled degrees

  12. 16 CFR 1031.6 - Extent and form of Commission involvement in the development of voluntary standards.

    Science.gov (United States)

    2010-01-01

    ..., engineering support, and information and education programs) and administrative assistance (e.g., travel costs... SAFETY COMMISSION GENERAL COMMISSION PARTICIPATION AND COMMISSION EMPLOYEE INVOLVEMENT IN VOLUNTARY... goals and objectives with regard to voluntary standards and improved consumer product safety; responding...

  13. Normalization of test and evaluation of biothreat detection systems: overcoming microbial air content fluctuations by using a standardized reagent bacterial mixture.

    Science.gov (United States)

    Berchebru, Laurent; Rameil, Pascal; Gaudin, Jean-Christophe; Gausson, Sabrina; Larigauderie, Guilhem; Pujol, Céline; Morel, Yannick; Ramisse, Vincent

    2014-10-01

    Test and evaluation of engineered biothreat agent detection systems ("biodetectors") are a challenging task for government agencies and industries involved in biosecurity and biodefense programs. In addition to user friendly features, biodetectors need to perform both highly sensitive and specific detection, and must not produce excessive false alerts. In fact, the atmosphere displays a number of variables such as airborne bacterial content that can interfere with the detection process, thus impeding comparative tests when carried out at different times or places. To overcome these bacterial air content fluctuations, a standardized reagent bacterial mixture (SRBM), consisting in a collection of selected cultivable environmental species that are prevalent in temperate climate bioaerosols, was designed to generate a stable, reproducible, and easy to use surrogate of bioaerosol sample. The rationale, design, and production process are reported. The results showed that 8.59; CI 95%: 8.46-8.72 log cfu distributed into vials underwent a 0.95; CI 95%: 0.65-1.26 log viability decay after dehydration and subsequent reconstitution, thus advantageously mimicking a natural bioaerosol sample which is typically composed of cultivable and uncultivable particles. Dehydrated SRBM was stable for more than 12months at 4°C and allowed the reconstitution of a dead/live cells aqueous suspension that is stable for 96h at +4°C, according to plate counts. Specific detection of a simulating biothreat agent (e.g. Bacillus atrophaeus) by immuno-magnetic or PCR assays did not display any significant loss of sensitivity, false negative or positive results in the presence of SRBM. This work provides guidance on testing and evaluating detection devices, and may contribute to the establishment of suitable standards and normalized procedures. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Non normal and non quadratic anisotropic plasticity coupled with ductile damage in sheet metal forming: Application to the hydro bulging test

    International Nuclear Information System (INIS)

    Badreddine, Houssem; Saanouni, Khemaies; Dogui, Abdelwaheb

    2007-01-01

    In this work an improved material model is proposed that shows good agreement with experimental data for both hardening curves and plastic strain ratios in uniaxial and equibiaxial proportional loading paths for steel metal until the final fracture. This model is based on non associative and non normal flow rule using two different orthotropic equivalent stresses in both yield criterion and plastic potential functions. For the plastic potential the classical Hill 1948 quadratic equivalent stress is considered while for the yield criterion the Karafillis and Boyce 1993 non quadratic equivalent stress is used taking into account the non linear mixed (kinematic and isotropic) hardening. Applications are made to hydro bulging tests using both circular and elliptical dies. The results obtained with different particular cases of the model such as the normal quadratic and the non normal non quadratic cases are compared and discussed with respect to the experimental results

  15. Patient Recall of Informed Consent at 4 Weeks After Total Hip Replacement With Standardized Versus Procedure-Specific Consent Forms.

    Science.gov (United States)

    Pomeroy, Eoghan; Shaarani, Shahril; Kenyon, Robert; Cashman, James

    2017-08-25

    Informed consent plays a pivotal role in the operative process, and surgeons have an ethical and legal obligation to provide patients with information to allow for shared decision-making. Unfortunately, patient recall after the consent process is frequently poor. This study aims to evaluate the effect of procedure-specific consent forms on patient's recall four weeks after total hip replacement (THR). This is a prospective study using a posttest-only control group design. Sixty adult patients undergoing total hip replacement were allocated to be consented using either the generic or the surgery-specific consent form. Four weeks after surgery, a phone interview was conducted to assess patient's recall of risk of surgical complications. Patient demographic characteristics and educational attainment were similar in both groups. There was a statistically significant increase in the mean number of risks recalled in the study group at 1.43 compared with 0.67 in the control group (P = 0.0131). Consent is a complex process, and obtaining informed consent is far from straightforward. A statistically significant improvement in patient's recall with the use of procedure-specific consent forms was identified, and based on this, we would advocate their use. However, overall patient recall in both groups was poor. We believe that improving the quality of informed consent may require the sum of small gains, and the use of procedure-specific consent forms may aid in this regard.

  16. The MMPI-2-Restructured Form and the Standard MMPI-2 Clinical Scales in Relation to DSM-IV

    NARCIS (Netherlands)

    Heijden, P.T. van der; Egger, J.I.M.; Rossi, G.M.P.; Grundel, G.; Derksen, J.J.L.

    2013-01-01

    In a Dutch sample of psychiatric outpatients (N = 94), we linked the Minnesota Multiphasic Personality Inventory–2 (MMPI-2; Butcher et al., 2001) Clinical scales and MMPI-2-Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008) Higher-Order (H-O) scales, Restructured Clinical (RC) scales and

  17. Development of the Parent Form of the Preschool Children's Communication Skills Scale and Comparison of the Communication Skills of Children with Normal Development and with Autism Spectrum Disorder

    Science.gov (United States)

    Aydin, Aydan

    2016-01-01

    This study aims at developing an assessment scale for identifying preschool children's communication skills, at distinguishing children with communication deficiencies and at comparing the communication skills of children with normal development (ND) and those with autism spectrum disorder (ASD). Participants were 427 children of up to 6 years of…

  18. Assembling of (βLPH) beta-lypothrophine radioimmunoassay. Plasma levels standardization in normal individuals and patients with hypophysis and adrenals diseases

    International Nuclear Information System (INIS)

    Castro, Margaret de.

    1988-01-01

    The present study investigates the extraction and radioimmunoassay (RIA) conditions of plasma βLPH. It was extracted by the activated silicic acid method, with a mean extraction efficiency of 31.6% and a mean intra-extraction variation coefficient of 8.1%. Radioiodination was performed by the chloramine-T method and βLPH 125 I was purified by gel chromatography on Sephadex G100. Estimated specific activity ranged from 100 to 192.8 μCi/μg, with a mean incorporation percentage of 66.6%. The titer of the first antibody was 1:50.000/100 μl. The assay was performed under non-equilibrium conditions, with a pre-incubation period of 24 hours and incubation of 4 hours. Mean immunoreactivity (Bo/Total) was 21.1%, with a mean Blank/Total ratio of 2.3%. Sensitivity, expressed as the mean minimum detectable dose, was 40 pg/tube, equivalent to 56 pg/ml plasma. Intra-assay variation coefficients were 6.5%, 3.8% and 6.8%, respectively, at B/Bo levels of 0.8, 0.6 and 0.4 of the standard curve. At B/Bo equal to 0.5, the intra-assay variation coefficient was 20.9%. Replicates of 14 plasma samples showed a correlation coefficient of r 0.99, (p< 0.05). Parallelism between the curve obtained with different volumes of an extract with a high βLPH value and the standard curve was found. The method was controlled biologically by the presence of correlation between the plasma βLPH levels and determined pathological states and with clinical functional studies. Twenty seven normal individuals, 10 patients with Cushing's disease to a tumor of the hypophysis, 4 patients with Cushing syndrome due to an adrenal tumor, 10 patients Addison disease, and 8 patients with hypopituitarism were studied. (author). 119 refs., 28 figs., 2 tabs

  19. Long-term leach testing of solidified radioactive waste forms (International Standard Publication ISO 6961:1982)

    International Nuclear Information System (INIS)

    Stefanik, J.

    2001-01-01

    Processes are developed for the immobilization of radionuclides by solidification of radioactive wastes. The resulting solidification products are characterized by strong resistance to leaching aimed at low release rates of the radionuclides to the environment. To measure this resistance to leaching of the solidified materials: glass, glass-ceramics, bitumen, cement, concrete, plastics, a long-term leach test is presented. The long-term leach test is aimed at: a) the comparison of different kinds or compositions of solidified waste forms; b) the intercomparison between leach test results from different laboratories on one product; c) the intercomparison between leach test results on products from different processes

  20. Comparison of spectrum normalization techniques for univariate ...

    Indian Academy of Sciences (India)

    Laser-induced breakdown spectroscopy; univariate study; normalization models; stainless steel; standard error of prediction. Abstract. Analytical performance of six different spectrum normalization techniques, namelyinternal normalization, normalization with total light, normalization with background along with their ...

  1. Effect of psychological intervention in the form of relaxation and guided imagery on cellular immune function in normal healthy subjects. An overview

    DEFF Research Database (Denmark)

    Zachariae, R; Kristensen, J S; Hokland, P

    1991-01-01

    The present study measured the effects of relaxation and guided imagery on cellular immune function. During a period of 10 days 10 healthy subjects were given one 1-hour relaxation procedure and one combined relaxation and guided imagery procedure, instructing the subjects to imagine their immune...... on the immune defense and could form the basis of further studies on psychological intervention and immunological status. Udgivelsesdato: 1990-null...

  2. Matrix forming characteristics of inner and outer human meniscus cells on 3D collagen scaffolds under normal and low oxygen tensions.

    Science.gov (United States)

    Croutze, Roger; Jomha, Nadr; Uludag, Hasan; Adesida, Adetola

    2013-12-13

    Limited intrinsic healing potential of the meniscus and a strong correlation between meniscal injury and osteoarthritis have prompted investigation of surgical repair options, including the implantation of functional bioengineered constructs. Cell-based constructs appear promising, however the generation of meniscal constructs is complicated by the presence of diverse cell populations within this heterogeneous tissue and gaps in the information concerning their response to manipulation of oxygen tension during cell culture. Four human lateral menisci were harvested from patients undergoing total knee replacement. Inner and outer meniscal fibrochondrocytes (MFCs) were expanded to passage 3 in growth medium supplemented with basic fibroblast growth factor (FGF-2), then embedded in porous collagen type I scaffolds and chondrogenically stimulated with transforming growth factor β3 (TGF-β3) under 21% (normal or normoxic) or 3% (hypoxic) oxygen tension for 21 days. Following scaffold culture, constructs were analyzed biochemically for glycosaminoglycan production, histologically for deposition of extracellular matrix (ECM), as well as at the molecular level for expression of characteristic mRNA transcripts. Constructs cultured under normal oxygen tension expressed higher levels of collagen type II (p = 0.05), aggrecan (p oxygen tension. There was no significant difference in expression of these genes between scaffolds seeded with MFCs isolated from inner or outer regions of the tissue following 21 days chondrogenic stimulation (p > 0.05). Cells isolated from inner and outer regions of the human meniscus demonstrated equivalent differentiation potential toward chondrogenic phenotype and ECM production. Oxygen tension played a key role in modulating the redifferentiation of meniscal fibrochondrocytes on a 3D collagen scaffold in vitro.

  3. Reconstructing Normality

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov

    2012-01-01

    Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...

  4. Normal growth, altered growth? Study of the relationship between harris lines and bone form within a post-medieval plague cemetery (Dendermonde, Belgium, 16th Century).

    Science.gov (United States)

    Boucherie, Alexandra; Castex, Dominique; Polet, Caroline; Kacki, Sacha

    2017-01-01

    Harris lines (HLs) are defined as transverse, mineralized lines associated with temporary growth arrest. In paleopathology, HLs are used to reconstruct health status of past populations. However, their etiology is still obscure. The aim of this article is to test the reliability of HLs as an arrested growth marker by investigating their incidence on human metrical parameters. The study was performed on 69 individuals (28 adults, 41 subadults) from the Dendermonde plague cemetery (Belgium, 16th century). HLs were rated on distal femora and both ends of tibiae. Overall prevalence and age-at-formation of each detected lines were calculated. ANOVA analyses were conducted within subadult and adult samples to test if the presence of HLs did impact size and shape parameters of the individuals. At Dendermonde, 52% of the individuals had at least one HL. The age-at-formation was estimated between 5 and 9 years old for the subadults and between 10 and 14 years old for the adults. ANOVA analyses showed that the presence of HLs did not affect the size of the individuals. However, significant differences in shape parameters were highlighted by HL presence. Subadults with HLs displayed slighter shape parameters than the subadults without, whereas the adults with HLs had larger measurements than the adults without. The results suggest that HLs can have a certain impact on shape parameters. The underlying causes can be various, especially for the early formed HLs. However, HLs deposited around puberty are more likely to be physiological lines reflecting hormonal secretions. Am. J. Hum. Biol. 29:e22885, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  5. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics-Three Decades of High-Quality, Technically-Rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high-quality, technically-rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards contain testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards are used to generate accurate, reliable, repeatable and complete data. Within Committee C28, users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, 50 standards since the Committee's founding in 1986. This paper provides a detailed retrospective of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of standards for advanced ceramics to demonstrate their practical applications.

  6. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics, Three Decades of High-quality, Technically-rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high quality, rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards provide accurate, reliable, repeatable and complete data. Within Committee C28 users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, nearly 50 standards since the Committees founding in 1986. This paper provides a retrospective review of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of advanced ceramics standards to demonstrate their practical applications.

  7. Normalization of time-series satellite reflectance data to a standard sun-target-sensor geometry using a semi-empirical model

    Science.gov (United States)

    Zhao, Yongguang; Li, Chuanrong; Ma, Lingling; Tang, Lingli; Wang, Ning; Zhou, Chuncheng; Qian, Yonggang

    2017-10-01

    Time series of satellite reflectance data have been widely used to characterize environmental phenomena, describe trends in vegetation dynamics and study climate change. However, several sensors with wide spatial coverage and high observation frequency are usually designed to have large field of view (FOV), which cause variations in the sun-targetsensor geometry in time-series reflectance data. In this study, on the basis of semiempirical kernel-driven BRDF model, a new semi-empirical model was proposed to normalize the sun-target-sensor geometry of remote sensing image. To evaluate the proposed model, bidirectional reflectance under different canopy growth conditions simulated by Discrete Anisotropic Radiative Transfer (DART) model were used. The semi-empirical model was first fitted by using all simulated bidirectional reflectance. Experimental result showed a good fit between the bidirectional reflectance estimated by the proposed model and the simulated value. Then, MODIS time-series reflectance data was normalized to a common sun-target-sensor geometry by the proposed model. The experimental results showed the proposed model yielded good fits between the observed and estimated values. The noise-like fluctuations in time-series reflectance data was also reduced after the sun-target-sensor normalization process.

  8. Graphs and Networks for Years 7 to 10: Reasons for and Ways of Using Digital Technologies to teach Algebra and the Standard Normal Curve

    Science.gov (United States)

    Padula, Janice

    2014-01-01

    If educators want to interest students in mathematics (and science), they must engage them in the lower forms of high school or even earlier (Fisher, 2012). So, teachers should always consider a topic's ability to interest students in the early years of instruction in high school and its topicality. Networks have come into prominence recently with…

  9. Socio-cultural adaptation and standardization of Dubois' five words testing in a population of normal subject in Mali, West Africa.

    Science.gov (United States)

    Guinto, Cheick O; Coulibaly, Toumany; Koné, Zeinab; Coulibaly, Souleymane; Maiga, Boubacar; Dembélé, Kekouta; Cissé, Lassana; Konaté, Mamadou; Coulibaly, Thomas; Sissoko, Adama S; Karambé, Mamadou; Burnett, Barrington; Landouré, Guida; Traoré, Moussa

    2016-06-01

    Dubois' five words testing (5WT) is a verbal memory test that depends on many parameters. The aim of this study is to adapt Dubois' 5WT to the Malian socio-cultural conditions to (i) determine performances of normal subjects to the 5WT and (ii) provide reference scores of the 5WT. A sample of 276 normal subjects aged ≥ 50 years (154 males and 122 females; 144 literates and 132 illiterates) were enrolled from February 2008 to January 2009. Subjects with a history of symptoms likely to modify cognitive functions and those who were found disabled under Lawton's four simplified item test were excluded. The learning score in illiterates was 1.51 in Dubois' 5WT and 4.90 in the modified 5WT. The mean value of the modified 5WT total score was 9.71. Majority (90.22%) of the subjects scored the maximum (10). The modified 5WT reduced with both the age (p culture and the socio-educative level in French. Its adaptation to the socio-cultural context could prove useful and efficient in countries with a low literacy rate and a diverse cultural background.

  10. Simulated Prism Therapy in Virtual Reality produces larger after-effects than standard prism exposure in normal healthy subject - Implications for Neglect Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    2018-01-01

    BACKGROUND: Virtual reality is an important area of exploration within computer-based cognitive rehabilitation of visual neglect. Virtual reality will allow for closer monitoring of patient behaviour during prism adaptation therapy and perhaps change the way we induce prismatic after......-effects. OBJECTIVE: This study compares the effect of two different prism simulation conditions in virtual reality to a standard exposure to prism goggles after one session of Prism Adaptation Therapy in healthy subjects. METHOD: 20 healthy subjects were subjected to one session of prism adaptation therapy under...... training for rehabilitation of hemi spatial attentional deficits such as visual neglect....

  11. Do different standard plate counting (IDF/ISSO or AOAC) methods interfere in the conversion of individual bacteria counts to colony forming units in raw milk?

    Science.gov (United States)

    Cassoli, L D; Lima, W J F; Esguerra, J C; Da Silva, J; Machado, P F; Mourão, G B

    2016-10-01

    This study aimed to establish the correlation between individual bacterial count (IBC) obtained by flow cytometry and the number of colony forming units (CFU) determined by standard plate count (SPC) in raw milk using two different reference methodologies: the methodology of the International Dairy Federation (IDF) - International Organization for Standardization (ISO) 4833, incubation for 72 h at 30°C and the methodology of the Association of Official Analytical Chemists (AOAC), incubation for 48 h at 35°C. For this, 100 bovine milk samples (80 ml) from different farms were collected in a sterile bottle and maintained refrigerated at 4°C and were delivered to the laboratory. In the laboratory, the samples were divided into two vials of 40 ml each. Then, half of the vials were forwarded for the SPC analysis, and the other half were analysed using the equipment BactoScan FC. The analyses by flow cytometry and SPC were performed at the same time (maximum deviation of +/- 1 h). To transform the data from IBC ml(-1) to CFU ml(-1) (IDF or AOAC methodology), a standard linear regression equation was used, as recommended by IDF/ISO-196. The difference between the reference methodologies affects the equation that transforms IBC into CFU and therefore the accuracy of the results. The results estimated by the equation using the ISO 4833 methodology were on average 0·18 log units higher than the results estimated using the equation using the AOAC methodology. After the comparison of the methodologies, it was concluded that there is an impact of the reference methodologies on the conversion of the results from IBC to CFU. Depending on the methodology adopted by each laboratory or country, there may not be equivalence in the results. Hence, the laboratories specialized in milk quality analysis that have changed their methodology for analysis, passing from the MAPA (AOAC) methodology to the IDF standard, need to develop new conversion equations to make their

  12. Chandra-SDSS Normal and Star-Forming Galaxies. I. X-Ray Source Properties of Galaxies Detected by the Chandra X-Ray Observatory in SDSS DR2

    Science.gov (United States)

    Hornschemeier, A. E.; Heckman, T. M.; Ptak, A. F.; Tremonti, C. A.; Colbert, E. J. M.

    2005-01-01

    We have cross-correlated X-ray catalogs derived from archival Chandra X-Ray Observatory ACIS observations with a Sloan Digital Sky Survey Data Release 2 (DR2) galaxy catalog to form a sample of 42 serendipitously X-ray-detected galaxies over the redshift interval 0.03normal galaxies and those in the deepest X-ray surveys. Our chief purpose is to compare optical spectroscopic diagnostics of activity (both star formation and accretion) with X-ray properties of galaxies. Our work supports a normalization value of the X-ray-star formation rate correlation consistent with the lower values published in the literature. The difference is in the allocation of X-ray emission to high-mass X-ray binaries relative to other components, such as hot gas, low-mass X-ray binaries, and/or active galactic nuclei (AGNs). We are able to quantify a few pitfalls in the use of lower resolution, lower signal-to-noise ratio optical spectroscopy to identify X-ray sources (as has necessarily been employed for many X-ray surveys). Notably, we find a few AGNs that likely would have been misidentified as non-AGN sources in higher redshift studies. However, we do not find any X-ray-hard, highly X-ray-luminous galaxies lacking optical spectroscopic diagnostics of AGN activity. Such sources are members of the ``X-ray-bright, optically normal galaxy'' (XBONG) class of AGNs.

  13. A Denotational Account of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2004-01-01

    Abstract. We show that the standard normalization-by-evaluation construction for the simply-typed λβη-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a “recursively defined” invariant relation, in the style of Pitts. In fact......, the construction can be seen as generalizing a computational adequacy argument for an untyped, call-by-name language to normalization instead of evaluation. In the untyped setting, not all terms have normal forms, so the normalization function is necessarily partial. We establish its correctness in the senses...

  14. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  15. The Dynamics of Standardization

    DEFF Research Database (Denmark)

    Brunsson, Nils; Rasche, Andreas; Seidl, David

    2012-01-01

    This paper suggests that when the phenomenon of standards and standardization is examined from the perspective of organization studies, three aspects stand out: the standardization of organizations, standardization by organizations and standardization as (a form of) organization. Following a comp...

  16. Existence of a soluble form of CD50 (intercellular adhesion molecule-3) produced upon human lymphocyte activation. Present in normal human serum and levels are increased in the serum of systemic lupus erythematosus patients.

    Science.gov (United States)

    Pino-Otín, M R; Viñas, O; de la Fuente, M A; Juan, M; Font, J; Torradeflot, M; Pallarés, L; Lozano, F; Alberola-Ila, J; Martorell, J

    1995-03-15

    CD50 (ICAM-3) is a leukocyte differentiation Ag expressed almost exclusively on hemopoietic cells, with a key role in the first steps of immune response. To develop a specific sandwich ELISA to detect a soluble CD50 form (sCD50), two different mAbs (140-11 and 101-1D2) recognizing non-overlapping epitopes were used. sCD50 was detected in the supernatant of stimulated PBMCs, with the highest levels after CD3 triggering. Simultaneously, the CD50 surface expression diminished during the first 24 h. sCD50 isolated from culture supernatant and analyzed by immunoblotting showed an apparent m.w. of 95 kDa, slightly smaller than the membrane form. These data, together with Northern blot kinetics analysis, suggest that sCD50 is cleaved from cell membrane. Furthermore, we detect sCD50 in normal human sera and higher levels in sera of systemic lupus erythematosus (SLE) patients, especially in those in active phase. The sCD50 levels showed a positive correlation with sCD27 levels (r = 0.4213; p = 0.0026). Detection of sCD50, both after in vitro CD3 triggering of PBMCs and increased in SLE sera, suggests that sCD50 could be used as a marker of lymphocyte stimulation.

  17. ESTUDIO ESTADÍSTICO DEL NÚMERO DE REGLAS RESULTANTES AL TRANSFORMAR UNA GRAMÁTICA LIBRE DE CONTEXTO A LA FORMA NORMAL DE CHOMSKY STATISTICAL STUDY OF THE NUMBER OF RESULTING RULES WHEN TRANSFORMING A CONTEXT-FREE GRAMMAR TO CHOMSKY NORMAL FORM

    Directory of Open Access Journals (Sweden)

    Fredy Ángel Miguel Amaya Robayo

    2010-08-01

    Full Text Available Es un hecho conocido que toda gramática libre de contexto puede ser transformada a la forma normal de Chomsky de tal forma que los lenguajes generados por las dos gramáticas son equivalentes. Una gramática en forma normal de Chomsky (FNC, tiene algunas ventajas, por ejemplo sus árboles de derivación son binarios, la forma de sus reglas más simples etc. Por eso es siempre deseable poder trabajar con una gramática en FNC en las aplicaciones que lo requieran. Existe un algoritmo que permite transformar una gramática libre de contexto a una en FNC, sin embargo la cantidad de reglas generadas al hacer la transformación depende del número de reglas en la gramática inicial así como de otras características. En este trabajo se analiza desde el punto de vista experimental y estadístico, la relación existente entre el número de reglas iniciales y el número de reglas que resultan luego de transformar una Gramática Libre de Contexto a la FNC. Esto permite planificar la cantidad de recursos computacionales necesarios en caso de tratar con gramáticas de alguna complejidad.It is well known that any context-free grammar can be transformed to the Chomsky normal form so that the languages generated by each one are equivalent. A grammar in Chomsky Normal Form (CNF, has some advantages: their derivation trees are binary, simplest rules and so on. So it is always desirable to work with a grammar in CNF in applications that require them. There is an algorithm that can transform a context-free grammar to one CNF grammar, however the number of rules generated after the transformation depends on the initial grammar and other circumstances. In this work we analyze from the experimental and statistical point of view the relationship between the number of initial rules and the number of resulting rules after transforming. This allows you to plan the amount of computational resources needed in case of dealing with grammars of some complexity.

  18. Teaching Form as Form

    DEFF Research Database (Denmark)

    Keiding, Tina Bering

    2012-01-01

    understanding of form per se, or, to use an expression from this text, of form as form. This challenge can be reduced to one question: how can design teaching support students in achieving not only the ability to recognize and describe different form-related concepts in existing design (i.e. analytical...

  19. Development and inter-rater reliability of a standardized verbal instruction manual for the Chinese Geriatric Depression Scale-short form.

    Science.gov (United States)

    Wong, M T P; Ho, T P; Ho, M Y; Yu, C S; Wong, Y H; Lee, S Y

    2002-05-01

    The Geriatric Depression Scale (GDS) is a common screening tool for elderly depression in Hong Kong. This study aimed at (1) developing a standardized manual for the verbal administration and scoring of the GDS-SF, and (2) comparing the inter-rater reliability between the standardized and non-standardized verbal administration of GDS-SF. Two studies were reported. In Study 1, the process of developing the manual was described. In Study 2, we compared the inter-rater reliabilities of GDS-SF scores using the standardized verbal instructions and the traditional non-standardized administration. Results of Study 2 indicated that the standardized procedure in verbal administration and scoring improved the inter-rater reliabilities of GDS-SF. Copyright 2002 John Wiley & Sons, Ltd.

  20. Decommissioning standards

    International Nuclear Information System (INIS)

    Crofford, W.N.

    1980-01-01

    EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs

  1. Malware Normalization

    OpenAIRE

    Christodorescu, Mihai; Kinder, Johannes; Jha, Somesh; Katzenbeisser, Stefan; Veith, Helmut

    2005-01-01

    Malware is code designed for a malicious purpose, such as obtaining root privilege on a host. A malware detector identifies malware and thus prevents it from adversely affecting a host. In order to evade detection by malware detectors, malware writers use various obfuscation techniques to transform their malware. There is strong evidence that commercial malware detectors are susceptible to these evasion tactics. In this paper, we describe the design and implementation of a malware normalizer ...

  2. Normalizing tweets with edit scripts and recurrent neural embeddings

    NARCIS (Netherlands)

    Chrupala, Grzegorz; Toutanova, Kristina; Wu, Hua

    2014-01-01

    Tweets often contain a large proportion of abbreviations, alternative spellings, novel words and other non-canonical language. These features are problematic for standard language analysis tools and it can be desirable to convert them to canonical form. We propose a novel text normalization model

  3. Normal accidents

    International Nuclear Information System (INIS)

    Perrow, C.

    1989-01-01

    The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de

  4. Pursuing Normality

    DEFF Research Database (Denmark)

    Madsen, Louise Sofia; Handberg, Charlotte

    2018-01-01

    implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease......BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... was to understand the reflections on cancer survivorship care of lymphoma survivors to aid the future planning of cancer survivorship care and overcome barriers to participation. METHODS: Data were generated in a hematological ward during 4 months of ethnographic fieldwork, including participant observation and 46...

  5. Accurate quantification of sphingosine-1-phosphate in normal and Fabry disease plasma, cells and tissues by LC-MS/MS with (13)C-encoded natural S1P as internal standard.

    Science.gov (United States)

    Mirzaian, Mina; Wisse, Patrick; Ferraz, Maria J; Marques, André R A; Gabriel, Tanit L; van Roomen, Cindy P A A; Ottenhoff, Roelof; van Eijk, Marco; Codée, Jeroen D C; van der Marel, Gijsbert A; Overkleeft, Herman S; Aerts, Johannes M

    2016-08-01

    We developed a mass spectrometric procedure to quantify sphingosine-1-phosphate (S1P) in biological materials. The use of newly synthesized (13)C5 C18-S1P and commercial C17-S1P as internal standards rendered very similar results with respect to linearity, limit of detection and limit of quantitation. Caution is warranted with determination of plasma S1P levels. Earlier it was reported that S1P is elevated in plasma of Fabry disease patients. We investigated this with the improved quantification. No clear conclusion could be drawn for patient plasma samples given the lack of uniformity of blood collection and plasma preparation. To still obtain insight, plasma and tissues were identically collected from α-galactosidase A deficient Fabry mice and matched control animals. No significant difference was observed in plasma S1P levels. A significant 2.3 fold increase was observed in kidney of Fabry mice, but not in liver and heart. Comparative analysis of S1P in cultured fibroblasts from normal subjects and classically affected Fabry disease males revealed no significant difference. In conclusion, accurate quantification of S1P in biological materials is feasible by mass spectrometry using the internal standards (13)C5 C18-S1P or C17-S1P. Significant local increases of S1P in the kidney might occur in Fabry disease as suggested by the mouse model. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Variation in standards of research compensation and child assent practices: a comparison of 69 institutional review board-approved informed permission and assent forms for 3 multicenter pediatric clinical trials.

    Science.gov (United States)

    Kimberly, Michael B; Hoehn, K Sarah; Feudtner, Chris; Nelson, Robert M; Schreiner, Mark

    2006-05-01

    To systematically compare standards for compensation and child participant assent in informed permission, assent, and consent forms (IP-A-CFs) approved by 55 local institutional review boards (IRBs) reviewing 3 standardized multicenter research protocols. Sixty-nine principal investigators participating in any of 3 national, multicenter clinical trials submitted standardized research protocols for their trials to their local IRBs for approval. Copies of the subsequently IRB-approved IP-A-CFs were then forwarded to an academic clinical research organization. This collection of IRB-approved forms allowed for a quasiexperimental retrospective evaluation of the variation in informed permission, assent, and consent standards operationalized by the local IRBs. Standards for compensation and child participant assent varied substantially across 69 IRB-approved IP-A-CFs. Among the 48 IP-A-CFs offering compensation, monetary compensation was offered by 33 as reimbursement for travel, parking, or food expenses, whereas monetary or material compensation was offered by 22 for subject inconvenience and by 13 for subject time. Compensation ranged widely within and across studies (study 1, $180-1425; study 2, $0-500; and study 3, $0-100). Regarding child participant assent, among the 57 IP-A-CFs that included a form of assent documentation, 33 included a line for assent on the informed permission or consent form, whereas 35 included a separate form written in simplified language. Of the IP-A-CFs that stipulated the documentation of assent, 31 specified > or =1 age ranges for obtaining assent. Informed permission or consent forms were addressed either to parents or child participants. In response to identical clinical trial protocols, local IRBs generate IP-A-CFs that vary considerably regarding compensation and child participant assent.

  7. Normalization of satellite imagery

    Science.gov (United States)

    Kim, Hongsuk H.; Elman, Gregory C.

    1990-01-01

    Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.

  8. A mercury programme (autocode programme 5675) for transforming data on the angular distribution of elastically scattered neutrons to one standard form

    International Nuclear Information System (INIS)

    King, D.C.

    1964-04-01

    Data on the angular distribution of elastically scattered neutrons are reported in one or another of a variety of different forms. The Mercury autocode programme 5675 transforms the data into a tabular representation of the form (cosθ, p (cosθ)) where p(cosθ) is the normalised probability distribution and θ is the scattering angle in the centre of mass frame of reference. Output on cards punched in the format of the U.K.A.E.A. nuclear data library is optional. (author)

  9. Ascorbate/menadione-induced oxidative stress kills cancer cells that express normal or mutated forms of the oncogenic protein Bcr-Abl. An in vitro and in vivo mechanistic study.

    Science.gov (United States)

    Beck, Raphaël; Pedrosa, Rozangela Curi; Dejeans, Nicolas; Glorieux, Christophe; Levêque, Philippe; Gallez, Bernard; Taper, Henryk; Eeckhoudt, Stéphane; Knoops, Laurent; Calderon, Pedro Buc; Verrax, Julien

    2011-10-01

    Numerous studies suggest that generation of oxidative stress could be useful in cancer treatment. In this study, we evaluated, in vitro and in vivo, the antitumor potential of oxidative stress induced by ascorbate/menadione (asc/men). This combination of a reducing agent (ascorbate) and a redox active quinone (menadione) generates redox cycling leading to formation of reactive oxygen species (ROS). Asc/men was tested in several cell types including K562 cells (a stable human-derived leukemia cell line), freshly isolated leukocytes from patients with chronic myeloid leukemia, BaF3 cells (a murine pro-B cell line) transfected with Bcr-Abl and peripheral blood leukocytes derived from healthy donors. Although these latter cells were resistant to asc/men, survival of all the other cell lines was markedly reduced, including the BaF3 cells expressing either wild-type or mutated Bcr-Abl. In a standard in vivo model of subcutaneous tumor transplantation, asc/men provoked a significant delay in the proliferation of K562 and BaF3 cells expressing the T315I mutated form of Bcr-Abl. No effect of asc/men was observed when these latter cells were injected into blood of mice most probably because of the high antioxidant potential of red blood cells, as shown by in vitro experiments. We postulate that cancer cells are more sensitive to asc/men than healthy cells because of their lack of antioxidant enzymes, mainly catalase. The mechanism underlying this cytotoxicity involves the oxidative cleavage of Hsp90 with a subsequent loss of its chaperone function thus leading to degradation of wild-type and mutated Bcr-Abl protein.

  10. Bicervical normal uterus with normal vagina | Okeke | Annals of ...

    African Journals Online (AJOL)

    To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...

  11. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  12. Screen-Printed Electrode Modified by Bismuth /Fe3O4 Nanoparticle/Ionic Liquid Composite Using Internal Standard Normalization for Accurate Determination of Cd(II in Soil

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-12-01

    Full Text Available The quality and safety of agricultural products are threatened by heavy metal ions in soil, which can be absorbed by the crops, and then accumulated in the human body through the food chain. In this paper, we report a low-cost and easy-to-use screen-printed electrode (SPE for cadmium ion (Cd(II detection based on differential pulse voltammetry (DPV, which decorated with ionic liquid (IL, magnetite nanoparticle (Fe3O4, and deposited a bismuth film (Bi. The characteristics of Bi/Fe3O4/ILSPE were investigated using scanning electron microscopy, cyclic voltammetry, impedance spectroscopy, and linear sweep voltammetry. We found that the sensitivity of SPE was improved dramatically after functionalized with Bi/Fe3O4/IL. Under optimized conditions, the concentrations of Cd(II are linear with current responses in a range from 0.5 to 40 µg/L with the lowest detection limit of 0.05 µg/L (S/N = 3. Additionally, the internal standard normalization (ISN was used to process the response signals of Bi/Fe3O4/ILSPE and established a new linear equation. For detecting three different Cd(II concentrations, the root-mean-square error using ISN (0.25 is lower than linear method (0.36. Finally, the proposed electrode was applied to trace Cd(II in soil samples with the recovery in the range from 91.77 to 107.83%.

  13. Normatização do eletrorretinograma por reversão alternada de padrões em voluntários normais Standardization of pattern electroretinograms by alternate reversion in normal volunteers

    Directory of Open Access Journals (Sweden)

    Andréa Mara Simões Torigoe

    2003-08-01

    Full Text Available OBJETIVO: Realizar a normatização do eletrorretinograma por reversão alternada em indivíduos oftalmologicamente normais e sem doenças neurológicas associadas, determinando a faixa de normalidade estratificada por sexo, faixa etária e estímulo utilizado. MÉTODOS: A padronização seguiu o modelo proposto pela Organização Internacional de Eletrorretinografia e a normatização foi específica para o laboratório de potenciais evocados do Departamento de Neurologia da Faculdade de Ciências Médicas - Universidade Estadual de Campinas. Dois tipos de estímulos foram utilizados: o denominado estímulo 16, que proporciona ângulo visual de 60 minutos de arco e o de 32, que proporciona ângulo visual de 30. RESULTADOS: Em todos os pacientes obteve-se uma onda positiva, definida internacionalmente como P50 e uma negativa, chamada N95, sem a presença de artefatos. Foram observados intervalos de normalidade que continham a média das latências, amplitudes e durações das curvas positiva e negativa, internacionalmente aceitas. As ondas P50 e N95 apresentaram diferenças significativas na amplitude, latência e duração quando comparadas às diversas faixas etárias, ocorrendo diminuição na amplitude das ondas e aumento na latência total do eletrorretinograma com o aumento da idade. Construíram-se tabelas com intervalo de predição de 95% em relação à idade para a amplitude, latência e duração das curvas P50 e N95. CONCLUSÕES: A normatização do eletrorretinograma por reversão alternada proporciona a reprodutibilidade dos resultados e a possibilidade de estudos comparativos.PURPOSE: To standardize recordings of pattern electroretinograms (PERG in normal human subjects. METHODS: The standardization followed the model proposed by the International Organization of Eletroretinography and was specific for the laboratory of evoked potentials of the Department of Neurology, Faculty of Medical Sciences, State University of Campinas

  14. Identification of different shapes, colors and sizes of standard oral dosage forms in diabetes type 2 patients-A pilot study.

    Science.gov (United States)

    Stegemann, Sven; Riedl, Regina; Sourij, Harald

    2017-01-30

    The clear identification of drug products by the patients is essential for a safe and effective medication management. In order to understand the impact of shape, size and color on medication identification a study was performed in subjects with type 2 diabetes mellitus (T2D). Ten model drugs differentiated by shape, size and color were evaluated using a mixed method of medication schedule preparation by the participants followed by a semi-structured interview. Detection times were fastest for the large round tablet shape and the bi-chromatic forms. Larger size was easier to identify than the smaller sizes except for the bi-chromatic forms. The shape was the major source of errors, followed by the size and the color dimension. The results from this study suggests that color as a single dimension are perceived more effectively by subjects with T2D compared to shape and size, which requires a more demanding processing of three dimension and is dependent on the perspective. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Effects of variable transformations on errors in FORM results

    International Nuclear Information System (INIS)

    Qin Quan; Lin Daojin; Mei Gang; Chen Hao

    2006-01-01

    On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors

  16. Standard test method for accelerated leach test for diffusive releases from solidified waste and a computer program to model diffusive, fractional leaching from cylindrical waste forms

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method provides procedures for measuring the leach rates of elements from a solidified matrix material, determining if the releases are controlled by mass diffusion, computing values of diffusion constants based on models, and verifying projected long-term diffusive releases. This test method is applicable to any material that does not degrade or deform during the test. 1.1.1 If mass diffusion is the dominant step in the leaching mechanism, then the results of this test can be used to calculate diffusion coefficients using mathematical diffusion models. A computer program developed for that purpose is available as a companion to this test method (Note 1). 1.1.2 It should be verified that leaching is controlled by diffusion by a means other than analysis of the leach test solution data. Analysis of concentration profiles of species of interest near the surface of the solid waste form after the test is recommended for this purpose. 1.1.3 Potential effects of partitioning on the test results can...

  17. Normal Pressure Hydrocephalus (NPH)

    Science.gov (United States)

    ... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...

  18. Corticocortical feedback increases the spatial extent of normalization.

    Science.gov (United States)

    Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.

  19. Corticocortical feedback increases the spatial extent of normalization

    Science.gov (United States)

    Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596

  20. Normal growth spurt and final height despite low levels of all forms of circulating insulin-like growth factor-I in a patient with acid-labile subunit deficiency

    DEFF Research Database (Denmark)

    Domené, Horacio M; Martínez, Alicia S; Frystyk, Jan

    2007-01-01

    BACKGROUND: In a recently described patient with acid-labile subunit (ALS) deficiency, the inability to form ternary complexes resulted in a marked reduction in circulating total insulin-like growth factor (IGF)-I, whereas skeletal growth was only marginally affected. To further study the role of...

  1. Cephalometric-radiographic study, in lateral norm, considering the established standards of white Brazilian teenagers who presented normal occlusions and mal-occlusions of Class I and Class II, 1st Division and the ones from Ricketts' analysis

    International Nuclear Information System (INIS)

    Bismarck, V.E.

    1986-01-01

    In the present work, our purpose was make a cephalometric-radiographic study, comparing white Brazilian teenagers who presented normal occlusion and the ones who presented malocclusions of Class I and Class II, according to RICKETT'S analysis (1960). (author) [pt

  2. Three forms of relativity

    International Nuclear Information System (INIS)

    Strel'tsov, V.N.

    1992-01-01

    The physical sense of three forms of the relativity is discussed. The first - instant from - respects in fact the traditional approach based on the concept of instant distance. The normal form corresponds the radar formulation which is based on the light or retarded distances. The front form in the special case is characterized by 'observable' variables, and the known method of k-coefficient is its obvious expression. 16 refs

  3. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  4. The construction of normal expectations

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Røpke, Inge

    2008-01-01

    The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...

  5. Manufacturing technology for practical Josephson voltage normals

    International Nuclear Information System (INIS)

    Kohlmann, Johannes; Kieler, Oliver

    2016-01-01

    In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.

  6. Complete Normal Ordering 1: Foundations

    CERN Document Server

    Ellis, John; Skliros, Dimitri P.

    2016-01-01

    We introduce a new prescription for quantising scalar field theories perturbatively around a true minimum of the full quantum effective action, which is to `complete normal order' the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all `cephalopod' Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of `complete normal ordering' (which is an extension of the standard field theory definition of normal ordering) reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative i...

  7. Denotational Aspects of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2005-01-01

    of soundness (the output term, if any, is in normal form and ß-equivalent to the input term); identification (ß-equivalent terms are mapped to the same result); and completeness (the function is defined for all terms that do have normal forms). We also show how the semantic construction enables a simple yet...... formal correctness proof for the normalization algorithm, expressed as a functional program in an ML-like, call-by-value language. Finally, we generalize the construction to produce an infinitary variant of normal forms, namely Böhm trees. We show that the three-part characterization of correctness...

  8. Theory of normal metals

    International Nuclear Information System (INIS)

    Mahan, G.D.

    1992-01-01

    The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors

  9. PowerForms

    DEFF Research Database (Denmark)

    Brabrand, Claus; Møller, Anders; Ricky, Mikkel

    2000-01-01

    All uses of HTML forms may benefit from validation of the specified input field values. Simple validation matches individual values against specified formats, while more advanced validation may involve interdependencies of form fields. There is currently no standard for specifying or implementing...

  10. The "Second Place" Problem: Assistive Technology in Sports and (Re) Constructing Normal.

    Science.gov (United States)

    Baker, D A

    2016-02-01

    Objections to the use of assistive technologies (such as prostheses) in elite sports are generally raised when the technology in question is perceived to afford the user a potentially "unfair advantage," when it is perceived as a threat to the purity of the sport, and/or when it is perceived as a precursor to a slippery slope toward undesirable changes in the sport. These objections rely on being able to quantify standards of "normal" within a sport so that changes attributed to the use of assistive technology can be judged as causing a significant deviation from some baseline standard. This holds athletes using assistive technologies accountable to standards that restrict their opportunities to achieve greatness, while athletes who do not use assistive technologies are able to push beyond the boundaries of these standards without moral scrutiny. This paper explores how constructions of fairness and "normality" impact athletes who use assistive technology to compete in a sporting venue traditionally populated with "able-bodied" competitors. It argues that the dynamic and obfuscated construction of "normal" standards in elite sports should move away from using body performance as the measuring stick of "normal," toward alternate forms of constructing norms such as defining, quantifying, and regulating the mechanical actions that constitute the critical components of a sport. Though framed within the context of elite sports, this paper can be interpreted more broadly to consider problems with defining "normal" bodies in a society in which technologies are constantly changing our abilities and expectations of what normal means.

  11. MR guided spatial normalization of SPECT scans

    International Nuclear Information System (INIS)

    Crouch, B.; Barnden, L.R.; Kwiatek, R.

    2010-01-01

    Full text: In SPECT population studies where magnetic resonance (MR) scans are also available, the higher resolution of the MR scans allows for an improved spatial normalization of the SPECT scans. In this approach, the SPECT images are first coregistered to their corresponding MR images by a linear (affine) transformation which is calculated using SPM's mutual information maximization algorithm. Non-linear spatial normalization maps are then computed either directly from the MR scans using SPM's built in spatial normalization algorithm, or, from segmented TI MR images using DARTEL, an advanced diffeomorphism based spatial normalization algorithm. We compare these MR based methods to standard SPECT based spatial normalization for a population of 27 fibromyalgia patients and 25 healthy controls with spin echo T 1 scans. We identify significant perfusion deficits in prefrontal white matter in FM patients, with the DARTEL based spatial normalization procedure yielding stronger statistics than the standard SPECT based spatial normalization. (author)

  12. (EOI) Form

    International Development Research Centre (IDRC) Digital Library (Canada)

    Dorine Odongo

    COLLABORATING TECHNICAL AGENCIES: EXPRESSION OF INTEREST FORM. • Please read the information provided about the initiative and the eligibility requirements in the Prospectus before completing this application form. • Ensure all the sections of the form are accurately completed and saved in PDF format.

  13. Modular forms

    NARCIS (Netherlands)

    Edixhoven, B.; van der Geer, G.; Moonen, B.; Edixhoven, B.; van der Geer, G.; Moonen, B.

    2008-01-01

    Modular forms are functions with an enormous amount of symmetry that play a central role in number theory, connecting it with analysis and geometry. They have played a prominent role in mathematics since the 19th century and their study continues to flourish today. Modular forms formed the

  14. Accurate quantification of sphingosine-1-phosphate in normal and Fabry disease plasma, cells and tissues by LC-MS/MS with (13)C-encoded natural S1P as internal standard

    NARCIS (Netherlands)

    Mirzaian, Mina; Wisse, Patrick; Ferraz, Maria J.; Marques, André R. A.; Gabriel, Tanit L.; van Roomen, Cindy P. A. A.; Ottenhoff, Roelof; van Eijk, Marco; Codée, Jeroen D. C.; van der Marel, Gijsbert A.; Overkleeft, Herman S.; Aerts, Johannes M.

    2016-01-01

    We developed a mass spectrometric procedure to quantify sphingosine-1-phosphate (S1P) in biological materials. The use of newly synthesized (13)C5 C18-S1P and commercial C17-S1P as internal standards rendered very similar results with respect to linearity, limit of detection and limit of

  15. Robust Confidence Interval for a Ratio of Standard Deviations

    Science.gov (United States)

    Bonett, Douglas G.

    2006-01-01

    Comparing variability of test scores across alternate forms, test conditions, or subpopulations is a fundamental problem in psychometrics. A confidence interval for a ratio of standard deviations is proposed that performs as well as the classic method with normal distributions and performs dramatically better with nonnormal distributions. A simple…

  16. Normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  17. Random Generators and Normal Numbers

    OpenAIRE

    Bailey, David H.; Crandall, Richard E.

    2002-01-01

    Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...

  18. Normal gravity field in relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  19. Standard Errors for Matrix Correlations.

    Science.gov (United States)

    Ogasawara, Haruhiko

    1999-01-01

    Derives the asymptotic standard errors and intercorrelations for several matrix correlations assuming multivariate normality for manifest variables and derives the asymptotic standard errors of the matrix correlations for two factor-loading matrices. (SLD)

  20. ESCA studies on leached glass forms

    International Nuclear Information System (INIS)

    Dawkins, B.G.

    1979-01-01

    Electron Spectroscopy for Chemical Analysis (ESCA) results for frit, obsidian, NBS standard, and Savannah River Laboratory (SRL) glass forms that have been subjected to cumulative water leachings of 36 hours show that [Na] exhibits the largest and fastest change of all the elements observed. Leaching of surface Na occurred within minutes. Surface Na depletion increased with leach time. Continuous x-ray irradiation and argon ion milling induced Na mobility, precluding semiquantitative ESCA analysis at normal operating temperatures. However, the sample stage has been equipped with a liquid nitrogen supply and alkali mobility should be eliminated in future work

  1. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  2. Normal foot and ankle

    International Nuclear Information System (INIS)

    Weissman, S.D.

    1989-01-01

    The foot may be thought of as a bag of bones tied tightly together and functioning as a unit. The bones re expected to maintain their alignment without causing symptomatology to the patient. The author discusses a normal radiograph. The bones must have normal shape and normal alignment. The density of the soft tissues should be normal and there should be no fractures, tumors, or foreign bodies

  3. Air Force standards for nickel hydrogen battery

    Science.gov (United States)

    Hwang, Warren; Milden, Martin

    1994-01-01

    The topics discussed are presented in viewgraph form and include Air Force nickel hydrogen standardization goals, philosophy, project outline, cell level standardization, battery level standardization, and schedule.

  4. Splittings of free groups, normal forms and partitions of ends

    Indian Academy of Sciences (India)

    geodesic laminations and show that this space is compact. Many of the ... determined by the partition of ends of ˜M associated to the spheres. In §4, we recall ... As is well-known we can associate to a graph a topological space. Geometrically ...

  5. Nonpolynomial vector fields under the Lotka-Volterra normal form

    Science.gov (United States)

    Hernández-Bermejo, Benito; Fairén, Víctor

    1995-02-01

    We carry out the generalization of the Lotka-Volterra embedding to flows not explicitly recognizable under the generalized Lotka-Volterra format. The procedure introduces appropriate auxiliary variables, and it is shown how, to a great extent, the final Lotka-Volterra system is independent of their specific definition. Conservation of the topological equivalence during the process is also demonstrated.

  6. Normal forms for characteristic functions on n-ary relations

    NARCIS (Netherlands)

    D.J.N. van Eijck (Jan)

    2004-01-01

    textabstractFunctions of type (n) are characteristic functions on n-ary relations. Keenan established their importance for natural language semantics, by showing that natural language has many examples of irreducible type (n) functions, i.e., functions of type (n) that cannot be represented as

  7. Imaging the corpus callosum, septum pellucidum and fornix in children: normal anatomy and variations of normality

    International Nuclear Information System (INIS)

    Griffiths, Paul D.; Batty, Ruth; Connolly, Dan J.A.; Reeves, Michael J.

    2009-01-01

    The midline structures of the supra-tentorial brain are important landmarks for judging if the brain has formed correctly. In this article, we consider the normal appearances of the corpus callosum, septum pellucidum and fornix as shown on MR imaging in normal and near-normal states. (orig.)

  8. Deformation around basin scale normal faults

    International Nuclear Information System (INIS)

    Spahic, D.

    2010-01-01

    Faults in the earth crust occur within large range of scales from microscale over mesoscopic to large basin scale faults. Frequently deformation associated with faulting is not only limited to the fault plane alone, but rather forms a combination with continuous near field deformation in the wall rock, a phenomenon that is generally called fault drag. The correct interpretation and recognition of fault drag is fundamental for the reconstruction of the fault history and determination of fault kinematics, as well as prediction in areas of limited exposure or beyond comprehensive seismic resolution. Based on fault analyses derived from 3D visualization of natural examples of fault drag, the importance of fault geometry for the deformation of marker horizons around faults is investigated. The complex 3D structural models presented here are based on a combination of geophysical datasets and geological fieldwork. On an outcrop scale example of fault drag in the hanging wall of a normal fault, located at St. Margarethen, Burgenland, Austria, data from Ground Penetrating Radar (GPR) measurements, detailed mapping and terrestrial laser scanning were used to construct a high-resolution structural model of the fault plane, the deformed marker horizons and associated secondary faults. In order to obtain geometrical information about the largely unexposed master fault surface, a standard listric balancing dip domain technique was employed. The results indicate that for this normal fault a listric shape can be excluded, as the constructed fault has a geologically meaningless shape cutting upsection into the sedimentary strata. This kinematic modeling result is additionally supported by the observation of deformed horizons in the footwall of the structure. Alternatively, a planar fault model with reverse drag of markers in the hanging wall and footwall is proposed. Deformation around basin scale normal faults. A second part of this thesis investigates a large scale normal fault

  9. Ultrasonographic features of normal lower ureters

    International Nuclear Information System (INIS)

    Kim, Young Soon; Bae, M. Y.; Park, K. J.; Jeon, H. S.; Lee, J. H.

    1990-01-01

    Although ultrasonographic evaluation of the normal ureters is difficult due to bowel gas, the lower segment of the normal ureters can be visualized using the urinary bladder as an acoustic window. Authors prospetively performed ultrasonography with the standard suprapubic technique and analyzed the ultrasonographic features of normal lower ureters in 79 cases(77%). Length of visualized segment of the distal ureter ranged frp, 1.5cm to 7.2 cm and the visualized segment did not exceed 3.9mm in maximum diameter. Knowledge of sonographic features of the normal lower ureters can be helpful in the evaluation of pathologic or suspected pathologic conditions of the lower ureters

  10. Anomalous normal mode oscillations in semiconductor microcavities

    Energy Technology Data Exchange (ETDEWEB)

    Wang, H. [Univ. of Oregon, Eugene, OR (United States). Dept. of Physics; Hou, H.Q.; Hammons, B.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-04-01

    Semiconductor microcavities as a composite exciton-cavity system can be characterized by two normal modes. Under an impulsive excitation by a short laser pulse, optical polarizations associated with the two normal modes have a {pi} phase difference. The total induced optical polarization is then expected to exhibit a sin{sup 2}({Omega}t)-like oscillation where 2{Omega} is the normal mode splitting, reflecting a coherent energy exchange between the exciton and cavity. In this paper the authors present experimental studies of normal mode oscillations using three-pulse transient four wave mixing (FWM). The result reveals surprisingly that when the cavity is tuned far below the exciton resonance, normal mode oscillation in the polarization is cos{sup 2}({Omega}t)-like, in contrast to what is expected form the simple normal mode model. This anomalous normal mode oscillation reflects the important role of virtual excitation of electronic states in semiconductor microcavities.

  11. Complete normal ordering 1: Foundations

    Directory of Open Access Journals (Sweden)

    John Ellis

    2016-08-01

    Full Text Available We introduce a new prescription for quantising scalar field theories (in generic spacetime dimension and background perturbatively around a true minimum of the full quantum effective action, which is to ‘complete normal order’ the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all ‘cephalopod’ Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of ‘complete normal ordering’ (which is an extension of the standard field theory definition of normal ordering reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative interactions, and by using a point splitting ‘trick’ we extend this result to theories with derivative interactions, such as those appearing as non-linear σ-models in the world-sheet formulation of string theory. We focus here on theories with trivial vacua, generalising the discussion to non-trivial vacua in a follow-up paper.

  12. Baby Poop: What's Normal?

    Science.gov (United States)

    ... I'm breast-feeding my newborn and her bowel movements are yellow and mushy. Is this normal for baby poop? Answers from Jay L. Hoecker, M.D. Yellow, mushy bowel movements are perfectly normal for breast-fed babies. Still, ...

  13. Calibration of Flick standards

    International Nuclear Information System (INIS)

    Thalmann, Ruedi; Spiller, Jürg; Küng, Alain; Jusko, Otto

    2012-01-01

    Flick standards or magnification standards are widely used for an efficient and functional calibration of the sensitivity of form measuring instruments. The results of a recent measurement comparison have shown to be partially unsatisfactory and revealed problems related to the calibration of these standards. In this paper the influence factors for the calibration of Flick standards using roundness measurement instruments are discussed in detail, in particular the bandwidth of the measurement chain, residual form errors of the device under test, profile distortions due to the diameter of the probing element and questions related to the definition of the measurand. The different contributions are estimated using simulations and are experimentally verified. Also alternative methods to calibrate Flick standards are investigated. Finally the practical limitations of Flick standard calibration are shown and the usability of Flick standards both to calibrate the sensitivity of roundness instruments and to check the filter function of such instruments is analysed. (paper)

  14. Standardisation in standards

    International Nuclear Information System (INIS)

    McDonald, J. C.

    2012-01-01

    The following observations are offered by one who has served on national and international standards-writing committees and standards review committees. Service on working groups consists of either updating previous standards or developing new standards. The process of writing either type of document proceeds along similar lines. The first order of business is to recognise the need for developing or updating a standard and to identify the potential user community. It is also necessary to ensure that there is a required number of members willing to do the writing. A justification is required as to why a new standard should be developed, and this is written as a new work item proposal or a project initiation notification system form. This document must be filed officially and approved, and a search is then undertaken to ensure that the proposed new standard will not duplicate a standard that has already been published or is underway in another standards organisation. (author)

  15. Visual Memories Bypass Normalization.

    Science.gov (United States)

    Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam

    2018-05-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

  16. Making nuclear 'normal'

    International Nuclear Information System (INIS)

    Haehlen, Peter; Elmiger, Bruno

    2000-01-01

    The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be

  17. Standardization of the radioimmunoassay of 17β-estradiol (E2) plasmatic and his application to the study of secretion of E2 in normal women, during the menstrual cycle and after the infusion of the gonadotropin (LH/FSH/RH) release factor

    International Nuclear Information System (INIS)

    Kiyan, Takeko Shimizu

    1979-01-01

    A radioimmunoassay method for measurement of plasma E 2 was standardized utilizing a highly specific antisera against E 2 [-6 (-0-carboxymethyl)-oxime] BSA without the need of previous chromatographic purification. The anti-E 2 serum was highly specific, showing high affinity with affinity constants: K 1 =1 .62x10 12 M -1 and K 2 = 2.94x10 11 M -1 , calculated by Scatchard plot. The standard curve sensitivity was 2 pico grams. The method was specific and accurate, showing an intra-assay precision with a mean C.V. of 2.9%, with the inter-assay evaluation showing a mean C.V. of 5.0%. This method was employed to evaluate E 2 secretion during the menstrual cycle in 6 normal females, as indicated below: Days: -14 to -10 (early follicular phase..64.68 pg/ml±12.14; - 9 to - 1(late follicular phase).122.39 pg/ml ±33,54; Peak day 281.28 pg/ml ±66 ,59; + 1 to + 7(early luteal phase) 127.47 pg/ml ±24.88; + 8 to +14 (late luteal phase) 87.57 pg/ml±37,56. The effect of the acute and prolonged infusion of LH/FSH-RH(synthetic hypothalamic LH and FSH releasing hormone) was evaluated in the follicular and luteal phase in some of the normal females. (author)

  18. State Air Quality Standards.

    Science.gov (United States)

    Pollution Engineering, 1978

    1978-01-01

    This article presents in tabular form the air quality standards for sulfur dioxide, carbon monoxide, nitrogen dioxide, photochemicals, non-methane hydrocarbons and particulates for each of the 50 states and the District of Columbia. (CS)

  19. Automorphic Forms

    DEFF Research Database (Denmark)

    von Essen, Flemming Brændgaard

    The Taylor coefficients of weight k Eisenstein series wrt. SL2(Z) are related to values of L-functions for Hecke characters in the point k. We show some congruences for Taylor coefficients of Eisenstein series of weight 4 and 6 and use them to establish congruences for values of L......-functions for Hecke characters in the points 4 and 6. It is well known, that all zeros of the Eisenstein series Ek wrt. SL2(Z) in the standard fundamental domain has modulus 1. We show that this is also true for #n Ek, where # is a certain differential operator. We then proceed to study logarithms of multiplier...

  20. Normal Pressure Hydrocephalus

    Science.gov (United States)

    ... improves the chance of a good recovery. Without treatment, symptoms may worsen and cause death. What research is being done? The NINDS conducts and supports research on neurological disorders, including normal pressure hydrocephalus. Research on disorders such ...

  1. Normality in Analytical Psychology

    Science.gov (United States)

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  2. Normal pressure hydrocephalus

    Science.gov (United States)

    Hydrocephalus - occult; Hydrocephalus - idiopathic; Hydrocephalus - adult; Hydrocephalus - communicating; Dementia - hydrocephalus; NPH ... Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. ... Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders ...

  3. Normal Functioning Family

    Science.gov (United States)

    ... Spread the Word Shop AAP Find a Pediatrician Family Life Medical Home Family Dynamics Adoption & Foster Care ... Español Text Size Email Print Share Normal Functioning Family Page Content Article Body Is there any way ...

  4. Normal growth and development

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/002456.htm Normal growth and development To use the sharing features on this page, please enable JavaScript. A child's growth and development can be divided into four periods: ...

  5. Normal modes and continuous spectra

    International Nuclear Information System (INIS)

    Balmforth, N.J.; Morrison, P.J.

    1994-12-01

    The authors consider stability problems arising in fluids, plasmas and stellar systems that contain singularities resulting from wave-mean flow or wave-particle resonances. Such resonances lead to singularities in the differential equations determining the normal modes at the so-called critical points or layers. The locations of the singularities are determined by the eigenvalue of the problem, and as a result, the spectrum of eigenvalues forms a continuum. They outline a method to construct the singular eigenfunctions comprising the continuum for a variety of problems

  6. Asymptotic Normality of the Optimal Solution in Multiresponse Surface Mathematical Programming

    OpenAIRE

    Díaz-García, José A.; Caro-Lopera, Francisco J.

    2015-01-01

    An explicit form for the perturbation effect on the matrix of regression coeffi- cients on the optimal solution in multiresponse surface methodology is obtained in this paper. Then, the sensitivity analysis of the optimal solution is studied and the critical point characterisation of the convex program, associated with the optimum of a multiresponse surface, is also analysed. Finally, the asymptotic normality of the optimal solution is derived by the standard methods.

  7. Weston Standard battery

    CERN Multimedia

    This is a Weston AOIP standard battery with its calibration certificate (1956). Inside, the glassware forms an "H". Its name comes from the British physicist Edward Weston. A standard is the materialization of a given quantity whose value is known with great accuracy.

  8. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  9. International Construction Measurement Standard

    OpenAIRE

    Mitchell, Charles

    2016-01-01

    The International Construction Measurement Standard Coalition (the Coalition) was formed on 17 June 2015 after meeting at the International Monetary Fund in Washington DC, USA. The Coalition, comprising the organisations listed below at the date of publication, aims to bring about consistency in construction cost reporting standards internationally. This is achieved by the creation and adoption of this ICMS, an agreed international standard for the structuring and presentation of cost reports...

  10. Standardization of pattern electroretinograms by alternate reversion in normal volunteers

    OpenAIRE

    Torigoe, Andréa Mara Simões; Quagliato, Elizabeth M. A. B.; Torigoe, Marcelo; Carvalho, Keila Mirian Monteiro de

    2003-01-01

    OBJETIVO: Realizar a normatização do eletrorretinograma por reversão alternada em indivíduos oftalmologicamente normais e sem doenças neurológicas associadas, determinando a faixa de normalidade estratificada por sexo, faixa etária e estímulo utilizado. MÉTODOS: A padronização seguiu o modelo proposto pela Organização Internacional de Eletrorretinografia e a normatização foi específica para o laboratório de potenciais evocados do Departamento de Neurologia da Faculdade de Ciências Médicas - U...

  11. Successive Standardization of Rectangular Arrays

    Directory of Open Access Journals (Sweden)

    Richard A. Olshen

    2012-02-01

    Full Text Available In this note we illustrate and develop further with mathematics and examples, the work on successive standardization (or normalization that is studied earlier by the same authors in [1] and [2]. Thus, we deal with successive iterations applied to rectangular arrays of numbers, where to avoid technical difficulties an array has at least three rows and at least three columns. Without loss, an iteration begins with operations on columns: first subtract the mean of each column; then divide by its standard deviation. The iteration continues with the same two operations done successively for rows. These four operations applied in sequence completes one iteration. One then iterates again, and again, and again, ... In [1] it was argued that if arrays are made up of real numbers, then the set for which convergence of these successive iterations fails has Lebesgue measure 0. The limiting array has row and column means 0, row and column standard deviations 1. A basic result on convergence given in [1] is true, though the argument in [1] is faulty. The result is stated in the form of a theorem here, and the argument for the theorem is correct. Moreover, many graphics given in [1] suggest that except for a set of entries of any array with Lebesgue measure 0, convergence is very rapid, eventually exponentially fast in the number of iterations. Because we learned this set of rules from Bradley Efron, we call it “Efron’s algorithm”. More importantly, the rapidity of convergence is illustrated by numerical examples.

  12. European standards for composite construction

    NARCIS (Netherlands)

    Stark, J.W.B.

    2000-01-01

    The European Standards Organisation (CEN) has planned to develop a complete set of harmonized European building standards. This set includes standards for composite steel and concrete buildings and bridges. The Eurocodes, being the design standards, form part of this total system of European

  13. Monitoring the normal body

    DEFF Research Database (Denmark)

    Nissen, Nina Konstantin; Holm, Lotte; Baarts, Charlotte

    2015-01-01

    of practices for monitoring their bodies based on different kinds of calculations of weight and body size, observations of body shape, and measurements of bodily firmness. Biometric measurements are familiar to them as are health authorities' recommendations. Despite not belonging to an extreme BMI category...... provides us with knowledge about how to prevent future overweight or obesity. This paper investigates body size ideals and monitoring practices among normal-weight and moderately overweight people. Methods : The study is based on in-depth interviews combined with observations. 24 participants were...... recruited by strategic sampling based on self-reported BMI 18.5-29.9 kg/m2 and socio-demographic factors. Inductive analysis was conducted. Results : Normal-weight and moderately overweight people have clear ideals for their body size. Despite being normal weight or close to this, they construct a variety...

  14. The exp-normal distribution is infinitely divisible

    OpenAIRE

    Pinelis, Iosif

    2018-01-01

    Let $Z$ be a standard normal random variable (r.v.). It is shown that the distribution of the r.v. $\\ln|Z|$ is infinitely divisible; equivalently, the standard normal distribution considered as the distribution on the multiplicative group over $\\mathbb{R}\\setminus\\{0\\}$ is infinitely divisible.

  15. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  16. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  17. The normal holonomy group

    International Nuclear Information System (INIS)

    Olmos, C.

    1990-05-01

    The restricted holonomy group of a Riemannian manifold is a compact Lie group and its representation on the tangent space is a product of irreducible representations and a trivial one. Each one of the non-trivial factors is either an orthogonal representation of a connected compact Lie group which acts transitively on the unit sphere or it is the isotropy representation of a single Riemannian symmetric space of rank ≥ 2. We prove that, all these properties are also true for the representation on the normal space of the restricted normal holonomy group of any submanifold of a space of constant curvature. 4 refs

  18. Chemical forms of radioiodine

    International Nuclear Information System (INIS)

    Tachikawa, Enzo

    1979-01-01

    Release of radioiodine built-up during reactor operations presents a potential problem from the standpoint of environmental safety. Among the chemical forms of radioiodine, depending upon the circumstances, organic iodides cast a most serious problem because of its difficulties in the trapping and because of its stability compared to other chemical forms. Furthermore, pellet-cladding interaction (PCl) fuel failures in LWR fuel rods are believed to be stress corrosion cracks caused by embrittling fission product species, radioiodine. To deal with these problems, knowledge is required on the chemical behaviors of radioiodine in and out of fuels, as well as the release behaviors from fuels. Here a brief review is given of these respects, in aiming at clearing-up the questions still remaining unknown. The data seem to indicate that radioiodine exists as a combined form in fuels. upon heating slightly irradiated fuels, the iodine atoms are released in a chemical form associated with uranium atoms. Experiments, however, as needed with specimen of higher burnup, where the interactions of radioiodine with metallic fission products could be favored. The dominant release mechanism of radioiodine under normal operating temperatures will be diffusion to grain boundaries leading to open surfaces. Radiation-induced internal traps, however, after the rate of diffusion significantly. The carbon sources of organic iodides formed under various conditions and its formation mechanisms have also been considered. (author)

  19. Normal and Abnormal Behavior in Early Childhood

    OpenAIRE

    Spinner, Miriam R.

    1981-01-01

    Evaluation of normal and abnormal behavior in the period to three years of age involves many variables. Parental attitudes, determined by many factors such as previous childrearing experience, the bonding process, parental psychological status and parental temperament, often influence the labeling of behavior as normal or abnormal. This article describes the forms of crying, sleep and wakefulness, and affective responses from infancy to three years of age.

  20. Advancing Normal Birth: Organizations, Goals, and Research

    OpenAIRE

    Hotelling, Barbara A.; Humenick, Sharron S.

    2005-01-01

    In this column, the support for advancing normal birth is summarized, based on a comparison of the goals of Healthy People 2010, Lamaze International, the Coalition for Improving Maternity Services, and the midwifery model of care. Research abstracts are presented to provide evidence that the midwifery model of care safely and economically advances normal birth. Rates of intervention experienced, as reported in the Listening to Mothers survey, are compared to the forms of care recommended by ...

  1. Normality in Analytical Psychology

    Directory of Open Access Journals (Sweden)

    Steve Myers

    2013-11-01

    Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  2. Medically-enhanced normality

    DEFF Research Database (Denmark)

    Møldrup, Claus; Traulsen, Janine Morgall; Almarsdóttir, Anna Birna

    2003-01-01

    Objective: To consider public perspectives on the use of medicines for non-medical purposes, a usage called medically-enhanced normality (MEN). Method: Examples from the literature were combined with empirical data derived from two Danish research projects: a Delphi internet study and a Telebus...

  3. The Normal Fetal Pancreas.

    Science.gov (United States)

    Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon

    2017-10-01

    The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.

  4. Standard NIM instrumentation system

    International Nuclear Information System (INIS)

    1990-05-01

    NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID-20893 (Rev. 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice

  5. Conceptual Foundations of Improving the Living Standards of Territorial Community Based on the Introduction and Development of New Forms of Innovation and Investment Cooperation Between Regional Authorities and the Local Community

    Directory of Open Access Journals (Sweden)

    Popadynets, V.I.

    2015-03-01

    Full Text Available The main provisions have been formulated for elaboration, implementation and development of the regional target programs based on new methodological and financial resources and realization of sales of the most significant products of regional companies and organizations for the sustainable improvement of the local community living standards.

  6. Normal radiographic findings. 4. act. ed.

    International Nuclear Information System (INIS)

    Moeller, T.B.

    2003-01-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  7. Normal radiographic findings. 4. act. ed.; Roentgennormalbefunde

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, T.B. [Gemeinschaftspraxis fuer Radiologie und Nuklearmedizin, Dillingen (Germany)

    2003-07-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  8. Accounting standards

    NARCIS (Netherlands)

    Stellinga, B.; Mügge, D.

    2014-01-01

    The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed

  9. Standardization Documents

    Science.gov (United States)

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  10. Standard practice for prediction of the long-term behavior of materials, including waste forms, used in engineered barrier systems (EBS) for geological disposal of high-level radioactive waste

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This practice describes test methods and data analyses used to develop models for the prediction of the long-term behavior of materials, such as engineered barrier system (EBS) materials and waste forms, used in the geologic disposal of spent nuclear fuel (SNF) and other high-level nuclear waste in a geologic repository. The alteration behavior of waste form and EBS materials is important because it affects the retention of radionuclides by the disposal system. The waste form and EBS materials provide a barrier to release either directly (as in the case of waste forms in which the radionuclides are initially immobilized), or indirectly (as in the case of containment materials that restrict the ingress of groundwater or the egress of radionuclides that are released as the waste forms and EBS materials degrade). 1.1.1 Steps involved in making such predictions include problem definition, testing, modeling, and model confirmation. 1.1.2 The predictions are based on models derived from theoretical considerat...

  11. Contributor Form

    Directory of Open Access Journals (Sweden)

    Chief Editor

    2014-09-01

    to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4 the right to republish the work in a collection of articles in any other mechanical or electronic format. We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf. All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.S NoAuthors' NamesContribution (IJCME Guidelines{1 substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; 2 drafting the article or revising it critically for important intellectual content; and 3 final approval of the version to be published. Authors should meet conditions 1, 2, and 3}.SignatureDate                              Note: All the authors are required to sign independently in this form in the sequence given above. In case an author has left the institution/country and whose whereabouts are not known, the senior author may sign on his/her behalf taking the responsibility.No addition/deletion/ or any change in the sequence of the authorship will be permissible at a later stage, without valid reasons and permission of the Editor.If the authorship is contested at any stage, the article will be either returned or will not be processed for publication till the issue is solved.Maximum up to 4 authors for short communication and up to 6 authors for original article.

  12. Contributors Form

    Directory of Open Access Journals (Sweden)

    Chief Editor

    2016-06-01

    to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4 the right to republish the work in a collection of articles in any other mechanical or electronic format. We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf. All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.S NoAuthors' NamesContribution (IJCME Guidelines{1 substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; 2 drafting the article or revising it critically for important intellectual content; and 3 final approval of the version to be published. Authors should meet conditions 1, 2, and 3}.SignatureDate                              Note: All the authors are required to sign independently in this form in the sequence given above. In case an author has left the institution/country and whose whereabouts are not known, the senior author may sign on his/her behalf taking the responsibility.No addition/deletion/ or any change in the sequence of the authorship will be permissible at a later stage, without valid reasons and permission of the Editor.If the authorship is contested at any stage, the article will be either returned or will not be processed for publication till the issue is solved.Maximum up to 4 authors for short communication and up to 6 authors for original article.

  13. Confectionery-based dose forms.

    Science.gov (United States)

    Tangso, Kristian J; Ho, Quy Phuong; Boyd, Ben J

    2015-01-01

    Conventional dosage forms such as tablets, capsules and syrups are prescribed in the normal course of practice. However, concerns about patient preferences and market demands have given rise to the exploration of novel unconventional dosage forms. Among these, confectionery-based dose forms have strong potential to overcome compliance problems. This report will review the availability of these unconventional dose forms used in treating the oral cavity and for systemic drug delivery, with a focus on medicated chewing gums, medicated lollipops, and oral bioadhesive devices. The aim is to stimulate increased interest in the opportunities for innovative new products that are available to formulators in this field, particularly for atypical patient populations.

  14. Idiopathic Normal Pressure Hydrocephalus

    Directory of Open Access Journals (Sweden)

    Basant R. Nassar BS

    2016-04-01

    Full Text Available Idiopathic normal pressure hydrocephalus (iNPH is a potentially reversible neurodegenerative disease commonly characterized by a triad of dementia, gait, and urinary disturbance. Advancements in diagnosis and treatment have aided in properly identifying and improving symptoms in patients. However, a large proportion of iNPH patients remain either undiagnosed or misdiagnosed. Using PubMed search engine of keywords “normal pressure hydrocephalus,” “diagnosis,” “shunt treatment,” “biomarkers,” “gait disturbances,” “cognitive function,” “neuropsychology,” “imaging,” and “pathogenesis,” articles were obtained for this review. The majority of the articles were retrieved from the past 10 years. The purpose of this review article is to aid general practitioners in further understanding current findings on the pathogenesis, diagnosis, and treatment of iNPH.

  15. Normal Weight Dyslipidemia

    DEFF Research Database (Denmark)

    Ipsen, David Hojland; Tveden-Nyborg, Pernille; Lykkesfeldt, Jens

    2016-01-01

    Objective: The liver coordinates lipid metabolism and may play a vital role in the development of dyslipidemia, even in the absence of obesity. Normal weight dyslipidemia (NWD) and patients with nonalcoholic fatty liver disease (NAFLD) who do not have obesity constitute a unique subset...... of individuals characterized by dyslipidemia and metabolic deterioration. This review examined the available literature on the role of the liver in dyslipidemia and the metabolic characteristics of patients with NAFLD who do not have obesity. Methods: PubMed was searched using the following keywords: nonobese......, dyslipidemia, NAFLD, NWD, liver, and metabolically obese/unhealthy normal weight. Additionally, article bibliographies were screened, and relevant citations were retrieved. Studies were excluded if they had not measured relevant biomarkers of dyslipidemia. Results: NWD and NAFLD without obesity share a similar...

  16. Ethics and "normal birth".

    Science.gov (United States)

    Lyerly, Anne Drapkin

    2012-12-01

    The concept of "normal birth" has been promoted as ideal by several international organizations, although debate about its meaning is ongoing. In this article, I examine the concept of normalcy to explore its ethical implications and raise a trio of concerns. First, in its emphasis on nonuse of technology as a goal, the concept of normalcy may marginalize women for whom medical intervention is necessary or beneficial. Second, in its emphasis on birth as a socially meaningful event, the mantra of normalcy may unintentionally avert attention to meaning in medically complicated births. Third, the emphasis on birth as a normal and healthy event may be a contributor to the long-standing tolerance for the dearth of evidence guiding the treatment of illness during pregnancy and the failure to responsibly and productively engage pregnant women in health research. Given these concerns, it is worth debating not just what "normal birth" means, but whether the term as an ideal earns its keep. © 2012, Copyright the Authors Journal compilation © 2012, Wiley Periodicals, Inc.

  17. 10 CFR 71.71 - Normal conditions of transport.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Normal conditions of transport. 71.71 Section 71.71 Energy..., Special Form, and LSA-III Tests 2 § 71.71 Normal conditions of transport. (a) Evaluation. Evaluation of each package design under normal conditions of transport must include a determination of the effect on...

  18. Quantitative thallium-201 myocardial exercise scintigraphy in normal subjects and patients with normal coronary arteries

    International Nuclear Information System (INIS)

    Niemeyer, M.G.; St. Antonius Hospital Nieuwegein; Laarman, G.J.; Lelbach, S.; Cramer, M.J.; Ascoop, C.A.P.L.; Verzijlbergen, J.F.; Wall, E.E. van der; Zwinderman, A.H.; Pauwels, E.K.J.

    1990-01-01

    Quantitative thallium-201 myocardial exercise scintigraphy was tested in two patient populations representing alternative standards for cardiac normality: group I comprised 18 male uncatherized patients with a low likelihood of coronary artery disease (CAD); group II contained 41 patients with normal coronary arteriograms. Group I patients were younger, they achieved a higher rate-pressure product than group II patients; all had normal findings by phisical examination and electrocardiography at rest and exercise. Group II patients comprised 21 females, 11 patients showed abnormal electrocardiography at rest, and five patients showed ischemic ST depression during exercise. Twelve patients had sign of minimal CAD. Twelve patients revealed abnormal visual and quantitative thallium findings, three of these patients had minimal CAD. Profiles of uptake and washout of thallium-201 were derived from both patient groups, and compared with normal limits developed by Maddahi et al. Furthermore, low likelihood and angiographically normal patients may differ substantially, and both sets of normal patients should be considered when establishing criteria of abnormality in exercise thallium imaging. When commercial software containing normal limits for quantitative analysis of exercise thallium-201 imaging is used in clinical practice, it is mandatory to compare these with normal limits of uptake and washout of thallium-201, derived from the less heterogeneous group of low-likelihood subjects, which should be used in selecting a normal population to define normality. (author). 37 refs.; 3 figs; 1 tab

  19. 40 CFR 417.166 - Pretreatment standards for new sources.

    Science.gov (United States)

    2010-07-01

    ... GUIDELINES AND STANDARDS SOAP AND DETERGENT MANUFACTURING POINT SOURCE CATEGORY Manufacture of Liquid... standard shall be: (1) For normal liquid detergent operations the following values pertain: Pollutant or...

  20. International Electrotechnical Commission standards and French material control standards

    International Nuclear Information System (INIS)

    Furet, J.; Weill, J.

    1978-01-01

    There are reported the international standards incorporated into the IEC Subcommitee 45 A (Nuclear Reactor Instrumentation) and the national standards elaborated by the Commissariat a l'Energie Atomique, CEA, Group of normalized control equipment, the degree of application of those being reported on the base design, call of bids and exploitation of nuclear power plants. (J.E. de C)

  1. Comparative waste forms study

    International Nuclear Information System (INIS)

    Wald, J.W.; Lokken, R.O.; Shade, J.W.; Rusin, J.M.

    1980-12-01

    A number of alternative process and waste form options exist for the immobilization of nuclear wastes. Although data exists on the characterization of these alternative waste forms, a straightforward comparison of product properties is difficult, due to the lack of standardized testing procedures. The characterization study described in this report involved the application of the same volatility, mechanical strength and leach tests to ten alternative waste forms, to assess product durability. Bulk property, phase analysis and microstructural examination of the simulated products, whose waste loading varied from 5% to 100% was also conducted. The specific waste forms investigated were as follows: Cold Pressed and Sintered PW-9 Calcine; Hot Pressed PW-9 Calcine; Hot Isostatic Pressed PW-9 Calcine; Cold Pressed and Sintered SPC-5B Supercalcine; Hot Isostatic pressed SPC-5B Supercalcine; Sintered PW-9 and 50% Glass Frit; Glass 76-68; Celsian Glass Ceramic; Type II Portland Cement and 10% PW-9 Calcine; and Type II Portland Cement and 10% SPC-5B Supercalcine. Bulk property data were used to calculate and compare the relative quantities of waste form volume produced at a spent fuel processing rate of 5 metric ton uranium/day. This quantity ranged from 3173 L/day (5280 Kg/day) for 10% SPC-5B supercalcine in cement to 83 L/day (294 Kg/day) for 100% calcine. Mechanical strength, volatility, and leach resistance tests provide data related to waste form durability. Glass, glass-ceramic and supercalcine ranked high in waste form durability where as the 100% PW-9 calcine ranked low. All other materials ranked between these two groupings

  2. Communications standards

    CERN Document Server

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  3. Masturbation, sexuality, and adaptation: normalization in adolescence.

    Science.gov (United States)

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  4. Training Standardization

    International Nuclear Information System (INIS)

    Agnihotri, Newal

    2003-01-01

    The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation

  5. Strength of Gamma Rhythm Depends on Normalization

    Science.gov (United States)

    Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.

    2013-01-01

    Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427

  6. Use of normalized total dose to represent the biological effect of fractionated radiotherapy

    International Nuclear Information System (INIS)

    Flickinger, J.C.; Kalend, A.

    1990-01-01

    There are currently a number of radiobiological models to account for the effects of dose fractionation and time. Normalized total dose (NTD) is not another new model but is a previously reported, clinically useful form in which to represent the biological effect, determined by any specific radiobiological dose-fractionation model, of a course of radiation using a single set of standardized, easily understood terminology. The generalized form of NTD reviewed in this paper describes the effect of a course of radiotherapy administered with nonstandard fractionation as the total dose of radiation in Gy that could be administered with a given reference fractionation such as 2 Gy per fraction, 5 fractions per week that would produce an equivalent biological effect (probability of complications or tumor control) as predicted by a given dose-fractionation formula. The use of normalized total dose with several different exponential and linear-quadratic dose-fraction formulas is presented. (author). 51 refs.; 1 fig.; 1 tab

  7. Use of normalized total dose to represent the biological effect of fractionated radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Flickinger, J C; Kalend, A [Pittsburgh University School of Medicine (USA). Department of Radiation Oncology Pittsburg Cancer Institute (USA)

    1990-03-01

    There are currently a number of radiobiological models to account for the effects of dose fractionation and time. Normalized total dose (NTD) is not another new model but is a previously reported, clinically useful form in which to represent the biological effect, determined by any specific radiobiological dose-fractionation model, of a course of radiation using a single set of standardized, easily understood terminology. The generalized form of NTD reviewed in this paper describes the effect of a course of radiotherapy administered with nonstandard fractionation as the total dose of radiation in Gy that could be administered with a given reference fractionation such as 2 Gy per fraction, 5 fractions per week that would produce an equivalent biological effect (probability of complications or tumor control) as predicted by a given dose-fractionation formula. The use of normalized total dose with several different exponential and linear-quadratic dose-fraction formulas is presented. (author). 51 refs.; 1 fig.; 1 tab.

  8. 78 FR 63036 - Transmission Planning Reliability Standards

    Science.gov (United States)

    2013-10-23

    ... Reliability Standards for the Bulk Power System, 130 FERC ] 61,200 (2010). \\8\\ Mandatory Reliability Standards... electric system operations across normal and contingency conditions. We also find that Reliability Standard... Reliability Standards for the Bulk Power System, 131 FERC ] 61,231 at P 21. Comments 24. NERC supports the...

  9. Effluent standards

    Energy Technology Data Exchange (ETDEWEB)

    Geisler, G C [Pennsylvania State University (United States)

    1974-07-01

    At the conference there was a considerable interest in research reactor standards and effluent standards in particular. On the program, this is demonstrated by the panel discussion on effluents, the paper on argon 41 measured by Sims, and the summary paper by Ringle, et al. on the activities of ANS research reactor standards committee (ANS-15). As a result, a meeting was organized to discuss the proposed ANS standard on research reactor effluents (15.9). This was held on Tuesday evening, was attended by members of the ANS-15 committee who were present at the conference, participants in the panel discussion on the subject, and others interested. Out of this meeting came a number of excellent suggestions for changes which will increase the utility of the standard, and a strong recommendation that the effluent standard (15.9) be combined with the effluent monitoring standard. It is expected that these suggestions and recommendations will be incorporated and a revised draft issued for comment early this summer. (author)

  10. Nuclear standards

    International Nuclear Information System (INIS)

    Fichtner, N.; Becker, K.; Bashir, M.

    1981-01-01

    This compilation of all nuclear standards available to the authors by mid 1980 represents the third, carefully revised edition of a catalogue which was first published in 1975 as EUR 5362. In this third edition several changes have been made. The title has been condensed. The information has again been carefully up-dated, covering all changes regarding status, withdrawal of old standards, new projects, amendments, revisions, splitting of standards into several parts, combination of several standards into one, etc., as available to the authors by mid 1980. The speed with which information travels varies and requires in many cases rather tedious and cumbersome inquiries. Also, the classification scheme has been revised with the goal of better adjustment to changing situations and priorities. Whenever it turned out to be difficult to attribute a standard to a single subject category, multiple listings in all relevant categories have been made. As in previous editions, within the subcategories the standards are arranged by organization (in Categorie 2.1 by country) alphabetically and in ascending numerical order. It covers all relevant areas of power reactors, the fuel cycle, radiation protection, etc., from the basic laws and governmental regulations, regulatory guides, etc., all the way to voluntary industrial standards and codes of pratice. (orig./HP)

  11. Densified waste form and method for forming

    Science.gov (United States)

    Garino, Terry J.; Nenoff, Tina M.; Sava Gallis, Dorina Florentina

    2015-08-25

    Materials and methods of making densified waste forms for temperature sensitive waste material, such as nuclear waste, formed with low temperature processing using metallic powder that forms the matrix that encapsulates the temperature sensitive waste material. The densified waste form includes a temperature sensitive waste material in a physically densified matrix, the matrix is a compacted metallic powder. The method for forming the densified waste form includes mixing a metallic powder and a temperature sensitive waste material to form a waste form precursor. The waste form precursor is compacted with sufficient pressure to densify the waste precursor and encapsulate the temperature sensitive waste material in a physically densified matrix.

  12. 29 CFR 1904.29 - Forms.

    Science.gov (United States)

    2010-07-01

    ... OSHA 300 Log. Instead, enter “privacy case” in the space normally used for the employee's name. This...) Basic requirement. You must use OSHA 300, 300-A, and 301 forms, or equivalent forms, for recordable injuries and illnesses. The OSHA 300 form is called the Log of Work-Related Injuries and Illnesses, the 300...

  13. MATE standardization

    Science.gov (United States)

    Farmer, R. E.

    1982-11-01

    The MATE (Modular Automatic Test Equipment) program was developed to combat the proliferation of unique, expensive ATE within the Air Force. MATE incorporates a standard management approach and a standard architecture designed to implement a cradle-to-grave approach to the acquisition of ATE and to significantly reduce the life cycle cost of weapons systems support. These standards are detailed in the MATE Guides. The MATE Guides assist both the Air Force and Industry in implementing the MATE concept, and provide the necessary tools and guidance required for successful acquisition of ATE. The guides also provide the necessary specifications for industry to build MATE-qualifiable equipment. The MATE architecture provides standards for all key interfaces of an ATE system. The MATE approach to the acquisition and management of ATE has been jointly endorsed by the commanders of Air Force Systems Command and Air Force Logistics Command as the way of doing business in the future.

  14. Standardization of Sign Languages

    Science.gov (United States)

    Adam, Robert

    2015-01-01

    Over the years attempts have been made to standardize sign languages. This form of language planning has been tackled by a variety of agents, most notably teachers of Deaf students, social workers, government agencies, and occasionally groups of Deaf people themselves. Their efforts have most often involved the development of sign language books…

  15. IMPLICATIONS OF STANDARDIZATION AND HARMONIZATION OF ACCOUNTING FOR ROMANIA

    OpenAIRE

    Mihaela Cristina Onica; Neculina Chebac

    2008-01-01

    Accountancy normalization supposes rules and accountancy standards making, rules which are accomplished in the way of common denominator concerning the way of action and implementation in order to realize a content comparison and accountancy in formations approach. Accountancy normalization process is structured in two main areas: national accountancy normalization and international accountancy normalization. Though international accountancy normalization concept has to be realized a differen...

  16. Normalization of emotion control scale

    Directory of Open Access Journals (Sweden)

    Hojatoolah Tahmasebian

    2014-09-01

    Full Text Available Background: Emotion control skill teaches the individuals how to identify their emotions and how to express and control them in various situations. The aim of this study was to normalize and measure the internal and external validity and reliability of emotion control test. Methods: This standardization study was carried out on a statistical society, including all pupils, students, teachers, nurses and university professors in Kermanshah in 2012, using Williams’ emotion control scale. The subjects included 1,500 (810 females and 690 males people who were selected by stratified random sampling. Williams (1997 emotion control scale, was used to collect the required data. Emotional Control Scale is a tool for measuring the degree of control people have over their emotions. This scale has four subscales, including anger, depressed mood, anxiety and positive affect. The collected data were analyzed by SPSS software using correlation and Cronbach's alpha tests. Results: The results of internal consistency of the questionnaire reported by Cronbach's alpha indicated an acceptable internal consistency for emotional control scale, and the correlation between the subscales of the test and between the items of the questionnaire was significant at 0.01 confidence level. Conclusion: The validity of emotion control scale among the pupils, students, teachers, nurses and teachers in Iran has an acceptable range, and the test itemswere correlated with each other, thereby making them appropriate for measuring emotion control.

  17. Comparative and quantitative determination of total hemoglobin concentration in normal and psoriatic patients

    International Nuclear Information System (INIS)

    Mahesar, S.M.; Dahot, M.U.; Khuhawar, M.Y.; Mahesar, H.U.

    2004-01-01

    The cyanmethaemoglobin technique is now recommended as the standard method by International Committee for Standardization in Hematology and British Standards Institution 1966. The hemoglobin is treated with reagent containing potassium ferricyanide, Potassium cyanide and potassium dihydrogen phosphate. The ferricyanide forms methamoglobin which is converted to cyanmethaemoglobin by the cyanide. The average values of hemoglobin, percent determined from the blood samples of normal and psoriatic (n=44) males and (n=35) females were 15.0, 12.7, 13.6 and 11.2 g/100ml. The decrease in hemoglobin concentration could be due to anemia resulting during the cell proliferation epidermis in inflammatory state and Keratolytic disorder which take place in psoriasis. (author)

  18. Frequency standards

    CERN Document Server

    Riehle, Fritz

    2006-01-01

    Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards

  19. Biochemical response of normal albino rats to the addition of ...

    African Journals Online (AJOL)

    Experiments were conducted to determine the biochemical effect of Hibiscus cannabinus and Murraya koenigii extracts on normal albino rats using standard methods. Analyses carried out indicated that the aqueous leaf extract of H. cannabinus and M. koenigii exhibited significant hypolipideamic activity in normal rats.

  20. Short proofs of strong normalization

    OpenAIRE

    Wojdyga, Aleksander

    2008-01-01

    This paper presents simple, syntactic strong normalization proofs for the simply-typed lambda-calculus and the polymorphic lambda-calculus (system F) with the full set of logical connectives, and all the permutative reductions. The normalization proofs use translations of terms and types to systems, for which strong normalization property is known.

  1. Weak convergence and uniform normalization in infinitary rewriting

    DEFF Research Database (Denmark)

    Simonsen, Jakob Grue

    2010-01-01

    the starkly surprising result that for any orthogonal system with finitely many rules, the system is weakly normalizing under weak convergence if{f} it is strongly normalizing under weak convergence if{f} it is weakly normalizing under strong convergence if{f} it is strongly normalizing under strong...... convergence. As further corollaries, we derive a number of new results for weakly convergent rewriting: Systems with finitely many rules enjoy unique normal forms, and acyclic orthogonal systems are confluent. Our results suggest that it may be possible to recover some of the positive results for strongly...

  2. Normalized Excited Squeezed Vacuum State and Its Applications

    International Nuclear Information System (INIS)

    Meng Xiangguo; Wang Jisuo; Liang Baolong

    2007-01-01

    By using the intermediate coordinate-momentum representation in quantum optics and generating function for the normalization of the excited squeezed vacuum state (ESVS), the normalized ESVS is obtained. We find that its normalization constants obtained via two new methods are uniform and a new form which is different from the result obtained by Zhang and Fan [Phys. Lett. A 165 (1992) 14]. By virtue of the normalization constant of the ESVS and the intermediate coordinate-momentum representation, the tomogram of the normalized ESVS and some useful formulae are derived.

  3. An echocardiographic study of healthy Border Collies with normal reference ranges for the breed.

    Science.gov (United States)

    Jacobson, Jake H; Boon, June A; Bright, Janice M

    2013-06-01

    The objectives of this study were to obtain standard echocardiographic measurements from healthy Border Collies and to compare these measurements to those previously reported for a general population of dogs. Standard echocardiographic data were obtained from twenty apparently healthy Border Collie dogs. These data (n = 20) were compared to data obtained from a general population of healthy dogs (n = 69). Border Collies were deemed healthy based on normal history, physical examination, complete blood count, serum biochemical profile, electrocardiogram, and blood pressure, with no evidence of congenital or acquired heart disease on echocardiographic examination. Standard two dimensional, M-mode, and Doppler echocardiographic measurements were obtained and normal ranges determined. The data were compared to data previously obtained at our hospital from a general population of normal dogs. Two dimensional, M-mode, and Doppler reference ranges for healthy Border Collies are presented in tabular form. Comparison of the weight adjusted M-mode echocardiographic means from Border Collies to those from the general population of dogs showed Border Collies to have larger left ventricular systolic and diastolic dimensions, smaller interventricular septal thickness, and lower fractional shortening. There are differences in some echocardiographic parameters between healthy Border Collies and the general dog population, and the echocardiographic reference ranges provided in this study should be used as breed specific reference values for Border Collies. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Relevant Standards

    Indian Academy of Sciences (India)

    .86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...

  5. Achieving Standardization

    DEFF Research Database (Denmark)

    Henningsson, Stefan

    2014-01-01

    International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...

  6. Achieving Standardization

    DEFF Research Database (Denmark)

    Henningsson, Stefan

    2016-01-01

    International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...

  7. Standard Fortran

    International Nuclear Information System (INIS)

    Marshall, N.H.

    1981-01-01

    Because of its vast software investment in Fortran programs, the nuclear community has an inherent interest in the evolution of Fortran. This paper reviews the impact of the new Fortran 77 standard and discusses the projected changes which can be expected in the future

  8. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The normal range of condylar movement

    International Nuclear Information System (INIS)

    Choe, Han Up; Park, Tae Won

    1978-01-01

    The purpose of this study was to investigate the normal range of condylar movement of normal adults. The author gas observed roentgenographic images of four serial positions of condylar head taken by modified transcranial lateral oblique projection. The serial positions are centric occlusion, rest position, 1 inch open position and maximal open position. The results were obtained as follow; 1. Inter-incisal distance was 46.85 mm in maximal open position. 2. The length between the deepest point of glenoid fossa and summit of condylar head in rest position was wider than that in centric occlusion by 0.8 mm. 3. In 1 inch open position, condylar head moved forward from the standard line in 12.64 mm of horizontal direction and moved downwards from the standard line in 1.84 mm of vertical direction. 4. In maximal open position, condylar head moved forward from the standard line in 19.06 mm of horizontal direction and moved downwards from the standard line in 0.4 mm of vertical direction. 5. In centric occlusion, the width between glenoid fossa and margin of condylar head was greater in the posterior portion than in the anterior portion by 0.4 mm. 6. Except for estimated figures of 1 inch open position, all of the estimated figures was greater in male than in female.

  10. About the principles of radiation level normalization

    International Nuclear Information System (INIS)

    Nosovskij, A.V.

    2000-01-01

    The paper highlights the impact being made by the radiation level normalization principles upon the social and economic indicators. The newly introduced radiation safety standards - 97 are taken as an example. It is emphasized that it is necessary to use a sound approach while defining radiation protection standards, taking into consideration economic and social factors existing in Ukraine at the moment. Based on the concept of the natural radiation background and available results of the epidemiological surveys, the dose limits are proposed for the radiation protection standards. The paper gives a description of the dose limitation system recommended by the International Committee for Radiation Protection. The paper highlights a negative impact of the line non threshold concept, lack of special knowledge in the medical service and mass media to make decisions to protect people who suffered from the Chernobyl accident

  11. Behavioral finance: Finance with normal people

    Directory of Open Access Journals (Sweden)

    Meir Statman

    2014-06-01

    Behavioral finance substitutes normal people for the rational people in standard finance. It substitutes behavioral portfolio theory for mean-variance portfolio theory, and behavioral asset pricing model for the CAPM and other models where expected returns are determined only by risk. Behavioral finance also distinguishes rational markets from hard-to-beat markets in the discussion of efficient markets, a distinction that is often blurred in standard finance, and it examines why so many investors believe that it is easy to beat the market. Moreover, behavioral finance expands the domain of finance beyond portfolios, asset pricing, and market efficiency and is set to continue that expansion while adhering to the scientific rigor introduced by standard finance.

  12. Visual attention and flexible normalization pools

    Science.gov (United States)

    Schwartz, Odelia; Coen-Cagli, Ruben

    2013-01-01

    Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413

  13. Selective attention in normal and impaired hearing.

    Science.gov (United States)

    Shinn-Cunningham, Barbara G; Best, Virginia

    2008-12-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.

  14. Basic characterization of normal multifocal electroretinogram

    International Nuclear Information System (INIS)

    Fernandez Cherkasova, Lilia; Rojas Rondon, Irene; Castro Perez, Pedro Daniel; Lopez Felipe, Daniel; Santiesteban Freixas, Rosaralis; Mendoza Santiesteban, Carlos E

    2008-01-01

    A scientific literature review was made on the novel multifocal electroretinogram technique, the involved cell mechanisms and some of the factors modifying its results together with the form of presentation. The basic characteristics of this electrophysiological record obtained from several regions of the retina of normal subjects is important in order to create at a small scale a comparative database to evaluate pathological eye tracing. All this will greatly help in early less invasive electrodiagnosis of localized retinal lesions. (Author)

  15. Neutron scattering by normal liquids

    Energy Technology Data Exchange (ETDEWEB)

    Gennes, P.G. de [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1961-07-01

    Neutron data on motions in normal liquids well below critical point are reviewed and classified according to the order of magnitude of momentum transfers {Dirac_h}q and energy transfers {Dirac_h}w. For large momentum transfers a perfect gas model is valid. For smaller q and incoherent scattering, the major effects are related to the existence of two characteristic times: the period of oscillation of an atom in its cell, and the average lifetime of the atom in a definite cell. Various interpolation schemes covering both time scales are discussed. For coherent scattering and intermediate q, the energy spread is expected to show a minimum whenever q corresponds to a diffraction peak. For very small q the standard macroscopic description of density fluctuations is applicable. The limits of the various (q) and (w) domains and the validity of various approximations are discussed by a method of moments. The possibility of observing discrete transitions due to internal degrees of freedom in polyatomic molecules, in spite of the 'Doppler width' caused by translational motions, is also examined. (author) [French] L'auteur examine les donnees neutroniques sur les mouvements dans les liquides normaux, bien au-dessous du point critique, et les classe d'apres l'ordre de grandeur des transferts de quantite de mouvement {Dirac_h}q et des transferts d'energie {Dirac_h}w. Pour les grands transferts de, quantite de mouvement, un modele de gaz parfait est valable. En ce qui concerne les faibles valeurs de q et la diffussion incoherente, les principaux effets sont lies a l'existence de deux temps caracteristiques: la periode d'oscillation d'un atome dans sa cellule et la duree moyenne de vie de l'atome dans une cellule determinee. L'auteur etudie divers systemes d'interpolation se rapportant aux deux echelles de temps. Pour la diffusion coherente et les valeurs intermediaires de q, on presume que le spectre d'energie accuse un minimum chaque fois que q correspond a un pic de

  16. Fusion and normalization to enhance anomaly detection

    Science.gov (United States)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  17. A compact fiber optics-based heterodyne combined normal and transverse displacement interferometer.

    Science.gov (United States)

    Zuanetti, Bryan; Wang, Tianxue; Prakash, Vikas

    2017-03-01

    While Photonic Doppler Velocimetry (PDV) has become a common diagnostic tool for the measurement of normal component of particle motion in shock wave experiments, this technique has not yet been modified for the measurement of combined normal and transverse motion, as needed in oblique plate impact experiments. In this paper, we discuss the design and implementation of a compact fiber-optics-based heterodyne combined normal and transverse displacement interferometer. Like the standard PDV, this diagnostic tool is assembled using commercially available telecommunications hardware and uses a 1550 nm wavelength 2 W fiber-coupled laser, an optical focuser, and single mode fibers to transport light to and from the target. Two additional optical probes capture first-order beams diffracted from a reflective grating at the target free-surface and deliver the beams past circulators and a coupler where the signal is combined to form a beat frequency. The combined signal is then digitized and analyzed to determine the transverse component of the particle motion. The maximum normal velocity that can be measured by this system is limited by the equivalent transmission bandwidth (3.795 GHz) of the combined detector, amplifier, and digitizer and is estimated to be ∼2.9 km/s. Sample symmetric oblique plate-impact experiments are performed to demonstrate the capability of this diagnostic tool in the measurement of the combined normal and transverse displacement particle motion.

  18. Normalization in Lie algebras via mould calculus and applications

    Science.gov (United States)

    Paul, Thierry; Sauzin, David

    2017-11-01

    We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.

  19. Harmonic Maass forms and mock modular forms

    CERN Document Server

    Bringmann, Kathrin; Ono, Ken

    2017-01-01

    Modular forms and Jacobi forms play a central role in many areas of mathematics. Over the last 10-15 years, this theory has been extended to certain non-holomorphic functions, the so-called "harmonic Maass forms". The first glimpses of this theory appeared in Ramanujan's enigmatic last letter to G. H. Hardy written from his deathbed. Ramanujan discovered functions he called "mock theta functions" which over eighty years later were recognized as pieces of harmonic Maass forms. This book contains the essential features of the theory of harmonic Maass forms and mock modular forms, together with a wide variety of applications to algebraic number theory, combinatorics, elliptic curves, mathematical physics, quantum modular forms, and representation theory.

  20. How delusion is formed?

    Science.gov (United States)

    Park, Jong Suk; Kang, Ung Gu

    2016-02-01

    Traditionally, delusions have been considered to be the products of misinterpretation and irrationality. However, some theorists have argued that delusions are normal or rational cognitive responses to abnormal experiences. That is, when a recently experienced peculiar event is more plausibly explained by an extraordinary hypothesis, confidence in the veracity of this extraordinary explanation is reinforced. As the number of such experiences, driven by the primary disease process in the perceptual domain, increases, this confidence builds and solidifies, forming a delusion. We tried to understand the formation of delusions using a simulation based on Bayesian inference. We found that (1) even if a delusional explanation is only marginally more plausible than a non-delusional one, the repetition of the same experience results in a firm belief in the delusion. (2) The same process explains the systematization of delusions. (3) If the perceived plausibility of the explanation is not consistent but varies over time, the development of a delusion is delayed. Additionally, this model may explain why delusions are not corrected by persuasion or rational explanation. This Bayesian inference perspective can be considered a way to understand delusions in terms of rational human heuristics. However, such experiences of "rationality" can lead to irrational conclusions, depending on the characteristics of the subject. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. 32 CFR 2001.80 - Prescribed standard forms.

    Science.gov (United States)

    2010-07-01

    .... The national stock number of the SF 702 is 7540-01-213-7900. (6) SF 703, TOP SECRET Cover Sheet: The SF 703 serves as a shield to protect Top Secret classified information from inadvertent disclosure and to alert observers that Top Secret information is attached to it. If an agency determines, as part...

  2. 41 CFR 101-26.302 - Standard and optional forms.

    Science.gov (United States)

    2010-07-01

    ... been approved by GSA (KMPS) to be stocked and distributed by the promulgating agency or to be... purposes of economy, existing stocks are depleted prior to issuance of revisions unless the promulgating...

  3. On standard forms for transport equations and fluxes: Part 2

    International Nuclear Information System (INIS)

    Ross, D.W.

    1990-03-01

    Quasilinear expressions for anomalous particle and energy fluxes arising from electrostatic plasma turbulence in a tokamak are reviewed yet again. Further clarifications are made, and the position taken in a previous report is modified. There, the total energy flux, Q j , and the conductive heat flux, q j , were correctly defined, and the anomalous Q j was correctly calculated. It was shown that the anomalous energy transport can be correctly described by ∇·Q* j , where Q* j = 3/5 Q j , with all remaining source terms such as left-angle p j ∇·Vj} cancelling. Here, a revised discussion is given of the identification of the anomalous conductive flux, q j , in which the distinction between Q j and Q* j is reconsidered. It is shown that there is more than one consistent way to define q j . Transport calculations involving only theoretical electrostatic turbulent fluxes are unaffected by these distinctions since Q j or Q* j , rather than q j , is the quantity naturally calculated in the theory. However, an ambiguity remains in experimental transport analysis if the measured particle flux Γ j = n j V j is to be used in the energy equation. This is because we cannot be sure how properly to treat the source terms p j ∇·V j or { p j ∇·V j }. 17 refs

  4. 78 FR 65489 - Standard Claims and Appeals Forms

    Science.gov (United States)

    2013-10-31

    ....regulations.gov . FOR FURTHER INFORMATION CONTACT: Stephanie Caucutt Li, Chief, Regulations Staff (211D... rule enunciated in the main text of paragraphs (a) and (b) applies in certain scenarios. A...

  5. Group normalization for genomic data.

    Science.gov (United States)

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  6. Group normalization for genomic data.

    Directory of Open Access Journals (Sweden)

    Mahmoud Ghandi

    Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  7. The metabolomics standards initiative (MSI)

    NARCIS (Netherlands)

    Fiehn, O.; Robertson, D.; Griffin, J.; Werf, M. van der; Nikolau, B.; Morrison, N.; Sumner, L.W.; Goodacre, R.; Hardy, N.W.; Taylor, C.; Fostel, J.; Kristal, B.; Kaddurah-Daouk, R.; Mendes, P.; Ommen, B. van; Lindon, J.C.; Sansone, S.-A.

    2007-01-01

    In 2005, the Metabolomics Standards Initiative has been formed. An outline and general introduction is provided to inform about the history, structure, working plan and intentions of this initiative. Comments on any of the suggested minimal reporting standards are welcome to be sent to the open

  8. Standard model without Higgs particles

    International Nuclear Information System (INIS)

    Kovalenko, S.G.

    1992-10-01

    A modification of the standard model of electroweak interactions with the nonlocal Higgs sector is proposed. Proper form of nonlocality makes Higgs particles unobservable after the electroweak symmetry breaking. They appear only as a virtual state because their propagator is an entire function. We discuss some specific consequences of this approach comparing it with the conventional standard model. (author). 12 refs

  9. Towards an international address standard

    CSIR Research Space (South Africa)

    Coetzee, S

    2008-02-01

    Full Text Available in a better user experience. Standards compliance allows for the separation of concerns: HTML for content, Cascading Style Sheets (CSS) for presentation and JavaScript for dynamic behaviour. Standards compliant documents are also...) and cascading style sheets through CSS (CSS n.d.), whilst the JavaScript specification has been standardised by Ecma International (another standards organisation for information and communication systems), in the form of EcmaScript (Ecma...

  10. General philosophy of safety standards

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1987-01-01

    Safety standards should be related to the form and magnitude of the risk they aim to limit. Because of the lack of direct information at the exposure levels experienced, radiation protection standards have to be based on risk assumptions that, while plausible, are not proven. The pressure for standards has come as much from public perceptions and fears as from the reality of the risk. (author)

  11. On good ETOL forms

    DEFF Research Database (Denmark)

    Skyum, Sven

    1978-01-01

    This paper continues the study of ETOL forms and good EOL forms done by Maurer, Salomaa and Wood. It is proven that binary very complete ETOL forms exist, good synchronized ETOL forms exist and that no propagating or synchronized ETOL form can be very complete.......This paper continues the study of ETOL forms and good EOL forms done by Maurer, Salomaa and Wood. It is proven that binary very complete ETOL forms exist, good synchronized ETOL forms exist and that no propagating or synchronized ETOL form can be very complete....

  12. Hemoglobin levels in normal Filipino pregnant women.

    Science.gov (United States)

    Kuizon, M D; Natera, M G; Ancheta, L P; Platon, T P; Reyes, G D; Macapinlac, M P

    1981-09-01

    The hemoglobin concentrations during pregnancy in Filipinos belonging to the upper income group, who were prescribed 105 mg elemental iron daily, and who had acceptable levels of transferrin saturation, were examined in an attempt to define normal levels. The hemoglobin concentrations for each trimester followed a Gaussian distribution. The hemoglobin values equal to the mean minus one standard deviation were 11.4 gm/dl for the first trimester and 10.4 gm/dl for the second and third trimesters. Using these values as the lower limits of normal, in one group of pregnant women the prevalence of anemia during the last two trimesters was found lower than that obtained when WHO levels for normal were used. Groups of women with hemoglobin of 10.4 to 10.9 gm/dl (classified anemic by WHO criteria but normal in the present study) and those with 11.0 gm/dl and above could not be distinguished on the basis of their serum ferritin levels nor on the degree of decrease in their hemoglobin concentration during pregnancy. Many subjects in both groups, however, had serum ferritin levels less than 12 ng/ml which indicate poor iron stores. It might be desirable in future studies to determine the hemoglobin cut-off point that will delineate subjects who are both non-anemic and adequate in iron stores using serum ferritin levels as criterion for the latter.

  13. ASSESSMENT OF SELECTED PROPERTIES OF NORMAL CONCRETES WITH THE GRINDED RUBBER FROM WORN OUT VEHICLE TYRES

    Directory of Open Access Journals (Sweden)

    Ewa Ołdakowska

    2015-07-01

    Full Text Available Rubber from the worn tyres is associated with a useless material, strenuous for environment, whose most popular recovery method until recently was storage (currently forbidden by law. The adoption and dissemination of new ecological standards, created not only by the European and national legislation, but also developing as a result of expanding ecological consciousness, forces the necessity of seeking efficient methods of utilization of the vehicle tyres. The exemplary solution for the problem of tyres withdrawn from the operation, presented in the article, is using them in the grinded form as a substitute for the natural aggregate for the production of normal concrete. The article presents the results of the tests of selected properties of the modified normal concrete, upon the basis of which it has been found that the rubber causes decrease of compression strength, concrete weight, limits water absorbability, and does not influence significantly the physical and chemical phenomena accompanying the composite structure formation.

  14. CT of Normal Developmental and Variant Anatomy of the Pediatric Skull: Distinguishing Trauma from Normality.

    Science.gov (United States)

    Idriz, Sanjin; Patel, Jaymin H; Ameli Renani, Seyed; Allan, Rosemary; Vlahos, Ioannis

    2015-01-01

    The use of computed tomography (CT) in clinical practice has been increasing rapidly, with the number of CT examinations performed in adults and children rising by 10% per year in England. Because the radiology community strives to reduce the radiation dose associated with pediatric examinations, external factors, including guidelines for pediatric head injury, are raising expectations for use of cranial CT in the pediatric population. Thus, radiologists are increasingly likely to encounter pediatric head CT examinations in daily practice. The variable appearance of cranial sutures at different ages can be confusing for inexperienced readers of radiologic images. The evolution of multidetector CT with thin-section acquisition increases the clarity of some of these sutures, which may be misinterpreted as fractures. Familiarity with the normal anatomy of the pediatric skull, how it changes with age, and normal variants can assist in translating the increased resolution of multidetector CT into more accurate detection of fractures and confident determination of normality, thereby reducing prolonged hospitalization of children with normal developmental structures that have been misinterpreted as fractures. More important, the potential morbidity and mortality related to false-negative interpretation of fractures as normal sutures may be avoided. The authors describe the normal anatomy of all standard pediatric sutures, common variants, and sutural mimics, thereby providing an accurate and safe framework for CT evaluation of skull trauma in pediatric patients. (©)RSNA, 2015.

  15. Normal anatomical measurements in cervical computerized tomography

    International Nuclear Information System (INIS)

    Zaunbauer, W.; Daepp, S.; Haertel, M.

    1985-01-01

    Radiodiagnostically relevant normal values and variations for measurements of the cervical region, the arithmetical average and the standard deviation were determined from adequate computer tomograms on 60 healthy women and men, aged 20 to 83 years. The sagittal diameter of the prevertebral soft tissue and the lumina of the upper respiratory tract were evaluated at exactly defined levels between the hyoid bone and the incisura jugularis sterni. - The thickness of the aryepiglottic folds, the maximal sagittal and transverse diameters of the thyroid gland and the calibre of the great cervical vessels were defined. - To assess information about laryngeal function in computerized tomography, measurements of distances between the cervical spine and anatomical fixed points of the larynx and hypopharynx were made as well as of the degree of vocal cord movement during normal respiration and phonation. (orig.) [de

  16. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    Science.gov (United States)

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The

  17. Normal matter storage of antiprotons

    International Nuclear Information System (INIS)

    Campbell, L.J.

    1987-01-01

    Various simple issues connected with the possible storage of anti p in relative proximity to normal matter are discussed. Although equilibrium storage looks to be impossible, condensed matter systems are sufficiently rich and controllable that nonequilibrium storage is well worth pursuing. Experiments to elucidate the anti p interactions with normal matter are suggested. 32 refs

  18. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  19. Standard Model

    CERN Multimedia

    Dominguez, Daniel

    2015-01-01

    All matter around us is made of elementary particles, the building blocks of matter. These particles occur in two basic types called quarks and leptons. Each group consists of six particles, which are related in pairs, or “generations”. The lightest and most stable particles make up the first generation, whereas the heavier and less stable particles belong to the second and third generations. All stable matter in the universe is made from particles that belong to the first generation; any heavier particles quickly decay to the next most stable level. The six quarks are paired in the three generations – the “up quark” and the “down quark” form the first generation, followed by the “charm quark” and “strange quark”, then the “top quark” and “bottom (or beauty) quark”. Quarks also come in three different “colours” and only mix in such ways as to form colourless objects. The six leptons are similarly arranged in three generations – the “electron” and the “electron neutrin...

  20. Normalizing acronyms and abbreviations to aid patient understanding of clinical texts: ShARe/CLEF eHealth Challenge 2013, Task 2.

    Science.gov (United States)

    Mowery, Danielle L; South, Brett R; Christensen, Lee; Leng, Jianwei; Peltonen, Laura-Maria; Salanterä, Sanna; Suominen, Hanna; Martinez, David; Velupillai, Sumithra; Elhadad, Noémie; Savova, Guergana; Pradhan, Sameer; Chapman, Wendy W

    2016-07-01

    The ShARe/CLEF eHealth challenge lab aims to stimulate development of natural language processing and information retrieval technologies to aid patients in understanding their clinical reports. In clinical text, acronyms and abbreviations, also referenced as short forms, can be difficult for patients to understand. For one of three shared tasks in 2013 (Task 2), we generated a reference standard of clinical short forms normalized to the Unified Medical Language System. This reference standard can be used to improve patient understanding by linking to web sources with lay descriptions of annotated short forms or by substituting short forms with a more simplified, lay term. In this study, we evaluate 1) accuracy of participating systems' normalizing short forms compared to a majority sense baseline approach, 2) performance of participants' systems for short forms with variable majority sense distributions, and 3) report the accuracy of participating systems' normalizing shared normalized concepts between the test set and the Consumer Health Vocabulary, a vocabulary of lay medical terms. The best systems submitted by the five participating teams performed with accuracies ranging from 43 to 72 %. A majority sense baseline approach achieved the second best performance. The performance of participating systems for normalizing short forms with two or more senses with low ambiguity (majority sense greater than 80 %) ranged from 52 to 78 % accuracy, with two or more senses with moderate ambiguity (majority sense between 50 and 80 %) ranged from 23 to 57 % accuracy, and with two or more senses with high ambiguity (majority sense less than 50 %) ranged from 2 to 45 % accuracy. With respect to the ShARe test set, 69 % of short form annotations contained common concept unique identifiers with the Consumer Health Vocabulary. For these 2594 possible annotations, the performance of participating systems ranged from 50 to 75 % accuracy. Short form normalization continues

  1. Disjoint sum forms in reliability theory

    Directory of Open Access Journals (Sweden)

    B. Anrig

    2014-01-01

    Full Text Available The structure function f of a binary monotone system is assumed to be known and given in a disjunctive normal form, i.e. as the logical union of products of the indicator variables of the states of its subsystems. Based on this representation of f, an improved Abraham algorithm is proposed for generating the disjoint sum form of f. This form is the base for subsequent numerical reliability calculations. The approach is generalized to multivalued systems. Examples are discussed.

  2. Normal distal pulmonary vein anatomy

    Directory of Open Access Journals (Sweden)

    Wiesława Klimek-Piotrowska

    2016-01-01

    Full Text Available Background. It is well known that the pulmonary veins (PVs, especially their myocardial sleeves play a critical role in the initiation and maintenance of atrial fibrillation. Understanding the PV anatomy is crucial for the safety and efficacy of all procedures performed on PVs. The aim of this study was to present normal distal PV anatomy and to create a juxtaposition of all PV ostium variants.Methods. A total of 130 randomly selected autopsied adult human hearts (Caucasian were examined. The number of PVs ostia was evaluated and their diameter was measured. The ostium-to-last-tributary distance and macroscopic presence of myocardial sleeves were also evaluated.Results. Five hundred forty-one PV ostia were identified. Four classical PV ostia patterns (two left and two right PVs were observed in 70.8% of all cases. The most common variant was the classical pattern with additional middle right PV (19.2%, followed by the common ostium for the left superior and the inferior PVs (4.44%. Mean diameters of PV ostia (for the classical pattern were: left superior = 13.8 ± 2.9 mm; left inferior = 13.3 ± 3.4 mm; right superior = 14.3 ± 2.9 mm; right inferior = 13.7 ± 3.3 mm. When present, the additional middle right PV ostium had the smallest PV ostium diameter in the heart (8.2 ± 4.1 mm. The mean ostium-to-last-tributary (closest to the atrium distances were: left superior = 15.1 ± 4.6 mm; left inferior = 13.5 ± 4.0 mm; right superior = 11.8 ± 4.0 mm; right inferior = 11.0 ± 3.7 mm. There were no statistically significant differences between sexes in ostia diameters and ostium-to-last-tributary distances.Conclusion. Only 71% of the cases have four standard pulmonary veins. The middle right pulmonary vein is present in almost 20% of patients. Presented data can provide useful information for the clinicians during interventional procedures or radiologic examinations of PVs.

  3. Electroweak form factors

    International Nuclear Information System (INIS)

    Singh, S.K.

    2002-01-01

    The present status of electroweak nucleon form factors and the N - Δ transition form factors is reviewed. Particularly the determination of dipole mass M A in the axial vector form factor is discussed

  4. Visualization of normal pleural sinuses with AMBER

    International Nuclear Information System (INIS)

    Aarts, N.J.; Kool, L.J.S.; Oestmann, J.W.

    1991-01-01

    This paper reports that ventral and dorsal pleural sinuses are frequently better appreciated with advanced modulated beam equalization radiography (AMBER) than with standard chest radiography. The visualization of the sinuses with both techniques was compared and their typical configuration studied. Four hundred patients without known chest disease were evaluated. Two groups of 200 patients were studied with either AMBER or standard chest radiography. Visualization was evaluated by three radiologists using a four-point scale. The shape of the sinus was traced if sufficiently visible. A significantly larger segment of the respective sinuses was seen with the AMBER technique. The dorsal sinus was significantly easier to trace than the ventral. Various sinus configurations were noted. AMBER improves the visibility of the pleural sinuses. Knowledge of their normal configuration is the precondition for correctly diagnosing lesions hitherto frequently overlooked

  5. VIERS Electronic Form Submission Service (EFSS)

    Data.gov (United States)

    Department of Veterans Affairs — The D2D EFSS (Inc 1 and 2) provides a common access point to standardize, centralize, and integrate the universal collection of Benefit Claim Forms and supporting...

  6. The normal and pathological language

    OpenAIRE

    Espejo, Luis D.

    2014-01-01

    The extraordinary development of normal and pathological psychology has achieved in recent decades, thanks to the dual method of objective observation and oral survey enabled the researcher spirit of neuro-psychiatrist penetrate the intimate mechanism of the nervous system whose supreme manifestation is thought. It is normal psychology explaining the complicated game of perceptions: their methods of transmission, their centers of projection, its transformations and its synthesis to construct ...

  7. 48 CFR 1913.505-2 - Board order forms in lieu of Optional and Standard Forms.

    Science.gov (United States)

    2010-10-01

    ... BROADCASTING BOARD OF GOVERNORS CONTRACTING METHODS AND CONTRACT TYPES SMALL PURCHASES AND OTHER SIMPLIFIED... services as script writers, translators, narrators, etc. [50 FR 13205, Apr. 3, 1985] ...

  8. 78 FR 69086 - Federal Acquisition Regulation; Submission for OMB Review; Preaward Survey Forms (Standard Forms...

    Science.gov (United States)

    2013-11-18

    .../or business confidential information provided. FOR FURTHER INFORMATION CONTACT: Ms. Cecelia L. Davis... of the information collection would violate the fundamental purposes of the Paperwork Reduction Act... statistics) that were over the simplified acquisition threshold and that did not use FAR part 12 commercial...

  9. Clinical and psychological features of normal-weight women with subthreshold anorexia nervosa: a pilot case-control observational study.

    Science.gov (United States)

    Tagliabue, Anna; Ferraris, Cinzia; Martinelli, Valentina; Pinelli, Giovanna; Repossi, Ilaria; Trentani, Claudia

    2012-01-01

    Weight preoccupations have been frequently reported in normal-weight subjects. Subthreshold anorexia nervosa (s-AN, all DSM IV TR criteria except amenorrhea or underweight) is a form of eating disorder not otherwise specified that has received scarce scientific attention. Under a case-control design we compared the general characteristics, body composition, and psychopathological features of normal-weight patients with s-AN with those of BMI- and sex-matched controls. Participants in this pilot study included 9 normal-weight women who met the DSM IV TR criteria for s-AN and 18 BMI-matched normal-weight controls. The general characteristics of the study participants were collected by questionnaire. Body composition was measured by bioelectrical impedance. Behavioral and psychological measures included the standardized symptom checklist (SCL-90-R) and the eating disorder inventory (EDI-2). There were no differences in age, education, employment status, marital status, and history of previous slimming treatment in the two study groups. In addition, anthropometric measures and body composition of s-AN patients and BMI-matched normal weight controls were not significantly different. In the s-AN subgroup, we found a significant relationship between waist circumference and the SCL-90-R obsessivity-compulsivity scale (n=9, r=-0.69, pstudy cohort. These pilot results suggest that psychopathological criteria (particularly related to the obsessivity-compulsivity dimension) may be more useful than anthropometric measures for screening of s-AN in normal-weight women.

  10. Is normal science good science?

    Directory of Open Access Journals (Sweden)

    Adrianna Kępińska

    2015-09-01

    Full Text Available “Normal science” is a concept introduced by Thomas Kuhn in The Structure of Scientific Revolutions (1962. In Kuhn’s view, normal science means “puzzle solving”, solving problems within the paradigm—framework most successful in solving current major scientific problems—rather than producing major novelties. This paper examines Kuhnian and Popperian accounts of normal science and their criticisms to assess if normal science is good. The advantage of normal science according to Kuhn was “psychological”: subjective satisfaction from successful “puzzle solving”. Popper argues for an “intellectual” science, one that consistently refutes conjectures (hypotheses and offers new ideas rather than focus on personal advantages. His account is criticized as too impersonal and idealistic. Feyerabend’s perspective seems more balanced; he argues for a community that would introduce new ideas, defend old ones, and enable scientists to develop in line with their subjective preferences. The paper concludes that normal science has no one clear-cut set of criteria encompassing its meaning and enabling clear assessment.

  11. nth roots of normal contractions

    International Nuclear Information System (INIS)

    Duggal, B.P.

    1992-07-01

    Given a complex separable Hilbert space H and a contraction A on H such that A n , n≥2 some integer, is normal it is shown that if the defect operator D A = (1 - A * A) 1/2 is of the Hilbert-Schmidt class, then A is similar to a normal contraction, either A or A 2 is normal, and if A 2 is normal (but A is not) then there is a normal contraction N and a positive definite contraction P of trace class such that parallel to A - N parallel to 1 = 1/2 parallel to P + P parallel to 1 (where parallel to · parallel to 1 denotes the trace norm). If T is a compact contraction such that its characteristics function admits a scalar factor, if T = A n for some integer n≥2 and contraction A with simple eigen-values, and if both T and A satisfy a ''reductive property'', then A is a compact normal contraction. (author). 16 refs

  12. Adaptive municipal electronic forms

    NARCIS (Netherlands)

    Kuiper, Pieternel; van Dijk, Elisabeth M.A.G.; Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Guiderdoni-Jourdain, Karine; Oiry, Ewan

    Adaptation of electronic forms (e-forms) seems to be a step forward to reduce the burden for people who fill in forms. Municipalities more and more offer e-forms online that can be used by citizens to request a municipal product or service or by municipal employees to place a request on behalf of a

  13. Exchange rate arrangements: From extreme to "normal"

    Directory of Open Access Journals (Sweden)

    Beker Emilija

    2006-01-01

    Full Text Available The paper studies theoretical and empirical location dispersion of exchange rate arrangements - rigid-intermediate-flexible regimes, in the context of extreme arrangements of a currency board, dollarization and monetary union moderate characteristics of intermediate arrangements (adjustable pegs crawling pegs and target zones and imperative-process "normalization" in the form of a managed or clean floating system. It is established that de iure and de facto classifications generate "fear of floating" and "fear of pegging". The "impossible trinity" under the conditions of capital liberalization and globalization creates a bipolar view or hypothesis of vanishing intermediate exchange rate regimes.

  14. Self-consistent normal ordering of gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1987-01-01

    Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs

  15. [Quantification of acetabular coverage in normal adult].

    Science.gov (United States)

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  16. Commensuration and Legitimacy in Standards

    DEFF Research Database (Denmark)

    Hale, Lara

    This paper claims that commensuration is a form of valuation crucial for the legitimacy of standards. It is thus far poorly understood how standards are constructed in a legitimate manner, let alone the role of commensuration, the micro-process of converting qualities into measurable quantities...... for the purpose of comparison. The aim is to show how commensuration affects legitimacy at different phases of a standard's formation and diffusion. In order to do this, the lens is placed upon the relationship between the commensuration processes and input and output legitimacies. Research on the Active House...... legitimacy in different stages, either technical for the standard's specifications or contextual for the standard's implementation. Based on these findings, the paper offers a model of the commensurative development undergone in order to develop the legitimacy of a standard....

  17. Precaval retropancreatic space: Normal anatomy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yeon Hee; Kim, Ki Whang; Kim, Myung Jin; Yoo, Hyung Sik; Lee, Jong Tae [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    1992-07-15

    The authors defined precaval retropancreatic space as the space between pancreatic head with portal vein and IVC and analyzed the CT findings of this space to know the normal structures and size in this space. We evaluated 100 cases of normal abdominal CT scan to find out normal anatomic structures of precaval retropancreatic space retrospectively. We also measured the distance between these structures and calculated the minimum, maximum and mean values. At the splenoportal confluence level, normal structures between portal vein and IVC were vessel (21%), lymph node (19%), and caudate lobe of liver (2%) in order of frequency. The maximum AP diameter of portocaval lymph node was 4 mm. Common bile duct (CBD) was seen in 44% and the diameter was mean 3 mm and maximum 11 mm. CBD was located in extrapancreatic (75%) and lateral (60.6%) to pancreatic head. At IVC-left renal vein level, the maximum distance between CBD and IVC was 5 mm and the structure between posterior pancreatic surface and IVC was only fat tissue. Knowledge of these normal structures and measurement will be helpful in differentiating pancreatic mass with retropancreatic mass such as lymphadenopathy.

  18. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Establishing the proteome of normal human cerebrospinal fluid.

    Directory of Open Access Journals (Sweden)

    Steven E Schutzer

    2010-06-01

    Full Text Available Knowledge of the entire protein content, the proteome, of normal human cerebrospinal fluid (CSF would enable insights into neurologic and psychiatric disorders. Until now technologic hurdles and access to true normal samples hindered attaining this goal.We applied immunoaffinity separation and high sensitivity and resolution liquid chromatography-mass spectrometry to examine CSF from healthy normal individuals. 2630 proteins in CSF from normal subjects were identified, of which 56% were CSF-specific, not found in the much larger set of 3654 proteins we have identified in plasma. We also examined CSF from groups of subjects previously examined by others as surrogates for normals where neurologic symptoms warranted a lumbar puncture but where clinical laboratory were reported as normal. We found statistically significant differences between their CSF proteins and our non-neurological normals. We also examined CSF from 10 volunteer subjects who had lumbar punctures at least 4 weeks apart and found that there was little variability in CSF proteins in an individual as compared to subject to subject.Our results represent the most comprehensive characterization of true normal CSF to date. This normal CSF proteome establishes a comparative standard and basis for investigations into a variety of diseases with neurological and psychiatric features.

  20. The classification of normal screening mammograms

    Science.gov (United States)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  1. 3j Symbols: To Normalize or Not to Normalize?

    Science.gov (United States)

    van Veenendaal, Michel

    2011-01-01

    The systematic use of alternative normalization constants for 3j symbols can lead to a more natural expression of quantities, such as vector products and spherical tensor operators. The redefined coupling constants directly equate tensor products to the inner and outer products without any additional square roots. The approach is extended to…

  2. Deconstructing Interocular Suppression: Attention and Divisive Normalization.

    Directory of Open Access Journals (Sweden)

    Hsin-Hung Li

    2015-10-01

    Full Text Available In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression, the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature and divisive normalization contribute to interocular suppression.

  3. Deconstructing Interocular Suppression: Attention and Divisive Normalization.

    Science.gov (United States)

    Li, Hsin-Hung; Carrasco, Marisa; Heeger, David J

    2015-10-01

    In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression), the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature) and divisive normalization contribute to interocular suppression.

  4. Mast cell distribution in normal adult skin.

    Science.gov (United States)

    Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P

    2005-03-01

    To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.

  5. CT and MRI normal findings

    International Nuclear Information System (INIS)

    Moeller, T.B.; Reif, E.

    1998-01-01

    This book gives answers to questions frequently heard especially from trainees and doctors not specialising in the field of radiology: Is that a normal finding? How do I decide? What are the objective criteria? The information presented is three-fold. The normal findings of the usual CT and MRI examinations are shown with high-quality pictures serving as a reference, with inscribed important additional information on measures, angles and other criteria describing the normal conditions. These criteria are further explained and evaluated in accompanying texts which also teach the systematic approach for individual picture analysis, and include a check list of major aspects, as a didactic guide for learning. The book is primarily intended for students, radiographers, radiology trainees and doctors from other medical fields, but radiology specialists will also find useful details of help in special cases. (orig./CB) [de

  6. Marrow transfusions into normal recipients

    International Nuclear Information System (INIS)

    Brecher, G.

    1983-01-01

    During the past several years we have explored the transfusion of bone marrow into normal nonirradiated mice. While transfused marrow proliferates readily in irradiated animals, only minimal proliferation takes place in nonirradiated recipients. It has generally been assumed that this was due to the lack of available proliferative sites in recipients with normal marrow. Last year we were able to report that the transfusion of 200 million bone marrow cells (about 2/3 of the total complement of marrow cells of a normal mouse) resulted in 20% to 25% of the recipient's marrow being replaced by donor marrow. Thus we can now study the behavior of animals that have been transfused (donor) and endogenous (recipient) marrow cells, although none of the tissues of either donor or recipient have been irradiated. With these animals we hope to investigate the nature of the peculiar phenomenon of serial exhaustion of marrow, also referred to as the limited self-replicability of stem cells

  7. Against Logical Form

    Directory of Open Access Journals (Sweden)

    P N Johnson-Laird

    2010-10-01

    Full Text Available An old view in logic going back to Aristotle is that an inference is valid in virtue of its logical form. Many psychologists have adopted the same point of view about human reasoning: the first step is to recover the logical form of an inference, and the second step is to apply rules of inference that match these forms in order to prove that the conclusion follows from the premises. The present paper argues against this idea. The logical form of an inference transcends the grammatical forms of the sentences used to express it, because logical form also depends on context. Context is not readily expressed in additional premises. And the recovery of logical form leads ineluctably to the need for infinitely many axioms to capture the logical properties of relations. An alternative theory is that reasoning depends on mental models, and this theory obviates the need to recover logical form.

  8. Forms of Arthritis

    Science.gov (United States)

    ... this page please turn Javascript on. Forms of Arthritis Past Issues / Fall 2006 Table of Contents Today, ... of Linda Saisselin Osteoarthritis (OA) — the form of arthritis typically occurring during middle or old age, this ...

  9. Forms Management Policy

    Science.gov (United States)

    To establish EPA’s Forms Management Program; to describe the requisite roles, responsibilities, and procedures necessary for the successful management of EPA forms; and to more clearly fulfill EPA’s obligations in this regard.

  10. Radioimmunoassay in the diagnosis of atypical form of thyrotoxicosis

    Energy Technology Data Exchange (ETDEWEB)

    Livshits, G.Ya.

    1984-11-01

    Fifty-six patients with ''unmotivated'' disorder of the cardiac rhythm were examined. A combined radionuclide study including a study of iodoabsorptive function with a standard technique, thyroid visualization and determination of the thyroxin, triiodothyronine level in the blood serum with the radioimmunoassay using standard diagnostic kits, was conducted, Latent thyroid hyperfunction was revealed in 24 patients (42.8%). Study of iodoabsorptive function revealed pathological changes in 8 patients only, whereas raaioimmunoassay revealed a significant elevation of the peripheral thyroid hormone level as compared to that of the control group in 24 patients. The conclusion is that patients with ''unmotivated'' disorder of the cardiac rhythm often suffer from latent thyrotoxicosis which is the main etiological factor and trigger mechanism of arrhythmias. In such a situation they are the only clinical symptom of thyrotoxicosis that makes it possible to regard this form of disease as monosymptomatic. The early detection of the cause of cardiac rhythm disorder and the prescription of pathogenetic thyrostatic therapy resulted in the return of the cardiac cycle rate to normal in all the patients with sinus tachycardia and prevented relapses of the paroxysmal forms of rhythm disorder.

  11. Radioimmunoassay in the diagnosis of atypical form of thyrotoxicosis

    International Nuclear Information System (INIS)

    Livshits, G.Ya.

    1984-01-01

    Fifty-six patients with ''unmotivated'' disorder of the cardiac rhythm were examined. A combined radionuclide study including a study of iodoabsorptive function with a standard technique, thyroid visualization and determination of the thyroxin, triiodothyronine level in the blood serum with the radioimmunoassay using standard diagnostic kits, was conducted, Latent thyroid hyperfunction was revealed in 24 patients (42.8%). Study of iodoabsorptive function revealed pathological changes in 8 patients only, whereas raaioimmunoassay revealed a significant elevation of the peripheral thyroid hormone level as compared to that of the control group in 24 patients. The conclusion is that patients with ''unmotivated'' disorder of the cardiac rhythm often suffer from latent thyrotoxicosis which is the main etiological factor and trigger mechanism of arrhythmias. In such a situation they are the only clinical symptom of thyrotoxicosis that makes it possible to regard this form of disease as monosymptomatic. The early detection of the cause of cardiac rhythm disorder and the prescription of pathogenetic thyrostatic therapy resulted in the return of the cardiac cycle rate to normal in all the patients with sinus tachycardia and prevented relapses of the paroxysmal forms of rhythm disorder

  12. FORMS OF YOUTH TRAVEL

    OpenAIRE

    Moisã Claudia Olimpia; Moisã Claudia Olimpia

    2011-01-01

    Taking into account the suite of motivation that youth has when practicing tourism, it can be said that the youth travel takes highly diverse forms. These forms are educational tourism, volunteer programs and “work and travel”, cultural exchanges or sports tourism and adventure travel. In this article, we identified and analyzed in detail the main forms of youth travel both internationally and in Romania. We also illustrated for each form of tourism the specific tourism products targeting you...

  13. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  14. Unified form language

    DEFF Research Database (Denmark)

    Alnæs, Martin S.; Logg, Anders; Ølgaard, Kristian Breum

    2014-01-01

    We present the Unied Form Language (UFL), which is a domain-specic language for representing weak formulations of partial dierential equations with a view to numerical approximation. Features of UFL include support for variational forms and functionals, automatic dierentiation of forms and expres...... libraries to generate concrete low-level implementations. Some application examples are presented and libraries that support UFL are highlighted....

  15. Method for forming ammonia

    Science.gov (United States)

    Kong, Peter C.; Pink, Robert J.; Zuck, Larry D.

    2008-08-19

    A method for forming ammonia is disclosed and which includes the steps of forming a plasma; providing a source of metal particles, and supplying the metal particles to the plasma to form metal nitride particles; and providing a substance, and reacting the metal nitride particles with the substance to produce ammonia, and an oxide byproduct.

  16. Mesonic Form Factors

    Energy Technology Data Exchange (ETDEWEB)

    Frederic D. R. Bonnet; Robert G. Edwards; George T. Fleming; Randal Lewis; David Richards

    2003-07-22

    We have started a program to compute the electromagnetic form factors of mesons. We discuss the techniques used to compute the pion form factor and present preliminary results computed with domain wall valence fermions on MILC asqtad lattices, as well as Wilson fermions on quenched lattices. These methods can easily be extended to rho-to-gamma-pi transition form factors.

  17. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  18. Is My Child's Appetite Normal?

    Science.gov (United States)

    Is My Child’s Appetite Normal? Cayla, who is 4 years old, did not finish her lunch. But she is ready to play. Her ... snack for later. That is okay! Your child’s appetite changes. Children do not grow as fast in ...

  19. Transforming Normal Programs by Replacement

    NARCIS (Netherlands)

    Bossi, Annalisa; Pettorossi, A.; Cocco, Nicoletta; Etalle, Sandro

    1992-01-01

    The replacement transformation operation, already defined in [28], is studied wrt normal programs. We give applicability conditions able to ensure the correctness of the operation wrt Fitting's and Kunen's semantics. We show how replacement can mimic other transformation operations such as thinning,

  20. Semigroups of data normalization functions

    NARCIS (Netherlands)

    Warrens, Matthijs J.

    2016-01-01

    Variable centering and scaling are functions that are typically used in data normalization. Various properties of centering and scaling functions are presented. It is shown that if we use two centering functions (or scaling functions) successively, the result depends on the order in which the

  1. Normalizing Catastrophe: Sustainability and Scientism

    Science.gov (United States)

    Bonnett, Michael

    2013-01-01

    Making an adequate response to our deteriorating environmental situation is a matter of ever increasing urgency. It is argued that a central obstacle to achieving this is the way that scientism has become normalized in our thinking about environmental issues. This is taken to reflect on an underlying "metaphysics of mastery" that vitiates proper…

  2. Neutron RBE for normal tissues

    International Nuclear Information System (INIS)

    Field, S.B.; Hornsey, S.

    1979-01-01

    RBE for various normal tissues is considered as a function of neutron dose per fraction. Results from a variety of centres are reviewed. It is shown that RBE is dependent on neutron energy and is tissue dependent, but is not specially high for the more critical tissues or for damage occurring late after irradiation. (author)

  3. Normal and abnormal growth plate

    International Nuclear Information System (INIS)

    Kumar, R.; Madewell, J.E.; Swischuk, L.E.

    1987-01-01

    Skeletal growth is a dynamic process. A knowledge of the structure and function of the normal growth plate is essential in order to understand the pathophysiology of abnormal skeletal growth in various diseases. In this well-illustrated article, the authors provide a radiographic classification of abnormal growth plates and discuss mechanisms that lead to growth plate abnormalities

  4. Stochastic simulations of normal aging and Werner's syndrome.

    KAUST Repository

    Qi, Qi

    2014-04-26

    Human cells typically consist of 23 pairs of chromosomes. Telomeres are repetitive sequences of DNA located at the ends of chromosomes. During cell replication, a number of basepairs are lost from the end of the chromosome and this shortening restricts the number of divisions that a cell can complete before it becomes senescent, or non-replicative. In this paper, we use Monte Carlo simulations to form a stochastic model of telomere shortening to investigate how telomere shortening affects normal aging. Using this model, we study various hypotheses for the way in which shortening occurs by comparing their impact on aging at the chromosome and cell levels. We consider different types of length-dependent loss and replication probabilities to describe these processes. After analyzing a simple model for a population of independent chromosomes, we simulate a population of cells in which each cell has 46 chromosomes and the shortest telomere governs the replicative potential of the cell. We generalize these simulations to Werner\\'s syndrome, a condition in which large sections of DNA are removed during cell division and, amongst other conditions, results in rapid aging. Since the mechanisms governing the loss of additional basepairs are not known, we use our model to simulate a variety of possible forms for the rate at which additional telomeres are lost per replication and several expressions for how the probability of cell division depends on telomere length. As well as the evolution of the mean telomere length, we consider the standard deviation and the shape of the distribution. We compare our results with a variety of data from the literature, covering both experimental data and previous models. We find good agreement for the evolution of telomere length when plotted against population doubling.

  5. Bernstein Algorithm for Vertical Normalization to 3NF Using Synthesis

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2013-07-01

    Full Text Available This paper demonstrates the use of Bernstein algorithm for vertical normalization to 3NF using synthesis. The aim of the paper is to provide an algorithm for database normalization and present a set of steps which minimize redundancy in order to increase the database management efficiency, and specify tests and algorithms for testing and proving the reversibility (i.e., proving that the normalization did not cause loss of information. Using Bernstein algorithm steps, the paper gives examples of vertical normalization to 3NF through synthesis and proposes a test and an algorithm to demonstrate decomposition reversibility. This paper also sets out to explain that the reasons for generating normal forms are to facilitate data search, eliminate data redundancy as well as delete, insert and update anomalies and explain how anomalies develop using examples-

  6. Schema Design and Normalization Algorithm for XML Databases Model

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2009-06-01

    Full Text Available In this paper we study the problem of schema design and normalization in XML databases model. We show that, like relational databases, XML documents may contain redundant information, and this redundancy may cause update anomalies. Furthermore, such problems are caused by certain functional dependencies among paths in the document. Based on our research works, in which we presented the functional dependencies and normal forms of XML Schema, we present the decomposition algorithm for converting any XML Schema into normalized one, that satisfies X-BCNF.

  7. National Green Building Standard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-07-01

    DOE's Building America Program is a research and development program to improve the energy performance of new and existing homes. The ultimate goal of the Building America Program is to achieve examples of cost-effective, energy efficient solutions for all U.S. climate zones. Periodic maintenance of an ANSI standard by review of the entire document and action to revise or reaffirm it on a schedule not to exceed five years is required by ANSI. In compliance, a consensus group has once again been formed and the National Green Building Standard is currently being reviewed to comply with the periodic maintenance requirement of an ANSI standard.

  8. Performance standard for dose Calibrator

    CERN Document Server

    Darmawati, S

    2002-01-01

    Dose calibrator is an instrument used in hospitals to determine the activity of radionuclide for nuclear medicine purposes. International Electrotechnical Commission (IEC) has published IEC 1303:1994 standard that can be used as guidance to test the performance of the instrument. This paper briefly describes content of the document,as well as explains the assessment that had been carried out to test the instrument accuracy in Indonesia through intercomparison measurement.Its is suggested that hospitals acquire a medical physicist to perform the test for its dose calibrator. The need for performance standard in the form of Indonesia Standard is also touched.

  9. Problems with using the normal distribution--and ways to improve quality and efficiency of data analysis.

    Directory of Open Access Journals (Sweden)

    Eckhard Limpert

    Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.

  10. A strand specific high resolution normalization method for chip-sequencing data employing multiple experimental control measurements

    DEFF Research Database (Denmark)

    Enroth, Stefan; Andersson, Claes; Andersson, Robin

    2012-01-01

    High-throughput sequencing is becoming the standard tool for investigating protein-DNA interactions or epigenetic modifications. However, the data generated will always contain noise due to e.g. repetitive regions or non-specific antibody interactions. The noise will appear in the form of a backg......, the background is only used to adjust peak calling and not as a pre-processing step that aims at discerning the signal from the background noise. A normalization procedure that extracts the signal of interest would be of universal use when investigating genomic patterns....

  11. US Topo Product Standard

    Science.gov (United States)

    Cooley, Michael J.; Davis, Larry R.; Fishburn, Kristin A.; Lestinsky, Helmut; Moore, Laurence R.

    2011-01-01

    This document defines a U.S. Geological Survey (USGS) digital topographic map. This map series, named “US Topo,” is modeled on what is referred to as the standard USGS 7.5-minute (1:24,000-scale) topographic map series that was created during the period from 1947 to approximately 1992. The US Topo map product has the same extent, scale, and general layout as the older standard topographic maps. However, unlike the previous maps, US Topo maps are published using Adobe Systems Inc. Portable Document Format (PDF) with a geospatial extension that is called Georeferenced PDF (GeoPDF), patented by TerraGo Technologies. In addition, the US Topo map products incorporate an orthorectified image along with data that was included in the standard 7.5-minute topographic maps. US Topo maps are intended to serve conventional map users by providing Geographic Information System (GIS) information in symbolized form in the customary topographic map layout. The maps are not intended for GIS analysis applications.

  12. A Denotational Account of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2004-01-01

    We show that the standard normalization-by-evaluation construction for the simply-typed λβγ-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a recursively defined invariant relation, in the style of Pitts. In fact, the cons...

  13. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  14. Sketching Curves for Normal Distributions--Geometric Connections

    Science.gov (United States)

    Bosse, Michael J.

    2006-01-01

    Within statistics instruction, students are often requested to sketch the curve representing a normal distribution with a given mean and standard deviation. Unfortunately, these sketches are often notoriously imprecise. Poor sketches are usually the result of missing mathematical knowledge. This paper considers relationships which exist among…

  15. Surface tension of normal and heavy water

    International Nuclear Information System (INIS)

    Straub, J.; Rosner, N.; Grigull, V.

    1980-01-01

    A Skeleton Table and simple interpolation equation for the surface tension of light water was developed by the Working Group III of the International Association for the Properties of Steam and is recommended as an International Standard. The Skeleton Table is based on all known measurements of the surface tension and individual data were weighted corresponding to the accuracy of the measurements. The form of the interpolation equation is based on a physical concept. It represents an extension of van der Waals-equation, where the exponent conforms to the 'Scaling Laws'. In addition for application purposes simple relations for the Laplace-coefficient and for the density difference between the liquid and gaseous phases of light water are given. The same form of interpolation equation for the surface tension can be used for heavy water, for which the coefficients are given. However, this equation is based only on a single set of data. (orig.) [de

  16. Normal dispersion femtosecond fiber optical parametric oscillator.

    Science.gov (United States)

    Nguyen, T N; Kieu, K; Maslov, A V; Miyawaki, M; Peyghambarian, N

    2013-09-15

    We propose and demonstrate a synchronously pumped fiber optical parametric oscillator (FOPO) operating in the normal dispersion regime. The FOPO generates chirped pulses at the output, allowing significant pulse energy scaling potential without pulse breaking. The output average power of the FOPO at 1600 nm was ∼60  mW (corresponding to 1.45 nJ pulse energy and ∼55% slope power conversion efficiency). The output pulses directly from the FOPO were highly chirped (∼3  ps duration), and they could be compressed outside of the cavity to 180 fs by using a standard optical fiber compressor. Detailed numerical simulation was also performed to understand the pulse evolution dynamics around the laser cavity. We believe that the proposed design concept is useful for scaling up the pulse energy in the FOPO using different pumping wavelengths.

  17. Forms of Life, Forms of Reality

    Directory of Open Access Journals (Sweden)

    Piergiorgio Donatelli

    2015-10-01

    Full Text Available The article explores aspects of the notion of forms of life in the Wittgensteinian tradition especially following Iris Murdoch’s lead. On the one hand, the notion signals the hardness and inexhaustible character of reality, as the background needed in order to make sense of our lives in various ways. On the other, the hardness of reality is the object of a moral work of apprehension and deepening to the point at which its distinctive character dissolves into the family of connections we have gained for ourselves. The two movements of thought are connected and necessary.

  18. Production of sodalite waste forms by addition of glass

    International Nuclear Information System (INIS)

    Pereira, C.

    1995-01-01

    Spent nuclear fuel can be treated in a molten salt electrorefiner for conversion into metal and mineral waste forms for geologic disposal. Sodalite is one of the mineral waste forms under study. Fission products in the molten salt are ion-exchanged into zeolite A, which is converted to sodalite and consolidated. Sodalite can be formed directly from mixtures of salt and zeolite A at temperatures above 975 K; however, nepheline is usually produced as a secondary phase. Addition of small amounts of glass frit to the mixture reduced nepheline formation significantly. Loss of fission products was not observed for reaction below 1000 K. Hot-pressing of the sodalite powders yielded dense pellets (∼2.3 g/cm 3 ) without any loss of fission product species. Normalized release rates were below 1 g/m 2 ·day for pre-washed samples in 28-day leach tests based on standard MCC-1 tests but increased with the presence of free salt on the sodalite

  19. Micro metal forming

    CERN Document Server

    2013-01-01

    Micro Metal Forming, i. e. forming of parts and features with dimensions below 1 mm, is a young area of research in the wide field of metal forming technologies, expanding the limits for applying metal forming towards micro technology. The essential challenges arise from the reduced geometrical size and the increased lot size. In order to enable potential users to apply micro metal forming in production, information about the following topics are given: tribological behavior: friction between tool and work piece as well as tool wear mechanical behavior: strength and formability of the work piece material, durability of the work pieces size effects: basic description of effects occurring due to the fact, that the quantitative relation between different features changes with decreasing size process windows and limits for forming processes tool making methods numerical modeling of processes and process chains quality assurance and metrology All topics are discussed with respect to the questions relevant to micro...

  20. Superconducting versus normal conducting cavities

    CERN Document Server

    Podlech, Holger

    2013-01-01

    One of the most important issues of high-power hadron linacs is the choice of technology with respect to superconducting or room-temperature operation. The favour for a specific technology depends on several parameters such as the beam energy, beam current, beam power and duty factor. This contribution gives an overview of the comparison between superconducting and normal conducting cavities. This includes basic radiofrequency (RF) parameters, design criteria, limitations, required RF and plug power as well as case studies.

  1. Normal Movement Selectivity in Autism

    OpenAIRE

    Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J.

    2010-01-01

    It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements, but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Moveme...

  2. Normal range of gastric emptying in children

    International Nuclear Information System (INIS)

    Thomas, P.; Collins, C.; Francis, L.; Henry, R.; O'Loughlin, E.; John Hunter Children's Hospital, Newcastle, NSW

    1999-01-01

    Full text: As part of a larger study looking at gastric emptying times in cystic fibrosis, we assessed the normal range of gastric emptying in a control group of children. Thirteen children (8 girls, 5 boys) aged 4-15 years (mean 10) were studied. Excluded were children with a history of relevant gastrointestinal medical or surgical disease, egg allergy or medication affecting gastric emptying. Imaging was performed at 08.00 h after an overnight fast. The test meal was consumed in under 15 min and comprised one 50 g egg, 80 g commercial pancake mix, 10 ml of polyunsaturated oil, 40 ml of water and 30 g of jam. The meal was labelled with 99 Tc m -macroaggregates of albumin. Water (150 ml) was also consumed with the test meal. One minute images of 128 x 128 were acquired over the anterior and posterior projections every 5 min for 30 min, then every 15 min until 90 min with a final image at 120 min. Subjects remained supine for the first 60 min, after which they were allowed to walk around. A time-activity curve was generated using the geometric mean of anterior and posterior activity. The half emptying time ranged from 55 to 107 min (mean 79, ± 2 standard deviations 43-115). Lag time (time for 5% to leave stomach) ranged from 2 to 26 min (mean 10). The percent emptied at 60 min ranged from 47 to 73% (mean 63%). There was no correlation of half emptying time with age. The normal reference range for a test meal of pancakes has been established for 13 normal children

  3. Gravitation and quadratic forms

    International Nuclear Information System (INIS)

    Ananth, Sudarshan; Brink, Lars; Majumdar, Sucheta; Mali, Mahendra; Shah, Nabha

    2017-01-01

    The light-cone Hamiltonians describing both pure (N=0) Yang-Mills and N=4 super Yang-Mills may be expressed as quadratic forms. Here, we show that this feature extends to theories of gravity. We demonstrate how the Hamiltonians of both pure gravity and N=8 supergravity, in four dimensions, may be written as quadratic forms. We examine the effect of residual reparametrizations on the Hamiltonian and the resulting quadratic form.

  4. Neutron electromagnetic form factors

    International Nuclear Information System (INIS)

    Finn, J.M.; Madey, R.; Eden, T.; Markowitz, P.; Rutt, P.M.; Beard, K.; Anderson, B.D.; Baldwin, A.R.; Keane, D.; Manley, D.M.; Watson, J.W.; Zhang, W.M.; Kowalski, S.; Bertozzi, W.; Dodson, G.; Farkhondeh, M.; Dow, K.; Korsch, W.; Tieger, D.; Turchinetz, W.; Weinstein, L.; Gross, F.; Mougey, J.; Ulmer, P.; Whitney, R.; Reichelt, T.; Chang, C.C.; Kelly, J.J.; Payerle, T.; Cameron, J.; Ni, B.; Spraker, M.; Barkhuff, D.; Lourie, R.; Verst, S.V.; Hyde-Wright, C.; Jiang, W.-D.; Flanders, B.; Pella, P.; Arenhoevel, H.

    1992-01-01

    Nucleon form factors provide fundamental input for nuclear structure and quark models. Current knowledge of neutron form factors, particularly the electric form factor of the neutron, is insufficient to meet these needs. Developments of high-duty-factor accelerators and polarization-transfer techniques permit new experiments that promise results with small sensitivities to nuclear models. We review the current status of the field, our own work at the MIT/Bates linear accelerator, and future experimental efforts

  5. Gravitation and quadratic forms

    Energy Technology Data Exchange (ETDEWEB)

    Ananth, Sudarshan [Indian Institute of Science Education and Research,Pune 411008 (India); Brink, Lars [Department of Physics, Chalmers University of Technology,S-41296 Göteborg (Sweden); Institute of Advanced Studies and Department of Physics & Applied Physics,Nanyang Technological University,Singapore 637371 (Singapore); Majumdar, Sucheta [Indian Institute of Science Education and Research,Pune 411008 (India); Mali, Mahendra [School of Physics, Indian Institute of Science Education and Research,Thiruvananthapuram, Trivandrum 695016 (India); Shah, Nabha [Indian Institute of Science Education and Research,Pune 411008 (India)

    2017-03-31

    The light-cone Hamiltonians describing both pure (N=0) Yang-Mills and N=4 super Yang-Mills may be expressed as quadratic forms. Here, we show that this feature extends to theories of gravity. We demonstrate how the Hamiltonians of both pure gravity and N=8 supergravity, in four dimensions, may be written as quadratic forms. We examine the effect of residual reparametrizations on the Hamiltonian and the resulting quadratic form.

  6. Organizational forms and knowledge absorption

    Directory of Open Access Journals (Sweden)

    Radovanović Nikola

    2016-01-01

    Full Text Available Managing the entire portion of knowledge in an organization is a challenging task. At the organizational level, there can be enormous quantities of unknown, poorly valued or inefficiently applied knowledge. This is normally followed with the underdeveloped potential or inability of organizations to absorb knowledge from external sources. Facilitation of the efficient internal flow of knowledge within the established communication network may positively affect organizational capacity to absorb or identify, share and subsequently apply knowledge to commercial ends. Based on the evidences that the adoption of different organizational forms affects knowledge flows within an organization, this research analyzed the relationship between common organizational forms and absorptive capacity of organizations. In this paper, we test the hypothesis stating that the organizational structure affects knowledge absorption and exploitation in the organization. The methodology included quantitative and qualitative research method based on a questionnaire, while the data has been statistically analyzed and the hypothesis has been tested with the use of cross-tabulation and chi-square tests. The findings suggest that the type of organizational form affects knowledge absorption capacity and that having a less formalized and more flexible structure in an organization increases absorbing and exploiting opportunities of potentially valuable knowledge.

  7. Dosimetry standards for radiation processing

    International Nuclear Information System (INIS)

    Farrar, H. IV

    1999-01-01

    For irradiation treatments to be reproducible in the laboratory and then in the commercial environment, and for products to have certified absorbed doses, standardized dosimetry techniques are needed. This need is being satisfied by standards being developed by experts from around the world under the auspices of Subcommittee E10.01 of the American Society for Testing and Materials (ASTM). In the time period since it was formed in 1984, the subcommittee has grown to 150 members from 43 countries, representing a broad cross-section of industry, government and university interests. With cooperation from other international organizations, it has taken the combined part-time effort of all these people more than 13 years to complete 24 dosimetry standards. Four are specifically for food irradiation or agricultural applications, but the majority apply to all forms of gamma, x-ray, Bremsstrahlung and electron beam radiation processing, including dosimetry for sterilization of health care products and the radiation processing of fruits, vegetables, meats, spices, processed foods, plastics, inks, medical wastes and paper. An additional 6 standards are under development. Most of the standards provide exact procedures for using individual dosimetry systems or for characterizing various types of irradiation facilities, but one covers the selection and calibration of dosimetry systems, and another covers the treatment of uncertainties. Together, this set of standards covers essentially all aspects of dosimetry for radiation processing. The first 20 of these standards have been adopted in their present form by the International Organization of Standardization (ISO), and will be published by ISO in 1999. (author)

  8. Analysis of Fringe Field Formed Inside LDA Measurement Volume Using Compact Two Hololens Imaging Systems

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.; Yadav, H. L.

    2018-03-01

    We have designed and fabricated four LDA optical setups consisting of aberration compensated four different compact two hololens imaging systems. We have experimentally investigated and realized a hololens recording geometry which is interferogram of converging spherical wavefront with mutually coherent planar wavefront. Proposed real time monitoring and actual fringe field analysis techniques allow complete characterizations of fringes formed at measurement volume and permit to evaluate beam quality, alignment and fringe uniformity with greater precision. After experimentally analyzing the fringes formed at measurement volume by all four imaging systems, it is found that fringes obtained using compact two hololens imaging systems get improved both qualitatively and quantitatively compared to that obtained using conventional imaging system. Results indicate qualitative improvement of non-uniformity in fringe thickness and micro intensity variations perpendicular to the fringes, and quantitative improvement of 39.25% in overall average normalized standard deviations of fringe width formed by compact two hololens imaging systems compare to that of conventional imaging system.

  9. Lithium control during normal operation

    International Nuclear Information System (INIS)

    Suryanarayan, S.; Jain, D.

    2010-01-01

    Periodic increases in lithium (Li) concentrations in the primary heat transport (PHT) system during normal operation are a generic problem at CANDU® stations. Lithiated mixed bed ion exchange resins are used at stations for pH control in the PHT system. Typically tight chemistry controls including Li concentrations are maintained in the PHT water. The reason for the Li increases during normal operation at CANDU stations such as Pickering was not fully understood. In order to address this issue a two pronged approach was employed. Firstly, PNGS-A data and information from other available sources was reviewed in an effort to identify possible factors that may contribute to the observed Li variations. Secondly, experimental studies were carried out to assess the importance of these factors in order to establish reasons for Li increases during normal operation. Based on the results of these studies, plausible mechanisms/reasons for Li increases have been identified and recommendations made for proactive control of Li concentrations in the PHT system. (author)

  10. Normalization of Gravitational Acceleration Models

    Science.gov (United States)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.

    2011-01-01

    Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  11. "Ser diferente é normal?"/"Being different: is it normal?"

    Directory of Open Access Journals (Sweden)

    Viviane Veras

    2007-01-01

    Full Text Available A pergunta título deste trabalho retoma o slogan “Ser diferente é normal”, que é parte da campanha criada para uma organização não-governamental que atende portadores de Síndrome de Down. O objetivo é a inclusão social da pessoa com deficiência e o primeiro passo foi propor a inclusão de um grupo de diferentes no grupo dito normal. No vídeo de lançamento da campanha, o diferente, identificado como normal, é mostrado por meio de exemplos – um negro com cabelo black-power, um skin-head, um corpo tatuado, um corpo feminino halterofílico, uma família hippie, uma garota com síndrome de Down. A visão da adolescente dançando reduz, de certo modo, o efeito imaginário que vai além da síndrome, uma vez que apenas o corpo com seus olhinhos puxados se destacam, e não se interrogam questões cognitivas. Minha proposta é refletir sobre o estatuto paradoxal do exemplo, tal como é trabalhado nesse vídeo: se, por definição, um exemplo mostra de fato seu pertencimento a uma classe, pode-se concluir que é exatamente por ser exemplar que ele se encontra fora dela, no exato momento em que a exibe e define. The question in the title of this paper refers to the slogan "ser diferente é normal" ("It´s normal to be different", which is part of a campaign created for a NGO that supports people with Down syndrome. The objective of the campaign is to promote the social inclusion of individuals with Down syndrome, and the first step was to propose the inclusion of a group of "differents" in the so-called normal group. The film launching the campaign shows the different identified as normal by means of examples: a black man exhibiting blackpower haircut, a skin-head, a tattooed body, an over-athletic female body, a hippie family and a girl with Down syndrome. The vision of the dancing teenager lessens the imaginary effect that surpasses the syndrome, since only her body and her little oriental eyes stand out and no cognitive issues are

  12. Electronic Capitalization Asset Form -

    Data.gov (United States)

    Department of Transportation — National Automated Capitalization Authorization Form used by ATO Engineering Services, Logistics, Accounting for the purpose of identifying and capturing FAA project...

  13. Forming of superplastic ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Lesuer, D.R.; Wadsworth, J.; Nieh, T.G.

    1994-05-01

    Superplasticity in ceramics has now advanced to the stage that technologically viable superplastic deformation processing can be performed. In this paper, examples of superplastic forming and diffusion bonding of ceramic components are given. Recent work in biaxial gas-pressure forming of several ceramics is provided. These include yttria-stabilized, tetragonal zirconia (YTZP), a 20% alumina/YTZP composite, and silicon. In addition, the concurrent superplastic forming and diffusion bonding of a hybrid ceramic-metal structure are presented. These forming processes offer technological advantages of greater dimensional control and increased variety and complexity of shapes than is possible with conventional ceramic shaping technology.

  14. Cooperative Station History Forms

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Various forms, photographs and correspondence documenting the history of Cooperative station instrumentation, location changes, inspections, and...

  15. 75 FR 28777 - Information Collection; Financial Information Security Request Form

    Science.gov (United States)

    2010-05-24

    ... Collection; Financial Information Security Request Form AGENCY: Forest Service, USDA. ACTION: Notice; Request... currently approved information collection; Financial Information Security Request Form. DATES: Comments must... Standard Time, Monday through Friday. SUPPLEMENTARY INFORMATION: Title: Financial Information Security...

  16. Manufacturing technology for practical Josephson voltage normals; Fertigungstechnologie fuer praxistaugliche Josephson-Spannungsnormale

    Energy Technology Data Exchange (ETDEWEB)

    Kohlmann, Johannes; Kieler, Oliver [Physikalisch-Technische Bundesanstalt (PTB), Braunschweig (Germany). Arbeitsgruppe 2.43 ' ' Josephson-Schaltungen' '

    2016-09-15

    In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.

  17. Understanding a Normal Distribution of Data.

    Science.gov (United States)

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  18. Quantiles for Finite Mixtures of Normal Distributions

    Science.gov (United States)

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  19. Quantitative proteome profiling of normal human circulating microparticles

    DEFF Research Database (Denmark)

    Østergaard, Ole; Nielsen, Christoffer T; Iversen, Line V

    2012-01-01

    Circulating microparticles (MPs) are produced as part of normal physiology. Their numbers, origin, and composition change in pathology. Despite this, the normal MP proteome has not yet been characterized with standardized high-resolution methods. We here quantitatively profile the normal MP...... proteome using nano-LC-MS/MS on an LTQ-Orbitrap with optimized sample collection, preparation, and analysis of 12 different normal samples. Analytical and procedural variation were estimated in triply processed samples analyzed in triplicate from two different donors. Label-free quantitation was validated...... by the correlation of cytoskeletal protein intensities with MP numbers obtained by flow cytometry. Finally, the validity of using pooled samples was evaluated using overlap protein identification numbers and multivariate data analysis. Using conservative parameters, 536 different unique proteins were quantitated...

  20. Giro form reading machine

    Science.gov (United States)

    Minh Ha, Thien; Niggeler, Dieter; Bunke, Horst; Clarinval, Jose

    1995-08-01

    Although giro forms are used by many people in daily life for money remittance in Switzerland, the processing of these forms at banks and post offices is only partly automated. We describe an ongoing project for building an automatic system that is able to recognize various items printed or written on a giro form. The system comprises three main components, namely, an automatic form feeder, a camera system, and a computer. These components are connected in such a way that the system is able to process a bunch of forms without any human interactions. We present two real applications of our system in the field of payment services, which require the reading of both machine printed and handwritten information that may appear on a giro form. One particular feature of giro forms is their flexible layout, i.e., information items are located differently from one form to another, thus requiring an additional analysis step to localize them before recognition. A commercial optical character recognition software package is used for recognition of machine-printed information, whereas handwritten information is read by our own algorithms, the details of which are presented. The system is implemented by using a client/server architecture providing a high degree of flexibility to change. Preliminary results are reported supporting our claim that the system is usable in practice.

  1. Mastering HTML5 forms

    CERN Document Server

    Gupta, Gaurav

    2013-01-01

    This tutorial will show you how to create stylish forms, not only visually appealing, but interactive and customized, in order to gather valuable user inputs and information.Enhance your skills in building responsive and dynamic web forms using HTML5, CSS3, and related technologies. All you need is a basic understanding of HTML and PHP.

  2. Soil Forming Factors

    Science.gov (United States)

    It! What is Soil? Chip Off the Old Block Soil Forming Factors Matters of Life and Death Underneath It All Wise Choices A World of Soils Soil Forming Factors 2 A Top to Bottom Guide 3 Making a Soil Monolith 4 Soil Orders 5 State Soil Monoliths 6 Where in the Soil World Are You? >> A Top to

  3. Method for forming materials

    Science.gov (United States)

    Tolle, Charles R [Idaho Falls, ID; Clark, Denis E [Idaho Falls, ID; Smartt, Herschel B [Idaho Falls, ID; Miller, Karen S [Idaho Falls, ID

    2009-10-06

    A material-forming tool and a method for forming a material are described including a shank portion; a shoulder portion that releasably engages the shank portion; a pin that releasably engages the shoulder portion, wherein the pin defines a passageway; and a source of a material coupled in material flowing relation relative to the pin and wherein the material-forming tool is utilized in methodology that includes providing a first material; providing a second material, and placing the second material into contact with the first material; and locally plastically deforming the first material with the material-forming tool so as mix the first material and second material together to form a resulting material having characteristics different from the respective first and second materials.

  4. Reward value-based gain control: divisive normalization in parietal cortex.

    Science.gov (United States)

    Louie, Kenway; Grattan, Lauren E; Glimcher, Paul W

    2011-07-20

    The representation of value is a critical component of decision making. Rational choice theory assumes that options are assigned absolute values, independent of the value or existence of other alternatives. However, context-dependent choice behavior in both animals and humans violates this assumption, suggesting that biological decision processes rely on comparative evaluation. Here we show that neurons in the monkey lateral intraparietal cortex encode a relative form of saccadic value, explicitly dependent on the values of the other available alternatives. Analogous to extra-classical receptive field effects in visual cortex, this relative representation incorporates target values outside the response field and is observed in both stimulus-driven activity and baseline firing rates. This context-dependent modulation is precisely described by divisive normalization, indicating that this standard form of sensory gain control may be a general mechanism of cortical computation. Such normalization in decision circuits effectively implements an adaptive gain control for value coding and provides a possible mechanistic basis for behavioral context-dependent violations of rationality.

  5. Antimicrobial effects of herbal extracts on Streptococcus mutans and normal oral streptococci.

    Science.gov (United States)

    Lee, Sung-Hoon

    2013-08-01

    Streptococcus mutans is associated with dental caries. A cariogenic biofilm, in particular, has been studied extensively for its role in the formation of dental caries. Herbal extracts such as Cudrania tricuspidata, Sophora flavescens, Ginkgo biloba, and Betula Schmidtii have been used as a folk remedy for treating diseases. The purpose of this study was to evaluate and compare the antibacterial activity of herbal extracts against normal oral streptococci, planktonic and biofilm of S. mutans. Streptococcus gordonii, Streptococcus oralis, Streptococcus salivarius, Streptococcus sanguinis, and S. mutans were cultivated with brain heart infusion broth and susceptibility assay for the herbal extracts was performed according to the protocol of Clinical and Laboratory Standard Institute. Also, S. mutans biofilm was formed on a polystyrene 12-well plate and 8-well chamber glass slip using BHI broth containing 2% sucrose and 1% mannose after conditioning the plate and the glass slip with unstimulated saliva. The biofilm was treated with the herbal extracts in various concentrations and inoculated on Mitis-Salivarius bacitracin agar plate for enumeration of viable S. mutans by counting colony forming units. Planktonic S. mutans showed susceptibility to all of the extracts and S. mutans biofilm exhibited the highest level of sensitivity for the extracts of S. flavescens. The normal oral streptococci exhibited a weak susceptibility in comparison to S. mutans. S. oralis, however, was resistant to all of the extracts. In conclusion, the extract of S. flavescens may be a potential candidate for prevention and management of dental caries.

  6. Strong normalization by type-directed partial evaluation and run-time code generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1998-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  7. Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1997-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  8. 48 CFR 49.602-2 - Inventory forms.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Inventory forms. 49.602-2... TERMINATION OF CONTRACTS Contract Termination Forms and Formats 49.602-2 Inventory forms. Standard Form (SF) 1428, Inventory Disposal Schedule, and SF 1429, Inventory Disposal Schedule—Continuation Sheet, shall...

  9. Cystic form of rheumatoid arthritis

    Energy Technology Data Exchange (ETDEWEB)

    Dijkstra, P.F.; Gubler, F.M.; Maas, A.

    1988-10-01

    A nonerosive form of rheumatoid arthritis (R.A.) was found in 62 patients out of 660 patients with R.A.. These 62 patients exhibit slowly progressive cystic changes in about the same joints in which usually erosions develop in classic R.A.. The E.S.R. is often low, half of the patients remained seronegative and there are 35 males and 27 females in the group. A smaller group of 15 out of these patients could be followed from a stage wherein the radiographs were normal to a stage of extensive cystic changes, over a period of at least 6 years. An attempt is made to delineate this group within the rheumatoid arthritis disease entity.

  10. The Influence of Normalization Weight in Population Pharmacokinetic Covariate Models.

    Science.gov (United States)

    Goulooze, Sebastiaan C; Völler, Swantje; Välitalo, Pyry A J; Calvier, Elisa A M; Aarons, Leon; Krekels, Elke H J; Knibbe, Catherijne A J

    2018-03-23

    In covariate (sub)models of population pharmacokinetic models, most covariates are normalized to the median value; however, for body weight, normalization to 70 kg or 1 kg is often applied. In this article, we illustrate the impact of normalization weight on the precision of population clearance (CL pop ) parameter estimates. The influence of normalization weight (70, 1 kg or median weight) on the precision of the CL pop estimate, expressed as relative standard error (RSE), was illustrated using data from a pharmacokinetic study in neonates with a median weight of 2.7 kg. In addition, a simulation study was performed to show the impact of normalization to 70 kg in pharmacokinetic studies with paediatric or obese patients. The RSE of the CL pop parameter estimate in the neonatal dataset was lowest with normalization to median weight (8.1%), compared with normalization to 1 kg (10.5%) or 70 kg (48.8%). Typical clearance (CL) predictions were independent of the normalization weight used. Simulations showed that the increase in RSE of the CL pop estimate with 70 kg normalization was highest in studies with a narrow weight range and a geometric mean weight away from 70 kg. When, instead of normalizing with median weight, a weight outside the observed range is used, the RSE of the CL pop estimate will be inflated, and should therefore not be used for model selection. Instead, established mathematical principles can be used to calculate the RSE of the typical CL (CL TV ) at a relevant weight to evaluate the precision of CL predictions.

  11. Analysis of visual appearance of retinal nerve fibers in high resolution fundus images: a study on normal subjects.

    Science.gov (United States)

    Kolar, Radim; Tornow, Ralf P; Laemmer, Robert; Odstrcilik, Jan; Mayer, Markus A; Gazarek, Jiri; Jan, Jiri; Kubena, Tomas; Cernosek, Pavel

    2013-01-01

    The retinal ganglion axons are an important part of the visual system, which can be directly observed by fundus camera. The layer they form together inside the retina is the retinal nerve fiber layer (RNFL). This paper describes results of a texture RNFL analysis in color fundus photographs and compares these results with quantitative measurement of RNFL thickness obtained from optical coherence tomography on normal subjects. It is shown that local mean value, standard deviation, and Shannon entropy extracted from the green and blue channel of fundus images are correlated with corresponding RNFL thickness. The linear correlation coefficients achieved values 0.694, 0.547, and 0.512 for respective features measured on 439 retinal positions in the peripapillary area from 23 eyes of 15 different normal subjects.

  12. Analysis of Visual Appearance of Retinal Nerve Fibers in High Resolution Fundus Images: A Study on Normal Subjects

    Directory of Open Access Journals (Sweden)

    Radim Kolar

    2013-01-01

    Full Text Available The retinal ganglion axons are an important part of the visual system, which can be directly observed by fundus camera. The layer they form together inside the retina is the retinal nerve fiber layer (RNFL. This paper describes results of a texture RNFL analysis in color fundus photographs and compares these results with quantitative measurement of RNFL thickness obtained from optical coherence tomography on normal subjects. It is shown that local mean value, standard deviation, and Shannon entropy extracted from the green and blue channel of fundus images are correlated with corresponding RNFL thickness. The linear correlation coefficients achieved values 0.694, 0.547, and 0.512 for respective features measured on 439 retinal positions in the peripapillary area from 23 eyes of 15 different normal subjects.

  13. Dual radiofrequency drive quantum voltage standard with nanovolt resolution based on a closed-loop refrigeration cycle

    International Nuclear Information System (INIS)

    Georgakopoulos, D; Budovsky, I; Hagen, T; Sasaki, H; Yamamori, H

    2012-01-01

    We have developed a programmable Josephson voltage standard that can produce voltages up to 20 V with a resolution of better than 0.1 µV over the whole voltage range and better than 1 nV for voltages up to 10 mV. The standard has two superconductor–normal metal–superconductor junction arrays connected in series and driven by two radiofrequency oscillators. The cryogenic part of the standard is based on a cryocooler. The new standard agrees with the primary quantum voltage standard maintained at the National Measurement Institute, Australia, within 10 nV and forms the basis of an automated calibration system for digital multimeters and voltage references. (paper)

  14. Phenotype of normal spirometry in an aging population.

    Science.gov (United States)

    Vaz Fragoso, Carlos A; McAvay, Gail; Van Ness, Peter H; Casaburi, Richard; Jensen, Robert L; MacIntyre, Neil; Gill, Thomas M; Yaggi, H Klar; Concato, John

    2015-10-01

    In aging populations, the commonly used Global Initiative for Chronic Obstructive Lung Disease (GOLD) may misclassify normal spirometry as respiratory impairment (airflow obstruction and restrictive pattern), including the presumption of respiratory disease (chronic obstructive pulmonary disease [COPD]). To evaluate the phenotype of normal spirometry as defined by a new approach from the Global Lung Initiative (GLI), overall and across GOLD spirometric categories. Using data from COPDGene (n = 10,131; ages 45-81; smoking history, ≥10 pack-years), we evaluated spirometry and multiple phenotypes, including dyspnea severity (Modified Medical Research Council grade 0-4), health-related quality of life (St. George's Respiratory Questionnaire total score), 6-minute-walk distance, bronchodilator reversibility (FEV1 % change), computed tomography-measured percentage of lung with emphysema (% emphysema) and gas trapping (% gas trapping), and small airway dimensions (square root of the wall area for a standardized airway with an internal perimeter of 10 mm). Among 5,100 participants with GLI-defined normal spirometry, GOLD identified respiratory impairment in 1,146 (22.5%), including a restrictive pattern in 464 (9.1%), mild COPD in 380 (7.5%), moderate COPD in 302 (5.9%), and severe COPD in none. Overall, the phenotype of GLI-defined normal spirometry included normal adjusted mean values for dyspnea grade (0.8), St. George's Respiratory Questionnaire (15.9), 6-minute-walk distance (1,424 ft [434 m]), bronchodilator reversibility (2.7%), % emphysema (0.9%), % gas trapping (10.7%), and square root of the wall area for a standardized airway with an internal perimeter of 10 mm (3.65 mm); corresponding 95% confidence intervals were similarly normal. These phenotypes remained normal for GLI-defined normal spirometry across GOLD spirometric categories. GLI-defined normal spirometry, even when classified as respiratory impairment by GOLD, included adjusted mean values in the

  15. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...

  16. Normal pediatric postmortem CT appearances

    Energy Technology Data Exchange (ETDEWEB)

    Klein, Willemijn M.; Bosboom, Dennis G.H.; Koopmanschap, Desiree H.J.L.M. [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Nievelstein, Rutger A.J. [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Nikkels, Peter G.J. [University Medical Center Utrecht, Department of Pathology, Utrecht (Netherlands); Rijn, Rick R. van [Academic Medical Center, Department of Radiology, Amsterdam (Netherlands)

    2015-04-01

    Postmortem radiology is a rapidly developing specialty that is increasingly used as an adjunct to or substitute for conventional autopsy. The goal is to find patterns of disease and possibly the cause of death. Postmortem CT images bring to light processes of decomposition most radiologists are unfamiliar with. These postmortem changes, such as the formation of gas and edema, should not be mistaken for pathological processes that occur in living persons. In this review we discuss the normal postmortem thoraco-abdominal changes and how these appear on CT images, as well as how to differentiate these findings from those of pathological processes. (orig.)

  17. Multispectral histogram normalization contrast enhancement

    Science.gov (United States)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  18. Normal movement selectivity in autism.

    Science.gov (United States)

    Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J

    2010-05-13

    It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Movement selectivity is a defining characteristic of neurons involved in movement perception, including mirror neurons, and, as such, these findings argue against a mirror system dysfunction in autism. Copyright 2010 Elsevier Inc. All rights reserved.

  19. On The Extensive Form Of N-Person Cooperative Games | Udeh ...

    African Journals Online (AJOL)

    On The Extensive Form Of N-Person Cooperative Games. ... games. Keywords: Extensive form game, Normal form game, characteristic function, Coalition, Imputation, Player, Payoff, Strategy and Core ... AJOL African Journals Online. HOW TO ...

  20. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  1. Instant standard concept for data standards development

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; Kulcsor, Istvan Zsolt; Roes, Jasper

    2013-01-01

    This paper presents the current results of an ongoing research about a new data standards development concept. The concept is called Instant Standard referring to the pressure that is generated by shrinking the length of the standardization process. Based on this concept it is estimated that the

  2. Proximity effect in normal-superconductor hybrids for quasiparticle traps

    Energy Technology Data Exchange (ETDEWEB)

    Hosseinkhani, Amin [Peter Grunberg Institute (PGI-2), Forschungszentrum Julich, D-52425 Julich (Germany); JARA-Institute for Quantum Information, RWTH Aachen University, D-52056 Aachen (Germany)

    2016-07-01

    Coherent transport of charges in the form of Cooper pairs is the main feature of Josephson junctions which plays a central role in superconducting qubits. However, the presence of quasiparticles in superconducting devices may lead to incoherent charge transfer and limit the coherence time of superconducting qubits. A way around this so-called ''quasiparticle poisoning'' might be using a normal-metal island to trap quasiparticles; this has motivated us to revisit the proximity effect in normal-superconductor hybrids. Using the semiclassical Usadel equations, we study the density of states (DoS) both within and away from the trap. We find that in the superconducting layer the DoS quickly approaches the BCS form; this indicates that normal-metal traps should be effective at localizing quasiparticles.

  3. Modelling of tension stiffening for normal and high strength concrete

    DEFF Research Database (Denmark)

    Christiansen, Morten Bo; Nielsen, Mogens Peter

    1998-01-01

    form the model is extended to apply to biaxial stress fields as well. To determine the biaxial stress field, the theorem of minimum complementary elastic energy is used. The theory has been compared with tests on rods, disks, and beams of both normal and high strength concrete, and very good results...

  4. Learning attention for historical text normalization by learning to pronounce

    DEFF Research Database (Denmark)

    Bollmann, Marcel; Bingel, Joachim; Søgaard, Anders

    2017-01-01

    Automated processing of historical texts often relies on pre-normalization to modern word forms. Training encoder-decoder architectures to solve such problems typically requires a lot of training data, which is not available for the named task. We address this problem by using several novel encoder...

  5. Identity Work at a Normal University in Shanghai

    Science.gov (United States)

    Cockain, Alex

    2016-01-01

    Based upon ethnographic research, this article explores undergraduate students' experiences at a normal university in Shanghai focusing on the types of identities and forms of sociality emerging therein. Although students' symptoms of disappointment seem to indicate the power of university experiences to extinguish purposeful action, this article…

  6. On matrix superpotential and three-component normal modes

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, R. de Lima [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Lima, A.F. de [Universidade Federal de Campina Grande (UFCG), PB (Brazil). Dept. de Fisica; Mello, E.R. Bezerra de; Bezerra, V.B. [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil). Dept. de Fisica]. E-mails: rafael@df.ufcg.edu.br; aerlima@df.ufcg.edu.br; emello@fisica.ufpb.br; valdir@fisica.ufpb.br

    2007-07-01

    We consider the supersymmetric quantum mechanics(SUSY QM) with three-component normal modes for the Bogomol'nyi-Prasad-Sommerfield (BPS) states. An explicit form of the SUSY QM matrix superpotential is presented and the corresponding three-component bosonic zero-mode eigenfunction is investigated. (author)

  7. Alexander's disease in a neurologically normal child: a case report

    International Nuclear Information System (INIS)

    Guthrie, Scott O.; Knowles, Paul; Marshall, Robert; Burton, Edward M.

    2003-01-01

    We report the clinical and MRI findings of symmetric hyperintensity involving the deep and subcortical white matter of the frontal lobes in a neurologically normal child with macrocephaly. In this patient, a serum test for mutations in glial fibrillary acidic protein, used to diagnose Alexander's disease (AD), was positive. This case indicates an extraordinarily mild or early form of juvenile-onset AD. (orig.)

  8. Update on normal tension glaucoma

    Directory of Open Access Journals (Sweden)

    Jyotiranjan Mallick

    2016-01-01

    Full Text Available Normal tension glaucoma (NTG is labelled when typical glaucomatous disc changes, visual field defects and open anterior chamber angles are associated with intraocular pressure (IOP constantly below 21 mmHg. Chronic low vascular perfusion, Raynaud's phenomenon, migraine, nocturnal systemic hypotension and over-treated systemic hypertension are the main causes of normal tension glaucoma. Goldmann applanation tonometry, gonioscopy, slit lamp biomicroscopy, optical coherence tomography and visual field analysis are the main tools of investigation for the diagnosis of NTG. Management follows the same principles of treatment for other chronic glaucomas: To reduce IOP by a substantial amount, sufficient to prevent disabling visual loss. Treatment is generally aimed to lower IOP by 30% from pre-existing levels to 12-14 mmHg. Betaxolol, brimonidine, prostaglandin analogues, trabeculectomy (in refractory cases, systemic calcium channel blockers (such as nifedipine and 24-hour monitoring of blood pressure are considered in the management of NTG. The present review summarises risk factors, causes, pathogenesis, diagnosis and management of NTG.

  9. Normal variation of hepatic artery

    International Nuclear Information System (INIS)

    Kim, Inn; Nam, Myung Hyun; Rhim, Hyun Chul; Koh, Byung Hee; Seo, Heung Suk; Kim, Soon Yong

    1987-01-01

    This study was an analyses of blood supply of the liver in 125 patients who received hepatic arteriography and abdominal aortography from Jan. 1984 to Dec. 1986 at the Department of Radiology of Hanyang University Hospital. A. Variations in extrahepatic arteries: 1. The normal extrahepatic artery pattern occurred in 106 of 125 cases (84.8%) ; Right hepatic and left hepatic arteries arising from the hepatic artery proper and hepatic artery proper arising from the common hepatic artery. 2. The most common type of variation of extrahepatic artery was replaced right hepatic artery from superior mesenteric artery: 6 of 125 cases (4.8%). B. Variations in intrahepatic arteries: 1. The normal intrahepatic artery pattern occurred in 83 of 125 cases (66.4%). Right hepatic and left hepatic arteries arising from the hepatic artery proper and middle hepatic artery arising from lower portion of the umbilical point of left hepatic artery. 2. The most common variation of intrahepatic arteries was middle hepatic artery. 3. Among the variation of middle hepatic artery; Right, middle and left hepatic arteries arising from the same location at the hepatic artery proper was the most common type; 17 of 125 cases (13.6%)

  10. Radiation control standards and procedures

    Energy Technology Data Exchange (ETDEWEB)

    1956-12-14

    This manual contains the Radiation Control Standards'' and Radiation Control Procedures'' at Hanford Operations which have been established to provide the necessary control radiation exposures within Irradiation Processing Department. Provision is also made for including, in the form of Bulletins'', other radiological information of general interest to IPD personnel. The purpose of the standards is to establish firm radiological limits within which the Irradiation Processing Department will operate, and to outline our radiation control program in sufficient detail to insure uniform and consistent application throughout all IPD facilities. Radiation Control Procedures are intended to prescribe the best method of accomplishing an objective within the limitations of the Radiation Control Standards. A procedure may be changed at any time provided the suggested changes is generally agreeable to management involved, and is consistent with department policies and the Radiation Control Standards.

  11. [Precautions of physical performance requirements and test methods during product standard drafting process of medical devices].

    Science.gov (United States)

    Song, Jin-Zi; Wan, Min; Xu, Hui; Yao, Xiu-Jun; Zhang, Bo; Wang, Jin-Hong

    2009-09-01

    The major idea of this article is to discuss standardization and normalization for the product standard of medical devices. Analyze the problem related to the physical performance requirements and test methods during product standard drafting process and make corresponding suggestions.

  12. Package materials, waste form

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    The schedules for waste package development for the various host rocks were presented. The waste form subtask activities were reviewed, with the papers focusing on high-level waste, transuranic waste, and spent fuel. The following ten papers were presented: (1) Waste Package Development Approach; (2) Borosilicate Glass as a Matrix for Savannah River Plant Waste; (3) Development of Alternative High-Level Waste Forms; (4) Overview of the Transuranic Waste Management Program; (5) Assessment of the Impacts of Spent Fuel Disassembly - Alternatives on the Nuclear Waste Isolation System; (6) Reactions of Spent Fuel and Reprocessing Waste Forms with Water in the Presence of Basalt; (7) Spent Fuel Stabilizer Screening Studies; (8) Chemical Interactions of Shale Rock, Prototype Waste Forms, and Prototype Canister Metals in a Simulated Wet Repository Environment; (9) Impact of Fission Gas and Volatiles on Spent Fuel During Geologic Disposal; and (10) Spent Fuel Assembly Decay Heat Measurement and Analysis

  13. Getting in-formed

    DEFF Research Database (Denmark)

    Hansbøl, Mikala

    det vi undersøger på form gennem vores beskrivelser. Paperet tager afsæt i empiriske eksempler fra et postdoc projekt om et såkaldt 'serious game' - Mingoville. Projektet følger circuleringer og etableringer af Mingoville 'på en global markedsplads'. I paperet diskuteres hvordan vi som forskere samler....../performer de fænomener vi forsker i. Aktør-Netværks-Teoretiker Bruno Latour (2005) pointerer at enhver beskrivelse også er en form for forklaring. En form for forklaring, der putter ting ind i et skript og dermed også putter ting på form. Paperet diskuterer to tilgange til at gøre serious games og derved skabe viden om...... engagementer med disse fænomener i serious games forskning: experimentel og etnografisk....

  14. NOAA Form 370 Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data set contains information from submitted NOAA Form 370s, also known as the Fisheries Certificate of Origin, for imported shipments of frozen and/or processed...

  15. MAPS Appraisal Report Form

    CERN Multimedia

    HR Department

    2005-01-01

    As announced in Weekly Bulletin 48/2004, from now onwards, the paper MAPS appraisal report form has been replaced by an electronic form, which is available via EDH (on the EDH desktop under Other Tasks / HR & Training) No changes have been made to the contents of the form. Practical information will be available on the web page http://cern.ch/ais/projs/forms/maps/info.htm, and information meetings will be held on the following dates: 18 January 2005: MAIN AUDITORIUM (500-1-001) from 14:00 to 15:30. 20 January 2005: AB AUDITORIUM II (864-1-D02) from14:00 to 15:30. 24 January 2005: AT AUDITORIUM (30-7-018) from 10:00 to 11:30. Human Resources Department Tel. 73566

  16. VMS forms Output Tables

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — These output tables contain parsed and format validated data from the various VMS forms that are sent from any given vessel, while at sea, from the VMS devices on...

  17. Science on form

    International Nuclear Information System (INIS)

    Ishizaka, Shozo; Kato, Yoshihiro; Takaki, Ryuji; Toriwaki, Jun-ichiro

    1987-01-01

    The purpose of the Symposium was to discuss interdisciplinal science aspects of form. 'Form' depends on the material and the changes. But, it is the form that appears evident at once and endures. Form is absorbed from every field as media of information. One part of the work covers the description of non-periodic phenomena, morphogenesis or evolution. Irreducible stubborn facts as diseases or social problems, or whatever else that could not be analyzed are integrally challenged to be systematized by computer simulation. The other part covers the finding of laws for determining how systems behave. Attention should be paid to pattern recognition, image processing and pattern formation. The Symposium proceeded with no parallel sessions, and participants from various fields made exciting discussions in an interdisciplinal atmosphere. (Auth.)

  18. Access Customized Forms

    OpenAIRE

    Cosma Emil; Jeflea Victor

    2010-01-01

    By using Word, Excel or PowerPoint one can automate routine operations using the VBA language (Visual Basic for Applications). This language is also used in Access, allowing access to data stored in tables or queries. Thus, Access and VBA resources can be used together. Access is designed for programming forms and reports (among other things), so there won’t be found any of the VBA editor’s specific forms.

  19. Design of Normal Concrete Mixtures Using Workability-Dispersion-Cohesion Method

    Directory of Open Access Journals (Sweden)

    Hisham Qasrawi

    2016-01-01

    Full Text Available The workability-dispersion-cohesion method is a new proposed method for the design of normal concrete mixes. The method uses special coefficients called workability-dispersion and workability-cohesion factors. These coefficients relate workability to mobility and stability of the concrete mix. The coefficients are obtained from special charts depending on mix requirements and aggregate properties. The method is practical because it covers various types of aggregates that may not be within standard specifications, different water to cement ratios, and various degrees of workability. Simple linear relationships were developed for variables encountered in the mix design and were presented in graphical forms. The method can be used in countries where the grading or fineness of the available materials is different from the common international specifications (such as ASTM or BS. Results were compared to the ACI and British methods of mix design. The method can be extended to cover all types of concrete.

  20. Connection between effective-range expansion and nuclear vertex constant or asymptotic normalization coefficient

    International Nuclear Information System (INIS)

    Yarmukhamedov, R.; Baye, D.

    2011-01-01

    Explicit relations between the effective-range expansion and the nuclear vertex constant or asymptotic normalization coefficient (ANC) for the virtual decay B→A+a are derived for an arbitrary orbital momentum together with the corresponding location condition for the (A+a) bound-state energy. They are valid both for the charged case and for the neutral case. Combining these relations with the standard effective-range function up to order six makes it possible to reduce to two the number of free effective-range parameters if an ANC value is known from experiment. Values for the scattering length, effective range, and form parameter are determined in this way for the 16 O+p, α+t, and α+ 3 He collisions in partial waves where a bound state exists by using available ANCs deduced from experiments. The resulting effective-range expansions for these collisions are valid up to energies larger than 5 MeV.

  1. Impact of PET/CT image reconstruction methods and liver uptake normalization strategies on quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kuhnert, Georg; Sterzer, Sergej; Kahraman, Deniz; Dietlein, Markus; Drzezga, Alexander; Kobe, Carsten [University Hospital of Cologne, Department of Nuclear Medicine, Cologne (Germany); Boellaard, Ronald [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Scheffler, Matthias; Wolf, Juergen [University Hospital of Cologne, Lung Cancer Group Cologne, Department I of Internal Medicine, Center for Integrated Oncology Cologne Bonn, Cologne (Germany)

    2016-02-15

    In oncological imaging using PET/CT, the standardized uptake value has become the most common parameter used to measure tracer accumulation. The aim of this analysis was to evaluate ultra high definition (UHD) and ordered subset expectation maximization (OSEM) PET/CT reconstructions for their potential impact on quantification. We analyzed 40 PET/CT scans of lung cancer patients who had undergone PET/CT. Standardized uptake values corrected for body weight (SUV) and lean body mass (SUL) were determined in the single hottest lesion in the lung and normalized to the liver for UHD and OSEM reconstruction. Quantitative uptake values and their normalized ratios for the two reconstruction settings were compared using the Wilcoxon test. The distribution of quantitative uptake values and their ratios in relation to the reconstruction method used were demonstrated in the form of frequency distribution curves, box-plots and scatter plots. The agreement between OSEM and UHD reconstructions was assessed through Bland-Altman analysis. A significant difference was observed after OSEM and UHD reconstruction for SUV and SUL data tested (p < 0.0005 in all cases). The mean values of the ratios after OSEM and UHD reconstruction showed equally significant differences (p < 0.0005 in all cases). Bland-Altman analysis showed that the SUV and SUL and their normalized values were, on average, up to 60 % higher after UHD reconstruction as compared to OSEM reconstruction. OSEM and HD reconstruction brought a significant difference for SUV and SUL, which remained constantly high after normalization to the liver, indicating that standardization of reconstruction and the use of comparable SUV measurements are crucial when using PET/CT. (orig.)

  2. Polyphony and verb forms

    Directory of Open Access Journals (Sweden)

    Jelena Rajić

    2012-12-01

    Full Text Available This paper examines some special uses of indicative and subjunctive verb forms in Spanish, which contemporary linguistics explains using the notions of polyphony, evidentials, echoic representation, quotatives, etc. These terms, even though they refer to different characteristics and belong to different theoretical frameworks, share one common feature: they all refer to diverse linguistic forms (discourse markers, linguistic negation, quotatives, echoic utterances, etc. characterized by the presence and interaction of different voices or points of view in one discourse sequence. In this study we are interested in a description of quotative or polyphonic meanings expressed by specific verb forms and tenses, the imperfect and the conditional, and also by indicative forms in subordinate substantive clauses with a negative main verb and by subjunctive forms in subordinate concessive clauses. Our research focuses on the analysis of the linguistic conditions that make possible the evidential use of the conditional, the imperfect and the echoic (metarepresentative interpretation of indicative and subjunctive forms in the above-mentioned contexts. The examples we discuss show that evidential and echoic interpretations are inferential meanings derived from the extralinguistic situation and the knowledge that speakers have of the world.

  3. Forms of global governence

    Directory of Open Access Journals (Sweden)

    Maxim V. Kharkevich

    2014-01-01

    Full Text Available Global governance as a concept defines the meaning of contemporary world politics both as a discipline and as reality. Interdependent and globalized world requires governance, and a global government has not been formed yet. The theoretical possibility of global governance without global government is proved and justified. The purpose of this article is to analytically identify possible forms of global governance. Three such forms of global governance are identified: hierarchical, market and network. In a hierarchy the governance is due to the asymmetry of power between the parties. Market control happens via anonymous pricing mechanism. Network, in contrast to the market is characterized by a closer value link between the actors, but unlike the hierarchical relationship actors are free to leave the network. Global governance takes three forms and is being implemented by different actors. To determine the most efficient form of global governance is impossible. Efficiency depends on the match between a form and an object of government. It should be noted that meta governance is likely to remain a monopoly of institutionally strong states in global governance.

  4. APS beamline standard components handbook

    International Nuclear Information System (INIS)

    Kuzay, T.M.

    1992-01-01

    It is clear that most Advanced Photon Source (APS) Collaborative Access Team (CAT) members would like to concentrate on designing specialized equipment related to their scientific programs rather than on routine or standard beamline components. Thus, an effort is in progress at the APS to identify standard and modular components of APS beamlines. Identifying standard components is a nontrivial task because these components should support diverse beamline objectives. To assist with this effort, the APS has obtained advice and help from a Beamline Standardization and Modularization Committee consisting of experts in beamline design, construction, and operation. The staff of the Experimental Facilities Division identified various components thought to be standard items for beamlines, regardless of the specific scientific objective of a particular beamline. A generic beamline layout formed the basis for this identification. This layout is based on a double-crystal monochromator as the first optical element, with the possibility of other elements to follow. Pre-engineering designs were then made of the identified standard components. The Beamline Standardization and Modularization Committee has reviewed these designs and provided very useful input regarding the specifications of these components. We realize that there will be other configurations that may require special or modified components. This Handbook in its current version (1.1) contains descriptions, specifications, and pre-engineering design drawings of these standard components. In the future, the APS plans to add engineering drawings of identified standard beamline components. Use of standard components should result in major cost reductions for CATs in the areas of beamline design and construction

  5. Normal central retinal function and structure preserved in retinitis pigmentosa.

    Science.gov (United States)

    Jacobson, Samuel G; Roman, Alejandro J; Aleman, Tomas S; Sumaroka, Alexander; Herrera, Waldo; Windsor, Elizabeth A M; Atkinson, Lori A; Schwartz, Sharon B; Steinberg, Janet D; Cideciyan, Artur V

    2010-02-01

    To determine whether normal function and structure, as recently found in forms of Usher syndrome, also occur in a population of patients with nonsyndromic retinitis pigmentosa (RP). Patients with simplex, multiplex, or autosomal recessive RP (n = 238; ages 9-82 years) were studied with static chromatic perimetry. A subset was evaluated with optical coherence tomography (OCT). Co-localized visual sensitivity and photoreceptor nuclear layer thickness were measured across the central retina to establish the relationship of function and structure. Comparisons were made to patients with Usher syndrome (n = 83, ages 10-69 years). Cross-sectional psychophysical data identified patients with RP who had normal rod- and cone-mediated function in the central retina. There were two other patterns with greater dysfunction, and longitudinal data confirmed that progression can occur from normal rod and cone function to cone-only central islands. The retinal extent of normal laminar architecture by OCT corresponded to the extent of normal visual function in patients with RP. Central retinal preservation of normal function and structure did not show a relationship with age or retained peripheral function. Usher syndrome results were like those in nonsyndromic RP. Regional disease variation is a well-known finding in RP. Unexpected was the observation that patients with presumed recessive RP can have regions with functionally and structurally normal retina. Such patients will require special consideration in future clinical trials of either focal or systemic treatment. Whether there is a common molecular mechanism shared by forms of RP with normal regions of retina warrants further study.

  6. Is My Penis Normal? (For Teens)

    Science.gov (United States)

    ... Videos for Educators Search English Español Is My Penis Normal? KidsHealth / For Teens / Is My Penis Normal? Print en español ¿Es normal mi pene? ... any guy who's ever worried about whether his penis is a normal size. There's a fairly wide ...

  7. Normal vibrations in gallium arsenide

    International Nuclear Information System (INIS)

    Dolling, G.; Waugh, J.L.T.

    1964-01-01

    The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296 o K. The frequencies of normal modes of vibration propagating in the [ζ00], (ζζζ], and (0ζζ] crystal directions have been determined with a precision of between 1 and 2·5 per cent. A limited number of normal modes have also been studied at 95 and 184 o K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296 o K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10 12 c/s) for these modes (at 296 o K) have been assigned: T 8·02±0·08 and L 8·55±02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7·56 ± 008, TA 2·36 ± 0·015, LO 7·22 ± 0·15, LA 6·80 ± 0·06; (b) (0·5, 0·5, 0·5): TO 7·84 ± 0·12, TA 1·86 ± 0·02, LO 7·15 ± 0·07, LA 6·26 ± 0·10; (c) (0, 0·65, 0·65): optic 8·08 ±0·13, 7·54 ± 0·12 and 6·57 ± 0·11, acoustic 5·58 ± 0·08, 3·42 · 0·06 and 2·36 ± 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0·04 e. The frequency distribution function has been derived from one of the force models. (author)

  8. Normal vibrations in gallium arsenide

    Energy Technology Data Exchange (ETDEWEB)

    Dolling, G; Waugh, J L T

    1964-07-01

    The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296{sup o}K. The frequencies of normal modes of vibration propagating in the [{zeta}00], ({zeta}{zeta}{zeta}], and (0{zeta}{zeta}] crystal directions have been determined with a precision of between 1 and 2{center_dot}5 per cent. A limited number of normal modes have also been studied at 95 and 184{sup o}K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296{sup o}K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10{sup 12} c/s) for these modes (at 296{sup o}K) have been assigned: T 8{center_dot}02{+-}0{center_dot}08 and L 8{center_dot}55{+-}02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7{center_dot}56 {+-} 008, TA 2{center_dot}36 {+-} 0{center_dot}015, LO 7{center_dot}22 {+-} 0{center_dot}15, LA 6{center_dot}80 {+-} 0{center_dot}06; (b) (0{center_dot}5, 0{center_dot}5, 0{center_dot}5): TO 7{center_dot}84 {+-} 0{center_dot}12, TA 1{center_dot}86 {+-} 0{center_dot}02, LO 7{center_dot}15 {+-} 0{center_dot}07, LA 6{center_dot}26 {+-} 0{center_dot}10; (c) (0, 0{center_dot}65, 0{center_dot}65): optic 8{center_dot}08 {+-}0{center_dot}13, 7{center_dot}54 {+-} 0{center_dot}12 and 6{center_dot}57 {+-} 0{center_dot}11, acoustic 5{center_dot}58 {+-} 0{center_dot}08, 3{center_dot}42 {center_dot} 0{center_dot}06 and 2{center_dot}36 {+-} 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0{center_dot}04 e. The

  9. Malaysian NDT standards

    International Nuclear Information System (INIS)

    Khazali Mohd Zin

    2001-01-01

    In order to become a developed country, Malaysia needs to develop her own national standards. It has been projected that by the year 2020, Malaysia requires about 8,000 standards (Department of Standard Malaysia). Currently more than 2,000 Malaysian Standards have been gazette by the government which considerably too low before tire year 2020. NDT standards have been identified by the standard working group as one of the areas to promote our national standards. In this paper the author describes the steps taken to establish the Malaysian very own NDT standards. The project starts with the establishment of radiographic standards. (Author)

  10. Status of conversion of DOE standards to non-Government standards

    Energy Technology Data Exchange (ETDEWEB)

    Moseley, H.L.

    1992-07-01

    One major goal of the DOE Technical Standards Program is to convert existing DOE standards into non-Government standards (NGS's) where possible. This means that a DOE standard may form the basis for a standards-writing committee to produce a standard in the same subject area using the non-Government standards consensus process. This report is a summary of the activities that have evolved to effect conversion of DOE standards to NGSs, and the status of current conversion activities. In some cases, all requirements in a DOE standard will not be incorporated into the published non-Government standard because these requirements may be considered too restrictive or too specific for broader application by private industry. If requirements in a DOE standard are not incorporated in a non-Government standard and the requirements are considered necessary for DOE program applications, the DOE standard will be revised and issued as a supplement to the non-Government standard. The DOE standard will contain only those necessary requirements not reflected by the non-Government standard. Therefore, while complete conversion of DOE standards may not always be realized, the Department's technical standards policy as stated in Order 1300.2A has been fully supported in attempting to make maximum use of the non-Government standard.

  11. Status of conversion of DOE standards to non-Government standards

    Energy Technology Data Exchange (ETDEWEB)

    Moseley, H.L.

    1992-07-01

    One major goal of the DOE Technical Standards Program is to convert existing DOE standards into non-Government standards (NGS`s) where possible. This means that a DOE standard may form the basis for a standards-writing committee to produce a standard in the same subject area using the non-Government standards consensus process. This report is a summary of the activities that have evolved to effect conversion of DOE standards to NGSs, and the status of current conversion activities. In some cases, all requirements in a DOE standard will not be incorporated into the published non-Government standard because these requirements may be considered too restrictive or too specific for broader application by private industry. If requirements in a DOE standard are not incorporated in a non-Government standard and the requirements are considered necessary for DOE program applications, the DOE standard will be revised and issued as a supplement to the non-Government standard. The DOE standard will contain only those necessary requirements not reflected by the non-Government standard. Therefore, while complete conversion of DOE standards may not always be realized, the Department`s technical standards policy as stated in Order 1300.2A has been fully supported in attempting to make maximum use of the non-Government standard.

  12. A Denotational Account of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2004-01-01

    We show that the standard normalization-by-evaluation construction for the simply-typed λβγ-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a recursively defined invariant relation, in the style of Pitts. In fact, the cons...... proof for the normalization algorithm, expressed as a functional program in an ML-like call-by-value language. A version of this article with detailed proofs is available as a technical report [5]....

  13. Striving for the unknown normal

    DEFF Research Database (Denmark)

    Nielsen, Mikka

    During the last decade, more and more people have received prescriptions for ADHD drug treatment, and simultaneously the legitimacy of the ADHD diagnosis has been heavily debated among both professionals and laymen. Based on an anthropological fieldwork among adults with ADHD, I illustrate how...... the ADHD diagnosis both answers and produces existential questions on what counts as normal behaviour and emotions. The diagnosis helps the diagnosed to identify, accept and handle problems by offering concrete explanations and solutions to diffuse experienced problems. But the diagnostic process...... is not only a clarifying procedure with a straight plan for treatment and direct effects. It is also a messy affair. In a process of experimenting with drugs and attempting to determine how or whether the medication eliminates the correct symptoms the diagnosed is put in an introspective, self...

  14. IIH with normal CSF pressures?

    Directory of Open Access Journals (Sweden)

    Soh Youn Suh

    2013-01-01

    Full Text Available Idiopathic intracranial hypertension (IIH is a condition of raised intracranial pressure (ICP in the absence of space occupying lesions. ICP is usually measured by lumbar puncture and a cerebrospinal fluid (CSF pressure above 250 mm H 2 O is one of the diagnostic criteria of IIH. Recently, we have encountered two patients who complained of headaches and exhibited disc swelling without an increased ICP. We prescribed acetazolamide and followed both patients frequently; because of the definite disc swelling with IIH related symptoms. Symptoms and signs resolved in both patients after they started taking acetazolamide. It is generally known that an elevated ICP, as measured by lumbar puncture, is the most important diagnostic sign of IIH. However, these cases caution even when CSF pressure is within the normal range, that suspicion should be raised when a patient has papilledema with related symptoms, since untreated papilledema may cause progressive and irreversible visual loss.

  15. Transport through hybrid superconducting/normal nanostructures

    Energy Technology Data Exchange (ETDEWEB)

    Futterer, David

    2013-01-29

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  16. Transport through hybrid superconducting/normal nanostructures

    International Nuclear Information System (INIS)

    Futterer, David

    2013-01-01

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  17. CT in normal pressure hydrocephalus

    International Nuclear Information System (INIS)

    Fujita, Katsuzo; Nogaki, Hidekazu; Noda, Masaya; Kusunoki, Tadaki; Tamaki, Norihiko

    1981-01-01

    CT scans were obtained on 33 patients (age 73y. to 31y.) with the diagnosis of normal pressure hydrocephalus. In each case, the diagnosis was made on the basis of the symptoms, CT and cisternographic findings. Underlying diseases of normal pressure hydrocephalus are ruptured aneurysms (21 cases), arteriovenous malformations (2 cases), head trauma (1 case), cerebrovascular accidents (1 case) and idiopathie (8 cases). Sixteen of 33 patients showed marked improvement, five, moderate or minimal improvement, and twelve, no change. The results were compared with CT findings and clinical response to shunting. CT findings were classified into five types, bases on the degree of periventricular hypodensity (P.V.H.), the extent of brain damage by underlying diseases, and the degree of cortical atrophy. In 17 cases of type (I), CT shows the presence of P.V.H. with or without minimal frontal lobe damage and no cortical atrophy. The good surgical improvements were achieved in all cases of type (I) by shunting. In 4 cases of type (II), CT shows the presence of P.V.H. and severe brain damage without cortical atrophy. The fair clinical improvements were achieved in 2 cases (50%) by shunting. In one case of type (III), CT shows the absence of P.V.H. without brain damage nor cortical atrophy. No clinical improvement was obtained by shunting in this type. In 9 cases of type (IV) with mild cortical atrophy, the fair clinical improvement was achieved in two cases (22%) and no improvement in 7 cases. In 2 cases of type (V) with moderate or marked cortical atrophy, no clinical improvement was obtained by shunting. In conclusion, it appeared from the present study that there was a good correlation between the result of shunting and the type of CT, and clinical response to shunting operation might be predicted by classification of CT findings. (author)

  18. Perron–Frobenius theorem for nonnegative multilinear forms and extensions

    OpenAIRE

    Friedland, S.; Gaubert, S.; Han, L.

    2013-01-01

    We prove an analog of Perron-Frobenius theorem for multilinear forms with nonnegative coefficients, and more generally, for polynomial maps with nonnegative coefficients. We determine the geometric convergence rate of the power algorithm to the unique normalized eigenvector.

  19. Pudendal somatosensory evoked potentials in normal women

    Directory of Open Access Journals (Sweden)

    Geraldo A. Cavalcanti

    2007-12-01

    Full Text Available OBJECTIVE: Somatosensory evoked potential (SSEP is an electrophysiological test used to evaluate sensory innervations in peripheral and central neuropathies. Pudendal SSEP has been studied in dysfunctions related to the lower urinary tract and pelvic floor. Although some authors have already described technical details pertaining to the method, the standardization and the influence of physiological variables in normative values have not yet been established, especially for women. The aim of the study was to describe normal values of the pudendal SSEP and to compare technical details with those described by other authors. MATERIALS AND METHODS: The clitoral sensory threshold and pudendal SSEP latency was accomplished in 38 normal volunteers. The results obtained from stimulation performed on each side of the clitoris were compared to ages, body mass index (BMI and number of pregnancies. RESULTS: The values of clitoral sensory threshold and P1 latency with clitoral left stimulation were respectively, 3.64 ± 1.01 mA and 37.68 ± 2.60 ms. Results obtained with clitoral right stimulation were 3.84 ± 1.53 mA and 37.42 ± 3.12 ms, respectively. There were no correlations between clitoral sensory threshold and P1 latency with age, BMI or height of the volunteers. A significant difference was found in P1 latency between nulliparous women and volunteers who had been previously submitted to cesarean section. CONCLUSIONS: The SSEP latency represents an accessible and reproducible method to investigate the afferent pathways from the genitourinary tract. These results could be used as normative values in studies involving genitourinary neuropathies in order to better clarify voiding and sexual dysfunctions in females.

  20. [Adult form of Pompe disease].

    Science.gov (United States)

    Ziółkowska-Graca, Bozena; Kania, Aleksander; Zwolińska, Grazyna; Nizankowska-Mogilnicka, Ewa

    2008-01-01

    Pompe disease (glycogen-storage disease type II) is an autosomal recessive disorder caused by a deficiency of lysosomal acid alpha-glucosidase (GAA), leading to the accumulation of glycogen in the lysosomes primarily in muscle cells. In the adult form of the disease, proximal muscle weakness is noted and muscle volume is decreased. The infantile form is usually fatal. In the adult form of the disease the prognosis is relatively good. Muscle weakness may, however, interfere with normal daily activities, and respiratory insufficiency may be associated with obstructive sleep apnea. Death usually results from respiratory failure. Effective specific treatment is not available. Enzyme replacement therapy with recombinant human GAA (rh-GAA) still remains a research area. We report the case of a 24-year-old student admitted to the Department of Pulmonary Diseases because of severe respiratory insufficiency. Clinical symptoms such as dyspnea, muscular weakness and increased daytime sleepiness had been progressing for 2 years. Clinical examination and increased blood levels of CK suggested muscle pathology. Histopathological analysis of muscle biopsy, performed under electron microscope, confirmed the presence of vacuoles containing glycogen. Specific enzymatic activity of alpha-glucosidase was analyzed confirming Pompe disease. The only effective method to treat respiratory insufficiency was bi-level positive pressure ventilation. Respiratory rehabilitation was instituted and is still continued by the patient at home. A high-protein, low-sugar diet was proposed for the patient. Because of poliglobulia low molecular weight heparin was prescribed. The patient is eligible for experimental replacement therapy with rh-GAA.