Normal forms in Poisson geometry
Marcut, I.T.
2013-01-01
The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric
A q deformation of Gell-Mann-Okubo mass formula
Bagchi, B.; Biswas, S.N.
1996-01-01
We explore the possibility of deforming Gell-Mann-Okubo (GMO) mass formula within the framework of a quantized enveloping algebra. A small value of the deformation parameter is found to provide a good fit to the observed mass spectra of the π, K and η, mesons. (author). 13 refs
The reason behind the Gell-Mann-Okubo mass formula
Souza, Mario Everaldo de
1994-01-01
The Gell-Mann-Okubo mass formula has been widely used as a phenomenological tool in particle physics but the underlying basis for it has not been known. This paper reveals its basis and generalizes the formula to SU(n) (n = 3,4,5,6). (author)
TRASYS form factor matrix normalization
Tsuyuki, Glenn T.
1992-01-01
A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.
Normal form theory and spectral sequences
Sanders, Jan A.
2003-01-01
The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.
a Recursive Approach to Compute Normal Forms
HSU, L.; MIN, L. J.; FAVRETTO, L.
2001-06-01
Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.
Nonlinear dynamics exploration through normal forms
Kahn, Peter B
2014-01-01
Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations
Normal forms of Hopf-zero singularity
Gazor, Majid; Mokhtari, Fahimeh
2015-01-01
The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)
Normal forms of Hopf-zero singularity
Gazor, Majid; Mokhtari, Fahimeh
2015-01-01
The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.
Normal equivariant forms of vector fields
Sanchez Bringas, F.
1992-07-01
We prove a theorem of linearization of type Siegel and a theorem of normal forms of type Poincare-Dulac for germs of holomorphic vector fields in the origin of C 2 , Γ -equivariants, where Γ is a finite subgroup of GL (2,C). (author). 5 refs
Normal form for mirror machine Hamiltonians
Dragt, A.J.; Finn, J.M.
1979-01-01
A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior
The Okubo-Zweig-Iizuka rule and unitarity
Ruuskanen, V.; Toernqvist, N.A.
1977-01-01
Using an explicitly unitary framework the Okubo-Zweig-Iizuka (OZI) rule is discussed and in particular how this rule is compatible with unitarity. For the phi-ω case unitarity effects (phi→KantiK→ω) contribute a (nearly) imaginary mixing of 0.6 +- 0.1 % and thus cannot account for the whole mixing of 6-10 %. For the f'-f mixing unitarity effects give a much larger value (>6.8%). In order to understand the small experimental f'→2π coupling the process f'→f→2π must be cancelled by another process e.g. f'→fsup((2))→2π, where fsup((2)) is a heavy f-like meson. For the psions above the first important charm threshold unitarity effects are likely to be crucial. At very high energies cancellations can suppress the unitarity effects. But in a transient energy interval (+p particular between the Dsup(*)antiDsup(*) and the Dsup(*)antiDsup(**) thresholds) these cancellations cannot work everywhere because mass differences are important. Therefore (if charm annihilation amplitudes near threshold are not negligibly small), it is expected that in this interval unitarity effects should be the dominant mechanism for the breaking of the OZI rule. Results from a conventional mass matrix mixing analysis are also given in the appendix. (author)
The Okubo-Zweig-Iizuka rule and dual unitarization
Ninomiya, Kansuke; Toyoda, Fumihiko.
1978-01-01
The Okubo-Zweig-Iizuka (OZI) rule carries an important role in the constituent rearrangement and interest in the rule has been increased by the discovery of the J/psi particle. An example of phenomenological analysis related to the OZI rule is introduced. The decays of psi particles can be explained consistently by an existence of the disconnected type of a constituent rearrangement diagram (CRD) with comparable strength to that of the connected one. A survey of an outline of the dual unitarization scheme and discussion of the OZI rule with the scheme and other related theory are presented. The method of dual unitarization starts with the unitarity relation. In the estimation of multiparticle production amplitude in the unitarity sum, dominant nondiffractive production component is taken as a first approximation and the amplitude is described by the multiperipheral model of Reggeon exchange. Here, a case of meson-meson scattering in flavor of SU(N) scheme is considered. A survey of the mechanism of the planar bootstrap and Pomeron generation in the dual unitarization is made. The nonplanar CR diagrams give breaking of various features at the planar level. The reaction with the disconnected CRD could occur in the nonperipheral region through the hard collision of constituent and appears at large momentum transfers in comparable order with a connected one. Baryon and baryoniums with the scheme of the dual unitarization are examined. (Kato, T.)
Test of the Okubo-Zweig-Iizuka rule in phi production
Etkin, A.; Foley, K.J.; Goldman, J.H.; Love, W.A.; Morris, T.W.; Ozaki, S.; Platner, E.D.; Saulys, A.C.; Wheeler, C.D.; Willen, E.H.; Lindenbaum, S.J.; Kramer, M.A.; Mallik, U.
1978-01-01
We have measured the reaction π - p → K + K - K + K - n at 22.6 GeV/c and defect strong phi signals in the K + K - effective-mass plots. We do not observe the expected Okubo-Zweig-Iizuka--rule suppression of the phiphin final state and conclude that the rule is working poorly in the observed production processes
AFP Algorithm and a Canonical Normal Form for Horn Formulas
Majdoddin, Ruhollah
2014-01-01
AFP Algorithm is a learning algorithm for Horn formulas. We show that it does not improve the complexity of AFP Algorithm, if after each negative counterexample more that just one refinements are performed. Moreover, a canonical normal form for Horn formulas is presented, and it is proved that the output formula of AFP Algorithm is in this normal form.
An Algorithm for Higher Order Hopf Normal Forms
A.Y.T. Leung
1995-01-01
Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.
Normal form and synchronization of strict-feedback chaotic systems
Wang, Feng; Chen, Shihua; Yu Minghai; Wang Changping
2004-01-01
This study concerns the normal form and synchronization of strict-feedback chaotic systems. We prove that, any strict-feedback chaotic system can be rendered into a normal form with a invertible transform and then a design procedure to synchronize the normal form of a non-autonomous strict-feedback chaotic system is presented. This approach needs only a scalar driving signal to realize synchronization no matter how many dimensions the chaotic system contains. Furthermore, the Roessler chaotic system is taken as a concrete example to illustrate the procedure of designing without transforming a strict-feedback chaotic system into its normal form. Numerical simulations are also provided to show the effectiveness and feasibility of the developed methods
Normal form of linear systems depending on parameters
Nguyen Huynh Phan.
1995-12-01
In this paper we resolve completely the problem to find normal forms of linear systems depending on parameters for the feedback action that we have studied for the special case of controllable linear systems. (author). 24 refs
Volume-preserving normal forms of Hopf-zero singularity
Gazor, Majid; Mokhtari, Fahimeh
2013-01-01
A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)
Volume-preserving normal forms of Hopf-zero singularity
Gazor, Majid; Mokhtari, Fahimeh
2013-10-01
A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.
Phi photoproduction near threshold with Okubo-Zweig-Iizuka evading phi NN interactions
William, R A
1998-01-01
Existing intermediate and high energy phi-photoproduction data is consistent with purely diffractive production (i.e., Pomeron exchange). However, near threshold (1.574 GeV K sup + K sup - decay angular distribution. We stress the importance of measurements with linearly polarized photons near the phi threshold to separate natural and unnatural parity exchange mechanisms. Approved and planned phi photoproduction and electroproduction experiments at Jefferson Lab will help establish the relative dynamical contributions near threshold and clarify outstanding theoretical issues related to apparent Okubo-Zweig-Iizuka violations.
Utilizing Nested Normal Form to Design Redundancy Free JSON Schemas
Wai Yin Mok
2016-12-01
Full Text Available JSON (JavaScript Object Notation is a lightweight data-interchange format for the Internet. JSON is built on two structures: (1 a collection of name/value pairs and (2 an ordered list of values (http://www.json.org/. Because of this simple approach, JSON is easy to use and it has the potential to be the data interchange format of choice for the Internet. Similar to XML, JSON schemas allow nested structures to model hierarchical data. As data interchange over the Internet increases exponentially due to cloud computing or otherwise, redundancy free JSON data are an attractive form of communication because they improve the quality of data communication through eliminating update anomaly. Nested Normal Form, a normal form for hierarchical data, is a precise characterization of redundancy. A nested table, or a hierarchical schema, is in Nested Normal Form if and only if it is free of redundancy caused by multivalued and functional dependencies. Using Nested Normal Form as a guide, this paper introduces a JSON schema design methodology that begins with UML use case diagrams, communication diagrams and class diagrams that model a system under study. Based on the use cases’ execution frequencies and the data passed between involved parties in the communication diagrams, the proposed methodology selects classes from the class diagrams to be the roots of JSON scheme trees and repeatedly adds classes from the class diagram to the scheme trees as long as the schemas satisfy Nested Normal Form. This process continues until all of the classes in the class diagram have been added to some JSON scheme trees.
Normal Forms for Fuzzy Logics: A Proof-Theoretic Approach
Cintula, Petr; Metcalfe, G.
2007-01-01
Roč. 46, č. 5-6 (2007), s. 347-363 ISSN 1432-0665 R&D Projects: GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * normal form * proof theory * hypersequents Subject RIV: BA - General Mathematics Impact factor: 0.620, year: 2007
A New One-Pass Transformation into Monadic Normal Form
Danvy, Olivier
2003-01-01
We present a translation from the call-by-value λ-calculus to monadic normal forms that includes short-cut boolean evaluation. The translation is higher-order, operates in one pass, duplicates no code, generates no chains of thunks, and is properly tail recursive. It makes a crucial use of symbolic...
Automatic identification and normalization of dosage forms in drug monographs
2012-01-01
Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431
Fast Bitwise Implementation of the Algebraic Normal Form Transform
Bakoev, Valentin
2017-01-01
The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...
A New Normal Form for Multidimensional Mode Conversion
Tracy, E. R.; Richardson, A. S.; Kaufman, A. N.; Zobin, N.
2007-01-01
Linear conversion occurs when two wave types, with distinct polarization and dispersion characteristics, are locally resonant in a nonuniform plasma [1]. In recent work, we have shown how to incorporate a ray-based (WKB) approach to mode conversion in numerical algorithms [2,3]. The method uses the ray geometry in the conversion region to guide the reduction of the full NxN-system of wave equations to a 2x2 coupled pair which can be solved and matched to the incoming and outgoing WKB solutions. The algorithm in [2] assumes the ray geometry is hyperbolic and that, in ray phase space, there is an 'avoided crossing', which is the most common type of conversion. Here, we present a new formulation that can deal with more general types of conversion [4]. This formalism is based upon the fact (first proved in [5]) that it is always possible to put the 2x2 wave equation into a 'normal' form, such that the diagonal elements of the dispersion matrix Poisson-commute with the off-diagonals (at leading order). Therefore, if we use the diagonals (rather than the eigenvalues or the determinant) of the dispersion matrix as ray Hamiltonians, the off-diagonals will be conserved quantities. When cast into normal form, the 2x2 dispersion matrix has a very natural physical interpretation: the diagonals are the uncoupled ray hamiltonians and the off-diagonals are the coupling. We discuss how to incorporate the normal form into ray tracing algorithms
Normalization Of Thermal-Radiation Form-Factor Matrix
Tsuyuki, Glenn T.
1994-01-01
Report describes algorithm that adjusts form-factor matrix in TRASYS computer program, which calculates intraspacecraft radiative interchange among various surfaces and environmental heat loading from sources such as sun.
Nandi, S.
1977-08-01
We propose simple scaling laws for the Okubo-Zweig-Iizuka violating decays and the inclusive productions of hidden flavour vector mesons. These laws are in good agreement with the available data on phi, PSI and PSI'. Assuming that the recently observed bumps at approximately 9.44 (UPSILON) and at approximately 10.17 (UPSILON') GeV to be due to some new hidden flavour vector mesons, (such as t anti t and/or b anti b), these scaling laws are used to estimate the direct hadronic decay widths and the inclusive yields of UPSILON and UPSILON'. (orig.) [de
Diagonalization and Jordan Normal Form--Motivation through "Maple"[R
Glaister, P.
2009-01-01
Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package…
On the relationship between LTL normal forms and Büchi automata
Li, Jianwen; Pu, Geguang; Zhang, Lijun
2013-01-01
In this paper, we revisit the problem of translating LTL formulas to Büchi automata. We first translate the given LTL formula into a special disjuctive-normal form (DNF). The formula will be part of the state, and its DNF normal form specifies the atomic properties that should hold immediately...
Normal forms of invariant vector fields under a finite group action
Sanchez Bringas, F.
1992-07-01
Let Γ be a finite subgroup of GL(n,C). This subgroup acts on the space of germs of holomorphic vector fields vanishing at the origin in C n . We prove a theorem of invariant conjugation to a normal form and linearization for the subspace of invariant elements and we give a description of these normal forms in dimension n=2. (author)
Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.
Frejlich, Pedro; Mărcuț, Ioan
2018-01-01
Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.
Application of normal form methods to the analysis of resonances in particle accelerators
Davies, W.G.
1992-01-01
The transformation to normal form in a Lie-algebraic framework provides a very powerful method for identifying and analysing non-linear behaviour and resonances in particle accelerators. The basic ideas are presented and illustrated. (author). 4 refs
On some hypersurfaces with time like normal bundle in pseudo Riemannian space forms
Kashani, S.M.B.
1995-12-01
In this work we classify immersed hypersurfaces with constant sectional curvature in pseudo Riemannian space forms if the normal bundle is time like and the mean curvature is constant. (author). 9 refs
Faria, T.; Magalhaes, L. T.
The paper addresses, for retarded functional differential equations (FDEs), the computation of normal forms associated with the flow on a finite-dimensional invariant manifold tangent to invariant spaces for the infinitesimal generator of the linearized equation at a singularity. A phase space appropriate to the computation of these normal forms is introduced, and adequate nonresonance conditions for the computation of the normal forms are derived. As an application, the general situation of Bogdanov-Takens singularity and its versal unfolding for scalar retarded FDEs with nondegeneracy at second order is considered, both in the general case and in the case of differential-delay equations of the form ẋ( t) = ƒ( x( t), x( t-1)).
Cho, Min-Jeong; Hallac, Rami R; Ramesh, Jananie; Seaward, James R; Hermann, Nuno V; Darvann, Tron A; Lipira, Angelo; Kane, Alex A
2018-03-01
Restoring craniofacial symmetry is an important objective in the treatment of many craniofacial conditions. Normal form has been measured using anthropometry, cephalometry, and photography, yet all of these modalities have drawbacks. In this study, the authors define normal pediatric craniofacial form and craniofacial asymmetry using stereophotogrammetric images, which capture a densely sampled set of points on the form. After institutional review board approval, normal, healthy children (n = 533) with no known craniofacial abnormalities were recruited at well-child visits to undergo full head stereophotogrammetric imaging. The children's ages ranged from 0 to 18 years. A symmetric three-dimensional template was registered and scaled to each individual scan using 25 manually placed landmarks. The template was deformed to each subject's three-dimensional scan using a thin-plate spline algorithm and closest point matching. Age-based normal facial models were derived. Mean facial asymmetry and statistical characteristics of the population were calculated. The mean head asymmetry across all pediatric subjects was 1.5 ± 0.5 mm (range, 0.46 to 4.78 mm), and the mean facial asymmetry was 1.2 ± 0.6 mm (range, 0.4 to 5.4 mm). There were no significant differences in the mean head or facial asymmetry with age, sex, or race. Understanding the "normal" form and baseline distribution of asymmetry is an important anthropomorphic foundation. The authors present a method to quantify normal craniofacial form and baseline asymmetry in a large pediatric sample. The authors found that the normal pediatric craniofacial form is asymmetric, and does not change in magnitude with age, sex, or race.
A normal form approach to the theory of nonlinear betatronic motion
Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.
1994-01-01
The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)
SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS
A. V. Sokolov
2016-01-01
Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.
Reconstruction of normal forms by learning informed observation geometries from data.
Yair, Or; Talmon, Ronen; Coifman, Ronald R; Kevrekidis, Ioannis G
2017-09-19
The discovery of physical laws consistent with empirical observations is at the heart of (applied) science and engineering. These laws typically take the form of nonlinear differential equations depending on parameters; dynamical systems theory provides, through the appropriate normal forms, an "intrinsic" prototypical characterization of the types of dynamical regimes accessible to a given model. Using an implementation of data-informed geometry learning, we directly reconstruct the relevant "normal forms": a quantitative mapping from empirical observations to prototypical realizations of the underlying dynamics. Interestingly, the state variables and the parameters of these realizations are inferred from the empirical observations; without prior knowledge or understanding, they parametrize the dynamics intrinsically without explicit reference to fundamental physical quantities.
Closed-form confidence intervals for functions of the normal mean and standard deviation.
Donner, Allan; Zou, G Y
2012-08-01
Confidence interval methods for a normal mean and standard deviation are well known and simple to apply. However, the same cannot be said for important functions of these parameters. These functions include the normal distribution percentiles, the Bland-Altman limits of agreement, the coefficient of variation and Cohen's effect size. We present a simple approach to this problem by using variance estimates recovered from confidence limits computed for the mean and standard deviation separately. All resulting confidence intervals have closed forms. Simulation results demonstrate that this approach performs very well for limits of agreement, coefficients of variation and their differences.
On the construction of the Kolmogorov normal form for the Trojan asteroids
Gabern, F; Locatelli, U
2004-01-01
In this paper we focus on the stability of the Trojan asteroids for the planar Restricted Three-Body Problem (RTBP), by extending the usual techniques for the neighbourhood of an elliptic point to derive results in a larger vicinity. Our approach is based on the numerical determination of the frequencies of the asteroid and the effective computation of the Kolmogorov normal form for the corresponding torus. This procedure has been applied to the first 34 Trojan asteroids of the IAU Asteroid Catalog, and it has worked successfully for 23 of them. The construction of this normal form allows for computer-assisted proofs of stability. To show it, we have implemented a proof of existence of families of invariant tori close to a given asteroid, for a high order expansion of the Hamiltonian. This proof has been successfully applied to three Trojan asteroids.
Generating All Permutations by Context-Free Grammars in Chomsky Normal Form
Asveld, P.R.J.; Spoto, F.; Scollo, Giuseppe; Nijholt, Antinus
2003-01-01
Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq 1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with
Generating all permutations by context-free grammars in Chomsky normal form
Asveld, P.R.J.
2006-01-01
Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq1$, with
Generating All Permutations by Context-Free Grammars in Chomsky Normal Form
Asveld, P.R.J.
2004-01-01
Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with
THE METHOD OF CONSTRUCTING A BOOLEAN FORMULA OF A POLYGON IN THE DISJUNCTIVE NORMAL FORM
A. A. Butov
2014-01-01
Full Text Available The paper focuses on finalizing the method of finding a polygon Boolean formula in disjunctive normal form, described in the previous article [1]. An improved method eliminates the drawback asso-ciated with the existence of a class of problems for which the solution is only approximate. The pro-posed method always allows to find an exact solution. The method can be used, in particular, in the systems of computer-aided design of integrated circuits topology.
DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.
2008-06-01
For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).
High molecular gas fractions in normal massive star-forming galaxies in the young Universe.
Tacconi, L J; Genzel, R; Neri, R; Cox, P; Cooper, M C; Shapiro, K; Bolatto, A; Bouché, N; Bournaud, F; Burkert, A; Combes, F; Comerford, J; Davis, M; Schreiber, N M Förster; Garcia-Burillo, S; Gracia-Carpio, J; Lutz, D; Naab, T; Omont, A; Shapley, A; Sternberg, A; Weiner, B
2010-02-11
Stars form from cold molecular interstellar gas. As this is relatively rare in the local Universe, galaxies like the Milky Way form only a few new stars per year. Typical massive galaxies in the distant Universe formed stars an order of magnitude more rapidly. Unless star formation was significantly more efficient, this difference suggests that young galaxies were much more molecular-gas rich. Molecular gas observations in the distant Universe have so far largely been restricted to very luminous, rare objects, including mergers and quasars, and accordingly we do not yet have a clear idea about the gas content of more normal (albeit massive) galaxies. Here we report the results of a survey of molecular gas in samples of typical massive-star-forming galaxies at mean redshifts of about 1.2 and 2.3, when the Universe was respectively 40% and 24% of its current age. Our measurements reveal that distant star forming galaxies were indeed gas rich, and that the star formation efficiency is not strongly dependent on cosmic epoch. The average fraction of cold gas relative to total galaxy baryonic mass at z = 2.3 and z = 1.2 is respectively about 44% and 34%, three to ten times higher than in today's massive spiral galaxies. The slow decrease between z approximately 2 and z approximately 1 probably requires a mechanism of semi-continuous replenishment of fresh gas to the young galaxies.
Generating All Circular Shifts by Context-Free Grammars in Greibach Normal Form
Asveld, Peter R.J.
2007-01-01
For each alphabet Σn = {a1,a2,…,an}, linearly ordered by a1 < a2 < ⋯ < an, let Cn be the language of circular or cyclic shifts over Σn, i.e., Cn = {a1a2 ⋯ an-1an, a2a3 ⋯ ana1,…,ana1 ⋯ an-2an-1}. We study a few families of context-free grammars Gn (n ≥1) in Greibach normal form such that Gn generates
Normal form of particle motion under the influence of an ac dipole
R. Tomás
2002-05-01
Full Text Available ac dipoles in accelerators are used to excite coherent betatron oscillations at a drive frequency close to the tune. These beam oscillations may last arbitrarily long and, in principle, there is no significant emittance growth if the ac dipole is adiabatically turned on and off. Therefore the ac dipole seems to be an adequate tool for nonlinear diagnostics provided the particle motion is well described in the presence of the ac dipole and nonlinearities. Normal forms and Lie algebra are powerful tools to study the nonlinear content of an accelerator lattice. In this article a way to obtain the normal form of the Hamiltonian of an accelerator with an ac dipole is described. The particle motion to first order in the nonlinearities is derived using Lie algebra techniques. The dependence of the Hamiltonian terms on the longitudinal coordinate is studied showing that they vary differently depending on the ac dipole parameters. The relation is given between the lines of the Fourier spectrum of the turn-by-turn motion and the Hamiltonian terms.
Daniel Ventura
2010-01-01
Full Text Available The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it.
Theory and praxis pf map analsys in CHEF part 1: Linear normal form
Michelotti, Leo; /Fermilab
2008-10-01
This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.
Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form
Michelotti, Leo
2009-01-01
This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from
Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form
Michelotti, Leo; /FERMILAB
2009-04-01
This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material
Optimization of accelerator parameters using normal form methods on high-order transfer maps
Snopok, Pavel [Michigan State Univ., East Lansing, MI (United States)
2007-05-01
Methods of analysis of the dynamics of ensembles of charged particles in collider rings are developed. The following problems are posed and solved using normal form transformations and other methods of perturbative nonlinear dynamics: (1) Optimization of the Tevatron dynamics: (a) Skew quadrupole correction of the dynamics of particles in the Tevatron in the presence of the systematic skew quadrupole errors in dipoles; (b) Calculation of the nonlinear tune shift with amplitude based on the results of measurements and the linear lattice information; (2) Optimization of the Muon Collider storage ring: (a) Computation and optimization of the dynamic aperture of the Muon Collider 50 x 50 GeV storage ring using higher order correctors; (b) 750 x 750 GeV Muon Collider storage ring lattice design matching the Tevatron footprint. The normal form coordinates have a very important advantage over the particle optical coordinates: if the transformation can be carried out successfully (general restrictions for that are not much stronger than the typical restrictions imposed on the behavior of the particles in the accelerator) then the motion in the new coordinates has a very clean representation allowing to extract more information about the dynamics of particles, and they are very convenient for the purposes of visualization. All the problem formulations include the derivation of the objective functions, which are later used in the optimization process using various optimization algorithms. Algorithms used to solve the problems are specific to collider rings, and applicable to similar problems arising on other machines of the same type. The details of the long-term behavior of the systems are studied to ensure the their stability for the desired number of turns. The algorithm of the normal form transformation is of great value for such problems as it gives much extra information about the disturbing factors. In addition to the fact that the dynamics of particles is represented
Bioactive form of resveratrol in glioblastoma cells and its safety for normal brain cells
Xiao-Hong Shu
2013-05-01
Full Text Available ABSTRACTBackground: Resveratrol, a plant polyphenol existing in grapes and many other natural foods, possesses a wide range of biological activities including cancer prevention. It has been recognized that resveratrol is intracellularly biotransformed to different metabolites, but no direct evidence has been available to ascertain its bioactive form because of the difficulty to maintain resveratrol unmetabolized in vivo or in vitro. It would be therefore worthwhile to elucidate the potential therapeutic implications of resveratrol metabolism using a reliable resveratrol-sensitive cancer cells.Objective: To identify the real biological form of trans-resveratrol and to evaluate the safety of the effective anticancer dose of resveratrol for the normal brain cells.Methods: The samples were prepared from the condition media and cell lysates of human glioblastoma U251 cells, and were purified by solid phase extraction (SPE. The samples were subjected to high performance liquid chromatography (HPLC and liquid chromatography/tandem mass spectrometry (LC/MS analysis. According to the metabolite(s, trans-resveratrol was biotransformed in vitro by the method described elsewhere, and the resulting solution was used to treat U251 cells. Meanwhile, the responses of U251 and primarily cultured rat normal brain cells (glial cells and neurons to 100μM trans-resveratrol were evaluated by multiple experimental methods.Results: The results revealed that resveratrol monosulfate was the major metabolite in U251 cells. About half fraction of resveratrol monosulfate was prepared in vitro and this trans-resveratrol and resveratrol monosulfate mixture showed little inhibitory effect on U251 cells. It is also found that rat primary brain cells (PBCs not only resist 100μM but also tolerate as high as 200μM resveratrol treatment.Conclusions: Our study thus demonstrated that trans-resveratrol was the bioactive form in glioblastoma cells and, therefore, the biotransforming
Normal form analysis of linear beam dynamics in a coupled storage ring
Wolski, Andrzej; Woodley, Mark D.
2004-01-01
The techniques of normal form analysis, well known in the literature, can be used to provide a straightforward characterization of linear betatron dynamics in a coupled lattice. Here, we consider both the beam distribution and the betatron oscillations in a storage ring. We find that the beta functions for uncoupled motion generalize in a simple way to the coupled case. Defined in the way that we propose, the beta functions remain well behaved (positive and finite) under all circumstances, and have essentially the same physical significance for the beam size and betatron oscillation amplitude as in the uncoupled case. Application of this analysis to the online modeling of the PEP-II rings is also discussed
A Mathematical Framework for Critical Transitions: Normal Forms, Variance and Applications
Kuehn, Christian
2013-06-01
Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast-subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension-two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.
Koskela, Anne; Vehkalahti, Kaisa
2017-01-01
This article shows the importance of paying attention to the role of professional devices, such as standardised forms, as producers of normality and deviance in the history of education. Our case study focused on the standardised forms used by teachers during child guidance clinic referrals and transfers to special education in northern Finland,…
Furnes, Bjarte; Norman, Elisabeth
2015-08-01
Metacognition refers to 'cognition about cognition' and includes metacognitive knowledge, strategies and experiences (Efklides, 2008; Flavell, 1979). Research on reading has shown that better readers demonstrate more metacognitive knowledge than poor readers (Baker & Beall, 2009), and that reading ability improves through strategy instruction (Gersten, Fuchs, Williams, & Baker, 2001). The current study is the first to specifically compare the three forms of metacognition in dyslexic (N = 22) versus normally developing readers (N = 22). Participants read two factual texts, with learning outcome measured by a memory task. Metacognitive knowledge and skills were assessed by self-report. Metacognitive experiences were measured by predictions of performance and judgments of learning. Individuals with dyslexia showed insight into their reading problems, but less general knowledge of how to approach text reading. They more often reported lack of available reading strategies, but groups did not differ in the use of deep and surface strategies. Learning outcome and mean ratings of predictions of performance and judgments of learning were lower in dyslexic readers, but not the accuracy with which metacognitive experiences predicted learning. Overall, the results indicate that dyslexic reading and spelling problems are not generally associated with lower levels of metacognitive knowledge, metacognitive strategies or sensitivity to metacognitive experiences in reading situations. 2015 The Authors. Dyslexia Published by John Wiley & Sons Ltd.
Y. Yuliana
2011-07-01
Full Text Available The aim of an orthodontic treatment is to achieve aesthetic, dental health and the surrounding tissues, occlusal functional relationship, and stability. The success of an orthodontic treatment is influenced by many factors, such as diagnosis and treatment plan. In order to do a diagnosis and a treatment plan, medical record, clinical examination, radiographic examination, extra oral and intra oral photos, as well as study model analysis are needed. The purpose of this study was to evaluate the differences in dental arch form between level four polynomial and pentamorphic arch form and to determine which one is best suitable for normal occlusion sample. This analytic comparative study was conducted at Faculty of Dentistry Universitas Padjadjaran on 13 models by comparing the dental arch form using the level four polynomial method based on mathematical calculations, the pattern of the pentamorphic arch and mandibular normal occlusion as a control. The results obtained were tested using statistical analysis T student test. The results indicate a significant difference both in the form of level four polynomial method and pentamorphic arch form when compared with mandibular normal occlusion dental arch form. Level four polynomial fits better, compare to pentamorphic arch form.
Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs
Edneral, Victor
2018-02-01
This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.
Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs
Edneral Victor
2018-01-01
Full Text Available This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.
Menn, Lise; And Others
This study examined the role of empathy in the choice of syntactic form and the degree of independence of pragmatic and syntactic abilities in a range of aphasic patients. Study 1 involved 9 English-speaking and 9 Japanese-speaking aphasic subjects with 10 English-speaking and 4 Japanese normal controls. Study 2 involved 14 English- and 6…
Avendaño-Camacho, M; Vallejo, J A; Vorobjev, Yu
2013-01-01
We study the determination of the second-order normal form for perturbed Hamiltonians relative to the periodic flow of the unperturbed Hamiltonian H 0 . The formalism presented here is global, and can be easily implemented in any computer algebra system. We illustrate it by means of two examples: the Hénon–Heiles and the elastic pendulum Hamiltonians. (paper)
Lee, E.T.
1983-01-01
Algorithms for the construction of the Chomsky and Greibach normal forms for a fuzzy context-free grammar using the algebraic approach are presented and illustrated by examples. The results obtained in this paper may have useful applications in fuzzy languages, pattern recognition, information storage and retrieval, artificial intelligence, database and pictorial information systems. 16 references.
Lipkin, H.J.; Zou, B.
1996-01-01
Previous calculations by Geiger and Isgur showed that systematic cancellations among hadronic loops occur for u bar u↔s bar s mixing in all the low-lying nonets except 0 ++ . They suggested it is due to 3 P 0 dominance of the effective quark-antiquark (q bar q) pair creation operator. Here we give a general argument that there should be a large mixing for 0 ++ nonet from hadronic loops no matter what kind of model is assumed for q bar q creation. By the same argument we show that the Okubo-Zweig-Iizuka (OZI) rule should be best obeyed in the 1 -- , 2 ++ , and 3 -- nonets, but not very well obeyed in the 0 -+ , 1 +- , and 1 ++ nonets. All these model independent expectations are compatible with calculations by Geiger and Isgur using the 3 P 0 model as well as experimental data. A similar argument also suggests a large hadronic loop contribution for the p bar p→φφ reaction. copyright 1996 The American Physical Society
Noguchi, Hiroshi; Takehara, Kimie; Ohashi, Yumiko; Suzuki, Ryo; Yamauchi, Toshimasa; Kadowaki, Takashi; Sanada, Hiromi
2016-01-01
Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure) and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure) ratio (SPR) was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH) as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure), concretely, peak values (SPR-p) and time integral values (SPR-i). The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i. PMID:28050567
Ayumi Amemiya
2016-01-01
Full Text Available Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure ratio (SPR was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure, concretely, peak values (SPR-p and time integral values (SPR-i. The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i.
Holmes, Philip J.
1981-06-01
We study the instabilities known to aeronautical engineers as flutter and divergence. Mathematically, these states correspond to bifurcations to limit cycles and multiple equilibrium points in a differential equation. Making use of the center manifold and normal form theorems, we concentrate on the situation in which flutter and divergence become coupled, and show that there are essentially two ways in which this is likely to occur. In the first case the system can be reduced to an essential model which takes the form of a single degree of freedom nonlinear oscillator. This system, which may be analyzed by conventional phase-plane techniques, captures all the qualitative features of the full system. We discuss the reduction and show how the nonlinear terms may be simplified and put into normal form. Invariant manifold theory and the normal form theorem play a major role in this work and this paper serves as an introduction to their application in mechanics. Repeating the approach in the second case, we show that the essential model is now three dimensional and that far more complex behavior is possible, including nonperiodic and ‘chaotic’ motions. Throughout, we take a two degree of freedom system as an example, but the general methods are applicable to multi- and even infinite degree of freedom problems.
Generation of Strategies for Environmental Deception in Two-Player Normal-Form Games
2015-06-18
found in the literature is pre- sented by Kohlberg and Mertens [23]. A stable equilibrium by their definition is an equi- librium in an extensive-form...the equilibrium in this state provides them with an increased payoff. While interesting, Kohlberg and Mertens’ defi- 13 nition of equilibrium...stability used by Kohlberg and Mertens. Arsham’s work focuses on determining the amount by which a mixed-strategy Nash equilibrium’s payoff values can
Nazirov, N.N.; Kamalov, N.; Norbaev, N.
1978-01-01
The radiation effect on electric conductivity of tissues in case of alternating current, electrical capacity and cell impedance has been studied. Gamma irradiation of seedlings results in definite changes of electric factors of cells (electric conductivity, electric capacity, impedance). It is shown that especially strong changes have been revealed during gamma irradiation of radiosensitive wild form of cotton plants. The deviation of cell electric factors from the standard depends on the violation of evolutionally composed ion heterogeneity and cell colloid system state, which results in changes in their structure and metabolism in them
Heinz Toparkus
2014-04-01
Full Text Available In this paper we consider first-order systems with constant coefficients for two real-valued functions of two real variables. This is both a problem in itself, as well as an alternative view of the classical linear partial differential equations of second order with constant coefficients. The classification of the systems is done using elementary methods of linear algebra. Each type presents its special canonical form in the associated characteristic coordinate system. Then you can formulate initial value problems in appropriate basic areas, and you can try to achieve a solution of these problems by means of transform methods.
Ellison, James A.; Heinemann, Klaus [New Mexico Univ., Albuquerque, NM (United States). Dept. of Mathematics and Statistics; Vogt, Mathias [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Gooden, Matthew [North Carolina State Univ., Raleigh, NC (United States). Dept. of Physics
2013-03-15
We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length {lambda} of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As {lambda} varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in
Ellison, James A.; Heinemann, Klaus; Gooden, Matthew
2013-03-01
We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length λ of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As λ varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in the
Martinez Carrillo, Irma
2008-01-15
Power system dynamic behavior is inherently nonlinear and is driven by different processes at different time scales. The size and complexity of these mechanisms has stimulated the search for methods that reduce the original dimension but retain a certain degree of accuracy. In this dissertation, a novel nonlinear dynamical analysis method for the analysis of large amplitude oscillations that embraces ideas from normal form theory and singular perturbation techniques is proposed. This approach allows the full potential of the normal form method to be reached, and is suitably general for application to a wide variety of nonlinear systems. Drawing on the formal theory of dynamical systems, a structure-preserving model of the system is developed that preservers network and load characteristics. By exploiting the separation of fast and slow time scales of the model, an efficient approach based on singular perturbation techniques, is then derived for constructing a nonlinear power system representation that accurately preserves network structure. The method requires no reduction of the constraint equations and gives therefore, information about the effect of network and load characteristics on system behavior. Analytical expressions are then developed that provide approximate solutions to system performance near a singularity and techniques for interpreting these solutions in terms of modal functions are given. New insights into the nature of nonlinear oscillations are also offered and criteria for characterizing network effects on nonlinear system behavior are proposed. Theoretical insight into the behavior of dynamic coupling of differential-algebraic equations and the origin of nonlinearity is given, and implications for analyzing for design and placement of power system controllers in complex nonlinear systems are discussed. The extent of applicability of the proposed procedure is demonstrated by analyzing nonlinear behavior in two realistic test power systems
Barrett, S.F.; Tarone, R.E.; Moshell, A.N.; Ganges, M.B.; Robbins, J.H.
1981-01-01
In xeroderma pigmentosum, an inherited disorder of defective DNA repair, post-uv colony-forming ability of fibroblasts from patients in complementation groups A through F correlates with the patients' neurological status. The first xeroderma pigmentosum patient assigned to the recently discovered group G had the neurological abnormalities of XP. Researchers have determined the post-uv colony-forming ability of cultured fibroblasts from this patient and from 5 more control donors. Log-phase fibroblasts were irradiated with 254 nm uv light from a germicidal lamp, trypsinized, and replated at known densities. After 2 to 4 weeks' incubation the cells were fixed, stained and scored for colony formation. The strains' post-uv colony-forming ability curves were obtained by plotting the log of the percent remaining post-uv colony-forming ability as a function of the uv dose. The post-uv colony-forming ability of 2 of the 5 new normal strains was in the previously defined control donor zone, but that of the other 3 extended down to the level of the most resistant xeroderma pigmentosum strain. The post-uv colony-forming ability curve of the group G fibroblasts was not significantly different from the curves of the group D fibroblast strains from patients with clinical histories similar to that of the group G patient
Bobodzhanov, A A; Safonov, V F [National Research University " Moscow Power Engineering Institute" , Moscow (Russian Federation)
2013-07-31
The paper deals with extending the Lomov regularization method to classes of singularly perturbed Fredholm-type integro-differential systems, which have not so far been studied. In these the limiting operator is discretely noninvertible. Such systems are commonly known as problems with unstable spectrum. Separating out the essential singularities in the solutions to these problems presents great difficulties. The principal one is to give an adequate description of the singularities induced by 'instability points' of the spectrum. A methodology for separating singularities by using normal forms is developed. It is applied to the above type of systems and is substantiated in these systems. Bibliography: 10 titles.
Klein, A.; Marshalek, E.R.
1988-01-01
In recent years, the method for unitarizing nonunitary Dyson boson realizations of shell-model algebras has been both generalized and substantially simplified through the introduction of overtly group-theoretical methods. In this paper, these methods are applied to the boson-odd-particle realization of the algebra SO(2ν+1) for ν single-particle levels, adapted to the group chain SO(2ν+1) contains SO(2ν) contains U(ν), which Marshalek first derived by brute force summation of a Taylor expansion and later Okubo by a largely algebraic technique. (orig.)
Adam Karbowski
2017-09-01
Full Text Available The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants’ attributions of susceptibility to errors or non-self-interested motivation to the opponents.
Karbowski, Adam; Ramsza, Michał
2017-01-01
The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants' attributions of susceptibility to errors or non-self-interested motivation to the opponents.
Khallli H
2003-04-01
Full Text Available Background: To evaluate the effectiveness of the present educational programs in terms of students' achieving problem solving, decision making and critical thinking skills, reliable, valid and standard instrument are needed. Purposes: To Investigate the Reliability, validity and Norm of CCTST Form.B .The California Critical Thinking Skills Test contain 34 multi-choice questions with a correct answer in the jive Critical Thinking (CT cognitive skills domain. Methods: The translated CCTST Form.B were given t0405 BSN nursing students ojNursing Faculties located in Tehran (Tehran, Iran and Shahid Beheshti Universitiesthat were selected in the through random sampling. In order to determine the face and content validity the test was translated and edited by Persian and English language professor and researchers. it was also confirmed by judgments of a panel of medical education experts and psychology professor's. CCTST reliability was determined with internal consistency and use of KR-20. The construct validity of the test was investigated with factor analysis and internal consistency and group difference. Results: The test coefficien for reliablity was 0.62. Factor Analysis indicated that CCTST has been formed from 5 factor (element namely: Analysis, Evaluation, lriference, Inductive and Deductive Reasoning. Internal consistency method shows that All subscales have been high and positive correlation with total test score. Group difference method between nursing and philosophy students (n=50 indicated that there is meaningfUl difference between nursing and philosophy students scores (t=-4.95,p=0.OOO1. Scores percentile norm also show that percentile offifty scores related to 11 raw score and 95, 5 percentiles are related to 17 and 6 raw score ordinary. Conclusions: The Results revealed that the questions test is sufficiently reliable as a research tool, and all subscales measure a single construct (Critical Thinking and are able to distinguished the
Roy Choudhury, S.
2007-01-01
The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned
Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)
2015-09-25
Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.
Carpenter, Donald A.
2008-01-01
Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…
Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.
2003-01-01
The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for
Devi, V. Kalpana; Baskar, R.; Varalakshmi, P.
1993-01-01
The effect of Musa paradisiaca stem kernel juice was investigated in experimental urolithiatic rats. Stone forming rats exhibited a significant elevation in the activities of two oxalate synthesizing enzymes - Glycollic acid oxidase and Lactate dehydrogenase. Deposition and excretion of stone forming constituents in kidney and urine were also increased in these rats. The enzyme activities and the level of crystalline components were lowered with the extract treatment. The extract also reduced the activities of urinary alkaline phosphatase, lactate dehydrogenase, r-glutamyl transferase, inorganic pyrophosphatase and β-glucuronidase in calculogenic rats. No appreciable changes were noticed with leucine amino peptidase activity in treated rats. PMID:22556626
Dănăilă, E.; Benea, L.
2017-06-01
The tribocorrosion behaviour of Ti-10Zr alloy and porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy was evaluated in Fusayama-Mayer artificial saliva solution. Tribocorrosion experiments were performed using a unidirectional pin-on-disc experimental set-up which was mechanically and electrochemically instrumented, under various solicitation conditions. The effect of applied normal force on tribocorrosion performance of the tested materials was determined. Open circuit potential (OCP) measurements performed before, during and after sliding tests were applied in order to determine the tribocorrosion degradation. The applied normal force was found to greatly affect the potential during tribocorrosion experiments, an increase in the normal force inducing a decrease in potential accelerating the depassivation of the materials studied. The results show a decrease in friction coefficient with gradually increasing the normal load. It was proved that the porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy lead to an improvement of tribocorrosion resistance compared to non-anodized Ti-10Zr alloy intended for biomedical applications.
A Hard X-Ray Study of the Normal Star-Forming Galaxy M83 with NuSTAR
Yukita, M.; Hornschemeier, A. E.; Lehmer, B. D.
2016-01-01
We present the results from sensitive, multi-epoch NuSTAR observations of the late-type star-forming galaxy M83 (d = 4.6 Mpc). This is the first investigation to spatially resolve the hard (E > 10 keV) X-ray emission of this galaxy. The nuclear region and similar to 20 off-nuclear point sources......, including a previously discovered ultraluminous X-ray source, are detected in our NuSTAR observations. The X-ray hardnesses and luminosities of the majority of the point sources are consistent with hard X-ray sources resolved in the starburst galaxy NGC 253. We infer that the hard X-ray emission is most...
Kawai, Shinnosuke; Komatsuzaki, Tamiki
2009-12-14
We present a novel theory which enables us to explore the mechanism of reaction selectivity and robust functions in complex systems persisting under thermal fluctuation. The theory constructs a nonlinear coordinate transformation so that the equation of motion for the new reaction coordinate is independent of the other nonreactive coordinates in the presence of thermal fluctuation. In this article we suppose that reacting systems subject to thermal noise are described by a multidimensional Langevin equation without a priori assumption for the form of potential. The reaction coordinate is composed not only of all the coordinates and velocities associated with the system (solute) but also of the random force exerted by the environment (solvent) with friction constants. The sign of the reaction coordinate at any instantaneous moment in the region of a saddle determines the fate of the reaction, i.e., whether the reaction will proceed through to the products or go back to the reactants. By assuming the statistical properties of the random force, one can know a priori a well-defined boundary of the reaction which separates the full position-velocity space in the saddle region into mainly reactive and mainly nonreactive regions even under thermal fluctuation. The analytical expression of the reaction coordinate provides the firm foundation on the mechanism of how and why reaction proceeds in thermal fluctuating environments.
ω→π0γ* and ϕ→π0γ* transition form factors in dispersion theory
Schneider, Sebastian P.; Kubis, Bastian; Niecknig, Franz
2012-09-01
We calculate the ω→π0γ* and ϕ→π0γ* electromagnetic transition form factors based on dispersion theory, relying solely on a previous dispersive analysis of the corresponding three-pion decays and the pion vector form factor. We compare our findings to recent measurements of the ω→π0μ+μ- decay spectrum by the NA60 collaboration, and strongly encourage experimental investigation of the Okubo-Zweig-Iizuka forbidden ϕ→π0ℓ+ℓ- decays in order to understand the strong deviations from vector-meson dominance found in these transition form factors.
Bateman, Grant A. [John Hunter Hospital, Department of Medical Imaging, Newcastle (Australia); Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C. [Hunter Medical Research Institute, Clinical Neurosciences Program, Newcastle (Australia); Schofield, Peter [James Fletcher Hospital, Neuropsychiatry Unit, Newcastle (Australia)
2005-10-01
Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)
Bateman, Grant A.; Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C.; Schofield, Peter
2005-01-01
Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)
Marinez Carrillo, Irma
2003-08-01
This thesis investigates the application of parameter disturbance methods of analysis to the nonlinear dynamic systems theory, for the study of the stability of small signal of electric power systems. The work is centered in the determination of two fundamental aspects of interest in the study of the nonlinear dynamic behavior of the system: the characterization and quantification of the nonlinear interaction degree between the fundamental ways of oscillation of the system and the study of the ways with greater influence in the response of the system in the presence of small disturbances. With these objectives, a general mathematical model, based on the application of the expansion in series of power of the nonlinear model of the power system and the theory of normal forms of vector fields is proposed for the study of the dynamic behavior of the power system. The proposed tool generalizes the existing methods in the literature to consider effects of superior order in the dynamic model of the power system. Starting off of this representation, a methodology is proposed to obtain analytical solutions of loop back and the extension of the existing methods is investigated to identify and quantify the of interaction degree among the fundamental ways of oscillation of the system. The developed tool allows, from analytical expressions of loop backs, the development of analytical measures to evaluate the stress degree in the system, the interaction between the fundamental ways of oscillation and the determination of stability borders. The conceptual development of the proposed method in this thesis offers, on the other hand, a great flexibility to incorporate detailed models of the power system and the evaluation of diverse measures of the nonlinear modal interaction. Finally, the results are presented of the application of the method of analysis proposed for the study of the nonlinear dynamic behavior in a machine-infinite bus system considering different modeled degrees
Badreddine, Houssem; Saanouni, Khemaies; Dogui, Abdelwaheb
2007-01-01
In this work an improved material model is proposed that shows good agreement with experimental data for both hardening curves and plastic strain ratios in uniaxial and equibiaxial proportional loading paths for steel metal until the final fracture. This model is based on non associative and non normal flow rule using two different orthotropic equivalent stresses in both yield criterion and plastic potential functions. For the plastic potential the classical Hill 1948 quadratic equivalent stress is considered while for the yield criterion the Karafillis and Boyce 1993 non quadratic equivalent stress is used taking into account the non linear mixed (kinematic and isotropic) hardening. Applications are made to hydro bulging tests using both circular and elliptical dies. The results obtained with different particular cases of the model such as the normal quadratic and the non normal non quadratic cases are compared and discussed with respect to the experimental results
Aydin, Aydan
2016-01-01
This study aims at developing an assessment scale for identifying preschool children's communication skills, at distinguishing children with communication deficiencies and at comparing the communication skills of children with normal development (ND) and those with autism spectrum disorder (ASD). Participants were 427 children of up to 6 years of…
Zachariae, R; Kristensen, J S; Hokland, P
1991-01-01
The present study measured the effects of relaxation and guided imagery on cellular immune function. During a period of 10 days 10 healthy subjects were given one 1-hour relaxation procedure and one combined relaxation and guided imagery procedure, instructing the subjects to imagine their immune...... on the immune defense and could form the basis of further studies on psychological intervention and immunological status. Udgivelsesdato: 1990-null...
Croutze, Roger; Jomha, Nadr; Uludag, Hasan; Adesida, Adetola
2013-12-13
Limited intrinsic healing potential of the meniscus and a strong correlation between meniscal injury and osteoarthritis have prompted investigation of surgical repair options, including the implantation of functional bioengineered constructs. Cell-based constructs appear promising, however the generation of meniscal constructs is complicated by the presence of diverse cell populations within this heterogeneous tissue and gaps in the information concerning their response to manipulation of oxygen tension during cell culture. Four human lateral menisci were harvested from patients undergoing total knee replacement. Inner and outer meniscal fibrochondrocytes (MFCs) were expanded to passage 3 in growth medium supplemented with basic fibroblast growth factor (FGF-2), then embedded in porous collagen type I scaffolds and chondrogenically stimulated with transforming growth factor β3 (TGF-β3) under 21% (normal or normoxic) or 3% (hypoxic) oxygen tension for 21 days. Following scaffold culture, constructs were analyzed biochemically for glycosaminoglycan production, histologically for deposition of extracellular matrix (ECM), as well as at the molecular level for expression of characteristic mRNA transcripts. Constructs cultured under normal oxygen tension expressed higher levels of collagen type II (p = 0.05), aggrecan (p oxygen tension. There was no significant difference in expression of these genes between scaffolds seeded with MFCs isolated from inner or outer regions of the tissue following 21 days chondrogenic stimulation (p > 0.05). Cells isolated from inner and outer regions of the human meniscus demonstrated equivalent differentiation potential toward chondrogenic phenotype and ECM production. Oxygen tension played a key role in modulating the redifferentiation of meniscal fibrochondrocytes on a 3D collagen scaffold in vitro.
Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov
2012-01-01
Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...
Boucherie, Alexandra; Castex, Dominique; Polet, Caroline; Kacki, Sacha
2017-01-01
Harris lines (HLs) are defined as transverse, mineralized lines associated with temporary growth arrest. In paleopathology, HLs are used to reconstruct health status of past populations. However, their etiology is still obscure. The aim of this article is to test the reliability of HLs as an arrested growth marker by investigating their incidence on human metrical parameters. The study was performed on 69 individuals (28 adults, 41 subadults) from the Dendermonde plague cemetery (Belgium, 16th century). HLs were rated on distal femora and both ends of tibiae. Overall prevalence and age-at-formation of each detected lines were calculated. ANOVA analyses were conducted within subadult and adult samples to test if the presence of HLs did impact size and shape parameters of the individuals. At Dendermonde, 52% of the individuals had at least one HL. The age-at-formation was estimated between 5 and 9 years old for the subadults and between 10 and 14 years old for the adults. ANOVA analyses showed that the presence of HLs did not affect the size of the individuals. However, significant differences in shape parameters were highlighted by HL presence. Subadults with HLs displayed slighter shape parameters than the subadults without, whereas the adults with HLs had larger measurements than the adults without. The results suggest that HLs can have a certain impact on shape parameters. The underlying causes can be various, especially for the early formed HLs. However, HLs deposited around puberty are more likely to be physiological lines reflecting hormonal secretions. Am. J. Hum. Biol. 29:e22885, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Dispersion-theoretical analysis of the nucleon electromagnetic form factors
Belushkin, M.
2007-09-29
The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)
Dispersion-theoretical analysis of the nucleon electromagnetic form factors
Belushkin, M.
2007-01-01
The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the ππ, K anti K and the ρπ continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)
Hornschemeier, A. E.; Heckman, T. M.; Ptak, A. F.; Tremonti, C. A.; Colbert, E. J. M.
2005-01-01
We have cross-correlated X-ray catalogs derived from archival Chandra X-Ray Observatory ACIS observations with a Sloan Digital Sky Survey Data Release 2 (DR2) galaxy catalog to form a sample of 42 serendipitously X-ray-detected galaxies over the redshift interval 0.03
Normalized modes at selected points without normalization
Kausel, Eduardo
2018-04-01
As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á
Pino-Otín, M R; Viñas, O; de la Fuente, M A; Juan, M; Font, J; Torradeflot, M; Pallarés, L; Lozano, F; Alberola-Ila, J; Martorell, J
1995-03-15
CD50 (ICAM-3) is a leukocyte differentiation Ag expressed almost exclusively on hemopoietic cells, with a key role in the first steps of immune response. To develop a specific sandwich ELISA to detect a soluble CD50 form (sCD50), two different mAbs (140-11 and 101-1D2) recognizing non-overlapping epitopes were used. sCD50 was detected in the supernatant of stimulated PBMCs, with the highest levels after CD3 triggering. Simultaneously, the CD50 surface expression diminished during the first 24 h. sCD50 isolated from culture supernatant and analyzed by immunoblotting showed an apparent m.w. of 95 kDa, slightly smaller than the membrane form. These data, together with Northern blot kinetics analysis, suggest that sCD50 is cleaved from cell membrane. Furthermore, we detect sCD50 in normal human sera and higher levels in sera of systemic lupus erythematosus (SLE) patients, especially in those in active phase. The sCD50 levels showed a positive correlation with sCD27 levels (r = 0.4213; p = 0.0026). Detection of sCD50, both after in vitro CD3 triggering of PBMCs and increased in SLE sera, suggests that sCD50 could be used as a marker of lymphocyte stimulation.
Fredy Ángel Miguel Amaya Robayo
2010-08-01
Full Text Available Es un hecho conocido que toda gramática libre de contexto puede ser transformada a la forma normal de Chomsky de tal forma que los lenguajes generados por las dos gramáticas son equivalentes. Una gramática en forma normal de Chomsky (FNC, tiene algunas ventajas, por ejemplo sus árboles de derivación son binarios, la forma de sus reglas más simples etc. Por eso es siempre deseable poder trabajar con una gramática en FNC en las aplicaciones que lo requieran. Existe un algoritmo que permite transformar una gramática libre de contexto a una en FNC, sin embargo la cantidad de reglas generadas al hacer la transformación depende del número de reglas en la gramática inicial así como de otras características. En este trabajo se analiza desde el punto de vista experimental y estadístico, la relación existente entre el número de reglas iniciales y el número de reglas que resultan luego de transformar una Gramática Libre de Contexto a la FNC. Esto permite planificar la cantidad de recursos computacionales necesarios en caso de tratar con gramáticas de alguna complejidad.It is well known that any context-free grammar can be transformed to the Chomsky normal form so that the languages generated by each one are equivalent. A grammar in Chomsky Normal Form (CNF, has some advantages: their derivation trees are binary, simplest rules and so on. So it is always desirable to work with a grammar in CNF in applications that require them. There is an algorithm that can transform a context-free grammar to one CNF grammar, however the number of rules generated after the transformation depends on the initial grammar and other circumstances. In this work we analyze from the experimental and statistical point of view the relationship between the number of initial rules and the number of resulting rules after transforming. This allows you to plan the amount of computational resources needed in case of dealing with grammars of some complexity.
Keiding, Tina Bering
2012-01-01
understanding of form per se, or, to use an expression from this text, of form as form. This challenge can be reduced to one question: how can design teaching support students in achieving not only the ability to recognize and describe different form-related concepts in existing design (i.e. analytical...
Christodorescu, Mihai; Kinder, Johannes; Jha, Somesh; Katzenbeisser, Stefan; Veith, Helmut
2005-01-01
Malware is code designed for a malicious purpose, such as obtaining root privilege on a host. A malware detector identifies malware and thus prevents it from adversely affecting a host. In order to evade detection by malware detectors, malware writers use various obfuscation techniques to transform their malware. There is strong evidence that commercial malware detectors are susceptible to these evasion tactics. In this paper, we describe the design and implementation of a malware normalizer ...
Perrow, C.
1989-01-01
The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de
Madsen, Louise Sofia; Handberg, Charlotte
2018-01-01
implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease......BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... was to understand the reflections on cancer survivorship care of lymphoma survivors to aid the future planning of cancer survivorship care and overcome barriers to participation. METHODS: Data were generated in a hematological ward during 4 months of ethnographic fieldwork, including participant observation and 46...
Normalization of satellite imagery
Kim, Hongsuk H.; Elman, Gregory C.
1990-01-01
Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.
Bicervical normal uterus with normal vagina | Okeke | Annals of ...
To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...
Smooth quantile normalization.
Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada
2018-04-01
Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.
Normal Pressure Hydrocephalus (NPH)
... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...
Extended vector meson dominance model for the baryon octet electromagnetic form factors
Williams, R.A.; Puckett-Truman, C.
1996-01-01
An unresolved issue in the present understanding of nucleon structure is the effect of hidden strangeness on electromagnetic observables such as G n E (q 2 ). Previously, we have shown that G n E (q 2 ) is sensitive to small φNN couplings. A complementary approach for understanding effects due to strangeness content and the Okubo-Zweig-Iizuka (OZI) rule is to investigate the electromagnetic structure of hyperons. We apply Sakurai close-quote s universality limit of the SU(3) F symmetry relations and a prescription based on the OZI rule to calculate the electromagnetic form factors of the baryon octet states (p,n,Λ,Σ + ,Σ 0 ,Σ - ,Ξ 0 ,Ξ - ) within the framework of an extended vector meson dominance model. To provide additional motivation for experimental investigation, we discuss the possibility of extracting the ratio G M Λ (q 2 )/G M ΣΛ (q 2 ) from the Λ/Σ polarization ratio in kaon electroproduction experiments. copyright 1996 The American Physical Society
Domené, Horacio M; Martínez, Alicia S; Frystyk, Jan
2007-01-01
BACKGROUND: In a recently described patient with acid-labile subunit (ALS) deficiency, the inability to form ternary complexes resulted in a marked reduction in circulating total insulin-like growth factor (IGF)-I, whereas skeletal growth was only marginally affected. To further study the role of...
Strel'tsov, V.N.
1992-01-01
The physical sense of three forms of the relativity is discussed. The first - instant from - respects in fact the traditional approach based on the concept of instant distance. The normal form corresponds the radar formulation which is based on the light or retarded distances. The front form in the special case is characterized by 'observable' variables, and the known method of k-coefficient is its obvious expression. 16 refs
Normalization: A Preprocessing Stage
Patro, S. Gopal Krishna; Sahu, Kishore Kumar
2015-01-01
As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...
Manufacturing technology for practical Josephson voltage normals
Kohlmann, Johannes; Kieler, Oliver
2016-01-01
In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.
Denotational Aspects of Untyped Normalization by Evaluation
Filinski, Andrzej; Rohde, Henning Korsholm
2005-01-01
of soundness (the output term, if any, is in normal form and ß-equivalent to the input term); identification (ß-equivalent terms are mapped to the same result); and completeness (the function is defined for all terms that do have normal forms). We also show how the semantic construction enables a simple yet...... formal correctness proof for the normalization algorithm, expressed as a functional program in an ML-like, call-by-value language. Finally, we generalize the construction to produce an infinitary variant of normal forms, namely Böhm trees. We show that the three-part characterization of correctness...
Mahan, G.D.
1992-01-01
The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors
Beck, Raphaël; Pedrosa, Rozangela Curi; Dejeans, Nicolas; Glorieux, Christophe; Levêque, Philippe; Gallez, Bernard; Taper, Henryk; Eeckhoudt, Stéphane; Knoops, Laurent; Calderon, Pedro Buc; Verrax, Julien
2011-10-01
Numerous studies suggest that generation of oxidative stress could be useful in cancer treatment. In this study, we evaluated, in vitro and in vivo, the antitumor potential of oxidative stress induced by ascorbate/menadione (asc/men). This combination of a reducing agent (ascorbate) and a redox active quinone (menadione) generates redox cycling leading to formation of reactive oxygen species (ROS). Asc/men was tested in several cell types including K562 cells (a stable human-derived leukemia cell line), freshly isolated leukocytes from patients with chronic myeloid leukemia, BaF3 cells (a murine pro-B cell line) transfected with Bcr-Abl and peripheral blood leukocytes derived from healthy donors. Although these latter cells were resistant to asc/men, survival of all the other cell lines was markedly reduced, including the BaF3 cells expressing either wild-type or mutated Bcr-Abl. In a standard in vivo model of subcutaneous tumor transplantation, asc/men provoked a significant delay in the proliferation of K562 and BaF3 cells expressing the T315I mutated form of Bcr-Abl. No effect of asc/men was observed when these latter cells were injected into blood of mice most probably because of the high antioxidant potential of red blood cells, as shown by in vitro experiments. We postulate that cancer cells are more sensitive to asc/men than healthy cells because of their lack of antioxidant enzymes, mainly catalase. The mechanism underlying this cytotoxicity involves the oxidative cleavage of Hsp90 with a subsequent loss of its chaperone function thus leading to degradation of wild-type and mutated Bcr-Abl protein.
Dorine Odongo
COLLABORATING TECHNICAL AGENCIES: EXPRESSION OF INTEREST FORM. • Please read the information provided about the initiative and the eligibility requirements in the Prospectus before completing this application form. • Ensure all the sections of the form are accurately completed and saved in PDF format.
Edixhoven, B.; van der Geer, G.; Moonen, B.; Edixhoven, B.; van der Geer, G.; Moonen, B.
2008-01-01
Modular forms are functions with an enormous amount of symmetry that play a central role in number theory, connecting it with analysis and geometry. They have played a prominent role in mathematics since the 19th century and their study continues to flourish today. Modular forms formed the
Soares, Marcelo B.; Efstratiadis, Argiris
1997-01-01
This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.
Random Generators and Normal Numbers
Bailey, David H.; Crandall, Richard E.
2002-01-01
Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...
Weissman, S.D.
1989-01-01
The foot may be thought of as a bag of bones tied tightly together and functioning as a unit. The bones re expected to maintain their alignment without causing symptomatology to the patient. The author discusses a normal radiograph. The bones must have normal shape and normal alignment. The density of the soft tissues should be normal and there should be no fractures, tumors, or foreign bodies
Splittings of free groups, normal forms and partitions of ends
geodesic laminations and show that this space is compact. Many of the ... determined by the partition of ends of ˜M associated to the spheres. In §4, we recall ... As is well-known we can associate to a graph a topological space. Geometrically ...
Nonpolynomial vector fields under the Lotka-Volterra normal form
Hernández-Bermejo, Benito; Fairén, Víctor
1995-02-01
We carry out the generalization of the Lotka-Volterra embedding to flows not explicitly recognizable under the generalized Lotka-Volterra format. The procedure introduces appropriate auxiliary variables, and it is shown how, to a great extent, the final Lotka-Volterra system is independent of their specific definition. Conservation of the topological equivalence during the process is also demonstrated.
Normal forms for characteristic functions on n-ary relations
D.J.N. van Eijck (Jan)
2004-01-01
textabstractFunctions of type (n) are characteristic functions on n-ary relations. Keenan established their importance for natural language semantics, by showing that natural language has many examples of irreducible type (n) functions, i.e., functions of type (n) that cannot be represented as
Griffiths, Paul D.; Batty, Ruth; Connolly, Dan J.A.; Reeves, Michael J.
2009-01-01
The midline structures of the supra-tentorial brain are important landmarks for judging if the brain has formed correctly. In this article, we consider the normal appearances of the corpus callosum, septum pellucidum and fornix as shown on MR imaging in normal and near-normal states. (orig.)
Anomalous normal mode oscillations in semiconductor microcavities
Wang, H. [Univ. of Oregon, Eugene, OR (United States). Dept. of Physics; Hou, H.Q.; Hammons, B.E. [Sandia National Labs., Albuquerque, NM (United States)
1997-04-01
Semiconductor microcavities as a composite exciton-cavity system can be characterized by two normal modes. Under an impulsive excitation by a short laser pulse, optical polarizations associated with the two normal modes have a {pi} phase difference. The total induced optical polarization is then expected to exhibit a sin{sup 2}({Omega}t)-like oscillation where 2{Omega} is the normal mode splitting, reflecting a coherent energy exchange between the exciton and cavity. In this paper the authors present experimental studies of normal mode oscillations using three-pulse transient four wave mixing (FWM). The result reveals surprisingly that when the cavity is tuned far below the exciton resonance, normal mode oscillation in the polarization is cos{sup 2}({Omega}t)-like, in contrast to what is expected form the simple normal mode model. This anomalous normal mode oscillation reflects the important role of virtual excitation of electronic states in semiconductor microcavities.
... I'm breast-feeding my newborn and her bowel movements are yellow and mushy. Is this normal for baby poop? Answers from Jay L. Hoecker, M.D. Yellow, mushy bowel movements are perfectly normal for breast-fed babies. Still, ...
Visual Memories Bypass Normalization.
Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam
2018-05-01
How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.
Haehlen, Peter; Elmiger, Bruno
2000-01-01
The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be
... improves the chance of a good recovery. Without treatment, symptoms may worsen and cause death. What research is being done? The NINDS conducts and supports research on neurological disorders, including normal pressure hydrocephalus. Research on disorders such ...
Normality in Analytical Psychology
Myers, Steve
2013-01-01
Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262
Hydrocephalus - occult; Hydrocephalus - idiopathic; Hydrocephalus - adult; Hydrocephalus - communicating; Dementia - hydrocephalus; NPH ... Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. ... Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders ...
... Spread the Word Shop AAP Find a Pediatrician Family Life Medical Home Family Dynamics Adoption & Foster Care ... Español Text Size Email Print Share Normal Functioning Family Page Content Article Body Is there any way ...
... page: //medlineplus.gov/ency/article/002456.htm Normal growth and development To use the sharing features on this page, please enable JavaScript. A child's growth and development can be divided into four periods: ...
Normal modes and continuous spectra
Balmforth, N.J.; Morrison, P.J.
1994-12-01
The authors consider stability problems arising in fluids, plasmas and stellar systems that contain singularities resulting from wave-mean flow or wave-particle resonances. Such resonances lead to singularities in the differential equations determining the normal modes at the so-called critical points or layers. The locations of the singularities are determined by the eigenvalue of the problem, and as a result, the spectrum of eigenvalues forms a continuum. They outline a method to construct the singular eigenfunctions comprising the continuum for a variety of problems
Nissen, Nina Konstantin; Holm, Lotte; Baarts, Charlotte
2015-01-01
of practices for monitoring their bodies based on different kinds of calculations of weight and body size, observations of body shape, and measurements of bodily firmness. Biometric measurements are familiar to them as are health authorities' recommendations. Despite not belonging to an extreme BMI category...... provides us with knowledge about how to prevent future overweight or obesity. This paper investigates body size ideals and monitoring practices among normal-weight and moderately overweight people. Methods : The study is based on in-depth interviews combined with observations. 24 participants were...... recruited by strategic sampling based on self-reported BMI 18.5-29.9 kg/m2 and socio-demographic factors. Inductive analysis was conducted. Results : Normal-weight and moderately overweight people have clear ideals for their body size. Despite being normal weight or close to this, they construct a variety...
Normal modified stable processes
Barndorff-Nielsen, Ole Eiler; Shephard, N.
2002-01-01
Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...
Olmos, C.
1990-05-01
The restricted holonomy group of a Riemannian manifold is a compact Lie group and its representation on the tangent space is a product of irreducible representations and a trivial one. Each one of the non-trivial factors is either an orthogonal representation of a connected compact Lie group which acts transitively on the unit sphere or it is the isotropy representation of a single Riemannian symmetric space of rank ≥ 2. We prove that, all these properties are also true for the representation on the normal space of the restricted normal holonomy group of any submanifold of a space of constant curvature. 4 refs
Tachikawa, Enzo
1979-01-01
Release of radioiodine built-up during reactor operations presents a potential problem from the standpoint of environmental safety. Among the chemical forms of radioiodine, depending upon the circumstances, organic iodides cast a most serious problem because of its difficulties in the trapping and because of its stability compared to other chemical forms. Furthermore, pellet-cladding interaction (PCl) fuel failures in LWR fuel rods are believed to be stress corrosion cracks caused by embrittling fission product species, radioiodine. To deal with these problems, knowledge is required on the chemical behaviors of radioiodine in and out of fuels, as well as the release behaviors from fuels. Here a brief review is given of these respects, in aiming at clearing-up the questions still remaining unknown. The data seem to indicate that radioiodine exists as a combined form in fuels. upon heating slightly irradiated fuels, the iodine atoms are released in a chemical form associated with uranium atoms. Experiments, however, as needed with specimen of higher burnup, where the interactions of radioiodine with metallic fission products could be favored. The dominant release mechanism of radioiodine under normal operating temperatures will be diffusion to grain boundaries leading to open surfaces. Radiation-induced internal traps, however, after the rate of diffusion significantly. The carbon sources of organic iodides formed under various conditions and its formation mechanisms have also been considered. (author)
Normal and Abnormal Behavior in Early Childhood
Spinner, Miriam R.
1981-01-01
Evaluation of normal and abnormal behavior in the period to three years of age involves many variables. Parental attitudes, determined by many factors such as previous childrearing experience, the bonding process, parental psychological status and parental temperament, often influence the labeling of behavior as normal or abnormal. This article describes the forms of crying, sleep and wakefulness, and affective responses from infancy to three years of age.
Advancing Normal Birth: Organizations, Goals, and Research
Hotelling, Barbara A.; Humenick, Sharron S.
2005-01-01
In this column, the support for advancing normal birth is summarized, based on a comparison of the goals of Healthy People 2010, Lamaze International, the Coalition for Improving Maternity Services, and the midwifery model of care. Research abstracts are presented to provide evidence that the midwifery model of care safely and economically advances normal birth. Rates of intervention experienced, as reported in the Listening to Mothers survey, are compared to the forms of care recommended by ...
Normality in Analytical Psychology
Steve Myers
2013-11-01
Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.
Møldrup, Claus; Traulsen, Janine Morgall; Almarsdóttir, Anna Birna
2003-01-01
Objective: To consider public perspectives on the use of medicines for non-medical purposes, a usage called medically-enhanced normality (MEN). Method: Examples from the literature were combined with empirical data derived from two Danish research projects: a Delphi internet study and a Telebus...
Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon
2017-10-01
The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.
Normal radiographic findings. 4. act. ed.
Moeller, T.B.
2003-01-01
This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)
Normal radiographic findings. 4. act. ed.; Roentgennormalbefunde
Moeller, T.B. [Gemeinschaftspraxis fuer Radiologie und Nuklearmedizin, Dillingen (Germany)
2003-07-01
This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)
Chief Editor
2014-09-01
to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4 the right to republish the work in a collection of articles in any other mechanical or electronic format. We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf. All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.S NoAuthors' NamesContribution (IJCME Guidelines{1 substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; 2 drafting the article or revising it critically for important intellectual content; and 3 final approval of the version to be published. Authors should meet conditions 1, 2, and 3}.SignatureDate Note: All the authors are required to sign independently in this form in the sequence given above. In case an author has left the institution/country and whose whereabouts are not known, the senior author may sign on his/her behalf taking the responsibility.No addition/deletion/ or any change in the sequence of the authorship will be permissible at a later stage, without valid reasons and permission of the Editor.If the authorship is contested at any stage, the article will be either returned or will not be processed for publication till the issue is solved.Maximum up to 4 authors for short communication and up to 6 authors for original article.
Chief Editor
2016-06-01
to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4 the right to republish the work in a collection of articles in any other mechanical or electronic format. We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf. All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.S NoAuthors' NamesContribution (IJCME Guidelines{1 substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; 2 drafting the article or revising it critically for important intellectual content; and 3 final approval of the version to be published. Authors should meet conditions 1, 2, and 3}.SignatureDate Note: All the authors are required to sign independently in this form in the sequence given above. In case an author has left the institution/country and whose whereabouts are not known, the senior author may sign on his/her behalf taking the responsibility.No addition/deletion/ or any change in the sequence of the authorship will be permissible at a later stage, without valid reasons and permission of the Editor.If the authorship is contested at any stage, the article will be either returned or will not be processed for publication till the issue is solved.Maximum up to 4 authors for short communication and up to 6 authors for original article.
Confectionery-based dose forms.
Tangso, Kristian J; Ho, Quy Phuong; Boyd, Ben J
2015-01-01
Conventional dosage forms such as tablets, capsules and syrups are prescribed in the normal course of practice. However, concerns about patient preferences and market demands have given rise to the exploration of novel unconventional dosage forms. Among these, confectionery-based dose forms have strong potential to overcome compliance problems. This report will review the availability of these unconventional dose forms used in treating the oral cavity and for systemic drug delivery, with a focus on medicated chewing gums, medicated lollipops, and oral bioadhesive devices. The aim is to stimulate increased interest in the opportunities for innovative new products that are available to formulators in this field, particularly for atypical patient populations.
Idiopathic Normal Pressure Hydrocephalus
Basant R. Nassar BS
2016-04-01
Full Text Available Idiopathic normal pressure hydrocephalus (iNPH is a potentially reversible neurodegenerative disease commonly characterized by a triad of dementia, gait, and urinary disturbance. Advancements in diagnosis and treatment have aided in properly identifying and improving symptoms in patients. However, a large proportion of iNPH patients remain either undiagnosed or misdiagnosed. Using PubMed search engine of keywords “normal pressure hydrocephalus,” “diagnosis,” “shunt treatment,” “biomarkers,” “gait disturbances,” “cognitive function,” “neuropsychology,” “imaging,” and “pathogenesis,” articles were obtained for this review. The majority of the articles were retrieved from the past 10 years. The purpose of this review article is to aid general practitioners in further understanding current findings on the pathogenesis, diagnosis, and treatment of iNPH.
Ipsen, David Hojland; Tveden-Nyborg, Pernille; Lykkesfeldt, Jens
2016-01-01
Objective: The liver coordinates lipid metabolism and may play a vital role in the development of dyslipidemia, even in the absence of obesity. Normal weight dyslipidemia (NWD) and patients with nonalcoholic fatty liver disease (NAFLD) who do not have obesity constitute a unique subset...... of individuals characterized by dyslipidemia and metabolic deterioration. This review examined the available literature on the role of the liver in dyslipidemia and the metabolic characteristics of patients with NAFLD who do not have obesity. Methods: PubMed was searched using the following keywords: nonobese......, dyslipidemia, NAFLD, NWD, liver, and metabolically obese/unhealthy normal weight. Additionally, article bibliographies were screened, and relevant citations were retrieved. Studies were excluded if they had not measured relevant biomarkers of dyslipidemia. Results: NWD and NAFLD without obesity share a similar...
Lyerly, Anne Drapkin
2012-12-01
The concept of "normal birth" has been promoted as ideal by several international organizations, although debate about its meaning is ongoing. In this article, I examine the concept of normalcy to explore its ethical implications and raise a trio of concerns. First, in its emphasis on nonuse of technology as a goal, the concept of normalcy may marginalize women for whom medical intervention is necessary or beneficial. Second, in its emphasis on birth as a socially meaningful event, the mantra of normalcy may unintentionally avert attention to meaning in medically complicated births. Third, the emphasis on birth as a normal and healthy event may be a contributor to the long-standing tolerance for the dearth of evidence guiding the treatment of illness during pregnancy and the failure to responsibly and productively engage pregnant women in health research. Given these concerns, it is worth debating not just what "normal birth" means, but whether the term as an ideal earns its keep. © 2012, Copyright the Authors Journal compilation © 2012, Wiley Periodicals, Inc.
10 CFR 71.71 - Normal conditions of transport.
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Normal conditions of transport. 71.71 Section 71.71 Energy..., Special Form, and LSA-III Tests 2 § 71.71 Normal conditions of transport. (a) Evaluation. Evaluation of each package design under normal conditions of transport must include a determination of the effect on...
Masturbation, sexuality, and adaptation: normalization in adolescence.
Shapiro, Theodore
2008-03-01
During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.
Strength of Gamma Rhythm Depends on Normalization
Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.
2013-01-01
Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427
Densified waste form and method for forming
Garino, Terry J.; Nenoff, Tina M.; Sava Gallis, Dorina Florentina
2015-08-25
Materials and methods of making densified waste forms for temperature sensitive waste material, such as nuclear waste, formed with low temperature processing using metallic powder that forms the matrix that encapsulates the temperature sensitive waste material. The densified waste form includes a temperature sensitive waste material in a physically densified matrix, the matrix is a compacted metallic powder. The method for forming the densified waste form includes mixing a metallic powder and a temperature sensitive waste material to form a waste form precursor. The waste form precursor is compacted with sufficient pressure to densify the waste precursor and encapsulate the temperature sensitive waste material in a physically densified matrix.
2010-07-01
... OSHA 300 Log. Instead, enter “privacy case” in the space normally used for the employee's name. This...) Basic requirement. You must use OSHA 300, 300-A, and 301 forms, or equivalent forms, for recordable injuries and illnesses. The OSHA 300 form is called the Log of Work-Related Injuries and Illnesses, the 300...
Short proofs of strong normalization
Wojdyga, Aleksander
2008-01-01
This paper presents simple, syntactic strong normalization proofs for the simply-typed lambda-calculus and the polymorphic lambda-calculus (system F) with the full set of logical connectives, and all the permutative reductions. The normalization proofs use translations of terms and types to systems, for which strong normalization property is known.
Weak convergence and uniform normalization in infinitary rewriting
Simonsen, Jakob Grue
2010-01-01
the starkly surprising result that for any orthogonal system with finitely many rules, the system is weakly normalizing under weak convergence if{f} it is strongly normalizing under weak convergence if{f} it is weakly normalizing under strong convergence if{f} it is strongly normalizing under strong...... convergence. As further corollaries, we derive a number of new results for weakly convergent rewriting: Systems with finitely many rules enjoy unique normal forms, and acyclic orthogonal systems are confluent. Our results suggest that it may be possible to recover some of the positive results for strongly...
Normalized Excited Squeezed Vacuum State and Its Applications
Meng Xiangguo; Wang Jisuo; Liang Baolong
2007-01-01
By using the intermediate coordinate-momentum representation in quantum optics and generating function for the normalization of the excited squeezed vacuum state (ESVS), the normalized ESVS is obtained. We find that its normalization constants obtained via two new methods are uniform and a new form which is different from the result obtained by Zhang and Fan [Phys. Lett. A 165 (1992) 14]. By virtue of the normalization constant of the ESVS and the intermediate coordinate-momentum representation, the tomogram of the normalized ESVS and some useful formulae are derived.
Visual attention and flexible normalization pools
Schwartz, Odelia; Coen-Cagli, Ruben
2013-01-01
Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413
Selective attention in normal and impaired hearing.
Shinn-Cunningham, Barbara G; Best, Virginia
2008-12-01
A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.
Basic characterization of normal multifocal electroretinogram
Fernandez Cherkasova, Lilia; Rojas Rondon, Irene; Castro Perez, Pedro Daniel; Lopez Felipe, Daniel; Santiesteban Freixas, Rosaralis; Mendoza Santiesteban, Carlos E
2008-01-01
A scientific literature review was made on the novel multifocal electroretinogram technique, the involved cell mechanisms and some of the factors modifying its results together with the form of presentation. The basic characteristics of this electrophysiological record obtained from several regions of the retina of normal subjects is important in order to create at a small scale a comparative database to evaluate pathological eye tracing. All this will greatly help in early less invasive electrodiagnosis of localized retinal lesions. (Author)
Normalization in Lie algebras via mould calculus and applications
Paul, Thierry; Sauzin, David
2017-11-01
We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.
Harmonic Maass forms and mock modular forms
Bringmann, Kathrin; Ono, Ken
2017-01-01
Modular forms and Jacobi forms play a central role in many areas of mathematics. Over the last 10-15 years, this theory has been extended to certain non-holomorphic functions, the so-called "harmonic Maass forms". The first glimpses of this theory appeared in Ramanujan's enigmatic last letter to G. H. Hardy written from his deathbed. Ramanujan discovered functions he called "mock theta functions" which over eighty years later were recognized as pieces of harmonic Maass forms. This book contains the essential features of the theory of harmonic Maass forms and mock modular forms, together with a wide variety of applications to algebraic number theory, combinatorics, elliptic curves, mathematical physics, quantum modular forms, and representation theory.
Park, Jong Suk; Kang, Ung Gu
2016-02-01
Traditionally, delusions have been considered to be the products of misinterpretation and irrationality. However, some theorists have argued that delusions are normal or rational cognitive responses to abnormal experiences. That is, when a recently experienced peculiar event is more plausibly explained by an extraordinary hypothesis, confidence in the veracity of this extraordinary explanation is reinforced. As the number of such experiences, driven by the primary disease process in the perceptual domain, increases, this confidence builds and solidifies, forming a delusion. We tried to understand the formation of delusions using a simulation based on Bayesian inference. We found that (1) even if a delusional explanation is only marginally more plausible than a non-delusional one, the repetition of the same experience results in a firm belief in the delusion. (2) The same process explains the systematization of delusions. (3) If the perceived plausibility of the explanation is not consistent but varies over time, the development of a delusion is delayed. Additionally, this model may explain why delusions are not corrected by persuasion or rational explanation. This Bayesian inference perspective can be considered a way to understand delusions in terms of rational human heuristics. However, such experiences of "rationality" can lead to irrational conclusions, depending on the characteristics of the subject. Copyright © 2015 Elsevier Ltd. All rights reserved.
Group normalization for genomic data.
Ghandi, Mahmoud; Beer, Michael A
2012-01-01
Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.
Group normalization for genomic data.
Mahmoud Ghandi
Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.
Skyum, Sven
1978-01-01
This paper continues the study of ETOL forms and good EOL forms done by Maurer, Salomaa and Wood. It is proven that binary very complete ETOL forms exist, good synchronized ETOL forms exist and that no propagating or synchronized ETOL form can be very complete.......This paper continues the study of ETOL forms and good EOL forms done by Maurer, Salomaa and Wood. It is proven that binary very complete ETOL forms exist, good synchronized ETOL forms exist and that no propagating or synchronized ETOL form can be very complete....
Comparison of spectrum normalization techniques for univariate ...
Laser-induced breakdown spectroscopy; univariate study; normalization models; stainless steel; standard error of prediction. Abstract. Analytical performance of six different spectrum normalization techniques, namelyinternal normalization, normalization with total light, normalization with background along with their ...
Normal matter storage of antiprotons
Campbell, L.J.
1987-01-01
Various simple issues connected with the possible storage of anti p in relative proximity to normal matter are discussed. Although equilibrium storage looks to be impossible, condensed matter systems are sufficiently rich and controllable that nonequilibrium storage is well worth pursuing. Experiments to elucidate the anti p interactions with normal matter are suggested. 32 refs
An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...
Disjoint sum forms in reliability theory
B. Anrig
2014-01-01
Full Text Available The structure function f of a binary monotone system is assumed to be known and given in a disjunctive normal form, i.e. as the logical union of products of the indicator variables of the states of its subsystems. Based on this representation of f, an improved Abraham algorithm is proposed for generating the disjoint sum form of f. This form is the base for subsequent numerical reliability calculations. The approach is generalized to multivalued systems. Examples are discussed.
A Denotational Account of Untyped Normalization by Evaluation
Filinski, Andrzej; Rohde, Henning Korsholm
2004-01-01
Abstract. We show that the standard normalization-by-evaluation construction for the simply-typed λβη-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a “recursively defined” invariant relation, in the style of Pitts. In fact......, the construction can be seen as generalizing a computational adequacy argument for an untyped, call-by-name language to normalization instead of evaluation. In the untyped setting, not all terms have normal forms, so the normalization function is necessarily partial. We establish its correctness in the senses...
Singh, S.K.
2002-01-01
The present status of electroweak nucleon form factors and the N - Δ transition form factors is reviewed. Particularly the determination of dipole mass M A in the axial vector form factor is discussed
Complete Normal Ordering 1: Foundations
Ellis, John; Skliros, Dimitri P.
2016-01-01
We introduce a new prescription for quantising scalar field theories perturbatively around a true minimum of the full quantum effective action, which is to `complete normal order' the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all `cephalopod' Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of `complete normal ordering' (which is an extension of the standard field theory definition of normal ordering) reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative i...
The normal and pathological language
Espejo, Luis D.
2014-01-01
The extraordinary development of normal and pathological psychology has achieved in recent decades, thanks to the dual method of objective observation and oral survey enabled the researcher spirit of neuro-psychiatrist penetrate the intimate mechanism of the nervous system whose supreme manifestation is thought. It is normal psychology explaining the complicated game of perceptions: their methods of transmission, their centers of projection, its transformations and its synthesis to construct ...
Is normal science good science?
Adrianna Kępińska
2015-09-01
Full Text Available “Normal science” is a concept introduced by Thomas Kuhn in The Structure of Scientific Revolutions (1962. In Kuhn’s view, normal science means “puzzle solving”, solving problems within the paradigm—framework most successful in solving current major scientific problems—rather than producing major novelties. This paper examines Kuhnian and Popperian accounts of normal science and their criticisms to assess if normal science is good. The advantage of normal science according to Kuhn was “psychological”: subjective satisfaction from successful “puzzle solving”. Popper argues for an “intellectual” science, one that consistently refutes conjectures (hypotheses and offers new ideas rather than focus on personal advantages. His account is criticized as too impersonal and idealistic. Feyerabend’s perspective seems more balanced; he argues for a community that would introduce new ideas, defend old ones, and enable scientists to develop in line with their subjective preferences. The paper concludes that normal science has no one clear-cut set of criteria encompassing its meaning and enabling clear assessment.
nth roots of normal contractions
Duggal, B.P.
1992-07-01
Given a complex separable Hilbert space H and a contraction A on H such that A n , n≥2 some integer, is normal it is shown that if the defect operator D A = (1 - A * A) 1/2 is of the Hilbert-Schmidt class, then A is similar to a normal contraction, either A or A 2 is normal, and if A 2 is normal (but A is not) then there is a normal contraction N and a positive definite contraction P of trace class such that parallel to A - N parallel to 1 = 1/2 parallel to P + P parallel to 1 (where parallel to · parallel to 1 denotes the trace norm). If T is a compact contraction such that its characteristics function admits a scalar factor, if T = A n for some integer n≥2 and contraction A with simple eigen-values, and if both T and A satisfy a ''reductive property'', then A is a compact normal contraction. (author). 16 refs
Adaptive municipal electronic forms
Kuiper, Pieternel; van Dijk, Elisabeth M.A.G.; Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Guiderdoni-Jourdain, Karine; Oiry, Ewan
Adaptation of electronic forms (e-forms) seems to be a step forward to reduce the burden for people who fill in forms. Municipalities more and more offer e-forms online that can be used by citizens to request a municipal product or service or by municipal employees to place a request on behalf of a
Exchange rate arrangements: From extreme to "normal"
Beker Emilija
2006-01-01
Full Text Available The paper studies theoretical and empirical location dispersion of exchange rate arrangements - rigid-intermediate-flexible regimes, in the context of extreme arrangements of a currency board, dollarization and monetary union moderate characteristics of intermediate arrangements (adjustable pegs crawling pegs and target zones and imperative-process "normalization" in the form of a managed or clean floating system. It is established that de iure and de facto classifications generate "fear of floating" and "fear of pegging". The "impossible trinity" under the conditions of capital liberalization and globalization creates a bipolar view or hypothesis of vanishing intermediate exchange rate regimes.
Normal gravity field in relativistic geodesy
Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao
2018-02-01
Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are
Analysis tools for precision studies of hadronic three-body decays and transition form factors
Schneider, Sebastian Philipp
2013-01-01
present a calculation of the ππ P-wave inelasticity from ωπ intermediate states. Finally, we extend the framework and discuss the ω/φ→π 0 γ * transition form factor. For that we use the previously determined ω/φ→3π partial-wave amplitude and the well-known pion vector form factor as input. Our findings are compared to recent measurements of ω→π 0 μ + μ - by the NA60 collaboration. We also suggest that a precise measurement of the Okubo-Zweig-Iizuka-forbidden φ→π 0 l + l - decay may help to understand the strong deviations found between recent theoretical determinations and transition form factor data.
Analysis tools for precision studies of hadronic three-body decays and transition form factors
Schneider, Sebastian Philipp
2013-02-14
parameters that may be determined in future high-precision measurements of {omega}{yields}3{pi} and present a calculation of the {pi}{pi} P-wave inelasticity from {omega}{pi} intermediate states. Finally, we extend the framework and discuss the {omega}/{phi}{yields}{pi}{sup 0}{gamma}{sup *} transition form factor. For that we use the previously determined {omega}/{phi}{yields}3{pi} partial-wave amplitude and the well-known pion vector form factor as input. Our findings are compared to recent measurements of {omega}{yields}{pi}{sup 0}{mu}{sup +}{mu}{sup -} by the NA60 collaboration. We also suggest that a precise measurement of the Okubo-Zweig-Iizuka-forbidden {phi}{yields}{pi}{sup 0}l{sup +}l{sup -} decay may help to understand the strong deviations found between recent theoretical determinations and transition form factor data.
Precaval retropancreatic space: Normal anatomy
Lee, Yeon Hee; Kim, Ki Whang; Kim, Myung Jin; Yoo, Hyung Sik; Lee, Jong Tae [Yonsei University College of Medicine, Seoul (Korea, Republic of)
1992-07-15
The authors defined precaval retropancreatic space as the space between pancreatic head with portal vein and IVC and analyzed the CT findings of this space to know the normal structures and size in this space. We evaluated 100 cases of normal abdominal CT scan to find out normal anatomic structures of precaval retropancreatic space retrospectively. We also measured the distance between these structures and calculated the minimum, maximum and mean values. At the splenoportal confluence level, normal structures between portal vein and IVC were vessel (21%), lymph node (19%), and caudate lobe of liver (2%) in order of frequency. The maximum AP diameter of portocaval lymph node was 4 mm. Common bile duct (CBD) was seen in 44% and the diameter was mean 3 mm and maximum 11 mm. CBD was located in extrapancreatic (75%) and lateral (60.6%) to pancreatic head. At IVC-left renal vein level, the maximum distance between CBD and IVC was 5 mm and the structure between posterior pancreatic surface and IVC was only fat tissue. Knowledge of these normal structures and measurement will be helpful in differentiating pancreatic mass with retropancreatic mass such as lymphadenopathy.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
3j Symbols: To Normalize or Not to Normalize?
van Veenendaal, Michel
2011-01-01
The systematic use of alternative normalization constants for 3j symbols can lead to a more natural expression of quantities, such as vector products and spherical tensor operators. The redefined coupling constants directly equate tensor products to the inner and outer products without any additional square roots. The approach is extended to…
Deconstructing Interocular Suppression: Attention and Divisive Normalization.
Hsin-Hung Li
2015-10-01
Full Text Available In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression, the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature and divisive normalization contribute to interocular suppression.
Deconstructing Interocular Suppression: Attention and Divisive Normalization.
Li, Hsin-Hung; Carrasco, Marisa; Heeger, David J
2015-10-01
In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression), the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature) and divisive normalization contribute to interocular suppression.
Mast cell distribution in normal adult skin.
Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P
2005-03-01
To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.
Moeller, T.B.; Reif, E.
1998-01-01
This book gives answers to questions frequently heard especially from trainees and doctors not specialising in the field of radiology: Is that a normal finding? How do I decide? What are the objective criteria? The information presented is three-fold. The normal findings of the usual CT and MRI examinations are shown with high-quality pictures serving as a reference, with inscribed important additional information on measures, angles and other criteria describing the normal conditions. These criteria are further explained and evaluated in accompanying texts which also teach the systematic approach for individual picture analysis, and include a check list of major aspects, as a didactic guide for learning. The book is primarily intended for students, radiographers, radiology trainees and doctors from other medical fields, but radiology specialists will also find useful details of help in special cases. (orig./CB) [de
Marrow transfusions into normal recipients
Brecher, G.
1983-01-01
During the past several years we have explored the transfusion of bone marrow into normal nonirradiated mice. While transfused marrow proliferates readily in irradiated animals, only minimal proliferation takes place in nonirradiated recipients. It has generally been assumed that this was due to the lack of available proliferative sites in recipients with normal marrow. Last year we were able to report that the transfusion of 200 million bone marrow cells (about 2/3 of the total complement of marrow cells of a normal mouse) resulted in 20% to 25% of the recipient's marrow being replaced by donor marrow. Thus we can now study the behavior of animals that have been transfused (donor) and endogenous (recipient) marrow cells, although none of the tissues of either donor or recipient have been irradiated. With these animals we hope to investigate the nature of the peculiar phenomenon of serial exhaustion of marrow, also referred to as the limited self-replicability of stem cells
The construction of normal expectations
Quitzau, Maj-Britt; Røpke, Inge
2008-01-01
The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...
P N Johnson-Laird
2010-10-01
Full Text Available An old view in logic going back to Aristotle is that an inference is valid in virtue of its logical form. Many psychologists have adopted the same point of view about human reasoning: the first step is to recover the logical form of an inference, and the second step is to apply rules of inference that match these forms in order to prove that the conclusion follows from the premises. The present paper argues against this idea. The logical form of an inference transcends the grammatical forms of the sentences used to express it, because logical form also depends on context. Context is not readily expressed in additional premises. And the recovery of logical form leads ineluctably to the need for infinitely many axioms to capture the logical properties of relations. An alternative theory is that reasoning depends on mental models, and this theory obviates the need to recover logical form.
... this page please turn Javascript on. Forms of Arthritis Past Issues / Fall 2006 Table of Contents Today, ... of Linda Saisselin Osteoarthritis (OA) — the form of arthritis typically occurring during middle or old age, this ...
To establish EPA’s Forms Management Program; to describe the requisite roles, responsibilities, and procedures necessary for the successful management of EPA forms; and to more clearly fulfill EPA’s obligations in this regard.
Moisã Claudia Olimpia; Moisã Claudia Olimpia
2011-01-01
Taking into account the suite of motivation that youth has when practicing tourism, it can be said that the youth travel takes highly diverse forms. These forms are educational tourism, volunteer programs and “work and travel”, cultural exchanges or sports tourism and adventure travel. In this article, we identified and analyzed in detail the main forms of youth travel both internationally and in Romania. We also illustrated for each form of tourism the specific tourism products targeting you...
Alnæs, Martin S.; Logg, Anders; Ølgaard, Kristian Breum
2014-01-01
We present the Unied Form Language (UFL), which is a domain-specic language for representing weak formulations of partial dierential equations with a view to numerical approximation. Features of UFL include support for variational forms and functionals, automatic dierentiation of forms and expres...... libraries to generate concrete low-level implementations. Some application examples are presented and libraries that support UFL are highlighted....
Kong, Peter C.; Pink, Robert J.; Zuck, Larry D.
2008-08-19
A method for forming ammonia is disclosed and which includes the steps of forming a plasma; providing a source of metal particles, and supplying the metal particles to the plasma to form metal nitride particles; and providing a substance, and reacting the metal nitride particles with the substance to produce ammonia, and an oxide byproduct.
Frederic D. R. Bonnet; Robert G. Edwards; George T. Fleming; Randal Lewis; David Richards
2003-07-22
We have started a program to compute the electromagnetic form factors of mesons. We discuss the techniques used to compute the pion form factor and present preliminary results computed with domain wall valence fermions on MILC asqtad lattices, as well as Wilson fermions on quenched lattices. These methods can easily be extended to rho-to-gamma-pi transition form factors.
Complete normal ordering 1: Foundations
John Ellis
2016-08-01
Full Text Available We introduce a new prescription for quantising scalar field theories (in generic spacetime dimension and background perturbatively around a true minimum of the full quantum effective action, which is to ‘complete normal order’ the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all ‘cephalopod’ Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of ‘complete normal ordering’ (which is an extension of the standard field theory definition of normal ordering reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative interactions, and by using a point splitting ‘trick’ we extend this result to theories with derivative interactions, such as those appearing as non-linear σ-models in the world-sheet formulation of string theory. We focus here on theories with trivial vacua, generalising the discussion to non-trivial vacua in a follow-up paper.
Mixed normal inference on multicointegration
Boswijk, H.P.
2009-01-01
Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the
Is My Child's Appetite Normal?
Is My Child’s Appetite Normal? Cayla, who is 4 years old, did not finish her lunch. But she is ready to play. Her ... snack for later. That is okay! Your child’s appetite changes. Children do not grow as fast in ...
Transforming Normal Programs by Replacement
Bossi, Annalisa; Pettorossi, A.; Cocco, Nicoletta; Etalle, Sandro
1992-01-01
The replacement transformation operation, already defined in [28], is studied wrt normal programs. We give applicability conditions able to ensure the correctness of the operation wrt Fitting's and Kunen's semantics. We show how replacement can mimic other transformation operations such as thinning,
Semigroups of data normalization functions
Warrens, Matthijs J.
2016-01-01
Variable centering and scaling are functions that are typically used in data normalization. Various properties of centering and scaling functions are presented. It is shown that if we use two centering functions (or scaling functions) successively, the result depends on the order in which the
Normalizing Catastrophe: Sustainability and Scientism
Bonnett, Michael
2013-01-01
Making an adequate response to our deteriorating environmental situation is a matter of ever increasing urgency. It is argued that a central obstacle to achieving this is the way that scientism has become normalized in our thinking about environmental issues. This is taken to reflect on an underlying "metaphysics of mastery" that vitiates proper…
Neutron RBE for normal tissues
Field, S.B.; Hornsey, S.
1979-01-01
RBE for various normal tissues is considered as a function of neutron dose per fraction. Results from a variety of centres are reviewed. It is shown that RBE is dependent on neutron energy and is tissue dependent, but is not specially high for the more critical tissues or for damage occurring late after irradiation. (author)
Normal and abnormal growth plate
Kumar, R.; Madewell, J.E.; Swischuk, L.E.
1987-01-01
Skeletal growth is a dynamic process. A knowledge of the structure and function of the normal growth plate is essential in order to understand the pathophysiology of abnormal skeletal growth in various diseases. In this well-illustrated article, the authors provide a radiographic classification of abnormal growth plates and discuss mechanisms that lead to growth plate abnormalities
Bernstein Algorithm for Vertical Normalization to 3NF Using Synthesis
Matija Varga
2013-07-01
Full Text Available This paper demonstrates the use of Bernstein algorithm for vertical normalization to 3NF using synthesis. The aim of the paper is to provide an algorithm for database normalization and present a set of steps which minimize redundancy in order to increase the database management efficiency, and specify tests and algorithms for testing and proving the reversibility (i.e., proving that the normalization did not cause loss of information. Using Bernstein algorithm steps, the paper gives examples of vertical normalization to 3NF through synthesis and proposes a test and an algorithm to demonstrate decomposition reversibility. This paper also sets out to explain that the reasons for generating normal forms are to facilitate data search, eliminate data redundancy as well as delete, insert and update anomalies and explain how anomalies develop using examples-
Schema Design and Normalization Algorithm for XML Databases Model
Samir Abou El-Seoud
2009-06-01
Full Text Available In this paper we study the problem of schema design and normalization in XML databases model. We show that, like relational databases, XML documents may contain redundant information, and this redundancy may cause update anomalies. Furthermore, such problems are caused by certain functional dependencies among paths in the document. Based on our research works, in which we presented the functional dependencies and normal forms of XML Schema, we present the decomposition algorithm for converting any XML Schema into normalized one, that satisfies X-BCNF.
Forms of Life, Forms of Reality
Piergiorgio Donatelli
2015-10-01
Full Text Available The article explores aspects of the notion of forms of life in the Wittgensteinian tradition especially following Iris Murdoch’s lead. On the one hand, the notion signals the hardness and inexhaustible character of reality, as the background needed in order to make sense of our lives in various ways. On the other, the hardness of reality is the object of a moral work of apprehension and deepening to the point at which its distinctive character dissolves into the family of connections we have gained for ourselves. The two movements of thought are connected and necessary.
2013-01-01
Micro Metal Forming, i. e. forming of parts and features with dimensions below 1 mm, is a young area of research in the wide field of metal forming technologies, expanding the limits for applying metal forming towards micro technology. The essential challenges arise from the reduced geometrical size and the increased lot size. In order to enable potential users to apply micro metal forming in production, information about the following topics are given: tribological behavior: friction between tool and work piece as well as tool wear mechanical behavior: strength and formability of the work piece material, durability of the work pieces size effects: basic description of effects occurring due to the fact, that the quantitative relation between different features changes with decreasing size process windows and limits for forming processes tool making methods numerical modeling of processes and process chains quality assurance and metrology All topics are discussed with respect to the questions relevant to micro...
Superconducting versus normal conducting cavities
Podlech, Holger
2013-01-01
One of the most important issues of high-power hadron linacs is the choice of technology with respect to superconducting or room-temperature operation. The favour for a specific technology depends on several parameters such as the beam energy, beam current, beam power and duty factor. This contribution gives an overview of the comparison between superconducting and normal conducting cavities. This includes basic radiofrequency (RF) parameters, design criteria, limitations, required RF and plug power as well as case studies.
Normal Movement Selectivity in Autism
Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J.
2010-01-01
It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements, but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Moveme...
Gravitation and quadratic forms
Ananth, Sudarshan; Brink, Lars; Majumdar, Sucheta; Mali, Mahendra; Shah, Nabha
2017-01-01
The light-cone Hamiltonians describing both pure (N=0) Yang-Mills and N=4 super Yang-Mills may be expressed as quadratic forms. Here, we show that this feature extends to theories of gravity. We demonstrate how the Hamiltonians of both pure gravity and N=8 supergravity, in four dimensions, may be written as quadratic forms. We examine the effect of residual reparametrizations on the Hamiltonian and the resulting quadratic form.
Neutron electromagnetic form factors
Finn, J.M.; Madey, R.; Eden, T.; Markowitz, P.; Rutt, P.M.; Beard, K.; Anderson, B.D.; Baldwin, A.R.; Keane, D.; Manley, D.M.; Watson, J.W.; Zhang, W.M.; Kowalski, S.; Bertozzi, W.; Dodson, G.; Farkhondeh, M.; Dow, K.; Korsch, W.; Tieger, D.; Turchinetz, W.; Weinstein, L.; Gross, F.; Mougey, J.; Ulmer, P.; Whitney, R.; Reichelt, T.; Chang, C.C.; Kelly, J.J.; Payerle, T.; Cameron, J.; Ni, B.; Spraker, M.; Barkhuff, D.; Lourie, R.; Verst, S.V.; Hyde-Wright, C.; Jiang, W.-D.; Flanders, B.; Pella, P.; Arenhoevel, H.
1992-01-01
Nucleon form factors provide fundamental input for nuclear structure and quark models. Current knowledge of neutron form factors, particularly the electric form factor of the neutron, is insufficient to meet these needs. Developments of high-duty-factor accelerators and polarization-transfer techniques permit new experiments that promise results with small sensitivities to nuclear models. We review the current status of the field, our own work at the MIT/Bates linear accelerator, and future experimental efforts
Gravitation and quadratic forms
Ananth, Sudarshan [Indian Institute of Science Education and Research,Pune 411008 (India); Brink, Lars [Department of Physics, Chalmers University of Technology,S-41296 Göteborg (Sweden); Institute of Advanced Studies and Department of Physics & Applied Physics,Nanyang Technological University,Singapore 637371 (Singapore); Majumdar, Sucheta [Indian Institute of Science Education and Research,Pune 411008 (India); Mali, Mahendra [School of Physics, Indian Institute of Science Education and Research,Thiruvananthapuram, Trivandrum 695016 (India); Shah, Nabha [Indian Institute of Science Education and Research,Pune 411008 (India)
2017-03-31
The light-cone Hamiltonians describing both pure (N=0) Yang-Mills and N=4 super Yang-Mills may be expressed as quadratic forms. Here, we show that this feature extends to theories of gravity. We demonstrate how the Hamiltonians of both pure gravity and N=8 supergravity, in four dimensions, may be written as quadratic forms. We examine the effect of residual reparametrizations on the Hamiltonian and the resulting quadratic form.
Organizational forms and knowledge absorption
Radovanović Nikola
2016-01-01
Full Text Available Managing the entire portion of knowledge in an organization is a challenging task. At the organizational level, there can be enormous quantities of unknown, poorly valued or inefficiently applied knowledge. This is normally followed with the underdeveloped potential or inability of organizations to absorb knowledge from external sources. Facilitation of the efficient internal flow of knowledge within the established communication network may positively affect organizational capacity to absorb or identify, share and subsequently apply knowledge to commercial ends. Based on the evidences that the adoption of different organizational forms affects knowledge flows within an organization, this research analyzed the relationship between common organizational forms and absorptive capacity of organizations. In this paper, we test the hypothesis stating that the organizational structure affects knowledge absorption and exploitation in the organization. The methodology included quantitative and qualitative research method based on a questionnaire, while the data has been statistically analyzed and the hypothesis has been tested with the use of cross-tabulation and chi-square tests. The findings suggest that the type of organizational form affects knowledge absorption capacity and that having a less formalized and more flexible structure in an organization increases absorbing and exploiting opportunities of potentially valuable knowledge.
Lithium control during normal operation
Suryanarayan, S.; Jain, D.
2010-01-01
Periodic increases in lithium (Li) concentrations in the primary heat transport (PHT) system during normal operation are a generic problem at CANDU® stations. Lithiated mixed bed ion exchange resins are used at stations for pH control in the PHT system. Typically tight chemistry controls including Li concentrations are maintained in the PHT water. The reason for the Li increases during normal operation at CANDU stations such as Pickering was not fully understood. In order to address this issue a two pronged approach was employed. Firstly, PNGS-A data and information from other available sources was reviewed in an effort to identify possible factors that may contribute to the observed Li variations. Secondly, experimental studies were carried out to assess the importance of these factors in order to establish reasons for Li increases during normal operation. Based on the results of these studies, plausible mechanisms/reasons for Li increases have been identified and recommendations made for proactive control of Li concentrations in the PHT system. (author)
Normalization of Gravitational Acceleration Models
Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.
2011-01-01
Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.
"Ser diferente é normal?"/"Being different: is it normal?"
Viviane Veras
2007-01-01
Full Text Available A pergunta título deste trabalho retoma o slogan “Ser diferente é normal”, que é parte da campanha criada para uma organização não-governamental que atende portadores de Síndrome de Down. O objetivo é a inclusão social da pessoa com deficiência e o primeiro passo foi propor a inclusão de um grupo de diferentes no grupo dito normal. No vídeo de lançamento da campanha, o diferente, identificado como normal, é mostrado por meio de exemplos – um negro com cabelo black-power, um skin-head, um corpo tatuado, um corpo feminino halterofílico, uma família hippie, uma garota com síndrome de Down. A visão da adolescente dançando reduz, de certo modo, o efeito imaginário que vai além da síndrome, uma vez que apenas o corpo com seus olhinhos puxados se destacam, e não se interrogam questões cognitivas. Minha proposta é refletir sobre o estatuto paradoxal do exemplo, tal como é trabalhado nesse vídeo: se, por definição, um exemplo mostra de fato seu pertencimento a uma classe, pode-se concluir que é exatamente por ser exemplar que ele se encontra fora dela, no exato momento em que a exibe e define. The question in the title of this paper refers to the slogan "ser diferente é normal" ("It´s normal to be different", which is part of a campaign created for a NGO that supports people with Down syndrome. The objective of the campaign is to promote the social inclusion of individuals with Down syndrome, and the first step was to propose the inclusion of a group of "differents" in the so-called normal group. The film launching the campaign shows the different identified as normal by means of examples: a black man exhibiting blackpower haircut, a skin-head, a tattooed body, an over-athletic female body, a hippie family and a girl with Down syndrome. The vision of the dancing teenager lessens the imaginary effect that surpasses the syndrome, since only her body and her little oriental eyes stand out and no cognitive issues are
Electronic Capitalization Asset Form -
Department of Transportation — National Automated Capitalization Authorization Form used by ATO Engineering Services, Logistics, Accounting for the purpose of identifying and capturing FAA project...
Forming of superplastic ceramics
Lesuer, D.R.; Wadsworth, J.; Nieh, T.G.
1994-05-01
Superplasticity in ceramics has now advanced to the stage that technologically viable superplastic deformation processing can be performed. In this paper, examples of superplastic forming and diffusion bonding of ceramic components are given. Recent work in biaxial gas-pressure forming of several ceramics is provided. These include yttria-stabilized, tetragonal zirconia (YTZP), a 20% alumina/YTZP composite, and silicon. In addition, the concurrent superplastic forming and diffusion bonding of a hybrid ceramic-metal structure are presented. These forming processes offer technological advantages of greater dimensional control and increased variety and complexity of shapes than is possible with conventional ceramic shaping technology.
Cooperative Station History Forms
National Oceanic and Atmospheric Administration, Department of Commerce — Various forms, photographs and correspondence documenting the history of Cooperative station instrumentation, location changes, inspections, and...
Deformation around basin scale normal faults
Spahic, D.
2010-01-01
Faults in the earth crust occur within large range of scales from microscale over mesoscopic to large basin scale faults. Frequently deformation associated with faulting is not only limited to the fault plane alone, but rather forms a combination with continuous near field deformation in the wall rock, a phenomenon that is generally called fault drag. The correct interpretation and recognition of fault drag is fundamental for the reconstruction of the fault history and determination of fault kinematics, as well as prediction in areas of limited exposure or beyond comprehensive seismic resolution. Based on fault analyses derived from 3D visualization of natural examples of fault drag, the importance of fault geometry for the deformation of marker horizons around faults is investigated. The complex 3D structural models presented here are based on a combination of geophysical datasets and geological fieldwork. On an outcrop scale example of fault drag in the hanging wall of a normal fault, located at St. Margarethen, Burgenland, Austria, data from Ground Penetrating Radar (GPR) measurements, detailed mapping and terrestrial laser scanning were used to construct a high-resolution structural model of the fault plane, the deformed marker horizons and associated secondary faults. In order to obtain geometrical information about the largely unexposed master fault surface, a standard listric balancing dip domain technique was employed. The results indicate that for this normal fault a listric shape can be excluded, as the constructed fault has a geologically meaningless shape cutting upsection into the sedimentary strata. This kinematic modeling result is additionally supported by the observation of deformed horizons in the footwall of the structure. Alternatively, a planar fault model with reverse drag of markers in the hanging wall and footwall is proposed. Deformation around basin scale normal faults. A second part of this thesis investigates a large scale normal fault
Corticocortical feedback increases the spatial extent of normalization.
Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T
2014-01-01
Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.
Corticocortical feedback increases the spatial extent of normalization
Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.
2014-01-01
Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596
Kohlmann, Johannes; Kieler, Oliver [Physikalisch-Technische Bundesanstalt (PTB), Braunschweig (Germany). Arbeitsgruppe 2.43 ' ' Josephson-Schaltungen' '
2016-09-15
In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.
Understanding a Normal Distribution of Data.
Maltenfort, Mitchell G
2015-12-01
Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?
Quantiles for Finite Mixtures of Normal Distributions
Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.
2006-01-01
Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)
Minh Ha, Thien; Niggeler, Dieter; Bunke, Horst; Clarinval, Jose
1995-08-01
Although giro forms are used by many people in daily life for money remittance in Switzerland, the processing of these forms at banks and post offices is only partly automated. We describe an ongoing project for building an automatic system that is able to recognize various items printed or written on a giro form. The system comprises three main components, namely, an automatic form feeder, a camera system, and a computer. These components are connected in such a way that the system is able to process a bunch of forms without any human interactions. We present two real applications of our system in the field of payment services, which require the reading of both machine printed and handwritten information that may appear on a giro form. One particular feature of giro forms is their flexible layout, i.e., information items are located differently from one form to another, thus requiring an additional analysis step to localize them before recognition. A commercial optical character recognition software package is used for recognition of machine-printed information, whereas handwritten information is read by our own algorithms, the details of which are presented. The system is implemented by using a client/server architecture providing a high degree of flexibility to change. Preliminary results are reported supporting our claim that the system is usable in practice.
Brabrand, Claus; Møller, Anders; Ricky, Mikkel
2000-01-01
All uses of HTML forms may benefit from validation of the specified input field values. Simple validation matches individual values against specified formats, while more advanced validation may involve interdependencies of form fields. There is currently no standard for specifying or implementing...
Gupta, Gaurav
2013-01-01
This tutorial will show you how to create stylish forms, not only visually appealing, but interactive and customized, in order to gather valuable user inputs and information.Enhance your skills in building responsive and dynamic web forms using HTML5, CSS3, and related technologies. All you need is a basic understanding of HTML and PHP.
It! What is Soil? Chip Off the Old Block Soil Forming Factors Matters of Life and Death Underneath It All Wise Choices A World of Soils Soil Forming Factors 2 A Top to Bottom Guide 3 Making a Soil Monolith 4 Soil Orders 5 State Soil Monoliths 6 Where in the Soil World Are You? >> A Top to
Tolle, Charles R [Idaho Falls, ID; Clark, Denis E [Idaho Falls, ID; Smartt, Herschel B [Idaho Falls, ID; Miller, Karen S [Idaho Falls, ID
2009-10-06
A material-forming tool and a method for forming a material are described including a shank portion; a shoulder portion that releasably engages the shank portion; a pin that releasably engages the shoulder portion, wherein the pin defines a passageway; and a source of a material coupled in material flowing relation relative to the pin and wherein the material-forming tool is utilized in methodology that includes providing a first material; providing a second material, and placing the second material into contact with the first material; and locally plastically deforming the first material with the material-forming tool so as mix the first material and second material together to form a resulting material having characteristics different from the respective first and second materials.
Strong normalization by type-directed partial evaluation and run-time code generation
Balat, Vincent; Danvy, Olivier
1998-01-01
We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....
Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation
Balat, Vincent; Danvy, Olivier
1997-01-01
We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....
Cystic form of rheumatoid arthritis
Dijkstra, P.F.; Gubler, F.M.; Maas, A.
1988-10-01
A nonerosive form of rheumatoid arthritis (R.A.) was found in 62 patients out of 660 patients with R.A.. These 62 patients exhibit slowly progressive cystic changes in about the same joints in which usually erosions develop in classic R.A.. The E.S.R. is often low, half of the patients remained seronegative and there are 35 males and 27 females in the group. A smaller group of 15 out of these patients could be followed from a stage wherein the radiographs were normal to a stage of extensive cystic changes, over a period of at least 6 years. An attempt is made to delineate this group within the rheumatoid arthritis disease entity.
A locally adaptive normal distribution
Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren
2016-01-01
entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...
Normal pediatric postmortem CT appearances
Klein, Willemijn M.; Bosboom, Dennis G.H.; Koopmanschap, Desiree H.J.L.M. [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Nievelstein, Rutger A.J. [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Nikkels, Peter G.J. [University Medical Center Utrecht, Department of Pathology, Utrecht (Netherlands); Rijn, Rick R. van [Academic Medical Center, Department of Radiology, Amsterdam (Netherlands)
2015-04-01
Postmortem radiology is a rapidly developing specialty that is increasingly used as an adjunct to or substitute for conventional autopsy. The goal is to find patterns of disease and possibly the cause of death. Postmortem CT images bring to light processes of decomposition most radiologists are unfamiliar with. These postmortem changes, such as the formation of gas and edema, should not be mistaken for pathological processes that occur in living persons. In this review we discuss the normal postmortem thoraco-abdominal changes and how these appear on CT images, as well as how to differentiate these findings from those of pathological processes. (orig.)
Multispectral histogram normalization contrast enhancement
Soha, J. M.; Schwartz, A. A.
1979-01-01
A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.
Normal movement selectivity in autism.
Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J
2010-05-13
It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Movement selectivity is a defining characteristic of neurons involved in movement perception, including mirror neurons, and, as such, these findings argue against a mirror system dysfunction in autism. Copyright 2010 Elsevier Inc. All rights reserved.
On The Extensive Form Of N-Person Cooperative Games | Udeh ...
On The Extensive Form Of N-Person Cooperative Games. ... games. Keywords: Extensive form game, Normal form game, characteristic function, Coalition, Imputation, Player, Payoff, Strategy and Core ... AJOL African Journals Online. HOW TO ...
Proximity effect in normal-superconductor hybrids for quasiparticle traps
Hosseinkhani, Amin [Peter Grunberg Institute (PGI-2), Forschungszentrum Julich, D-52425 Julich (Germany); JARA-Institute for Quantum Information, RWTH Aachen University, D-52056 Aachen (Germany)
2016-07-01
Coherent transport of charges in the form of Cooper pairs is the main feature of Josephson junctions which plays a central role in superconducting qubits. However, the presence of quasiparticles in superconducting devices may lead to incoherent charge transfer and limit the coherence time of superconducting qubits. A way around this so-called ''quasiparticle poisoning'' might be using a normal-metal island to trap quasiparticles; this has motivated us to revisit the proximity effect in normal-superconductor hybrids. Using the semiclassical Usadel equations, we study the density of states (DoS) both within and away from the trap. We find that in the superconducting layer the DoS quickly approaches the BCS form; this indicates that normal-metal traps should be effective at localizing quasiparticles.
Modelling of tension stiffening for normal and high strength concrete
Christiansen, Morten Bo; Nielsen, Mogens Peter
1998-01-01
form the model is extended to apply to biaxial stress fields as well. To determine the biaxial stress field, the theorem of minimum complementary elastic energy is used. The theory has been compared with tests on rods, disks, and beams of both normal and high strength concrete, and very good results...
Normalizing tweets with edit scripts and recurrent neural embeddings
Chrupala, Grzegorz; Toutanova, Kristina; Wu, Hua
2014-01-01
Tweets often contain a large proportion of abbreviations, alternative spellings, novel words and other non-canonical language. These features are problematic for standard language analysis tools and it can be desirable to convert them to canonical form. We propose a novel text normalization model
Learning attention for historical text normalization by learning to pronounce
Bollmann, Marcel; Bingel, Joachim; Søgaard, Anders
2017-01-01
Automated processing of historical texts often relies on pre-normalization to modern word forms. Training encoder-decoder architectures to solve such problems typically requires a lot of training data, which is not available for the named task. We address this problem by using several novel encoder...
Identity Work at a Normal University in Shanghai
Cockain, Alex
2016-01-01
Based upon ethnographic research, this article explores undergraduate students' experiences at a normal university in Shanghai focusing on the types of identities and forms of sociality emerging therein. Although students' symptoms of disappointment seem to indicate the power of university experiences to extinguish purposeful action, this article…
On matrix superpotential and three-component normal modes
Rodrigues, R. de Lima [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Lima, A.F. de [Universidade Federal de Campina Grande (UFCG), PB (Brazil). Dept. de Fisica; Mello, E.R. Bezerra de; Bezerra, V.B. [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil). Dept. de Fisica]. E-mails: rafael@df.ufcg.edu.br; aerlima@df.ufcg.edu.br; emello@fisica.ufpb.br; valdir@fisica.ufpb.br
2007-07-01
We consider the supersymmetric quantum mechanics(SUSY QM) with three-component normal modes for the Bogomol'nyi-Prasad-Sommerfield (BPS) states. An explicit form of the SUSY QM matrix superpotential is presented and the corresponding three-component bosonic zero-mode eigenfunction is investigated. (author)
Alexander's disease in a neurologically normal child: a case report
Guthrie, Scott O.; Knowles, Paul; Marshall, Robert; Burton, Edward M.
2003-01-01
We report the clinical and MRI findings of symmetric hyperintensity involving the deep and subcortical white matter of the frontal lobes in a neurologically normal child with macrocephaly. In this patient, a serum test for mutations in glial fibrillary acidic protein, used to diagnose Alexander's disease (AD), was positive. This case indicates an extraordinarily mild or early form of juvenile-onset AD. (orig.)
Update on normal tension glaucoma
Jyotiranjan Mallick
2016-01-01
Full Text Available Normal tension glaucoma (NTG is labelled when typical glaucomatous disc changes, visual field defects and open anterior chamber angles are associated with intraocular pressure (IOP constantly below 21 mmHg. Chronic low vascular perfusion, Raynaud's phenomenon, migraine, nocturnal systemic hypotension and over-treated systemic hypertension are the main causes of normal tension glaucoma. Goldmann applanation tonometry, gonioscopy, slit lamp biomicroscopy, optical coherence tomography and visual field analysis are the main tools of investigation for the diagnosis of NTG. Management follows the same principles of treatment for other chronic glaucomas: To reduce IOP by a substantial amount, sufficient to prevent disabling visual loss. Treatment is generally aimed to lower IOP by 30% from pre-existing levels to 12-14 mmHg. Betaxolol, brimonidine, prostaglandin analogues, trabeculectomy (in refractory cases, systemic calcium channel blockers (such as nifedipine and 24-hour monitoring of blood pressure are considered in the management of NTG. The present review summarises risk factors, causes, pathogenesis, diagnosis and management of NTG.
Normal variation of hepatic artery
Kim, Inn; Nam, Myung Hyun; Rhim, Hyun Chul; Koh, Byung Hee; Seo, Heung Suk; Kim, Soon Yong
1987-01-01
This study was an analyses of blood supply of the liver in 125 patients who received hepatic arteriography and abdominal aortography from Jan. 1984 to Dec. 1986 at the Department of Radiology of Hanyang University Hospital. A. Variations in extrahepatic arteries: 1. The normal extrahepatic artery pattern occurred in 106 of 125 cases (84.8%) ; Right hepatic and left hepatic arteries arising from the hepatic artery proper and hepatic artery proper arising from the common hepatic artery. 2. The most common type of variation of extrahepatic artery was replaced right hepatic artery from superior mesenteric artery: 6 of 125 cases (4.8%). B. Variations in intrahepatic arteries: 1. The normal intrahepatic artery pattern occurred in 83 of 125 cases (66.4%). Right hepatic and left hepatic arteries arising from the hepatic artery proper and middle hepatic artery arising from lower portion of the umbilical point of left hepatic artery. 2. The most common variation of intrahepatic arteries was middle hepatic artery. 3. Among the variation of middle hepatic artery; Right, middle and left hepatic arteries arising from the same location at the hepatic artery proper was the most common type; 17 of 125 cases (13.6%)
Anon.
1980-01-01
The schedules for waste package development for the various host rocks were presented. The waste form subtask activities were reviewed, with the papers focusing on high-level waste, transuranic waste, and spent fuel. The following ten papers were presented: (1) Waste Package Development Approach; (2) Borosilicate Glass as a Matrix for Savannah River Plant Waste; (3) Development of Alternative High-Level Waste Forms; (4) Overview of the Transuranic Waste Management Program; (5) Assessment of the Impacts of Spent Fuel Disassembly - Alternatives on the Nuclear Waste Isolation System; (6) Reactions of Spent Fuel and Reprocessing Waste Forms with Water in the Presence of Basalt; (7) Spent Fuel Stabilizer Screening Studies; (8) Chemical Interactions of Shale Rock, Prototype Waste Forms, and Prototype Canister Metals in a Simulated Wet Repository Environment; (9) Impact of Fission Gas and Volatiles on Spent Fuel During Geologic Disposal; and (10) Spent Fuel Assembly Decay Heat Measurement and Analysis
Hansbøl, Mikala
det vi undersøger på form gennem vores beskrivelser. Paperet tager afsæt i empiriske eksempler fra et postdoc projekt om et såkaldt 'serious game' - Mingoville. Projektet følger circuleringer og etableringer af Mingoville 'på en global markedsplads'. I paperet diskuteres hvordan vi som forskere samler....../performer de fænomener vi forsker i. Aktør-Netværks-Teoretiker Bruno Latour (2005) pointerer at enhver beskrivelse også er en form for forklaring. En form for forklaring, der putter ting ind i et skript og dermed også putter ting på form. Paperet diskuterer to tilgange til at gøre serious games og derved skabe viden om...... engagementer med disse fænomener i serious games forskning: experimentel og etnografisk....
National Oceanic and Atmospheric Administration, Department of Commerce — The data set contains information from submitted NOAA Form 370s, also known as the Fisheries Certificate of Origin, for imported shipments of frozen and/or processed...
HR Department
2005-01-01
As announced in Weekly Bulletin 48/2004, from now onwards, the paper MAPS appraisal report form has been replaced by an electronic form, which is available via EDH (on the EDH desktop under Other Tasks / HR & Training) No changes have been made to the contents of the form. Practical information will be available on the web page http://cern.ch/ais/projs/forms/maps/info.htm, and information meetings will be held on the following dates: 18 January 2005: MAIN AUDITORIUM (500-1-001) from 14:00 to 15:30. 20 January 2005: AB AUDITORIUM II (864-1-D02) from14:00 to 15:30. 24 January 2005: AT AUDITORIUM (30-7-018) from 10:00 to 11:30. Human Resources Department Tel. 73566
National Oceanic and Atmospheric Administration, Department of Commerce — These output tables contain parsed and format validated data from the various VMS forms that are sent from any given vessel, while at sea, from the VMS devices on...
Ishizaka, Shozo; Kato, Yoshihiro; Takaki, Ryuji; Toriwaki, Jun-ichiro
1987-01-01
The purpose of the Symposium was to discuss interdisciplinal science aspects of form. 'Form' depends on the material and the changes. But, it is the form that appears evident at once and endures. Form is absorbed from every field as media of information. One part of the work covers the description of non-periodic phenomena, morphogenesis or evolution. Irreducible stubborn facts as diseases or social problems, or whatever else that could not be analyzed are integrally challenged to be systematized by computer simulation. The other part covers the finding of laws for determining how systems behave. Attention should be paid to pattern recognition, image processing and pattern formation. The Symposium proceeded with no parallel sessions, and participants from various fields made exciting discussions in an interdisciplinal atmosphere. (Auth.)
Cosma Emil; Jeflea Victor
2010-01-01
By using Word, Excel or PowerPoint one can automate routine operations using the VBA language (Visual Basic for Applications). This language is also used in Access, allowing access to data stored in tables or queries. Thus, Access and VBA resources can be used together. Access is designed for programming forms and reports (among other things), so there won’t be found any of the VBA editor’s specific forms.
Jelena Rajić
2012-12-01
Full Text Available This paper examines some special uses of indicative and subjunctive verb forms in Spanish, which contemporary linguistics explains using the notions of polyphony, evidentials, echoic representation, quotatives, etc. These terms, even though they refer to different characteristics and belong to different theoretical frameworks, share one common feature: they all refer to diverse linguistic forms (discourse markers, linguistic negation, quotatives, echoic utterances, etc. characterized by the presence and interaction of different voices or points of view in one discourse sequence. In this study we are interested in a description of quotative or polyphonic meanings expressed by specific verb forms and tenses, the imperfect and the conditional, and also by indicative forms in subordinate substantive clauses with a negative main verb and by subjunctive forms in subordinate concessive clauses. Our research focuses on the analysis of the linguistic conditions that make possible the evidential use of the conditional, the imperfect and the echoic (metarepresentative interpretation of indicative and subjunctive forms in the above-mentioned contexts. The examples we discuss show that evidential and echoic interpretations are inferential meanings derived from the extralinguistic situation and the knowledge that speakers have of the world.
Maxim V. Kharkevich
2014-01-01
Full Text Available Global governance as a concept defines the meaning of contemporary world politics both as a discipline and as reality. Interdependent and globalized world requires governance, and a global government has not been formed yet. The theoretical possibility of global governance without global government is proved and justified. The purpose of this article is to analytically identify possible forms of global governance. Three such forms of global governance are identified: hierarchical, market and network. In a hierarchy the governance is due to the asymmetry of power between the parties. Market control happens via anonymous pricing mechanism. Network, in contrast to the market is characterized by a closer value link between the actors, but unlike the hierarchical relationship actors are free to leave the network. Global governance takes three forms and is being implemented by different actors. To determine the most efficient form of global governance is impossible. Efficiency depends on the match between a form and an object of government. It should be noted that meta governance is likely to remain a monopoly of institutionally strong states in global governance.
Normal central retinal function and structure preserved in retinitis pigmentosa.
Jacobson, Samuel G; Roman, Alejandro J; Aleman, Tomas S; Sumaroka, Alexander; Herrera, Waldo; Windsor, Elizabeth A M; Atkinson, Lori A; Schwartz, Sharon B; Steinberg, Janet D; Cideciyan, Artur V
2010-02-01
To determine whether normal function and structure, as recently found in forms of Usher syndrome, also occur in a population of patients with nonsyndromic retinitis pigmentosa (RP). Patients with simplex, multiplex, or autosomal recessive RP (n = 238; ages 9-82 years) were studied with static chromatic perimetry. A subset was evaluated with optical coherence tomography (OCT). Co-localized visual sensitivity and photoreceptor nuclear layer thickness were measured across the central retina to establish the relationship of function and structure. Comparisons were made to patients with Usher syndrome (n = 83, ages 10-69 years). Cross-sectional psychophysical data identified patients with RP who had normal rod- and cone-mediated function in the central retina. There were two other patterns with greater dysfunction, and longitudinal data confirmed that progression can occur from normal rod and cone function to cone-only central islands. The retinal extent of normal laminar architecture by OCT corresponded to the extent of normal visual function in patients with RP. Central retinal preservation of normal function and structure did not show a relationship with age or retained peripheral function. Usher syndrome results were like those in nonsyndromic RP. Regional disease variation is a well-known finding in RP. Unexpected was the observation that patients with presumed recessive RP can have regions with functionally and structurally normal retina. Such patients will require special consideration in future clinical trials of either focal or systemic treatment. Whether there is a common molecular mechanism shared by forms of RP with normal regions of retina warrants further study.
Is My Penis Normal? (For Teens)
... Videos for Educators Search English Español Is My Penis Normal? KidsHealth / For Teens / Is My Penis Normal? Print en español ¿Es normal mi pene? ... any guy who's ever worried about whether his penis is a normal size. There's a fairly wide ...
Normal vibrations in gallium arsenide
Dolling, G.; Waugh, J.L.T.
1964-01-01
The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296 o K. The frequencies of normal modes of vibration propagating in the [ζ00], (ζζζ], and (0ζζ] crystal directions have been determined with a precision of between 1 and 2·5 per cent. A limited number of normal modes have also been studied at 95 and 184 o K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296 o K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10 12 c/s) for these modes (at 296 o K) have been assigned: T 8·02±0·08 and L 8·55±02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7·56 ± 008, TA 2·36 ± 0·015, LO 7·22 ± 0·15, LA 6·80 ± 0·06; (b) (0·5, 0·5, 0·5): TO 7·84 ± 0·12, TA 1·86 ± 0·02, LO 7·15 ± 0·07, LA 6·26 ± 0·10; (c) (0, 0·65, 0·65): optic 8·08 ±0·13, 7·54 ± 0·12 and 6·57 ± 0·11, acoustic 5·58 ± 0·08, 3·42 · 0·06 and 2·36 ± 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0·04 e. The frequency distribution function has been derived from one of the force models. (author)
Normal vibrations in gallium arsenide
Dolling, G; Waugh, J L T
1964-07-01
The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296{sup o}K. The frequencies of normal modes of vibration propagating in the [{zeta}00], ({zeta}{zeta}{zeta}], and (0{zeta}{zeta}] crystal directions have been determined with a precision of between 1 and 2{center_dot}5 per cent. A limited number of normal modes have also been studied at 95 and 184{sup o}K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296{sup o}K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10{sup 12} c/s) for these modes (at 296{sup o}K) have been assigned: T 8{center_dot}02{+-}0{center_dot}08 and L 8{center_dot}55{+-}02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7{center_dot}56 {+-} 008, TA 2{center_dot}36 {+-} 0{center_dot}015, LO 7{center_dot}22 {+-} 0{center_dot}15, LA 6{center_dot}80 {+-} 0{center_dot}06; (b) (0{center_dot}5, 0{center_dot}5, 0{center_dot}5): TO 7{center_dot}84 {+-} 0{center_dot}12, TA 1{center_dot}86 {+-} 0{center_dot}02, LO 7{center_dot}15 {+-} 0{center_dot}07, LA 6{center_dot}26 {+-} 0{center_dot}10; (c) (0, 0{center_dot}65, 0{center_dot}65): optic 8{center_dot}08 {+-}0{center_dot}13, 7{center_dot}54 {+-} 0{center_dot}12 and 6{center_dot}57 {+-} 0{center_dot}11, acoustic 5{center_dot}58 {+-} 0{center_dot}08, 3{center_dot}42 {center_dot} 0{center_dot}06 and 2{center_dot}36 {+-} 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0{center_dot}04 e. The
Striving for the unknown normal
Nielsen, Mikka
During the last decade, more and more people have received prescriptions for ADHD drug treatment, and simultaneously the legitimacy of the ADHD diagnosis has been heavily debated among both professionals and laymen. Based on an anthropological fieldwork among adults with ADHD, I illustrate how...... the ADHD diagnosis both answers and produces existential questions on what counts as normal behaviour and emotions. The diagnosis helps the diagnosed to identify, accept and handle problems by offering concrete explanations and solutions to diffuse experienced problems. But the diagnostic process...... is not only a clarifying procedure with a straight plan for treatment and direct effects. It is also a messy affair. In a process of experimenting with drugs and attempting to determine how or whether the medication eliminates the correct symptoms the diagnosed is put in an introspective, self...
IIH with normal CSF pressures?
Soh Youn Suh
2013-01-01
Full Text Available Idiopathic intracranial hypertension (IIH is a condition of raised intracranial pressure (ICP in the absence of space occupying lesions. ICP is usually measured by lumbar puncture and a cerebrospinal fluid (CSF pressure above 250 mm H 2 O is one of the diagnostic criteria of IIH. Recently, we have encountered two patients who complained of headaches and exhibited disc swelling without an increased ICP. We prescribed acetazolamide and followed both patients frequently; because of the definite disc swelling with IIH related symptoms. Symptoms and signs resolved in both patients after they started taking acetazolamide. It is generally known that an elevated ICP, as measured by lumbar puncture, is the most important diagnostic sign of IIH. However, these cases caution even when CSF pressure is within the normal range, that suspicion should be raised when a patient has papilledema with related symptoms, since untreated papilledema may cause progressive and irreversible visual loss.
Transport through hybrid superconducting/normal nanostructures
Futterer, David
2013-01-29
We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to
Transport through hybrid superconducting/normal nanostructures
Futterer, David
2013-01-01
We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to
Wald, J.W.; Lokken, R.O.; Shade, J.W.; Rusin, J.M.
1980-12-01
A number of alternative process and waste form options exist for the immobilization of nuclear wastes. Although data exists on the characterization of these alternative waste forms, a straightforward comparison of product properties is difficult, due to the lack of standardized testing procedures. The characterization study described in this report involved the application of the same volatility, mechanical strength and leach tests to ten alternative waste forms, to assess product durability. Bulk property, phase analysis and microstructural examination of the simulated products, whose waste loading varied from 5% to 100% was also conducted. The specific waste forms investigated were as follows: Cold Pressed and Sintered PW-9 Calcine; Hot Pressed PW-9 Calcine; Hot Isostatic Pressed PW-9 Calcine; Cold Pressed and Sintered SPC-5B Supercalcine; Hot Isostatic pressed SPC-5B Supercalcine; Sintered PW-9 and 50% Glass Frit; Glass 76-68; Celsian Glass Ceramic; Type II Portland Cement and 10% PW-9 Calcine; and Type II Portland Cement and 10% SPC-5B Supercalcine. Bulk property data were used to calculate and compare the relative quantities of waste form volume produced at a spent fuel processing rate of 5 metric ton uranium/day. This quantity ranged from 3173 L/day (5280 Kg/day) for 10% SPC-5B supercalcine in cement to 83 L/day (294 Kg/day) for 100% calcine. Mechanical strength, volatility, and leach resistance tests provide data related to waste form durability. Glass, glass-ceramic and supercalcine ranked high in waste form durability where as the 100% PW-9 calcine ranked low. All other materials ranked between these two groupings
CT in normal pressure hydrocephalus
Fujita, Katsuzo; Nogaki, Hidekazu; Noda, Masaya; Kusunoki, Tadaki; Tamaki, Norihiko
1981-01-01
CT scans were obtained on 33 patients (age 73y. to 31y.) with the diagnosis of normal pressure hydrocephalus. In each case, the diagnosis was made on the basis of the symptoms, CT and cisternographic findings. Underlying diseases of normal pressure hydrocephalus are ruptured aneurysms (21 cases), arteriovenous malformations (2 cases), head trauma (1 case), cerebrovascular accidents (1 case) and idiopathie (8 cases). Sixteen of 33 patients showed marked improvement, five, moderate or minimal improvement, and twelve, no change. The results were compared with CT findings and clinical response to shunting. CT findings were classified into five types, bases on the degree of periventricular hypodensity (P.V.H.), the extent of brain damage by underlying diseases, and the degree of cortical atrophy. In 17 cases of type (I), CT shows the presence of P.V.H. with or without minimal frontal lobe damage and no cortical atrophy. The good surgical improvements were achieved in all cases of type (I) by shunting. In 4 cases of type (II), CT shows the presence of P.V.H. and severe brain damage without cortical atrophy. The fair clinical improvements were achieved in 2 cases (50%) by shunting. In one case of type (III), CT shows the absence of P.V.H. without brain damage nor cortical atrophy. No clinical improvement was obtained by shunting in this type. In 9 cases of type (IV) with mild cortical atrophy, the fair clinical improvement was achieved in two cases (22%) and no improvement in 7 cases. In 2 cases of type (V) with moderate or marked cortical atrophy, no clinical improvement was obtained by shunting. In conclusion, it appeared from the present study that there was a good correlation between the result of shunting and the type of CT, and clinical response to shunting operation might be predicted by classification of CT findings. (author)
Perron–Frobenius theorem for nonnegative multilinear forms and extensions
Friedland, S.; Gaubert, S.; Han, L.
2013-01-01
We prove an analog of Perron-Frobenius theorem for multilinear forms with nonnegative coefficients, and more generally, for polynomial maps with nonnegative coefficients. We determine the geometric convergence rate of the power algorithm to the unique normalized eigenvector.
[Adult form of Pompe disease].
Ziółkowska-Graca, Bozena; Kania, Aleksander; Zwolińska, Grazyna; Nizankowska-Mogilnicka, Ewa
2008-01-01
Pompe disease (glycogen-storage disease type II) is an autosomal recessive disorder caused by a deficiency of lysosomal acid alpha-glucosidase (GAA), leading to the accumulation of glycogen in the lysosomes primarily in muscle cells. In the adult form of the disease, proximal muscle weakness is noted and muscle volume is decreased. The infantile form is usually fatal. In the adult form of the disease the prognosis is relatively good. Muscle weakness may, however, interfere with normal daily activities, and respiratory insufficiency may be associated with obstructive sleep apnea. Death usually results from respiratory failure. Effective specific treatment is not available. Enzyme replacement therapy with recombinant human GAA (rh-GAA) still remains a research area. We report the case of a 24-year-old student admitted to the Department of Pulmonary Diseases because of severe respiratory insufficiency. Clinical symptoms such as dyspnea, muscular weakness and increased daytime sleepiness had been progressing for 2 years. Clinical examination and increased blood levels of CK suggested muscle pathology. Histopathological analysis of muscle biopsy, performed under electron microscope, confirmed the presence of vacuoles containing glycogen. Specific enzymatic activity of alpha-glucosidase was analyzed confirming Pompe disease. The only effective method to treat respiratory insufficiency was bi-level positive pressure ventilation. Respiratory rehabilitation was instituted and is still continued by the patient at home. A high-protein, low-sugar diet was proposed for the patient. Because of poliglobulia low molecular weight heparin was prescribed. The patient is eligible for experimental replacement therapy with rh-GAA.
Generating All Circular Shifts by Context-Free Grammars in Chomsky Normal Form
Asveld, P.R.J.
2005-01-01
Let $\\{a_1,a_2,\\ldots,a_n\\}$ be an alphabet of $n$ symbols and let $C_n$ be the language of circular shifts of the word $a_1a_2\\cdots a_n$; so $C_n = \\{a_1a_2\\cdots a_{n-1}a_n, a_2a_3\\cdots a_na_1, \\ldots,a_na_1\\cdots a_{n-2}a_{n-1}\\}$. We discuss a few families of context-free grammars $G_n$
Generating All Circular Shifts by Context-Free Grammars in Chomsky Normal Form
Asveld, P.R.J.
2006-01-01
Let $\\{a_1,a_2,\\ldots,a_n\\}$ be an alphabet of $n$ symbols and let $C_n$ be the language of circular shifts of the word $a_1a_2\\cdots a_n$; so $C_n = \\{a_1a_2\\cdots a_{n-1}a_n, a_2a_3\\cdots a_na_1, \\ldots,a_na_1\\cdots a_{n-2}a_{n-1}\\}$. We discuss a few families of context-free grammars $G_n$
Efficient Computation of Transition State Resonances and Reaction Rates from a Quantum Normal Form
Schubert, Roman; Waalkens, Holger; Wiggins, Stephen
2006-01-01
A quantum version of a recent formulation of transition state theory in phase space is presented. The theory developed provides an algorithm to compute quantum reaction rates and the associated Gamov-Siegert resonances with very high accuracy. The algorithm is especially efficient for
A normal form for hypergraph-based module extraction for SROIQ
Nortje, R
2012-12-01
Full Text Available Modularization is an important part of the modular design and maintenance of large scale ontologies. Syntactic locality modules, with their desirable model theoretic properties, play an ever increasing role in the design of algorithms...
Boldyreff, B; Meggio, F; Pinna, L A
1993-01-01
Twenty-one mutants of the noncatalytic beta-subunit of human casein kinase-2 have been created, expressed in Escherichia coli, and purified to homogeneity. They are either modified at the autophosphorylation site (mutants beta delta 1-4 and beta A 5,6) or bear variable deletions in their C...
Normal forms of dispersive scalar Poisson brackets with two independent variables
Carlet, Guido; Casati, Matteo; Shadrin, Sergey
2018-03-01
We classify the dispersive Poisson brackets with one dependent variable and two independent variables, with leading order of hydrodynamic type, up to Miura transformations. We show that, in contrast to the case of a single independent variable for which a well-known triviality result exists, the Miura equivalence classes are parametrised by an infinite number of constants, which we call numerical invariants of the brackets. We obtain explicit formulas for the first few numerical invariants.
Institutional Inertia and Institutional Change in an Expanding Normal-Form Game
Torsten Heinrich
2013-08-01
Full Text Available We investigate aspects of institutional change in an evolutionary game-theoretic framework, in principle focusing on problems of coordination in groups when new solutions to a problem become available. In an evolutionary game with an underlying dilemma structure, we let a number of new strategies become gradually available to the agents. The dilemma structure of the situation is not changed by these. Older strategies offer a lesser payoff than newly available ones. The problem that agents have to solve for realizing improved results is, therefore, to coordinate on newly available strategies. Strategies are taken to represent institutions; the coordination on a new strategy by agents, hence, represents a change in the institutional framework of a group. The simulations we run show a stable pattern regarding such institutional changes. A number of institutions are found to coexist, with the specific number depending on the relation of payoffs achievable through the coordination of different strategies. Usually, the strategies leading to the highest possible payoff are not among these. This can be taken to reflect the heterogeneity of rules in larger groups, with different subgroups showing different behavior patterns.
N. Ye. Trekina
2014-11-01
Full Text Available It is presented a case of delayed diagnosis brad systole against permanent atrial fibrillation (syndrome Frederick which became to syncope patient and to the later implanting of pacemaker.
Fox, Robert V.; Zhang, Fengyan; Rodriguez, Rene G.; Pak, Joshua J.; Sun, Chivin
2016-06-21
Single source precursors or pre-copolymers of single source precursors are subjected to microwave radiation to form particles of a I-III-VI.sub.2 material. Such particles may be formed in a wurtzite phase and may be converted to a chalcopyrite phase by, for example, exposure to heat. The particles in the wurtzite phase may have a substantially hexagonal shape that enables stacking into ordered layers. The particles in the wurtzite phase may be mixed with particles in the chalcopyrite phase (i.e., chalcopyrite nanoparticles) that may fill voids within the ordered layers of the particles in the wurtzite phase thus produce films with good coverage. In some embodiments, the methods are used to form layers of semiconductor materials comprising a I-III-VI.sub.2 material. Devices such as, for example, thin-film solar cells may be fabricated using such methods.
Biffis, Andrea; Dvorakova, Gita; Falcimaigne-Cordin, Aude
2012-01-01
The current state of the art in the development of methodologies for the preparation of MIPs in predetermined physical forms is critically reviewed, with particular attention being paid to the forms most widely employed in practical applications, such as spherical beads in the micro- to nanometer range, microgels, monoliths, membranes. Although applications of the various MIP physical forms are mentioned, the focus of the paper is mainly on the description of the various preparative methods. The aim is to provide the reader with an overview of the latest achievements in the field, as well as with a mean for critically evaluating the various proposed methodologies towards an envisaged application. The review covers the literature up to early 2010, with special emphasis on the developments of the last 10 years.
Plumpton, C
1968-01-01
Sixth Form Pure Mathematics, Volume 1, Second Edition, is the first of a series of volumes on Pure Mathematics and Theoretical Mechanics for Sixth Form students whose aim is entrance into British and Commonwealth Universities or Technical Colleges. A knowledge of Pure Mathematics up to G.C.E. O-level is assumed and the subject is developed by a concentric treatment in which each new topic is used to illustrate ideas already treated. The major topics of Algebra, Calculus, Coordinate Geometry, and Trigonometry are developed together. This volume covers most of the Pure Mathematics required for t
Neilson, R.M. Jr.; Colombo, P.
1982-01-01
Contemporary solidification agents are being investigated relative to their applications to major fuel cycle and non-fuel cycle low-level waste (LLW) streams. Work is being conducted to determine the range of conditions under which these solidification agents can be applied to specific LLW streams. These studies are directed primarily towards defining operating parameters for both improved solidification of problem wastes and solidification of new LLW streams generated from advanced volume reduction technologies. Work is being conducted to measure relevant waste form properties. These data will be compiled and evaluated to demonstrate compliance with waste form performance and shallow land burial acceptance criteria and transportation requirements
Cohen, Adam B.
2009-01-01
Psychologists interested in culture have focused primarily on East-West differences in individualism-collectivism, or independent-interdependent self-construal. As important as this dimension is, there are many other forms of culture with many dimensions of cultural variability. Selecting from among the many understudied cultures in psychology,…
Inequalities for Differential Forms
Agarwal, Ravi P
2009-01-01
Presents a series of local and global estimates and inequalities for differential forms, in particular the ones that satisfy the A-harmonic equations. This work focuses on the Hardy-Littlewood, Poincare, Cacciooli, imbedded and reverse Holder inequalities. It is for researchers, instructors and graduate students
Disconnected electromagnetic form factors
Wilcox, Walter
2001-01-01
Preliminary results of a calculation of disconnected nucleon electromagnetic factors factors on the lattice are presented. The implementation of the numerical subtraction scheme is outlined. A comparison of results for electric and magnetic disconnected form factors on two lattice sizes with those of the Kentucky group is presented. Unlike previous results, the results found in this calculation are consistent with zero in these sectors
2002-12-01
This first documentary form, edited by the national association of local commissions of information about nuclear activities (ANCLI), briefly presents the radioactivity phenomenon, the ionising radiations, the characteristics of radiation sources (activity, half life, energy), and the dosimetry (absorbed, equivalent, efficient doses). (J.S.)
Bay, Niels
2000-01-01
Lubrication is essential in most metal forming processes. The lubricant film has two basic functions, [1]: i. to separate the work piece and tool surfaces and ii. to cool the workpiece and the tool. Separation of the two surfaces implies lower friction facilitating deformation and lowering the tool...
Jaeger, Thomas Arvid
2012-01-01
a common professional language like in mathematics, colour and music. The result is a weaker professionalism in the aesthetic competences compared to the professionalism and competences in other areas. A research project [1] on contrasts or opposites in form investigated the phenomenon in the fields...
Knudsen, Morten
2011-01-01
and kept out of sight in the decision processes by looking at a specific case study involving the construction of a model intended to control, and render transparent, the quality of health services in Denmark. This paper outlines the forms of inattentiveness which make communication blind to information...
Clausen, Lars
2013-01-01
, both political and military, war between the two forms, the post-napoleonic, Fichtean notion of nationality (1807-8) and the historical notion of imperium. “Nationality” entered the political semantics witch such a force and shook the existing political order of empires to the ground because of its...
Discover new cooperation forms
Anon.
2000-01-01
In spite of the good forecasts concerning the supply and demand, the gas market is full of uncertainties because of the competition and the industrial reorganizing. Producers and operators try to define new forms of cooperation allowing the attainments protection and at the same time allowing to take advantage of the market opportunities with a shared risk. (A.L.B.)
Personal Information Request Form
PC Forms Inc. 834-4048
To apply for information under the Privacy Act, complete this form or a written request mentioning the Act. Describe the information being sought and provide any relevant details necessary to help the. International Development Research Centre. (IDRC) find it. If you require assistance, refer to. Info Source (Sources of ...
Heavy meson form factors from QCD
Falk, A.F.; Georgi, H.; Grinstein, B.
1990-01-01
We calculate the leading QCD radiative corrections to the relations which follow from the decoupling of the heavy quark spin as the quark mass goes infinity and from the symmetry between systems with different heavy quarks. One of the effects we calculate gives the leading q 2 -dependence of the form factor of a heavy quark, which in turn dominates the q 2 -dependence of the form factors of bound states of the heavy quark with light quarks. This, combined with the normalization of the form factor provided by symmetry, gives us a first principles calculation of the heavy meson (or baryon) form factors in the limit of very large heavy quark mass. (orig.)
Sonographic findings of normal newborn spinal cord
Park, Chan Sup; Kim, Dong Gyu
1988-01-01
The authors performed spinal cord ultrasonography of 21 healthy newborn infants in Gyeongsang National University Hospital. Normal spinal cord revealed low echogenecity at that of cerebrospinal fluid and was demarcated by intense reflections from its dorsal and ventral surfaces. The central canal was routinely seen as a thin linear reflection in the center of the cord. The nerve roots making up the cauda equina formed a poorly defined collection of intense linear echoes extending from the conus. On real time image, the normal spinal cord exhibited rather slow and rhythmical anteroposterior movement within the subarachnoid fluid. A distinct and rapid vascular pulsation of the spinal cord was usually recognizable. The approximate level of vertebral bodies was determined as follows; most ventrally located vertebral body was thought to be L5 and S1 was seen slightly posterior to the L5 directed inferoposteriorly. According to the above criteria terminal portions of spinal cord were seen around the L2 body in 5 MHz and pointed termination of conus medullaris was clearly seen at L2-3 junction and in upper body of L3 by 7.5 MHz. So it would be better to examine by 5 MHz for spatial orientation and then by 7.5 MHz for more accurate examination. High-resolution, real-time ultrasonography was a safe, rapid screening technique for evaluation of the spinal cord in infants. Additional applications of spinal sonography may be possible in the evaluation of neonatal syringohydromyelia and meningocele as well as intraspinal cyst localization for possible percutaneous puncture by ultrasound guidance
Chemical compatibility of DWPF canistered waste forms
Harbour, J.R.
1993-01-01
The Waste Acceptance Preliminary Specifications (WAPS) require that the contents of the canistered waste form are compatible with one another and the stainless steel canister. The canistered waste form is a closed system comprised of a stainless steel vessel containing waste glass, air, and condensate. This system will experience a radiation field and an elevated temperature due to radionuclide decay. This report discusses possible chemical reactions, radiation interactions, and corrosive reactions within this system both under normal storage conditions and after exposure to temperatures up to the normal glass transition temperature, which for DWPF waste glass will be between 440 and 460 degrees C. Specific conclusions regarding reactions and corrosion are provided. This document is based on the assumption that the period of interim storage prior to packaging at the federal repository may be as long as 50 years
Normalization of emotion control scale
Hojatoolah Tahmasebian
2014-09-01
Full Text Available Background: Emotion control skill teaches the individuals how to identify their emotions and how to express and control them in various situations. The aim of this study was to normalize and measure the internal and external validity and reliability of emotion control test. Methods: This standardization study was carried out on a statistical society, including all pupils, students, teachers, nurses and university professors in Kermanshah in 2012, using Williams’ emotion control scale. The subjects included 1,500 (810 females and 690 males people who were selected by stratified random sampling. Williams (1997 emotion control scale, was used to collect the required data. Emotional Control Scale is a tool for measuring the degree of control people have over their emotions. This scale has four subscales, including anger, depressed mood, anxiety and positive affect. The collected data were analyzed by SPSS software using correlation and Cronbach's alpha tests. Results: The results of internal consistency of the questionnaire reported by Cronbach's alpha indicated an acceptable internal consistency for emotional control scale, and the correlation between the subscales of the test and between the items of the questionnaire was significant at 0.01 confidence level. Conclusion: The validity of emotion control scale among the pupils, students, teachers, nurses and teachers in Iran has an acceptable range, and the test itemswere correlated with each other, thereby making them appropriate for measuring emotion control.
Digital Pupillometry in Normal Subjects
Rickmann, Annekatrin; Waizel, Maria; Kazerounian, Sara; Szurman, Peter; Wilhelm, Helmut; Boden, Karl T.
2017-01-01
ABSTRACT The aim of this study was to evaluate the pupil size of normal subjects at different illumination levels with a novel pupillometer. The pupil size of healthy study participants was measured with an infrared-video PupilX pupillometer (MEye Tech GmbH, Alsdorf, Germany) at five different illumination levels (0, 0.5, 4, 32, and 250 lux). Measurements were performed by the same investigator. Ninety images were executed during a measurement period of 3 seconds. The absolute linear camera resolution was approximately 20 pixels per mm. This cross-sectional study analysed 490 eyes of 245 subjects (mean age: 51.9 ± 18.3 years, range: 6–87 years). On average, pupil diameter decreased with increasing light intensities for both eyes, with a mean pupil diameter of 5.39 ± 1.04 mm at 0 lux, 5.20 ± 1.00 mm at 0.5 lux, 4.70 ± 0.97 mm at 4 lux, 3.74 ± 0.78 mm at 32 lux, and 2.84 ± 0.50 mm at 250 lux illumination. Furthermore, it was found that anisocoria increased by 0.03 mm per life decade for all illumination levels (R2 = 0.43). Anisocoria was higher under scotopic and mesopic conditions. This study provides additional information to the current knowledge concerning age- and light-related pupil size and anisocoria as a baseline for future patient studies. PMID:28228832
Verdaguer, E.
1983-01-01
The short wavelength normal modes of self-gravitating rotating polytropic discs in the Bardeen approximation are studied. The discs' oscillations can be seen in terms of two types of modes: the p-modes whose driving forces are pressure forces and the r-modes driven by Coriolis forces. As a consequence of differential rotation coupling between the two takes place and some mixed modes appear, their properties can be studied under the assumption of weak coupling and it is seen that they avoid the crossing of the p- and r-modes. The short wavelength analysis provides a basis for the classification of the modes, which can be made by using the properties of their phase diagrams. The classification is applied to the large wavelength modes of differentially rotating discs with strong coupling and to a uniformly rotating sequence with no coupling, which have been calculated in previous papers. Many of the physical properties and qualitative features of these modes are revealed by the analysis. (author)
Reliability assessment based on small samples of normal distribution
Ma Zhibo; Zhu Jianshi; Xu Naixin
2003-01-01
When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations
Andreassen, Rikke; Ahmed Andresen, Uzma
2014-01-01
and daycare institutions, shape their racial and gendered experiences. Drawing upon black feminist theory, postcolonial theory, critical race and whiteness studies, the two women illustrate inclusions and exclusions in their society based on gender, race, class and sexuality – and especially pinpoint to how......This article focuses on the doing and undoing of race in daily life practices in Denmark. It takes the form of a dialogue between two women, a heterosexual Muslim woman of color and a lesbian white woman, who discuss and analyze how their daily life, e.g. interactions with their children’s schools...... left behind – prevents contemporary people from addressing existing patterns of racial discrimination, inclusion and exclusion in their daily lives, as well as from connecting their contemporary struggles to historical struggles and inequalities. Furthermore, they illustrate how food, class and race...
Defining the "normal" postejaculate urinalysis.
Mehta, Akanksha; Jarow, Jonathan P; Maples, Pat; Sigman, Mark
2012-01-01
Although sperm have been shown to be present in the postejaculate urinalysis (PEU) of both fertile and infertile men, the number of sperm present in the PEU of the general population has never been well defined. The objective of this study was to describe the semen and PEU findings in both the general and infertile population, in order to develop a better appreciation for "normal." Infertile men (n = 77) and control subjects (n = 71) were prospectively recruited. Exclusion criteria included azoospermia and medications known to affect ejaculation. All men underwent a history, physical examination, semen analysis, and PEU. The urine was split into 2 containers: PEU1, the initial voided urine, and PEU2, the remaining voided urine. Parametric statistical methods were applied for data analysis to compare sperm concentrations in each sample of semen and urine between the 2 groups of men. Controls had higher average semen volume (3.3 ± 1.6 vs 2.0 ± 1.4 mL, P sperm concentrations (112 million vs 56.2 million, P = .011), compared with infertile men. The presence of sperm in urine was common in both groups, but more prevalent among infertile men (98.7% vs 88.7%, P = .012), in whom it comprised a greater proportion of the total sperm count (46% vs 24%, P = .022). The majority of sperm present in PEU were seen in PEU1 of both controls (69%) and infertile men (88%). An association was noted between severe oligospermia (sperm counts in PEU (sperm in the urine compared with control, there is a large degree of overlap between the 2 populations, making it difficult to identify a specific threshold to define a positive test. Interpretation of a PEU should be directed by whether the number of sperm in the urine could affect subsequent management.
Turbocharging Normalization in Highland Conditions
I. V. Filippov
2017-01-01
Full Text Available To ensure many production processes are used compressors of various types, including turbochargers, which produce compressed air. The actual performance values of turbochargers used in highlands are significantly different from the certified values, and parameters of compressed air do not always guarantee the smooth and efficient functioning for consumers.The paper presents research results of the turbochargers of 4CI 425MX4 type, a series of "CENTAC", manufactured by INGERSOL – RAND Company. The research has been conducted in industrial highland conditions in difficult climatic environment. There were almost no investigations of turbochargers running in highland conditions. The combination of low atmospheric pressure with high temperature of the intake air causes the abnormal operating conditions of a turbocharger. Only N. M. Barannikov in his paper shows the results of theoretical studies of such operating conditions, but as to the practical research, there is no information at all.To normalize the turbocharger operation an option of the mechanical pressurization in the suction pipe is adopted. As a result of theoretical research, a TurboMAX blower MAX500 was chosen as a supercharger. The next stage of theoretical research was to construct characteristics of the turbocharger 4CI 425MX4 with a mechanical supercharger in the suction pipe. The boost reduces to the minimum the time of using additional compressors when parameters of the intake air are changed and ensures the smooth and efficient functioning for consumers.To verify the results of theoretical studies, namely, the technique for recalculation of the turbocharger characteristics under the real conditions of suction, were carried out the experimental researches. The average error between experimental and theoretical data is 2,9783 %, which confirms the validity of the technique used for reduction of the turbocharger characteristics to those under the real conditions of suction.
Lie algebra of conformal Killing–Yano forms
Ertem, Ümit
2016-01-01
We provide a generalization of the Lie algebra of conformal Killing vector fields to conformal Killing–Yano forms. A new Lie bracket for conformal Killing–Yano forms that corresponds to slightly modified Schouten–Nijenhuis bracket of differential forms is proposed. We show that conformal Killing–Yano forms satisfy a graded Lie algebra in constant curvature manifolds. It is also proven that normal conformal Killing–Yano forms in Einstein manifolds also satisfy a graded Lie algebra. The constructed graded Lie algebras reduce to the graded Lie algebra of Killing–Yano forms and the Lie algebras of conformal Killing and Killing vector fields in special cases. (paper)
Desplanques, B.
1987-01-01
Electromagnetic form factors, in first approximation, are sensitive to spatial distribution of nucleons and to their current. In second approximation, more precise effects are concerned, whose role is increasing with momentum transfer and participating essentially of short range nuclei description. They concern of course the nucleon-nucleon interaction while approaching each other and keeping their free-state identity, but also mutually polarizing one the other. In this last effect, radial and orbital excitations of nucleon, the nucleon mesonic cloud modification and the nucleon antinucleon pair excitation are included. In this paper, these contributions are discussed while trying to find the important elements for a good description of form factors. Current questions are also discussed. Light nuclei are essentially concerned [fr
Neilson, R.M. Jr.; Colombo, P.
1982-01-01
In this program, contemporary solidification agents are being investigated relative to their applications to major fuel cycle and non-fuel cycle low-level waste (LLW) streams. Work is being conducted to determine the range of conditions under which these solidification agents can be applied to specific LLW streams. These studies are directed primarily towards defining operating parameters for both improved solidification of problem wastes and solidification of new LLW streams generated from advanced volume reduction technologies. Work is being conducted to measure relevant waste form properties. These data will be compiled and evaluated to demonstrate compliance with waste form performance and shallow land burial acceptance criteria and transportation requirements (both as they exist and as they are modified with time). 6 tables
Fujimoto, J.; Ishikawa, T.; Kato, K.; Kaneko, T.; Nakazawa, N..; Shimizu, Y.; Vermaseren, J.; Yasui, Y.
2006-01-01
A new version of GRACE/1-loop, a system for the automatic calculation of 1-loop Feynman diagrams, has been developed after the replacement of its symbolic manipulation engine from REDUCE to FORM. This enables us the efficient memory management that is essential for the handling of a tremendous number of Feynman diagrams, so that the performance of GRACE/1-loop is significantly improved as demonstrated in the text
Differential forms of supermanifolds
Beresin, P.A.
1979-01-01
The theory of differential and pseUdo-differential forms on supermanifolds is constructed. The definition and notations of superanalogy of the Pontryagin and Chern characteristic classes are given. The theory considered is purely local. The scheme suggested here generalizes the so-called Weil homomorphism for superspace which lays on the basis of the Chern and Potryagin characteristic class theory. The theory can be extended to the global supermanifolds
Vaginal Discharge: What's Normal, What's Not
... Staying Safe Videos for Educators Search English Español Vaginal Discharge: What's Normal, What's Not KidsHealth / For Teens / ... Discharge: What's Normal, What's Not Print What Is Vaginal Discharge? Vaginal discharge is fluid that comes from ...
Should Japan Become a Normal Country
Yildiz, Ahmet
2005-01-01
This thesis evaluates Japanese geopolitical change in the post-Cold War era. It does so by analyzing Japan's history, its foreign policy since 1945, its reasons for becoming a normal country, and the impact of its normalization...
Peter Hacker
2015-10-01
Full Text Available The phrase ‘Lebensform’ (form of life had a long and varied history prior to Wittgenstein’s use of it on a mere three occasions in the Philosophical Investigations. It is not a pivotal concept in Wittgenstein’s philosophy. But it is a minor signpost of a major reorientation of philosophy, philosophy of language and logic, and philosophy of mathematics that Wittgenstein instigated. For Wittgenstein sought to replace the conception of a language as a meaning calculus (Frege, Russell, the Tractatus by an anthropological or ethnological conception. A language is not a class of sentences that can be formed from a set of axioms (definitions, formation and transformation rules and the meanings of which is given by their truth-conditions, but an open-ended series of interlocking language-games constituting a form of life or way of living (a culture. Wittgenstein’s uses of ‘Lebensform’ and its cognates, both in the Investigations and in his Nachlass are severally analysed, and various exegetical misinterpretations are clarified.
Computer tomography (CT) finding of normal pancreas
Cho, Chi Ja; Kim, Byung Tae; Lee, Jeung Suk
1983-01-01
Conventional radiology of the pancreas are too often unsatisfactory. It is well known that the whole body CT is very useful in identifying retroperitoneal pathology. The authors intended to present normal pancreatic morphology and data for preparation of basis for interpretation of abnormalities. We results were as follows; 1. There were 36 male and 24 female patients, and their ages ranged from 7 to 78 years. 2. 1) The organs adjacent pancreas were stomach, inferior vena cava, duodenum, caudate lobe of the liver left kidney, left adrenal gland, superior mesenteric vessels, spleen. 2) In 19 patients, pancreatic tail at the level of left kidney in the transverse plane, it was either ventral in 13 (68%), ventromedial in 2 (19%), ventrolateral in 4 (21%) to left kidney, in the other 41 patients, it was cranial to the upper pole of left kidney, ventral in 25 (61%), ventromedial in 1 (2%), ventrolateral in 15 (37%). 3) Pancreatic tail was cranial to the pancreatic body, 3 cm cranial in 2 (4%), 2-3 cm in 5 (8%), 1-2 cm in 6 (10%), less than 1 cm in 11 (18%). In the other, caudal in 3 (5%). 4) Pancreatic tail was cranial to the level of the splenic hilum in 36 (60%), 0-2 cm caudal in 24 (40%). 3. Pancreatic shape was uniform tapering form in 37 (62%), lobulated form in 23 (38%). 4. Pancreatic orientation was horizontal in 13 (22%), vertical 56 (76%), S-shaped in 1 (2%). 5. Pancreatic margin was smooth in 22 (37%), lobulated in 38 (63%). 6. In most patients, pancreas was uniform in density. 7. Pancreatic size was 0.5 ± 0.1 in measurement ratio of the head in 48 (80%), 0.4 ± 0.1 of the body in 49 (88%), 0.5 ± 0.1 of the tail in 47 (78%)
A note on totally normal spaces
Zougdani, H.K.
1990-10-01
In this note we give the necessary and sufficient condition for a topological space X such that the product space X x Y is totally normal for any (non discrete) metric space Y, and we show that a totally normal p-space need not be a perfectly normal in general, which makes Theorem 2 doubtful. (author). 6 refs
Manoussakis, G.; Delikaraoglou, D.
2011-01-01
In this paper we form relations for the determination of the elements of the E\\"otv\\"os matrix of the Earth's normal gravity field. In addition a relation between the Gauss curvature of the normal equipotential surface and the Gauss curvature of the actual equipotential surface both passing through the point P is presented. For this purpose we use a global Cartesian system (X, Y, Z) and use the variables X, and Y to form a local parameterization a normal equipotential surface to describe its ...
Neutron scattering by normal liquids
Gennes, P.G. de [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires
1961-07-01
Neutron data on motions in normal liquids well below critical point are reviewed and classified according to the order of magnitude of momentum transfers {Dirac_h}q and energy transfers {Dirac_h}w. For large momentum transfers a perfect gas model is valid. For smaller q and incoherent scattering, the major effects are related to the existence of two characteristic times: the period of oscillation of an atom in its cell, and the average lifetime of the atom in a definite cell. Various interpolation schemes covering both time scales are discussed. For coherent scattering and intermediate q, the energy spread is expected to show a minimum whenever q corresponds to a diffraction peak. For very small q the standard macroscopic description of density fluctuations is applicable. The limits of the various (q) and (w) domains and the validity of various approximations are discussed by a method of moments. The possibility of observing discrete transitions due to internal degrees of freedom in polyatomic molecules, in spite of the 'Doppler width' caused by translational motions, is also examined. (author) [French] L'auteur examine les donnees neutroniques sur les mouvements dans les liquides normaux, bien au-dessous du point critique, et les classe d'apres l'ordre de grandeur des transferts de quantite de mouvement {Dirac_h}q et des transferts d'energie {Dirac_h}w. Pour les grands transferts de, quantite de mouvement, un modele de gaz parfait est valable. En ce qui concerne les faibles valeurs de q et la diffussion incoherente, les principaux effets sont lies a l'existence de deux temps caracteristiques: la periode d'oscillation d'un atome dans sa cellule et la duree moyenne de vie de l'atome dans une cellule determinee. L'auteur etudie divers systemes d'interpolation se rapportant aux deux echelles de temps. Pour la diffusion coherente et les valeurs intermediaires de q, on presume que le spectre d'energie accuse un minimum chaque fois que q correspond a un pic de
Normal distal pulmonary vein anatomy
Wiesława Klimek-Piotrowska
2016-01-01
Full Text Available Background. It is well known that the pulmonary veins (PVs, especially their myocardial sleeves play a critical role in the initiation and maintenance of atrial fibrillation. Understanding the PV anatomy is crucial for the safety and efficacy of all procedures performed on PVs. The aim of this study was to present normal distal PV anatomy and to create a juxtaposition of all PV ostium variants.Methods. A total of 130 randomly selected autopsied adult human hearts (Caucasian were examined. The number of PVs ostia was evaluated and their diameter was measured. The ostium-to-last-tributary distance and macroscopic presence of myocardial sleeves were also evaluated.Results. Five hundred forty-one PV ostia were identified. Four classical PV ostia patterns (two left and two right PVs were observed in 70.8% of all cases. The most common variant was the classical pattern with additional middle right PV (19.2%, followed by the common ostium for the left superior and the inferior PVs (4.44%. Mean diameters of PV ostia (for the classical pattern were: left superior = 13.8 ± 2.9 mm; left inferior = 13.3 ± 3.4 mm; right superior = 14.3 ± 2.9 mm; right inferior = 13.7 ± 3.3 mm. When present, the additional middle right PV ostium had the smallest PV ostium diameter in the heart (8.2 ± 4.1 mm. The mean ostium-to-last-tributary (closest to the atrium distances were: left superior = 15.1 ± 4.6 mm; left inferior = 13.5 ± 4.0 mm; right superior = 11.8 ± 4.0 mm; right inferior = 11.0 ± 3.7 mm. There were no statistically significant differences between sexes in ostia diameters and ostium-to-last-tributary distances.Conclusion. Only 71% of the cases have four standard pulmonary veins. The middle right pulmonary vein is present in almost 20% of patients. Presented data can provide useful information for the clinicians during interventional procedures or radiologic examinations of PVs.
Method for construction of normalized cDNA libraries
Soares, Marcelo B.; Efstratiadis, Argiris
1998-01-01
This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.
Investigation of normal organ development with fetal MRI
Prayer, Daniela; Brugger, Peter C.
2007-01-01
The understanding of the presentation of normal organ development on fetal MRI forms the basis for recognition of pathological states. During the second and third trimesters, maturational processes include changes in size, shape and signal intensities of organs. Visualization of these developmental processes requires tailored MR protocols. Further prerequisites for recognition of normal maturational states are unequivocal intrauterine orientation with respect to left and right body halves, fetal proportions, and knowledge about the MR presentation of extrafetal/intrauterine organs. Emphasis is laid on the demonstration of normal MR appearance of organs that are frequently involved in malformation syndromes. In addition, examples of time-dependent contrast enhancement of intrauterine structures are given. (orig.)
Investigation of normal organ development with fetal MRI
Prayer, Daniela [Medical University of Vienna, Department of Radiology, Vienna (Austria); Brugger, Peter C. [Medical University of Vienna, Center of Anatomy and Cell Biology, Integrative Morphology Group, Vienna (Austria)
2007-10-15
The understanding of the presentation of normal organ development on fetal MRI forms the basis for recognition of pathological states. During the second and third trimesters, maturational processes include changes in size, shape and signal intensities of organs. Visualization of these developmental processes requires tailored MR protocols. Further prerequisites for recognition of normal maturational states are unequivocal intrauterine orientation with respect to left and right body halves, fetal proportions, and knowledge about the MR presentation of extrafetal/intrauterine organs. Emphasis is laid on the demonstration of normal MR appearance of organs that are frequently involved in malformation syndromes. In addition, examples of time-dependent contrast enhancement of intrauterine structures are given. (orig.)
Der er gået 20 år siden Aalborg og Nordjylland i 1997 fik sin egen, unikke civilingeniøruddannelse i form af Arkitektur & Design-uddannelsen ved Aalborg Universitet. Hele forløbet er nu dokumenteret i en jubilæumsbog, der fortæller om hvordan uddannelsen i sin tid blev etableret; hvordan undervis...... størrelse på over 500 studerende, de fleste med daglig gang i den store CREATE-bygning på havnefronten i Aalborg....
MR guided spatial normalization of SPECT scans
Crouch, B.; Barnden, L.R.; Kwiatek, R.
2010-01-01
Full text: In SPECT population studies where magnetic resonance (MR) scans are also available, the higher resolution of the MR scans allows for an improved spatial normalization of the SPECT scans. In this approach, the SPECT images are first coregistered to their corresponding MR images by a linear (affine) transformation which is calculated using SPM's mutual information maximization algorithm. Non-linear spatial normalization maps are then computed either directly from the MR scans using SPM's built in spatial normalization algorithm, or, from segmented TI MR images using DARTEL, an advanced diffeomorphism based spatial normalization algorithm. We compare these MR based methods to standard SPECT based spatial normalization for a population of 27 fibromyalgia patients and 25 healthy controls with spin echo T 1 scans. We identify significant perfusion deficits in prefrontal white matter in FM patients, with the DARTEL based spatial normalization procedure yielding stronger statistics than the standard SPECT based spatial normalization. (author)
Rhodes, Mark A.
2008-10-21
A bipolar pulse forming transmission line module for linear induction accelerators having first, second, third, fourth, and fifth planar conductors which form an interleaved stack with dielectric layers between the conductors. Each conductor has a first end, and a second end adjacent an acceleration axis. The first and second planar conductors are connected to each other at the second ends, the fourth and fifth planar conductors are connected to each other at the second ends, and the first and fifth planar conductors are connected to each other at the first ends via a shorting plate adjacent the first ends. The third planar conductor is electrically connectable to a high voltage source, and an internal switch functions to short a high voltage from the first end of the third planar conductor to the first end of the fourth planar conductor to produce a bipolar pulse at the acceleration axis with a zero net time integral. Improved access to the switch is enabled by an aperture through the shorting plate and the proximity of the aperture to the switch.
Effects of variable transformations on errors in FORM results
Qin Quan; Lin Daojin; Mei Gang; Chen Hao
2006-01-01
On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors
Hyperprolactinemia with normal serum prolactin: Its clinical significance
Manika Agarwal
2010-01-01
Full Text Available Amenorrhea and infertility with an added feature of galactorrhea makes a provisional diagnosis of hyperprolactinemia. But again, normal serum prolactin with all clinical features of hyperprolactinemia might question the diagnosis and further management. The answer lies in the heterogeneity of the peptide hormone - the immunoactive and the bioactive forms. This has been further illustrated with the help of a case which had been treated with cabergoline.
Forms and genesis of species abundance distributions
Evans O. Ochiaga
2015-12-01
Full Text Available Species abundance distribution (SAD is one of the most important metrics in community ecology. SAD curves take a hollow or hyperbolic shape in a histogram plot with many rare species and only a few common species. In general, the shape of SAD is largely log-normally distributed, although the mechanism behind this particular SAD shape still remains elusive. Here, we aim to review four major parametric forms of SAD and three contending mechanisms that could potentially explain this highly skewed form of SAD. The parametric forms reviewed here include log series, negative binomial, lognormal and geometric distributions. The mechanisms reviewed here include the maximum entropy theory of ecology, neutral theory and the theory of proportionate effect.
ELEMENTAL FORMS OF HOSPITALITY
Maximiliano Emanuel Korstanje
2010-11-01
Full Text Available Modern studies emphasized on the needs of researching the hospitality as relevant aspects of tourism and hospitality fields. Anyway, these approaches are inextricably intertwined to the industry of tourism and do not take seriously the anthropological and sociological roots of hospitality. In fact, the hotel seems to be a partial sphere of hospitality at all. Under this context, the present paper explores the issue of hospitality enrooted in the political and economic indo-European principle of free-transit which is associated to a much broader origin. Starting from the premise etymologically hostel and hospital share similar origins, we follow the contributions of J Derrida to determine the elements that formed the hospitality up to date.
Ryong Ji, C.; Pang, A.; Szczepaniak, A. [North Carolina State Univ., Raleigh, NC (United States)
1994-04-01
It is pointed out that the correct criterion to define the legal PQCD contribution to the exclusive processes in the lightcone perturbative expansion should be based on the large off-shellness of the lightcone energy in the intermediate states. In the lightcone perturbative QCD calculation of the pion form factor, the authors find that the legal PQCD contribution defined by the lightcone energy cut saturates in the smaller Q{sup 2} region compared to that defined by the gluon four-momentum square cut. This is due to the contribution by the highly off-energy-shell gluons in the end point regions of the phase space, indicating that the gluon four-momentum-square cut may have cut too much to define the legal PQCD.
Nucleon Electromagnetic Form Factors
Marc Vanderhaeghen; Charles Perdrisat; Vina Punjabi
2007-10-01
There has been much activity in the measurement of the elastic electromagnetic proton and neutron form factors in the last decade, and the quality of the data has greatly improved by performing double polarization experiments, in comparison with previous unpolarized data. Here we review the experimental data base in view of the new results for the proton, and neutron, obtained at JLab, MAMI, and MIT-Bates. The rapid evolution of phenomenological models triggered by these high-precision experiments will be discussed, including the recent progress in the determination of the valence quark generalized parton distributions of the nucleon, as well as the steady rate of improvements made in the lattice QCD calculations.
McHugh, K. M.; Key, J. F.
The United States Council for Automotive Research (USCAR) has formed a partnership with the Idaho National Engineering Laboratory (INEL) to develop a process for the rapid production of low-cost tooling based on spray forming technology developed at the INEL. Phase 1 of the program will involve bench-scale system development, materials characterization, and process optimization. In Phase 2, prototype systems will be designed, constructed, evaluated, and optimized. Process control and other issues that influence commercialization will be addressed during this phase of the project. Technology transfer to USCAR, or a tooling vendor selected by USCAR, will be accomplished during Phase 3. The approach INEL is using to produce tooling, such as plastic injection molds and stamping dies, combines rapid solidification processing and net-shape materials processing into a single step. A bulk liquid metal is pressure-fed into a de Laval spray nozzle transporting a high velocity, high temperature inert gas. The gas jet disintegrates the metal into fine droplets and deposits them onto a tool pattern made from materials such as plastic, wax, clay, ceramics, and metals. The approach is compatible with solid freeform fabrication techniques such as stereolithography, selective laser sintering, and laminated object manufacturing. Heat is extracted rapidly, in-flight, by convection as the spray jet entrains cool inert gas to produce undercooled and semi-solid droplets. At the pattern, the droplets weld together while replicating the shape and surface features of the pattern. Tool formation is rapid; deposition rates in excess of 1 ton/h have been demonstrated for bench-scale nozzles.
Kohler, Susanna
2016-07-01
What causes the large-scale spiral structures found in some protoplanetary disks? Most models assume theyre created by newly-forming planets, but a new study suggests that planets might have nothing to do with it.Perturbations from Planets?In some transition disks protoplanetary disks with gaps in their inner regions weve directly imaged large-scale spiral arms. Many theories currently attribute the formation of these structures to young planets: either the direct perturbations of a planet embedded in the disk cause the spirals, or theyre indirectly caused by the orbit of a planetary body outside of the arms.Another example of spiral arms detected in a protoplanetary disk, MWC 758. [NASA/ESA/ESO/M. Benisty et al.]But what if you could get spirals without any planets? A team of scientists led by Matas Montesinos (University of Chile) have recently published a study in which they examine what happens to a shadowed protoplanetary disk.Casting Shadows with WarpsIn the teams setup, they envision a protoplanetary disk that is warped: the inner region is slightly tilted relative to the outer region. As the central star casts light out over its protoplanetary disk, this disk warping would cause some regions of the disk to be shaded in a way that isnt axially symmetric with potentially interesting implications.Montesinos and collaborators ran 2D hydrodynamics simulations to determine what happens to the motion of particles within the disk when they pass in and out of the shadowed regions. Since the shadowed regions are significantly colder than the illuminated disk, the pressure in these regions is much lower. Particles are therefore accelerated and decelerated as they pass through these regions, and the lack of axial symmetry causes spiral density waves to form in the disk as a result.Initial profile for the stellar heating rate per unit area for one of the authors simulations. The regions shadowed as a result of the disk warp subtend 0.5 radians each (shown on the left
Self-Esteem of Gifted, Normal, and Mild Mentally Handicapped Children.
Chiu, Lian-Hwang
1990-01-01
Administered Coopersmith Self-Esteem Inventory (SEI) Form B to elementary school students (N=450) identified as gifted, normal, and mild mentally handicapped (MiMH). Results indicated that both the gifted and normal children had significantly higher self-esteem than did the MiMH children, but there were no differences between gifted and normal…
Correlated random sampling for multivariate normal and log-normal distributions
Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.
2012-01-01
A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.
Tsiapas, Elias; Soumelidou, Despina; Tsiapas, Christos
2017-04-01
When the Earth was formed, it was in a state of burning heat. As time went by, temperature on the planet's surface was falling due to radiation and heat transfer, and various components (crusts) began taking solid form at the Earth's poles. The formation of crusts took place at the Earth's poles, because the stirring of burning and fluid masses on the surface of the Earth was significantly slighter there than it was on the equator. Due to centrifugal force and Coriolis Effect, these solid masses headed towards the equator; those originating from the North Pole followed a south-western course, while those originating from the South Pole followed a north-western course and there they rotated from west to east at a lower speed than the underlying burning and liquid earth, because of their lower initial linear velocity, their solid state and inertia. Because inertia is proportional to mass, the initially larger solid body swept all new solid ones, incorporating them to its western side. The density of the new solid masses was higher, because the components on the surface would freeze and solidify first, before the underlying thicker components. As a result, the western side of the initial islet of solid rocks submerged, while the east side elevated. . As a result of the above, this initial islet began to spin in reverse, and after taking on the shape of a sphere, it formed the "heart" of the Moon. The Moon-sphere, rolling on the equator, would sink the solid rocks that continued to descend from the Earth's poles. The sinking rocks partially melted because of higher temperatures in the greater depths that the Moon descended to, while part of the rocks' mass bonded with the Moon and also served as a heat-insulating material, preventing the descended side of the sphere from melting. Combined with the Earth's liquid mass that covered its emerging eastern surface, new sphere-shaped shells were created, with increased density and very powerful structural cohesion. During the
Perry, Randall S.; Kolb, Vera M.; Lynne, Bridget Y.; Sephton, Mark A.; Mcloughlin, Nicola; Engel, Michael H.; Olendzenski, Lorraine; Brasier, Martin; Staley, James T., Jr.
2005-09-01
Desert varnish is a black, manganese-rich rock coating that is widespread on Earth. The mechanism underlying its formation, however, has remained unresolved. We present here new data and an associated model for how desert varnish forms, which substantively challenges previously accepted models. We tested both inorganic processes (e.g. clays and oxides cementing coatings) and microbial methods of formation. Techniques used in this preliminary study include SEM-EDAX with backscatter, HRTEM of focused ion beam prepared (FIB) wafers and several other methods including XRPD, Raman spectroscopy, XPS and Tof-SIMS. The only hypothesis capable of explaining a high water content, the presence of organic compounds, an amorphous silica phase (opal-A) and lesser quantities of clays than previously reported, is a mechanism involving the mobilization and redistribution of silica. The discovery of silica in desert varnish suggests labile organics are preserved by interaction with condensing silicic acid. Organisms are not needed for desert varnish formation but Bacteria, Archaea, Eukarya, and other organic compounds are passively incorporated and preserved as organominerals. The rock coatings thus provide useful records of past environments on Earth and possibly other planets. Additionally this model also helps to explain the origin of key varnish and rock glaze features, including their hardness, the nature of the "glue" that binds heterogeneous components together, its layered botryoidal morphology, and its slow rate of formation.
2005-01-01
An image H(x, y) for displaying a target image G(x, y) is displayed on a liquid-crystal display panel and illumination light from an illumination light source is made to pass therethrough to form an image on a PALSLM. Read light hv is radiated to the PALSLM and a phase-modulated light image alpha...... (x, y) read out of the PALSLM is subjected to Fourier transform by a lens. A phase contrast filter gives a predetermined phase shift to only the zero-order light component of Fourier light image alpha f(x, y). The phase-shifted light image is subjected to inverse Fourier transform by a lens...... to project an output image O(x, y) to an output plane. A light image O'(x, y) branched by a beam sampler is picked up by a pickup device and an evaluation value calculating unit evaluates conformity between the image O(x, y) and the image G(x, y).; A control unit performs feedback control of optical...
Ultrasonographic features of normal lower ureters
Kim, Young Soon; Bae, M. Y.; Park, K. J.; Jeon, H. S.; Lee, J. H.
1990-01-01
Although ultrasonographic evaluation of the normal ureters is difficult due to bowel gas, the lower segment of the normal ureters can be visualized using the urinary bladder as an acoustic window. Authors prospetively performed ultrasonography with the standard suprapubic technique and analyzed the ultrasonographic features of normal lower ureters in 79 cases(77%). Length of visualized segment of the distal ureter ranged frp, 1.5cm to 7.2 cm and the visualized segment did not exceed 3.9mm in maximum diameter. Knowledge of sonographic features of the normal lower ureters can be helpful in the evaluation of pathologic or suspected pathologic conditions of the lower ureters
The "normal" elongation of river basins
Castelltort, Sebastien
2013-04-01
The spacing between major transverse rivers at the front of Earth's linear mountain belts consistently scales with about half of the mountain half-width [1], despite strong differences in climate and rock uplift rates. Like other empirical measures describing drainage network geometry this result seems to indicate that the form of river basins, among other properties of landscapes, is invariant. Paradoxically, in many current landscape evolution models, the patterns of drainage network organization, as seen for example in drainage density and channel spacing, seem to depend on both climate [2-4] and tectonics [5]. Hovius' observation [1] is one of several unexplained "laws" in geomorphology that still sheds mystery on how water, and rivers in particular, shape the Earth's landscapes. This narrow range of drainage network shapes found in the Earth's orogens is classicaly regarded as an optimal catchment geometry that embodies a "most probable state" in the uplift-erosion system of a linear mountain belt. River basins currently having an aspect away from this geometry are usually considered unstable and expected to re-equilibrate over geological time-scales. Here I show that the Length/Width~2 aspect ratio of drainage basins in linear mountain belts is the natural expectation of sampling a uniform or normal distribution of basin shapes, and bears no information on the geomorphic processes responsible for landscape development. This finding also applies to Hack's [6] law of river basins areas and lengths, a close parent of Hovius' law. [1]Hovius, N. Basin Res. 8, 29-44 (1996) [2]Simpson, G. & Schlunegger, F. J. Geophys. Res. 108, 2300 (2003) [3]Tucker, G. & Bras, R. Water Resour. Res. 34, 2751-2764 (1998) [4]Tucker, G. & Slingerland, R. Water Resour. Res. 33, 2031-2047 (1997) [5]Tucker, G. E. & Whipple, K. X. J. Geophys. Res. 107, 1-1 (2002) [6]Hack, J. US Geol. Surv. Prof. Pap. 294-B (1957)
Vogel, H. [Asklepios Klinik St. Georg, Roentgenabteilung, Lohmuehlenstrasse 5, 20099 Hamburg (Germany)], E-mail: Hermann.vogel@ak-stgeorg.lbk-hh.de; Bartelt, D. [Asklepios Klinik St. Georg, Roentgenabteilung, Lohmuehlenstrasse 5, 20099 Hamburg (Germany)
2007-08-15
Purpose: Under war conditions, employed weapons can be identified on radiographs obtained in X-ray diagnostic. The analysis of such X-ray films allows concluding that there are additional information about the conditions of transport and treatment; it shall be shown that there are X-ray findings which are typical and characteristic for certain forms of warfare. Material and method: The radiograms have been collected during thirty years; they come from hospitals, where war casualties had been treated, and personal collections. Results: The material is selected, because in war X-ray diagnostic will be limited and the interest of the opposing parties influence the access to the material; furthermore the possibilities to publish or to communicate facts and thoughts are different. Citizens of the USA, GB, France, or Israel will have easier access to journals than those of Vietnam, Chad, and Zimbabwe. Under war conditions, poor countries, like North Vietnam may develop own concepts of medical care. There are X-ray findings which are typical or even characteristic for air warfare, guerrilla warfare, gas war, desert warfare, conventional warfare, and annihilation warfare, and city guerrilla warfare/civil war. The examples demonstrate that weapons and the conditions of transport and treatment can be recognized by X-ray findings. The radiogram can be read like a document. Conclusion: In War, there are differences between a treatment and imaging diagnostic in countries, which control the air space and in those who do not. Medical care of the poor, i.e. in countries (in general those opposing the western nations) will hardly be published, and poverty has no advocate.
Vogel, H.; Bartelt, D.
2007-01-01
Purpose: Under war conditions, employed weapons can be identified on radiographs obtained in X-ray diagnostic. The analysis of such X-ray films allows concluding that there are additional information about the conditions of transport and treatment; it shall be shown that there are X-ray findings which are typical and characteristic for certain forms of warfare. Material and method: The radiograms have been collected during thirty years; they come from hospitals, where war casualties had been treated, and personal collections. Results: The material is selected, because in war X-ray diagnostic will be limited and the interest of the opposing parties influence the access to the material; furthermore the possibilities to publish or to communicate facts and thoughts are different. Citizens of the USA, GB, France, or Israel will have easier access to journals than those of Vietnam, Chad, and Zimbabwe. Under war conditions, poor countries, like North Vietnam may develop own concepts of medical care. There are X-ray findings which are typical or even characteristic for air warfare, guerrilla warfare, gas war, desert warfare, conventional warfare, and annihilation warfare, and city guerrilla warfare/civil war. The examples demonstrate that weapons and the conditions of transport and treatment can be recognized by X-ray findings. The radiogram can be read like a document. Conclusion: In War, there are differences between a treatment and imaging diagnostic in countries, which control the air space and in those who do not. Medical care of the poor, i.e. in countries (in general those opposing the western nations) will hardly be published, and poverty has no advocate
Spinal cord normalization in multiple sclerosis.
Oh, Jiwon; Seigo, Michaela; Saidha, Shiv; Sotirchos, Elias; Zackowski, Kathy; Chen, Min; Prince, Jerry; Diener-West, Marie; Calabresi, Peter A; Reich, Daniel S
2014-01-01
Spinal cord (SC) pathology is common in multiple sclerosis (MS), and measures of SC-atrophy are increasingly utilized. Normalization reduces biological variation of structural measurements unrelated to disease, but optimal parameters for SC volume (SCV)-normalization remain unclear. Using a variety of normalization factors and clinical measures, we assessed the effect of SCV normalization on detecting group differences and clarifying clinical-radiological correlations in MS. 3T cervical SC-MRI was performed in 133 MS cases and 11 healthy controls (HC). Clinical assessment included expanded disability status scale (EDSS), MS functional composite (MSFC), quantitative hip-flexion strength ("strength"), and vibration sensation threshold ("vibration"). SCV between C3 and C4 was measured and normalized individually by subject height, SC-length, and intracranial volume (ICV). There were group differences in raw-SCV and after normalization by height and length (MS vs. HC; progressive vs. relapsing MS-subtypes, P normalization by length (EDSS:r = -.43; MSFC:r = .33; strength:r = .38; vibration:r = -.40), and height (EDSS:r = -.26; MSFC:r = .28; strength:r = .22; vibration:r = -.29), but diminished with normalization by ICV (EDSS:r = -.23; MSFC:r = -.10; strength:r = .23; vibration:r = -.35). In relapsing MS, normalization by length allowed statistical detection of correlations that were not apparent with raw-SCV. SCV-normalization by length improves the ability to detect group differences, strengthens clinical-radiological correlations, and is particularly relevant in settings of subtle disease-related SC-atrophy in MS. SCV-normalization by length may enhance the clinical utility of measures of SC-atrophy. Copyright © 2014 by the American Society of Neuroimaging.
An atlas of normal skeletal scintigraphy
Flanagan, J.J.; Maisey, M.N.
1985-01-01
This atlas was compiled to provide the neophyte as well as the experienced radiologist and the nuclear medicine physician with a reference on normal skeletal scintigraphy as an aid in distinguishing normal variations in skeletal uptake from abnormal findings. Each skeletal scintigraph is labeled, and utilizing an identical scale, a relevant skeletal photograph and radiograph are placed adjacent to the scintigraph
On normal modes in classical Hamiltonian systems
van Groesen, Embrecht W.C.
1983-01-01
Normal modes of Hamittonian systems that are even and of classical type are characterized as the critical points of a normalized kinetic energy functional on level sets of the potential energy functional. With the aid of this constrained variational formulation the existence of at least one family
Computerized three-dimensional normal atlas
Mano, Isamu; Suto, Yasuzo; Suzuki, Masataka; Iio, Masahiro.
1990-01-01
This paper presents our ongoing project in which normal human anatomy and its quantitative data are systematically arranged in a computer. The final product, the Computerized Three-Dimensional Normal Atlas, will be able to supply tomographic images in any direction, 3-D images, and coded information on organs, e.g., anatomical names, CT numbers, and T 1 and T 2 values. (author)
Pseudo--Normals for Signed Distance Computation
Aanæs, Henrik; Bærentzen, Jakob Andreas
2003-01-01
the relation of a point to a mesh. At the vertices and edges of a triangle mesh, the surface is not \\$C\\^1\\$ continuous. Hence, the normal is undefined at these loci. Thürmer and Wüthrich proposed the \\$\\backslash\\$emph{angle weighted pseudo--normal} as a way to deal with this problem. In this paper, we...
The lambda sigma calculus and strong normalization
Schack-Nielsen, Anders; Schürmann, Carsten
Explicit substitution calculi can be classified into several dis- tinct categories depending on whether they are confluent, meta-confluent, strong normalization preserving, strongly normalizing, simulating, fully compositional, and/or local. In this paper we present a variant of the λσ-calculus, ...
ESCA studies on leached glass forms
Dawkins, B.G.
1979-01-01
Electron Spectroscopy for Chemical Analysis (ESCA) results for frit, obsidian, NBS standard, and Savannah River Laboratory (SRL) glass forms that have been subjected to cumulative water leachings of 36 hours show that [Na] exhibits the largest and fastest change of all the elements observed. Leaching of surface Na occurred within minutes. Surface Na depletion increased with leach time. Continuous x-ray irradiation and argon ion milling induced Na mobility, precluding semiquantitative ESCA analysis at normal operating temperatures. However, the sample stage has been equipped with a liquid nitrogen supply and alkali mobility should be eliminated in future work
Normal zone soliton in large composite superconductors
Kupferman, R.; Mints, R.G.; Ben-Jacob, E.
1992-01-01
The study of normal zone of finite size (normal domains) in superconductors, has been continuously a subject of interest in the field of applied superconductivity. It was shown that in homogeneous superconductors normal domains are always unstable, so that if a normal domain nucleates, it will either expand or shrink. While testing the stability of large cryostable composite superconductors, a new phenomena was found, the existence of stable propagating normal solitons. The formation of these propagating domains was shown to be a result of the high Joule power generated in the superconductor during the relatively long process of current redistribution between the superconductor and the stabilizer. Theoretical studies were performed in investigate the propagation of normal domains in large composite super conductors in the cryostable regime. Huang and Eyssa performed numerical calculations simulating the diffusion of heat and current redistribution in the conductor, and showed the existence of stable propagating normal domains. They compared the velocity of normal domain propagation with the experimental data, obtaining a reasonable agreement. Dresner presented an analytical method to solve this problem if the time dependence of the Joule power is given. He performed explicit calculations of normal domain velocity assuming that the Joule power decays exponentially during the process of current redistribution. In this paper, the authors propose a system of two one-dimensional diffusion equations describing the dynamics of the temperature and the current density distributions along the conductor. Numerical simulations of the equations reconfirm the existence of propagating domains in the cryostable regime, while an analytical investigation supplies an explicit formula for the velocity of the normal domain
Acoustic wave spread in superconducting-normal-superconducting sandwich
Urushadze, G.I.
2004-01-01
The acoustic wave spread, perpendicular to the boundaries between superconducting and normal metals in superconducting-normal-superconducting (SNS) sandwich has been considered. The alternate current flow sound induced by the Green function method has been found and the coefficient of the acoustic wave transmission through the junction γ=(S 1 -S 2 )/S 1 , (where S 1 and S 2 are average energy flows formed on the first and second boundaries) as a function of the phase difference between superconductors has been investigated. It is shown that while the SNS sandwich is almost transparent for acoustic waves (γ 0 /τ), n=0,1,2, ... (where τ 0 /τ is the ratio of the broadening of the quasiparticle energy levels in impurity normal metal as a result of scattering of the carriers by impurities 1/τ to the spacing between energy levels 1/τ 0 ), γ=2, (S 2 =-S 1 ), which corresponds to the full reflection of the acoustic wave from SNS sandwich. This result is valid for the limit of a pure normal metal but in the main impurity case there are two amplification and reflection regions for acoustic waves. The result obtained shows promise for the SNS sandwich as an ideal mirror for acoustic wave reflection
Human renin biosynthesis and secretion in normal and ischemic kidneys
Pratt, R.E.; Carleton, J.E.; Richie, J.P.; Heusser, C.; Dzau, V.J.
1987-01-01
The pathway of renin biosynthesis and secretion in normal and ischemic human kidneys has been investigated by pulse-labeling experiments. The results indicate that in normal human kidney, preprorenin is rapidly processed to 47-kDa prorenin. Microradiosequencing showed that this molecule was generated by cleavage between Gly-23 and Leu-24, yielding a 43-amino acid proregion. Analysis of prorenin secreted by the kidney tissue yielded an identical sequence, indicating that prorenin is secreted without any further proteolysis. An examination of the kinetics of processing and secretion suggested that a majority of the newly synthesized prorenin is quickly secreted, while only a small fraction is processed intracellularly to the mature renin. The differences in secretion kinetics between prorenin and mature renin and the selective inhibition of prorenin secretion by monensin suggest that they are secreted independently via two pathways: a constitutive pathway probably from the Golgi or protogranules that rapidly release prorenin and a regulated pathway that secretes mature renin from the mature granules. A comparison of the kinetics of processing between normal and ischemic tissues suggests that renal ischemia leads to an overall increase in the rate of processing or prorenin to mature renin. In addition, prolonged biosynthetic labeling of renin in the ischemic kidney yielded two smaller molecular weight immunoreactive forms suggestive of renin fragments that may be degradative products. These fragments were not detected in normal kidney tissue labeled for similar lengths of time
Perturbative QCD and electromagnetic form factors
Carlson, C.E.; Gross, F.
1987-01-01
We calculate nucleon magnetic form factors using perturbative QCD for several distribution amplitudes including a general one given in terms of Appell polynomials. We find that the magnitude and sign of both nucleon magnetic form factors can be explained within perturbative QCD. The observed normalization of G/sub Mp/ requires that the distribution amplitude be broader than its superhigh momentum transfer limit, and the G/sub Mn//G/sub Mp/ data may require the distribution amplitude to be asymmetric, in accord with distribution amplitudes derived from QCD sum rules. Some speculation as to how an asymmetric distribution amplitude can come about is offered. Finally, we show that the soft contributions corresponding to the particular distribution amplitudes we use need not be bigger than the data. 16 refs., 6 figs
ArcForm - A multimodal notation
Allsopp, Benjamin Brink
ArcForm (AF) is a visual notation based on a new graph-like network structure. It supports a unique approach to labeling arcs and nodes to allow diverse and grammatically normal English (or other natural language) sentences to be embedded in the network (Allsopp, 2013). In doing this AF combines...... the familiarity and expressiveness of written natural language with the visuospatial intuition of navigating geographical maps. Thus AF simultaneously exploits visual, textual, linguistic and spatial modalities. In static representations AF seems to have various benefits. We believe that AF’s multiple modalities...... support better overview, aid memory and facilitate forming new insights. At the same time, AF’s closeness to natural language allows it to remain cross-domain and multipurpose. However AF is not limited to static representations and is designed to be supported digitally. Here we expect additional benefits...
Annotating Logical Forms for EHR Questions.
Roberts, Kirk; Demner-Fushman, Dina
2016-05-01
This paper discusses the creation of a semantically annotated corpus of questions about patient data in electronic health records (EHRs). The goal is to provide the training data necessary for semantic parsers to automatically convert EHR questions into a structured query. A layered annotation strategy is used which mirrors a typical natural language processing (NLP) pipeline. First, questions are syntactically analyzed to identify multi-part questions. Second, medical concepts are recognized and normalized to a clinical ontology. Finally, logical forms are created using a lambda calculus representation. We use a corpus of 446 questions asking for patient-specific information. From these, 468 specific questions are found containing 259 unique medical concepts and requiring 53 unique predicates to represent the logical forms. We further present detailed characteristics of the corpus, including inter-annotator agreement results, and describe the challenges automatic NLP systems will face on this task.
The morphological classification of normal and abnormal red blood cell using Self Organizing Map
Rahmat, R. F.; Wulandari, F. S.; Faza, S.; Muchtar, M. A.; Siregar, I.
2018-02-01
Blood is an essential component of living creatures in the vascular space. For possible disease identification, it can be tested through a blood test, one of which can be seen from the form of red blood cells. The normal and abnormal morphology of the red blood cells of a patient is very helpful to doctors in detecting a disease. With the advancement of digital image processing technology can be used to identify normal and abnormal blood cells of a patient. This research used self-organizing map method to classify the normal and abnormal form of red blood cells in the digital image. The use of self-organizing map neural network method can be implemented to classify the normal and abnormal form of red blood cells in the input image with 93,78% accuracy testing.
MR imaging of the ankle: Normal variants
Noto, A.M.; Cheung, Y.; Rosenberg, Z.S.; Norman, A.; Leeds, N.E.
1987-01-01
Thirty asymptomatic ankles were studied with high-resolution surface coil MR imaging. The thirty ankles were reviewed for identification or normal structures. The MR appearance of the deltoid and posterior to talo-fibular ligaments, peroneous brevis and longus tendons, and posterior aspect of the tibial-talar joint demonstrated several normal variants not previously described. These should not be misinterpreted as pathologic processes. The specific findings included (1) cortical irregularity of the posterior tibial-talar joint in 27 of 30 cases which should not be mistaken for osteonecrois; (2) normal posterior talo-fibular ligament with irregular and frayed inhomogeneity, which represents a normal variant in seven of ten cases; and (3) fluid in the shared peroneal tendons sheath which may be confused for a longitudinal tendon tear in three of 30 cases. Ankle imaging with the use of MR is still a relatively new procedure. Further investigation is needed to better define normal anatomy as well as normal variants. The authors described several structures that normally present with variable MR imaging appearances. This is clinically significant in order to maintain a high sensitivity and specificity in MR imaging interpretation
Normalization constraint for variational bounds on fluid permeability
Berryman, J.G.; Milton, G.W.
1985-01-01
A careful reexamination of the formulation of Prager's original variational principle for viscous flow through porous media has uncovered a subtle error in the normalization constraint on the trial functions. Although a certain surface integral of the true pressure field over the internal surface area always vanishes for isotropic materials, the corresponding surface integral for a given trial pressure field does not necessarily vanish but has nevertheless been previously neglected in the normalization. When this error is corrected, the form of the variational estimate is actually simpler than before and furthermore the resulting bounds have been shown to improve when the constant trial functions are used in either the two-point or three-point bounds
Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.
Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan
2016-02-01
This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.
Detection of a normal zone in the MFTF magnets
Owen, E.W.
1979-01-01
A method is described for the electrical detection of a normal zone in inductively coupled superconducting coils. Measurements are made with two kinds of bridges, mutual inductance bridges and self-inductance bridges. The bridge outputs are combined with other measured voltages to form a detector that can be realized with either analog circuits or a computer algorithm. The detection of a normal zone in a pair of coupled coils, each with taps, is discussed in detail. It is also shown that the method applies to a pair of coils when one has no taps and to a pair when one coil is superconducting and the other is not. The method is extended, in principle, to a number of coils. A description is given of a technique for balancing the bridges at near the operating currents of the coils
Defecography: A study of normal volunteers
Shorvon, P.; Stevenson, G.W.; McHugh, S.; Somers, P.
1987-01-01
This study of young volunteers was set up in an effort to establish true normal measurements for defecography with minimum selection bias. The results describe the mean (and the range) for the following: anorectal angle; anorectal junction position at rest; excursion on lift, strain, and evacuation; anal canal length and degree of closure; and the frequency and degree of features such as rectocele and intussusception which have previously been called abnormalities. The results indicate that there is a very wide range of normal appearances. Knowledge of these normal variations is important to avoid overreporting and unnecessary surgery
Normal-dispersion microresonator Kerr frequency combs
Xue Xiaoxiao
2016-06-01
Full Text Available Optical microresonator-based Kerr frequency comb generation has developed into a hot research area in the past decade. Microresonator combs are promising for portable applications due to their potential for chip-level integration and low power consumption. According to the group velocity dispersion of the microresonator employed, research in this field may be classified into two categories: the anomalous dispersion regime and the normal dispersion regime. In this paper, we discuss the physics of Kerr comb generation in the normal dispersion regime and review recent experimental advances. The potential advantages and future directions of normal dispersion combs are also discussed.
Advances in metal forming expert system for metal forming
Hingole, Rahulkumar Shivajirao
2015-01-01
This comprehensive book offers a clear account of the theory and applications of advanced metal forming. It provides a detailed discussion of specific forming processes, such as deep drawing, rolling, bending extrusion and stamping. The author highlights recent developments of metal forming technologies and explains sound, new and powerful expert system techniques for solving advanced engineering problems in metal forming. In addition, the basics of expert systems, their importance and applications to metal forming processes, computer-aided analysis of metalworking processes, formability analysis, mathematical modeling and case studies of individual processes are presented.
The total plasmatic estriol on normal gestation
Thiesen, A.L.
1980-01-01
The total plasmatic estriol in normal pregnants was determinated by radioimmunological method using estriol labelled with sup(125)I. The obtained results presented similar results in comparison with methods using sup(19)C and sup(3)H. (author)
Terre Haute and the Normal School Fire
Ferreira, Allen
1974-01-01
This paper examines the short history of the Terre Haute Normal School before its tragic burning on April 9, 1888 and relates that story to the course of events immediately following the fire. (Author)
Looking at Your Newborn: What's Normal
... features that may make a normal newborn look strange are temporary. After all, babies develop while immersed ... sleepy during the first day or two of life. Many new parents become concerned about their newborn's ...
Mental Health: What's Normal, What's Not?
Healthy Lifestyle Adult health Understanding what's considered normal mental health can be tricky. See how feelings, thoughts and behaviors determine mental health and how to recognize if you or a ...
Compressed normalized block difference for object tracking
Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge
2018-04-01
Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.
Forced Normalization: Antagonism Between Epilepsy and Psychosis.
Kawakami, Yasuhiko; Itoh, Yasuhiko
2017-05-01
The antagonism between epilepsy and psychosis has been discussed for a long time. Landolt coined the term "forced normalization" in the 1950s to describe psychotic episodes associated with the remission of seizures and disappearance of epileptiform activity on electroencephalograms in individuals with epilepsy. Since then, neurologists and psychiatrists have been intrigued by this phenomenon. However, although collaborative clinical studies and basic experimental researches have been performed, the mechanism of forced normalization remains unknown. In this review article, we present a historical overview of the concept of forced normalization, and discuss potential pathogenic mechanisms and clinical diagnosis. We also discuss the role of dopamine, which appears to be a key factor in the mechanism of forced normalization. Copyright © 2017 Elsevier Inc. All rights reserved.
Efficient CEPSTRAL Normalization for Robust Speech Recognition
Liu, Fu-Hua; Stern, Richard M; Huang, Xuedong; Acero, Alejandro
1993-01-01
.... We compare the performance of these algorithms with the very simple RASTA and cepstral mean normalization procedures, describing the performance of these algorithms in the context of the 1992 DARPA...
Electromagnetic Hadronic Form-Factors
Edwards, Robert G.
2005-01-01
We present a calculation of the nucleon electromagnetic form-factors as well as the pion and rho to pion transition form-factors in a hybrid calculation with domain wall valence quarks and improved staggered (Asqtad) sea quarks
Right thoracic curvature in the normal spine
Masuda Keigo
2011-01-01
Full Text Available Abstract Background Trunk asymmetry and vertebral rotation, at times observed in the normal spine, resemble the characteristics of adolescent idiopathic scoliosis (AIS. Right thoracic curvature has also been reported in the normal spine. If it is determined that the features of right thoracic side curvature in the normal spine are the same as those observed in AIS, these findings might provide a basis for elucidating the etiology of this condition. For this reason, we investigated right thoracic curvature in the normal spine. Methods For normal spinal measurements, 1,200 patients who underwent a posteroanterior chest radiographs were evaluated. These consisted of 400 children (ages 4-9, 400 adolescents (ages 10-19 and 400 adults (ages 20-29, with each group comprised of both genders. The exclusion criteria were obvious chest and spinal diseases. As side curvature is minimal in normal spines and the range at which curvature is measured is difficult to ascertain, first the typical curvature range in scoliosis patients was determined and then the Cobb angle in normal spines was measured using the same range as the scoliosis curve, from T5 to T12. Right thoracic curvature was given a positive value. The curve pattern was organized in each collective three groups: neutral (from -1 degree to 1 degree, right (> +1 degree, and left ( Results In child group, Cobb angle in left was 120, in neutral was 125 and in right was 155. In adolescent group, Cobb angle in left was 70, in neutral was 114 and in right was 216. In adult group, Cobb angle in left was 46, in neutral was 102 and in right was 252. The curvature pattern shifts to the right side in the adolescent group (p Conclusions Based on standing chest radiographic measurements, a right thoracic curvature was observed in normal spines after adolescence.
Distinguishing hyperhidrosis and normal physiological sweat production
Thorlacius, Linnea; Gyldenløve, Mette; Zachariae, Claus
2015-01-01
of this study was to establish reference intervals for normal physiological axillary and palmar sweat production. METHODS: Gravimetric testing was performed in 75 healthy control subjects. Subsequently, these results were compared with findings in a cohort of patients with hyperhidrosis and with the results...... 100 mg/5 min. CONCLUSIONS: A sweat production rate of 100 mg/5 min as measured by gravimetric testing may be a reasonable cut-off value for distinguishing axillary and palmar hyperhidrosis from normal physiological sweat production....
Normalization based K means Clustering Algorithm
Virmani, Deepali; Taneja, Shweta; Malhotra, Geetika
2015-01-01
K-means is an effective clustering technique used to separate similar data into groups based on initial centroids of clusters. In this paper, Normalization based K-means clustering algorithm(N-K means) is proposed. Proposed N-K means clustering algorithm applies normalization prior to clustering on the available data as well as the proposed approach calculates initial centroids based on weights. Experimental results prove the betterment of proposed N-K means clustering algorithm over existing...
Sampling from the normal and exponential distributions
Chaplin, K.R.; Wills, C.A.
1982-01-01
Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms
A compiler for variational forms
Kirby, Robert C.; Logg, Anders
2011-01-01
As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...
Kuiper, P.M.; van Dijk, Elisabeth M.A.G.; Boerma, A.K.; Weibelzahl, S.; Cristea, A.
2006-01-01
Adaptation of electronic forms seems to be a step forward to reduce the burden for people who fill in forms. Municipalities more and more offer eforms online that can be used to request a municipal product or service. To create adaptive e-forms that satisfy the need of end-users, involvement of
Synthetic ossicular replacements: Normal and abnormal CT appearance
Swartz, J.D.; Zwillenberg, S.; Berger, A.S.; Granoff, D.W.; Popky, G.L.
1986-01-01
Numerous synthetic ossicular replacements are currently in use. The TORP conducts sound from the newly formed tympanic membrane to the oval window; the PORP is used when the stapes superstructure is maintained, being interposed between the tympanic membrane and the stapes capitulum. In 12 patients the surgical results of ossicular replacement procedures were good, which gave the author the opportunity to study the normal CT appearance. In an additional 10 patients CT was performed before surgical revision. Using CT, they have been able to diagnose subluxation and fibrous tissue fixation. In two patients the CT appearance was unremarkable, but at surgery lateralization of the graft was found, with a nonfunctioning interface
Diffraction enhanced imaging of normal and arthritic mice feet
Crittell, Suzanne; Cheung, K.C.; Hall, Chris; Ibison, Mark; Nolan, Paul; Page, Robert; Scraggs, David; Wilkinson, Steve
2007-01-01
The aim of this experiment was to produce X-ray images of mice feet using the diffraction-enhanced imaging (DEI) system at the UK Synchrotron Radiation Source (SRS) at Daresbury. There were two broad types of mice feet samples studied: normal and arthritic. The two types of samples were imaged using several views and compared in order to determine whether it would be possible to detect the early morphological changes linked with this form of arthritis. We found that the DEI images produced were indeed of sufficient quality to show the presence of some osteoarthritic changes
The pulse-driven AC Josephson voltage normal
Kieler, Oliver
2016-01-01
In this contribution quantum precise alternating-voltage sources are presented, which make the generation of arbitrary wave forms with highest spectral purity with a high bandwidth from DC up to the MHz range possible. Heartpiece of these Josephson voltage normals is a serial circuit of many thousand Josephson contacts, which make by irradiation with high-frequency radiation (microwaves) the generation of highly precise voltage values possible. Thereby in the current-voltage characteristics stages of constant voltage, so called Shapiro stages, occur. Illustratively these stages can be described by the transfer of a certain number of flux quanta through the Josephson contacts.
Calcitonin serum levels in normal and in pathological conditions
Ziliotto, D.; Luisetto, G.; Zanatta, G.P.; Cataldi, F.; Zangari, M.; Gangemi, M.; Melanotte, P.L.; Caira, S.
1985-01-01
Radioimmunoassay of calcitonin (CT) gives variable results because of differences in sensitivity and specificity of antibody preparations and because of the known immunoheterogeneity of circulating CT. The difficulties in interpretation of data has hindered our understanding of normal and abnormal CT physiology. The authors separated the biologically active CT monomer (CTm) from the higher molecular weight biologically inactive forms before RIA. It makes it possible to re-evaluate the behaviour of CT in physiological conditions and to study its changes in diseases in which bone and mineral metabolism are in some way compromised. (Auth.)
Radiosensitivity of normal human epidermal cells in culture
Dover, R.; Potten, C.S.
1983-01-01
Using an in vitro culture system the authors have derived #betta#-radiation survival curves over a dose range 0-8 Gy for the clonogenic cells of normal human epidermis. The culture system used allows the epidermal cells to stratify and form a multi-layered sheet of keratinizing cells. The cultures appear to be a very good model for epidermis in vivo. The survival curves show a population which is apparently more sensitive than murine epidermis in vivo. It remains unclear whether this is an intrinsic difference between the species or is a consequence of the in vitro cultivation of the human cells. (author)
R. Walter Heinrichs
2017-01-01
Full Text Available This study assessed whether cortical thickness across the brain and regionally in terms of the default mode, salience, and central executive networks differentiates schizophrenia patients and healthy controls with normal range or below-normal range cognitive performance. Cognitive normality was defined using the MATRICS Consensus Cognitive Battery (MCCB composite score (T=50 ± 10 and structural magnetic resonance imaging was used to generate cortical thickness data. Whole brain analysis revealed that cognitively normal range controls (n=39 had greater cortical thickness than both cognitively normal (n=17 and below-normal range (n=49 patients. Cognitively normal controls also demonstrated greater thickness than patients in regions associated with the default mode and salience, but not central executive networks. No differences on any thickness measure were found between cognitively normal range and below-normal range controls (n=24 or between cognitively normal and below-normal range patients. In addition, structural covariance between network regions was high and similar across subgroups. Positive and negative symptom severity did not correlate with thickness values. Cortical thinning across the brain and regionally in relation to the default and salience networks may index shared aspects of the psychotic psychopathology that defines schizophrenia with no relation to cognitive impairment.
2014-01-01
M.A. (African Studies) The study deals with forms of address in isiZulu. Therefore, the various aspects of speech that play roles when addressing a person, the factors affecting forms of address in isiZulu and the effect of languages such as English, Afrikaans and other African languages on the forms of address in isiZulu are of interest. Research was conducted on forms of address in isiZulu in parts of Soweto and it was discovered that form of address are determined by different factors i...
Parser Adaptation for Social Media by Integrating Normalization
van der Goot, Rob; van Noord, Gerardus
This work explores normalization for parser adaptation. Traditionally, normalization is used as separate pre-processing step. We show that integrating the normalization model into the parsing algorithm is beneficial. This way, multiple normalization candidates can be leveraged, which improves
Linear regression and the normality assumption.
Schmidt, Amand F; Finan, Chris
2017-12-16
Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.
Niemeyer, M.G.; St. Antonius Hospital Nieuwegein; Laarman, G.J.; Lelbach, S.; Cramer, M.J.; Ascoop, C.A.P.L.; Verzijlbergen, J.F.; Wall, E.E. van der; Zwinderman, A.H.; Pauwels, E.K.J.
1990-01-01
Quantitative thallium-201 myocardial exercise scintigraphy was tested in two patient populations representing alternative standards for cardiac normality: group I comprised 18 male uncatherized patients with a low likelihood of coronary artery disease (CAD); group II contained 41 patients with normal coronary arteriograms. Group I patients were younger, they achieved a higher rate-pressure product than group II patients; all had normal findings by phisical examination and electrocardiography at rest and exercise. Group II patients comprised 21 females, 11 patients showed abnormal electrocardiography at rest, and five patients showed ischemic ST depression during exercise. Twelve patients had sign of minimal CAD. Twelve patients revealed abnormal visual and quantitative thallium findings, three of these patients had minimal CAD. Profiles of uptake and washout of thallium-201 were derived from both patient groups, and compared with normal limits developed by Maddahi et al. Furthermore, low likelihood and angiographically normal patients may differ substantially, and both sets of normal patients should be considered when establishing criteria of abnormality in exercise thallium imaging. When commercial software containing normal limits for quantitative analysis of exercise thallium-201 imaging is used in clinical practice, it is mandatory to compare these with normal limits of uptake and washout of thallium-201, derived from the less heterogeneous group of low-likelihood subjects, which should be used in selecting a normal population to define normality. (author). 37 refs.; 3 figs; 1 tab
A rare case of a familial form of nonsyndromic trigonocephaly ...
The psychomotor development of patients is usually normal and the majority of cases are mild. Most cases are sporadic but familial forms with apparently autosomal dominant transmission have been reported (7-8%). However, the concordance rate of isolated trigonocephaly in monozygotic twins is 43%, suggesting that ...
Strong Bayesian evidence for the normal neutrino hierarchy
Simpson, Fergus; Jimenez, Raul; Verde, Licia [ICCUB, University of Barcelona (UB-IEEC), Marti i Franques 1, Barcelona, 08028 (Spain); Pena-Garay, Carlos, E-mail: fergus2@gmail.com, E-mail: raul.jimenez@icc.ub.edu, E-mail: penagaray@gmail.com, E-mail: liciaverde@icc.ub.edu [I2SysBio, CSIC-UVEG, P.O. 22085, Valencia, 46071 (Spain)
2017-06-01
The configuration of the three neutrino masses can take two forms, known as the normal and inverted hierarchies. We compute the Bayesian evidence associated with these two hierarchies. Previous studies found a mild preference for the normal hierarchy, and this was driven by the asymmetric manner in which cosmological data has confined the available parameter space. Here we identify the presence of a second asymmetry, which is imposed by data from neutrino oscillations. By combining constraints on the squared-mass splittings [1] with the limit on the sum of neutrino masses of Σ m {sub ν} < 0.13 eV [2], and using a minimally informative prior on the masses, we infer odds of 42:1 in favour of the normal hierarchy, which is classified as 'strong' in the Jeffreys' scale. We explore how these odds may evolve in light of higher precision cosmological data, and discuss the implications of this finding with regards to the nature of neutrinos. Finally the individual masses are inferred to be m {sub 1}=3.80{sup +26.2}{sub -3.73}meV; m {sub 2}=8.8{sup +18}{sub -1.2}meV; m {sub 3}=50.4{sup +5.8}{sub -1.2}meV (95% credible intervals).
de Carvalho, Paulo Victor Rodrigues; Gomes, José Orlando; Huber, Gilbert Jacob; Vidal, Mario Cesar
2009-05-01
A fundamental challenge in improving the safety of complex systems is to understand how accidents emerge in normal working situations, with equipment functioning normally in normally structured organizations. We present a field study of the en route mid-air collision between a commercial carrier and an executive jet, in the clear afternoon Amazon sky in which 154 people lost their lives, that illustrates one response to this challenge. Our focus was on how and why the several safety barriers of a well structured air traffic system melted down enabling the occurrence of this tragedy, without any catastrophic component failure, and in a situation where everything was functioning normally. We identify strong consistencies and feedbacks regarding factors of system day-to-day functioning that made monitoring and awareness difficult, and the cognitive strategies that operators have developed to deal with overall system behavior. These findings emphasize the active problem-solving behavior needed in air traffic control work, and highlight how the day-to-day functioning of the system can jeopardize such behavior. An immediate consequence is that safety managers and engineers should review their traditional safety approach and accident models based on equipment failure probability, linear combinations of failures, rules and procedures, and human errors, to deal with complex patterns of coincidence possibilities, unexpected links, resonance among system functions and activities, and system cognition.
Computed tomography of the normal sternum
Goodman, L.R.; Teplick, S.K.; Kay, H.
1983-01-01
The normal CT anatomy of the sternum was studied in 35 patients. In addition to the normal appearance of the sternum, normal variants that may mimic desease were often noted. In the manubrium, part of the posterior cortical margin was unsharp and irregular in 34 of 35 patients. Part of the anterior cortical margin was indistinct in 20 of the 35 patients. Angulation of the CT gantry to a position more nearly perpendicular to the manubrium improved the definition of the cortical margins. The body of the sternum was ovoid to rectangular and usually had sharp cortical margins. Sections through the manubriosternal joint and xyphoid often demonstrated irregular mottled calcifications and indistinct margins again simulating bony lesions. The rib insertions, sternal clavicular joints, and adjacent soft-tissue appearance also were evaluated
Asymptotic normalization coefficients and astrophysical factors
Mukhamedzhanov, A.M.; Azhari, A.; Clark, H.L.; Gagliardi, C.A.; Lui, Y.-W.; Sattarov, A.; Trache, L.; Tribble, R.E.; Burjan, V.; Kroha, V.; Carstoiu, F.
2000-01-01
The S factor for the direct capture reaction 7 Be(p,γ) 8 B can be found at astrophysical energies from the asymptotic normalization coefficients (ANC's) which provide the normalization of the tails of the overlap functions for 8 B → 7 Be + p. Peripheral transfer reactions offer a technique to determine these ANC's. Using this technique, the 10 B( 7 Be, 8 B) 9 Be and 14 N( 7 Be, 8 B) 13 C reactions have been used to measure the asymptotic normalization coefficient for 7 Be(p, γ) 8 B. These results provide an indirect determination of S 17 (0). Analysis of the existing 9 Be(p, γ) 10 B experimental data within the framework of the R-matrix method demonstrates that experimentally measured ANC's can provide a reasonable determination of direct radiative capture rates. (author)
Congenital anomalies and normal skeletal variants
Guebert, G.M.; Yochum, T.R.; Rowe, L.J.
1987-01-01
Congenital anomalies and normal skeletal variants are a common occurrence in clinical practice. In this chapter a large number of skeletal anomalies of the spine and pelvis are reviewed. Some of the more common skeletal anomalies of the extremities are also presented. The second section of this chapter deals with normal skeletal variants. Some of these variants may simulate certain disease processes. In some instances there are no clear-cut distinctions between skeletal variants and anomalies; therefore, there may be some overlap of material. The congenital anomalies are presented initially with accompanying text, photos, and references, beginning with the skull and proceeding caudally through the spine to then include the pelvis and extremities. The normal skeletal variants section is presented in an anatomical atlas format without text or references
X-ray emssion from normal galaxies
Speybroeck, L. van; Bechtold, J.
1981-01-01
A summary of results obtained with the Einstein Observatory is presented. There are two general categories of normal galaxy investigation being pursued - detailed studies of nearby galaxies where individual sources can be detected and possibly correlated with galactic morphology, and shorter observations of many more distant objects to determine the total luminosity distribution of normal galaxies. The principal examples of the first type are the CFA study of M31 and the Columbia study of the Large Magellanic Cloud. The Columbia normal galaxy survey is the principal example of the second type, although there also are smaller CFA programs concentrating on early galaxies and peculiar galaxies, and MIT has observed some members of the local group. (Auth.)
Quantum arrival times and operator normalization
Hegerfeldt, Gerhard C.; Seidel, Dirk; Gonzalo Muga, J.
2003-01-01
A recent approach to arrival times used the fluorescence of an atom entering a laser illuminated region, and the resulting arrival-time distribution was close to the axiomatic distribution of Kijowski, but not exactly equal, neither in limiting cases nor after compensation of reflection losses by normalization on the level of expectation values. In this paper we employ a normalization on the level of operators, recently proposed in a slightly different context. We show that in this case the axiomatic arrival-time distribution of Kijowski is recovered as a limiting case. In addition, it is shown that Allcock's complex potential model is also a limit of the physically motivated fluorescence approach and connected to Kijowski's distribution through operator normalization
Helicon normal modes in Proto-MPEX
Piotrowicz, P. A.; Caneses, J. F.; Green, D. L.; Goulding, R. H.; Lau, C.; Caughman, J. B. O.; Rapp, J.; Ruzic, D. N.
2018-05-01
The Proto-MPEX helicon source has been operating in a high electron density ‘helicon-mode’. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the ‘helicon-mode’. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besides directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region. ).
Normalization as a canonical neural computation
Carandini, Matteo; Heeger, David J.
2012-01-01
There is increasing evidence that the brain relies on a set of canonical neural computations, repeating them across brain regions and modalities to apply similar operations to different problems. A promising candidate for such a computation is normalization, in which the responses of neurons are divided by a common factor that typically includes the summed activity of a pool of neurons. Normalization was developed to explain responses in the primary visual cortex and is now thought to operate throughout the visual system, and in many other sensory modalities and brain regions. Normalization may underlie operations such as the representation of odours, the modulatory effects of visual attention, the encoding of value and the integration of multisensory information. Its presence in such a diversity of neural systems in multiple species, from invertebrates to mammals, suggests that it serves as a canonical neural computation. PMID:22108672
Undercuts by Laser Shock Forming
Wielage, Hanna; Vollertsen, Frank
2011-01-01
In laser shock forming TEA-CO 2 -laser induced shock waves are used to form metal foils, such as aluminum or copper. The process utilizes an initiated plasma shock wave on the target surface, which leads to a forming of the foil. A challenge in forming technologies is the manufacturing of undercuts. By conventional forming methods these special forms are not feasible. In this article, it is presented that undercuts in the micro range can be produced by laser shock deep drawing. Different drawing die diameters, drawing die depths and the material aluminum in the thicknesses 20 and 50 μm were investigated. It will be presented that smaller die diameters facilitate undercuts compared to bigger die diameters. The phenomena can be explained by Barlow's formula. Furthermore, it is shown which maximum undercut depth at different die diameters can be reached. To this end, cross-sections of the different parameter combinations are displayed.
Forming processes and mechanics of sheet metal forming
Burchitz, I.A.
2004-01-01
The report is dealing with the numerical analysis of forming processes. Forming processes is the large group of manufacturing processes used to obtain various product shapes by means of plastic deformations. The report is organized as follows. An overview of the deformation processes and the
Normal stress Sestamibi study: why re inject?
Unger, S.A.; Hughes, T.
2000-01-01
Full text: Myocardial perfusion imaging (MPI) is widely used for risk stratification of patients with known or suspected coronary artery disease. A normal MPI study predicts an annual cardiac event rate of 99 Tc m -Sestamibi (MIBI), omitting the rest study when the post-stress study is interpreted as normal. The safety of this approach has not been validated, all published reports utilising both rest and stress images to interpret a study as 'normal'. Between 1/1/98 and 30/8/98, 489 patients (patients) were referred to our department for stress MPI. Of these, 237 were interpreted as normal on the basis of their post-stress study, and did not undergo a rest study. 12 month clinical follow-up was available in 184 (78%) of these patients, representing the study group (82 males, 102 females; mean age 61±12 years). 156 of these patients were referred for assessment of chest pain, three for dyspnoea, six for abnormal ECGs, and 19 for pre-operative evaluation. At one year of follow-up, there were no myocardial infarcts or admissions for unstable angina, and no cardiac deaths. Three patients died of non-cardiac causes. Seven patients underwent coronary angiography: five were normal, one had a single 50% stenosis, and one had an 80% vein graft stenosis which was subsequently angioplastied. In conclusion, a normal stress MIBI image predicts an excellent prognosis and negates the need for a rest reinjection study, thus reducing patient camera time and radiation exposure, improving departmental throughput, and minimising public health expenditure. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc
Normal modes of weak colloidal gels
Varga, Zsigmond; Swan, James W.
2018-01-01
The normal modes and relaxation rates of weak colloidal gels are investigated in calculations using different models of the hydrodynamic interactions between suspended particles. The relaxation spectrum is computed for freely draining, Rotne-Prager-Yamakawa, and accelerated Stokesian dynamics approximations of the hydrodynamic mobility in a normal mode analysis of a harmonic network representing several colloidal gels. We find that the density of states and spatial structure of the normal modes are fundamentally altered by long-ranged hydrodynamic coupling among the particles. Short-ranged coupling due to hydrodynamic lubrication affects only the relaxation rates of short-wavelength modes. Hydrodynamic models accounting for long-ranged coupling exhibit a microscopic relaxation rate for each normal mode, λ that scales as l-2, where l is the spatial correlation length of the normal mode. For the freely draining approximation, which neglects long-ranged coupling, the microscopic relaxation rate scales as l-γ, where γ varies between three and two with increasing particle volume fraction. A simple phenomenological model of the internal elastic response to normal mode fluctuations is developed, which shows that long-ranged hydrodynamic interactions play a central role in the viscoelasticity of the gel network. Dynamic simulations of hard spheres that gel in response to short-ranged depletion attractions are used to test the applicability of the density of states predictions. For particle concentrations up to 30% by volume, the power law decay of the relaxation modulus in simulations accounting for long-ranged hydrodynamic interactions agrees with predictions generated by the density of states of the corresponding harmonic networks as well as experimental measurements. For higher volume fractions, excluded volume interactions dominate the stress response, and the prediction from the harmonic network density of states fails. Analogous to the Zimm model in polymer
Metabolomics data normalization with EigenMS.
Yuliya V Karpievitch
Full Text Available Liquid chromatography mass spectrometry has become one of the analytical platforms of choice for metabolomics studies. However, LC-MS metabolomics data can suffer from the effects of various systematic biases. These include batch effects, day-to-day variations in instrument performance, signal intensity loss due to time-dependent effects of the LC column performance, accumulation of contaminants in the MS ion source and MS sensitivity among others. In this study we aimed to test a singular value decomposition-based method, called EigenMS, for normalization of metabolomics data. We analyzed a clinical human dataset where LC-MS serum metabolomics data and physiological measurements were collected from thirty nine healthy subjects and forty with type 2 diabetes and applied EigenMS to detect and correct for any systematic bias. EigenMS works in several stages. First, EigenMS preserves the treatment group differences in the metabolomics data by estimating treatment effects with an ANOVA model (multiple fixed effects can be estimated. Singular value decomposition of the residuals matrix is then used to determine bias trends in the data. The number of bias trends is then estimated via a permutation test and the effects of the bias trends are eliminated. EigenMS removed bias of unknown complexity from the LC-MS metabolomics data, allowing for increased sensitivity in differential analysis. Moreover, normalized samples better correlated with both other normalized samples and corresponding physiological data, such as blood glucose level, glycated haemoglobin, exercise central augmentation pressure normalized to heart rate of 75, and total cholesterol. We were able to report 2578 discriminatory metabolite peaks in the normalized data (p<0.05 as compared to only 1840 metabolite signals in the raw data. Our results support the use of singular value decomposition-based normalization for metabolomics data.
Mathematical models of tumour and normal tissue response
Jones, B.; Dale, R.G.; Charing Cross Group of Hospitals, London
1999-01-01
The historical application of mathematics in the natural sciences and in radiotherapy is compared. The various forms of mathematical models and their limitations are discussed. The Linear Quadratic (LQ) model can be modified to include (i) radiobiological parameter changes that occur during fractionated radiotherapy, (ii) situations such as focal forms of radiotherapy, (iii) normal tissue responses, and (iv) to allow for the process of optimization. The inclusion of a variable cell loss factor in the LQ model repopulation term produces a more flexible clonogenic doubling time, which can simulate the phenomenon of 'accelerated repopulation'. Differential calculus can be applied to the LQ model after elimination of the fraction number integers. The optimum dose per fraction (maximum cell kill relative to a given normal tissue fractionation sensitivity) is then estimated from the clonogen doubling times and the radiosensitivity parameters (or α/β ratios). Economic treatment optimization is described. Tumour volume studies during or following teletherapy are used to optimize brachytherapy. The radiation responses of both individual tumours and tumour populations (by random sampling 'Monte-Carlo' techniques from statistical ranges of radiobiological and physical parameters) can be estimated. Computerized preclinical trials can be used to guide choice of dose fractionation scheduling in clinical trials. The potential impact of gene and other biological therapies on the results of radical radiotherapy are testable. New and experimentally testable hypotheses are generated from limited clinical data by exploratory modelling exercises. (orig.)
Computing Instantaneous Frequency by normalizing Hilbert Transform
Huang, Norden E.
2005-05-31
This invention presents Normalized Amplitude Hilbert Transform (NAHT) and Normalized Hilbert Transform(NHT), both of which are new methods for computing Instantaneous Frequency. This method is designed specifically to circumvent the limitation set by the Bedorsian and Nuttal Theorems, and to provide a sharp local measure of error when the quadrature and the Hilbert Transform do not agree. Motivation for this method is that straightforward application of the Hilbert Transform followed by taking the derivative of the phase-angle as the Instantaneous Frequency (IF) leads to a common mistake made up to this date. In order to make the Hilbert Transform method work, the data has to obey certain restrictions.
Normal anatomy of lung perfusion SPECT scintigraphy
Moskowitz, G.W.; Levy, L.M.
1987-01-01
Ten patients studies for possible pulmonary embolic disease had normal lung perfusion planar and SPECT scintigraphy. A computer program was developed to superimpose the CT scans on corresponding SPECT images. Superimposition of CT scans on corresponding SPECT transaxial cross-sectional images, when available, provides the needed definition and relationships of adjacent organs. SPECT transaxial sections provide clear anatomic definition of perfusion defects without foreground and background lung tissue superimposed. The location, shape, and size of the perfusion defects can be readily assessed by SPECT. An algorithm was developed for the differentiation of abnormal pulmonary perfusion patterns from normal structures on variation
Anatomy, normal variants, and basic biomechanics
Berquist, T.H.; Johnson, K.A.
1989-01-01
This paper reports on the anatomy and basic functions of the foot and ankle important to physicians involved in imaging procedures, clinical medicine, and surgery. New radiographic techniques especially magnetic resonance imaging, provide more diagnostic information owing to improved tissue contrast and the ability to obtain multiple image planes (axial, sagittal, coronal, oblique). Therefore, a thorough knowledge of skeletal and soft tissue anatomy is even more essential. Normal variants must also be understood in order to distinguish normal from pathologic changes in the foot and ankle. A basic understanding of biomechanics is also essential for selecting the proper diagnostic techniques
Dlk1 in normal and abnormal hematopoiesis
Sakajiri, S; O'kelly, J; Yin, D
2005-01-01
normals. Also, Dlk1 mRNA was elevated in mononuclear, low density bone marrow cells from 11/38 MDS patients, 5/11 AML M6 and 2/4 AML M7 samples. Furthermore, 5/6 erythroleukemia and 2/2 megakaryocytic leukemia cell lines highly expressed Dlk1 mRNA. Levels of Dlk1 mRNA markedly increased during...... (particularly M6, M7), and it appears to be associated with normal development of megakaryocytes and B cells....
Statistical Theory of Normal Grain Growth Revisited
Gadomski, A.; Luczka, J.
2002-01-01
In this paper, we discuss three physically relevant problems concerning the normal grain growth process. These are: Infinite vs finite size of the system under study (a step towards more realistic modeling); conditions of fine-grained structure formation, with possible applications to thin films and biomembranes, and interesting relations to superplasticity of materials; approach to log-normality, an ubiquitous natural phenomenon, frequently reported in literature. It turns out that all three important points mentioned are possible to be included in a Mulheran-Harding type behavior of evolving grains-containing systems that we have studied previously. (author)
Radiogenomics: predicting clinical normal tissue radiosensitivity
Alsner, Jan
2006-01-01
Studies on the genetic basis of normal tissue radiosensitivity, or 'radiogenomics', aims at predicting clinical radiosensitivity and optimize treatment from individual genetic profiles. Several studies have now reported links between variations in certain genes related to the biological response...... to radiation injury and risk of normal tissue morbidity in cancer patients treated with radiotherapy. However, after these initial association studies including few genes, we are still far from being able to predict clinical radiosensitivity on an individual level. Recent data from our own studies on risk...
Tectonics: The meaning of form
Christiansen, Karl; Brandt, Per Aage
Tectonics – The meaning of form deals with one of the core topics of architecture: the relationship between form and content. In the world of architecture, form is not only made from brick, glass and wood. Form means something. When a material is processed with sufficient technical skill and insi...... perspectives. You can read the chapters in any order you like – from the beginning, end or the middle. There is no correct order. The project is methodologically inductive: the more essays you read, the broader your knowledge of tectonics get....
Amorphous drugs and dosage forms
Grohganz, Holger; Löbmann, K.; Priemel, P.
2013-01-01
The transformation to an amorphous form is one of the most promising approaches to address the low solubility of drug compounds, the latter being an increasing challenge in the development of new drug candidates. However, amorphous forms are high energy solids and tend to recry stallize. New...... formulation principles are needed to ensure the stability of amorphous drug forms. The formation of solid dispersions is still the most investigated approach, but additional approaches are desirable to overcome the shortcomings of solid dispersions. Spatial separation by either coating or the use of micro-containers...... before single molecules are available for the formation of crystal nuclei, thus stabilizing the amorphous form....
INFORMATIONAL EFFECT OF A FORM
Kovalenko V.F.
2016-02-01
Full Text Available The study was conducted by method of light scattering of laser emission. The influence of the form field, mutual influence of mental informational and form torsional fields as well as the following exposure of water samples in the form field after the cease of informational influence on water structure were examined. Paper forms of a pyramid, a cylinder, and a prism were used. The experimental findings show that mechanism of mutual influence on water structure of the form and informational torsional fields depended on the initial conditions of spin restructuring process – the configuration of a form, the type of the form field (internal and external ones, and the initial water structure. The influence of the form field on informational aftereffect was determined, the character of which was defined by ratio of intensities of torsional form field and an informational soliton. The phenomenon of the abnormally large amplification of the informational aftereffect in the internal field of a pyramid demonstrating the attributes of positive reverse connection between the informational soliton and torsional field of water structure and selection of generated cluster sizes were discovered.
Improving the forming capability of laser dynamic forming by using rubber as a forming medium
Shen, Zongbao; Liu, Huixia; Wang, Xiao; Wang, Cuntang
2016-04-01
Laser dynamic forming (LDF) is a novel high velocity forming technique, which employs laser-generated shock wave to load the sample. The forming velocity induced by the high energy laser pulse may exceed the critical forming velocity, resulting in the occurrence of premature fracture. To avoid the above premature fracture, rubber is introduced in LDF as a forming medium to prolong the loading duration in this paper. Laser induced shock wave energy is transferred to the sample in different forming stages, so the forming velocity can be kept below the critical forming velocity when the initial laser energy is high for fracture. Bulge forming experiments with and without rubber were performed to study the effect of rubber on loading duration. The experimental results show that, the shock wave energy attenuates during the propagation through the rubber layer, the rubber can avoid the premature fracture. So the plastic deformation can continue, the forming capability of LDF is improved. Due to the severe plastic deformation under rubber compression, adiabatic shear bands (ASB) occur in LDF with rubber. The material softening in ASB leads to the irregular fracture, which is different from the premature fracture pattern (regular fracture) in LDF without rubber. To better understand this deformation behavior, Johnson-Cook model is used to simulate the dynamic response and the evolution of ASB of copper sample. The simulation results also indicate the rubber can prolong the loading duration.
The anti-tumor efficacy of nanoparticulate form of ICD-85 versus free form
Zare Mirakabadi, A.
2015-04-01
Full Text Available Biodegradable polymeric nanoparticles (NPs have been intensively studied as a possible way to enhance anti-tumor efficacy while reducing side effects. ICD-85, derived from the venom of two separate species of venomous animals, has been shown to exhibit anti-cancer activity. In this report polymer based sodium alginate nanoparticles of ICD-85 was used to enhance its therapeutic effects and reduce its side effects. The inhibitory effect was evaluated by MTT assay. The necrotic effect was assessed using LDH assay. The induction of apoptosis was analyzed by caspase-8 colorimetric assay kit. Cytotoxicity assay in HeLa cells demonstrated enhanced efficacy of ICD-85 loaded NPs compared to the free ICD-85. The IC50 values obtained in HeLa cells after 48 h, for free ICD-85 and ICD-85 loaded NPs were 26±2.9μg ml-1 and 18±2.5μg ml-1, respectively. While it was observed that free ICD-85 exhibits mild cytotoxicity towards normal MRC-5 cells (IC50>60μg ml-1, ICD-85 loaded NPs was found to have higher efficacy in anti-proliferative activity on HeLa cells in vitro without any significant cytotoxic effect on normal MRC-5 cells. The apoptosis-induction mechanism by both form of ICD-85 on HeLa cells was found to be through activation of caspase-8 with approximately 2 fold greater of ICD-85 loaded NPs as compared to free ICD-85. Our work reveals that although ICD-85 in free form is relatively selective to inhibit the growth of cancer cells via apoptosis as compared to normal cells, but nanoparticulate form increases its selectivity towards cancer cells.
Ketone body metabolism in normal and diabetic human skeletal muscle
Nosadini, R.; Avogaro, A.; Sacca, L.
1985-01-01
Although the liver is considered the major source of ketone bodies (KB) in humans, these compounds may also be formed by nonhepatic tissues. To study this aspect further, 3-[ 14 C]hydroxybutyrate (BOH) or [3- 14 C]acetoacetate (AcAc) were constantly infused after a priming dose and contemporaneous arterial and venous samples were taken at splanchnic, heart, kidney, and leg sites in eight normal subjects (N) undergoing diagnostic catheterization and at the forearm site in five normal and six ketotic diabetic (D) subjects. After 70 min of infusion, tracer and tracee levels of AcAc and BOH reached a steady state in the artery and vein in both normal and diabetic subjects. The venous-arterial (V-A) difference at the forearm step for cold KB was negligible both in normal and diabetic subjects, whereas for labeled KB it was approximately 10-fold higher in diabetic subjects (V-A AcAc, -31 +/- 7 and -270 +/- 34 dpm/ml in N and D, respectively; V-A BOH, -38 +/- 6 and -344 +/- 126 dpm/ml in N and D, respectively). The authors assumed that the V-A difference in tracer concentration was consistent with dilution of the tracer by newly synthesized tracee inside the muscle and calculated that the forearm muscle produces KB at a rate of 16.2 +/- 3.3 mumol/min in D and 0.9 +/- 0.9 mumol/min in N. These findings can be accounted for by the hypothesis that the disappearance flux of KB from circulation was replaced by an equivalent flux of KB entering the vein at the muscle step in D but not in N. Moreover, in N KB were not only produced but also utilized by the splanchnic area (39 +/- 9 mumol/min)
Normal Isocurvature Surfaces and Special Isocurvature Circles (SIC)
Manoussakis, Gerassimos; Delikaraoglou, Demitris
2010-05-01
An isocurvature surface of a gravity field is a surface on which the value of the plumblines' curvature is constant. Here we are going to study the isocurvature surfaces of the Earth's normal gravity field. The normal gravity field is a symmetric gravity field therefore the isocurvature surfaces are surfaces of revolution. But even in this case the necessary relations for their study are not simple at all. Therefore to study an isocurvature surface we make special assumptions to form a vector equation which will hold only for a small coordinate patch of the isocurvature surface. Yet from the definition of the isocurvature surface and the properties of the normal gravity field is possible to express very interesting global geometrical properties of these surfaces without mixing surface differential calculus. The gradient of the plumblines' curvature function is vertical to an isocurvature surface. If P is a point of an isocurvature surface and "Φ" is the angle of the gradient of the plumblines' curvature with the equatorial plane then this direction points to the direction along which the curvature of the plumbline decreases / increases the most, and therefore is related to the strength of the normal gravity field. We will show that this direction is constant along a line of curvature of the isocurvature surface and this line is an isocurvature circle. In addition we will show that at each isocurvature surface there is at least one isocurvature circle along which the direction of the maximum variation of the plumblines' curvature function is parallel to the equatorial plane of the ellipsoid of revolution. This circle is defined as a Special Isocurvature Circle (SIC). Finally we shall prove that all these SIC lye on a special surface of revolution, the so - called SIC surface. That is to say, a SIC is not an isolated curve in the three dimensional space.
CT and MRI normal findings; CT- und MRT-Normalbefunde
Moeller, T.B.; Reif, E. [Caritas-Krankenhaus, Dillingen (Germany)
1998-07-01
This book gives answers to questions frequently heard especially from trainees and doctors not specialising in the field of radiology: Is that a normal finding? How do I decide? What are the objective criteria? The information presented is three-fold. The normal findings of the usual CT and MRI examinations are shwon with high-quality pictures serving as a reference, with inscribed important additional information on measures, angles and other criteria describing the normal conditions. These criteria are further explained and evaluated in accompanying texts which also teach the systematic approach for individual picture analysis, and include a check list of major aspects, as a didactic guide for learning. The book is primarily intended for students, radiographers, radiology trainees and doctors from other medical fields, but radiology specialists will also find useful details of help in special cases. (orig./CB) [German] Normalbefunde sind die haeufigsten Befunde ueberhaupt. Also kein Problem? Doch. Besonders Radiologen in der Ausbildung und Aerzte aus anderen Fachgebieten stellen sich immer wieder die entscheidende Frage: Ist das normal? Woran kann ich das erkennen? Wie kann ich das objektivieren? Dieses Buch leistet dreierlei: 1. Es zeigt klassische Normalbefunde der gaengigen CT- und MRT-Untersuchungen in hoher Abbildungsqualitaet als Referenz. Direkt in die Aufnahmen eingezeichnet sind wichtige Daten: Masse, Winkel und andere Kriterien des Normalen. Sie werden im Text nochmals zusammengefasst, erklaert und bewertet. 2. Es lehrt die Systematik der Bildbetrachtung - wie schaue ich mir ein Bild an, welche Strukturen betrachte ich in welcher Reihenfolge und worauf muss ich dabei besonders achten? Dies alles in Form einer uebersichtlichen Checkliste zu jeder Aufnahme. 3. Es gibt eine Befundformulierung vor, die sich wiederum an dem Schema der Bildanalyse orientiert, alle Kriterien des Normalen definiert und dadurch auch ein wichtiges didaktisches Element darstellt
Potential clinical impact of normal-tissue intrinsic radiosensitivity testing
Bentzen, Soeren M.
1997-01-01
A critical appraisal is given of the possible benefit from a reliable pre-treatment knowledge of individual normal-tissue sensitivity to radiotherapy. The considerations are in part, but not exclusively, based on the recent experience with in vitro colony-forming assays of the surviving fraction at 2 Gy, the SF 2 . Three strategies are reviewed: (1) to screen for rare cases with extreme radiosensitivity, so-called over-reactors, and treat these with reduced total dose, (2) to identify the sensitive tail of the distribution of 'normal' radiosensitivities, refer these patients to other treatment, and to escalate the dose to the remaining patients, or (3) to individualize dose prescriptions based on individual radiosensitivity, i.e. treating to isoeffect rather than to a specific dose-fractionation schedule. It is shown that these strategies will have a small, if any, impact on routine radiotherapy. Screening for over-reactors is hampered by the low prevalence of these among otherwise un-selected patients that leads to a low positive predictive value of in vitro radiosensitivity assays. It is argued, that this problem may persist even if the noise on current assays could be reduced to (the unrealistic value of) zero, simply because of the large biological variation in SF 2 . Removing the sensitive tail of the patient population, will only have a minor effect on the dose that could be delivered to the remaining patients, because of the sigmoid shape of empirical dose-response relationships. Finally, individualizing dose prescriptions based exclusively on information from a normal-tissue radiosensitivity assay, leads to a nearly symmetrical distribution of dose-changes that would produce a very small gain, or even a loss, of tumor control probability if implemented in the clinic. From a theoretical point of view, other strategies could be devised and some of these are considered in this review. Right now the most promising clinical use of in vitro radiosensitivity
Improving historical spelling normalization with bi-directional LSTMs and multi-task learning
Bollmann, Marcel; Søgaard, Anders
2016-01-01
Natural-language processing of historical documents is complicated by the abundance of variant spellings and lack of annotated data. A common approach is to normalize the spelling of historical words to modern forms. We explore the suitability of a deep neural network architecture for this task, particularly a deep bi-LSTM network applied on a character level. Our model compares well to previously established normalization algorithms when evaluated on a diverse set of texts from Early New Hig...
Wickramasinghe, S N; Spearing, R L; Hill, G R
1998-12-01
Two non-anaemic subjects, a father and daughter, with a new form of congenital dyserythropoiesis are reported. The features of their disorder are: (1) an abnormal blood film with basophilic stippling of red cells and oval macrocytes, (2) various dysplastic changes in the erythroblasts, including internuclear chromatin bridges, (3) ultrastructurally-normal erythroblast heterochromatin, (4) normal serum thymidine kinase activity, and (5) a probable autosomal dominant inheritance. The last three features distinguish this disorder from CDA type I.
Indentation stiffness does not discriminate between normal and degraded articular cartilage.
Brown, Cameron P; Crawford, Ross W; Oloyede, Adekunle
2007-08-01
Relative indentation characteristics are commonly used for distinguishing between normal healthy and degraded cartilage. The application of this parameter in surgical decision making and an appreciation of articular cartilage biomechanics has prompted us to hypothesise that it is difficult to define a reference stiffness to characterise normal articular cartilage. This hypothesis is tested for validity by carrying out biomechanical indentation of articular cartilage samples that are characterised as visually normal and degraded relative to proteoglycan depletion and collagen disruption. Compressive loading was applied at known strain rates to visually normal, artificially degraded and naturally osteoarthritic articular cartilage and observing the trends of their stress-strain and stiffness characteristics. While our results demonstrated a 25% depreciation in the stiffness of individual samples after proteoglycan depletion, they also showed that when compared to the stiffness of normal samples only 17% lie outside the range of the stress-strain behaviour of normal samples. We conclude that the extent of the variability in the properties of normal samples, and the degree of overlap (81%) of the biomechanical properties of normal and degraded matrices demonstrate that indentation data cannot form an accurate basis for distinguishing normal from abnormal articular cartilage samples with consequences for the application of this mechanical process in the clinical environment.
Automatic Radiometric Normalization of Multitemporal Satellite Imagery
Canty, Morton J.; Nielsen, Allan Aasbjerg; Schmidt, Michael
2004-01-01
with normalization using orthogonal regression. The procedure is applied to Landsat TM images over Nevada, Landsat ETM+ images over Morocco, and SPOT HRV images over Kenya. Results from this new automatic, combined MAD/orthogonal regression method, based on statistical analysis of test pixels not used in the actual...
Post-Normal science in practice
Dankel, Dorothy J.; Vaage, Nora S.; van der Sluijs, Jeroen P.
This special issue contains a selection of papers presented during the 2014 Bergen meeting, complemented with short perspectives by young PNS-inspired scholars, presented at a mini-symposium "Post-normal times? New thinking about science and policy advice" held on 21 October 2016 in celebration of
Physical Development: What's Normal? What's Not?
... Stages Listen Español Text Size Email Print Share Physical Development: What’s Normal? What’s Not? Page Content Article ... growth . The timing and speed of a child's physical development can vary a lot, because it is ...
Achondroplasia in sibs of normal parents.
Philip, N; Auger, M; Mattei, J F; Giraud, F
1988-01-01
A new case of recurrent achondroplasia in sibs of normal parents is reported. Two sisters and a half sister were affected. Various mechanisms can be postulated to account for unexpected recurrence of achondroplasia in the same sibship. Germinal mosaicism and unstable premutation are discussed here. Images PMID:3236371
Achondroplasia in sibs of normal parents.
Philip, N; Auger, M; Mattei, J F; Giraud, F
1988-01-01
A new case of recurrent achondroplasia in sibs of normal parents is reported. Two sisters and a half sister were affected. Various mechanisms can be postulated to account for unexpected recurrence of achondroplasia in the same sibship. Germinal mosaicism and unstable premutation are discussed here.
Endoscopic third ventriculostomy in idiopathic normal pressure ...
Mohammed Ahmed Eshra
2013-12-22
Dec 22, 2013 ... system of the brain causing ventricular enlargement. This is followed by gradual .... sion, not to decrease the pressure (which is already normal).8–15 ... So ETV must be performed in patients with clinical evolution of not more.
Principal normal indicatrices of closed space curves
Røgen, Peter
1999-01-01
A theorem due to J. Weiner, which is also proven by B. Solomon, implies that a principal normal indicatrix of a closed space curve with nonvanishing curvature has integrated geodesic curvature zero and contains no subarc with integrated geodesic curvature pi. We prove that the inverse problem alw...
Normal tension glaucoma and Alzheimer disease
Bach-Holm, Daniella; Kessing, Svend Vedel; Mogensen, Ulla Brasch
2012-01-01
Purpose: To investigate whether normal tension glaucoma (NTG) is associated with increased risk of developing dementia/Alzheimer disease (AD). Methods: A total of 69 patients with NTG were identified in the case note files in the Glaucoma Clinic, University Hospital of Copenhagen (Rigshospitalet...
Normal stresses in semiflexible polymer hydrogels
Vahabi, M.; Vos, Bart E.; de Cagny, Henri C. G.; Bonn, Daniel; Koenderink, Gijsje H.; MacKintosh, F. C.
2018-03-01
Biopolymer gels such as fibrin and collagen networks are known to develop tensile axial stress when subject to torsion. This negative normal stress is opposite to the classical Poynting effect observed for most elastic solids including synthetic polymer gels, where torsion provokes a positive normal stress. As shown recently, this anomalous behavior in fibrin gels depends on the open, porous network structure of biopolymer gels, which facilitates interstitial fluid flow during shear and can be described by a phenomenological two-fluid model with viscous coupling between network and solvent. Here we extend this model and develop a microscopic model for the individual diagonal components of the stress tensor that determine the axial response of semiflexible polymer hydrogels. This microscopic model predicts that the magnitude of these stress components depends inversely on the characteristic strain for the onset of nonlinear shear stress, which we confirm experimentally by shear rheometry on fibrin gels. Moreover, our model predicts a transient behavior of the normal stress, which is in excellent agreement with the full time-dependent normal stress we measure.
Comparative ultrasound measurement of normal thyroid gland ...
2011-08-31
Aug 31, 2011 ... the normal thyroid gland has a homogenous increased medium level echo texture. The childhood thyroid gland dimension correlates linearly with age and body surface unlike adults. [14] Iodothyronine (T3) and thyroxine (T4) are thyroid hormones which function to control the basal metabolic rate (BMR).
Mast cell distribution in normal adult skin
A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)
2005-01-01
markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.
Normal sleep and its neurophysiological regulation
Hofman, W.F.; Talamini, L.M.; Watson, R.R.
2015-01-01
Normal sleep consists of two states: NREM (light and deep sleep) and REM, alternating in a cyclical pattern. The sleep/wake rhythm is regulated by two processes: the sleep propensity, building up during wake, and the circadian rhythm, imposed by the suprachiasmatic nucleus. The arousal pathways in
Named entity normalization in user generated content
Jijkoun, V.; Khalid, M.A.; Marx, M.; de Rijke, M.
2008-01-01
Named entity recognition is important for semantically oriented retrieval tasks, such as question answering, entity retrieval, biomedical retrieval, trend detection, and event and entity tracking. In many of these tasks it is important to be able to accurately normalize the recognized entities,
Morphological evaluation of normal human corneal epithelium
Ehlers, Niels; Heegaard, Steffen; Hjortdal, Jesper
2010-01-01
of corneas from 100 consecutively selected paraffin-embedded eyes were stained with hematoxylin-eosin and Periodic Acid-Schiff (PAS). All specimens were evaluated by light microscopy. The eyes were enucleated from patients with choroidal melanoma. Corneas were considered to be normal. RESULTS: Ninety of 100...
Dissociative Functions in the Normal Mourning Process.
Kauffman, Jeffrey
1994-01-01
Sees dissociative functions in mourning process as occurring in conjunction with integrative trends. Considers initial shock reaction in mourning as model of normal dissociation in mourning process. Dissociation is understood to be related to traumatic significance of death in human consciousness. Discerns four psychological categories of…
Hemoglobin levels in normal Filipino pregnant women.
Kuizon, M D; Natera, M G; Ancheta, L P; Platon, T P; Reyes, G D; Macapinlac, M P
1981-09-01
The hemoglobin concentrations during pregnancy in Filipinos belonging to the upper income group, who were prescribed 105 mg elemental iron daily, and who had acceptable levels of transferrin saturation, were examined in an attempt to define normal levels. The hemoglobin concentrations for each trimester followed a Gaussian distribution. The hemoglobin values equal to the mean minus one standard deviation were 11.4 gm/dl for the first trimester and 10.4 gm/dl for the second and third trimesters. Using these values as the lower limits of normal, in one group of pregnant women the prevalence of anemia during the last two trimesters was found lower than that obtained when WHO levels for normal were used. Groups of women with hemoglobin of 10.4 to 10.9 gm/dl (classified anemic by WHO criteria but normal in the present study) and those with 11.0 gm/dl and above could not be distinguished on the basis of their serum ferritin levels nor on the degree of decrease in their hemoglobin concentration during pregnancy. Many subjects in both groups, however, had serum ferritin levels less than 12 ng/ml which indicate poor iron stores. It might be desirable in future studies to determine the hemoglobin cut-off point that will delineate subjects who are both non-anemic and adequate in iron stores using serum ferritin levels as criterion for the latter.
Normalized compression distance of multisets with applications
Cohen, A.R.; Vitányi, P.M.B.
Pairwise normalized compression distance (NCD) is a parameter-free, feature-free, alignment-free, similarity metric based on compression. We propose an NCD of multisets that is also metric. Previously, attempts to obtain such an NCD failed. For classification purposes it is superior to the pairwise
Limiting Normal Operator in Quasiconvex Analysis
Aussel, D.; Pištěk, Miroslav
2015-01-01
Roč. 23, č. 4 (2015), s. 669-685 ISSN 1877-0533 R&D Projects: GA ČR GA15-00735S Institutional support: RVO:67985556 Keywords : Quasiconvex function * Sublevel set * Normal operator Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2015 http://library.utia.cas.cz/separaty/2015/MTR/pistek-0453552.pdf
The normal bacterial flora prevents GI disease
First page Back Continue Last page Overview Graphics. The normal bacterial flora prevents GI disease. Inhibits pathogenic enteric bacteria. Decrease luminal pH; Secrete bacteriocidal proteins; Colonization resistance; Block epithelial binding – induce MUC2. Improves epithelial and mucosal barrier integrity. Produce ...
Role of the normal gut microbiota.
Jandhyala, Sai Manasa; Talukdar, Rupjyoti; Subramanyam, Chivkula; Vuyyuru, Harish; Sasikala, Mitnala; Nageshwar Reddy, D
2015-08-07
Relation between the gut microbiota and human health is being increasingly recognised. It is now well established that a healthy gut flora is largely responsible for overall health of the host. The normal human gut microbiota comprises of two major phyla, namely Bacteroidetes and Firmicutes. Though the gut microbiota in an infant appears haphazard, it starts resembling the adult flora by the age of 3 years. Nevertheless, there exist temporal and spatial variations in the microbial distribution from esophagus to the rectum all along the individual's life span. Developments in genome sequencing technologies and bioinformatics have now enabled scientists to study these microorganisms and their function and microbe-host interactions in an elaborate manner both in health and disease. The normal gut microbiota imparts specific function in host nutrient metabolism, xenobiotic and drug metabolism, maintenance of structural integrity of the gut mucosal barrier, immunomodulation, and protection against pathogens. Several factors play a role in shaping the normal gut microbiota. They include (1) the mode of delivery (vaginal or caesarean); (2) diet during infancy (breast milk or formula feeds) and adulthood (vegan based or meat based); and (3) use of antibiotics or antibiotic like molecules that are derived from the environment or the gut commensal community. A major concern of antibiotic use is the long-term alteration of the normal healthy gut microbiota and horizontal transfer of resistance genes that could result in reservoir of organisms with a multidrug resistant gene pool.
Perturbations of normally solvable nonlinear operators, I
William O. Ray
1985-01-01
Full Text Available Let X and Y be Banach spaces and let ℱ and be Gateaux differentiable mappings from X to Y In this note we study when the operator ℱ+ is surjective for sufficiently small perturbations of a surjective operator ℱ The methods extend previous results in the area of normal solvability for nonlinear operators.
Sample normalization methods in quantitative metabolomics.
Wu, Yiman; Li, Liang
2016-01-22
To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.
Refixation saccades with normal gain values
Korsager, Leise Elisabeth Hviid; Faber, Christian Emil; Schmidt, Jesper Hvass
2017-01-01
-ocular reflex. However, this partial deficit is in conflict with the current way of interpreting vHIT results in which the vestibular function is classified as either normal or pathological based only on the gain value. Refixation saccades, which are evident signs of vestibulopathy, are not considered...
Hypervascular liver lesions in radiologically normal liver
Amico, Enio Campos; Alves, Jose Roberto; Souza, Dyego Leandro Bezerra de; Salviano, Fellipe Alexandre Macena; Joao, Samir Assi; Liguori, Adriano de Araujo Lima, E-mail: ecamic@uol.com.br [Hospital Universitario Onofre Lopes (HUOL/UFRN), Natal, RN (Brazil). Clinica Gastrocentro e Ambulatorios de Cirurgia do Aparelho Digestivo e de Cirurgia Hepatobiliopancreatica
2017-09-01
Background: The hypervascular liver lesions represent a diagnostic challenge. Aim: To identify risk factors for cancer in patients with non-hemangiomatous hypervascular hepatic lesions in radiologically normal liver. Method: This prospective study included patients with hypervascular liver lesions in radiologically normal liver. The diagnosis was made by biopsy or was presumed on the basis of radiologic stability in follow-up period of one year. Cirrhosis or patients with typical imaging characteristics of haemangioma were excluded. Results: Eighty eight patients were included. The average age was 42.4. The lesions were unique and were between 2-5 cm in size in most cases. Liver biopsy was performed in approximately 1/3 of cases. The lesions were benign or most likely benign in 81.8%, while cancer was diagnosed in 12.5% of cases. Univariate analysis showed that age >45 years (p< 0.001), personal history of cancer (p=0.020), presence of >3 nodules (p=0.003) and elevated alkaline phosphatase (p=0.013) were significant risk factors for cancer. Conclusion: It is safe to observe hypervascular liver lesions in normal liver in patients up to 45 years, normal alanine amino transaminase, up to three nodules and no personal history of cancer. Lesion biopsies are safe in patients with atypical lesions and define the treatment to be established for most of these patients. (author)
KERNEL MAD ALGORITHM FOR RELATIVE RADIOMETRIC NORMALIZATION
Y. Bai
2016-06-01
Full Text Available The multivariate alteration detection (MAD algorithm is commonly used in relative radiometric normalization. This algorithm is based on linear canonical correlation analysis (CCA which can analyze only linear relationships among bands. Therefore, we first introduce a new version of MAD in this study based on the established method known as kernel canonical correlation analysis (KCCA. The proposed method effectively extracts the non-linear and complex relationships among variables. We then conduct relative radiometric normalization experiments on both the linear CCA and KCCA version of the MAD algorithm with the use of Landsat-8 data of Beijing, China, and Gaofen-1(GF-1 data derived from South China. Finally, we analyze the difference between the two methods. Results show that the KCCA-based MAD can be satisfactorily applied to relative radiometric normalization, this algorithm can well describe the nonlinear relationship between multi-temporal images. This work is the first attempt to apply a KCCA-based MAD algorithm to relative radiometric normalization.
Robust glint detection through homography normalization
Hansen, Dan Witzner; Roholm, Lars; García Ferreiros, Iván
2014-01-01
A novel normalization principle for robust glint detection is presented. The method is based on geometric properties of corneal reflections and allows for simple and effective detection of glints even in the presence of several spurious and identically appearing reflections. The method is tested...
Power curve report - with turbulence intensity normalization
Gómez Arranz, Paula; Wagner, Rozenn; Vesth, Allan
, additional shear and turbulence intensitity filters are applied on the measured data. Secondly, the method for normalization to a given reference turbulence intensity level (as described in Annex M of the draft of IEC 61400-12-1 Ed.2 [3]) is applied. The measurements have been performed using DTU...
Accounting for the Benefits of Database Normalization
Wang, Ting J.; Du, Hui; Lehmann, Constance M.
2010-01-01
This paper proposes a teaching approach to reinforce accounting students' understanding of the concept of database normalization. Unlike a conceptual approach shown in most of the AIS textbooks, this approach involves with calculations and reconciliations with which accounting students are familiar because the methods are frequently used in…
Superconvergent sum rules for the normal reflectivity
Furuya, K.; Zimerman, A.H.; Villani, A.
1976-05-01
Families of superconvergent relations for the normal reflectivity function are written. Sum rules connecting the difference of phases of the reflectivities of two materials are also considered. Finally superconvergence relations and sum rules for magneto-reflectivity in the Faraday and Voigt regimes are also studied
Effects of pions on normal tissues
Tokita, N.
1981-01-01
Verification of the uniform biological effectiveness of pion beams of various dimensions produced at LAMPF has been made using cultured mammalian cells and mouse jejunum. Normal tissue radiobiology studies at LAMPF are reviewed with regard to biological beam characterization for the therapy program and the current status of acute and late effect studies on rodents
Challenging the Ideology of Normal in Schools
Annamma, Subini A.; Boelé, Amy L.; Moore, Brooke A.; Klingner, Janette
2013-01-01
In this article, we build on Brantlinger's work to critique the binary of normal and abnormal applied in US schools that create inequities in education. Operating from a critical perspective, we draw from Critical Race Theory, Disability Studies in Education, and Cultural/Historical Activity Theory to build a conceptual framework for…
NETWORK CODING BY BEAM FORMING
2013-01-01
Network coding by beam forming in networks, for example, in single frequency networks, can provide aid in increasing spectral efficiency. When network coding by beam forming and user cooperation are combined, spectral efficiency gains may be achieved. According to certain embodiments, a method...... cooperating with the plurality of user equipment to decode the received data....
Automated Test-Form Generation
van der Linden, Wim J.; Diao, Qi
2011-01-01
In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…
Electromagnetic form factors of hadrons
Zidell, V.S.
1976-01-01
A vector meson dominance model of the electromagnetic form factors of hadrons is developed which is based on the use of unstable particle propagators. Least-square fits are made to the proton, neutron, pion and kaon form factor data in both the space and time-like regions. A good fit to the low-energy nucleon form factor data is obtained using only rho, ω, and phi dominance, and leads to a determination of the vector meson resonance parameters in good agreement with experiment. The nucleon-vector meson coupling constants obey simple sum rules indicating that there exists no hard core contribution to the form factors within theoretical uncertainties. The prediction for the electromagnetic radii of the proton is in reasonable agreement with recent experiments. The pion and kaon charge form factors as deduced from the nucleon form factors assuming vector meson universality are compared to the data. The pion form factor agrees with the data in both the space and time-like regions. The pion charge radius is in agreement with the recent Dubna result, but the isovector P-wave pion-pion phase shift calculated from the theory disagrees with experiment. A possible contribution to the form factors from a heavy rho meson is also evaluated
[Galenic forms for oral medication].
El Semman, Ousseid; Certain, Agnès; Bouziane, Faouzia; Arnaud, Philippe
2012-10-01
Galenic science is interested in the art and the way of formulating an active principle with an excipient in order for it to be administered to the patient. The pharmaceutical forms envisage different administration routes, including by mouth. Nurses need to handle and sometimes modify the pharmaceutical form of a drug to make it easier for the patient to take. This requires vigilance.
Glymphatic MRI in idiopathic normal pressure hydrocephalus.
Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian
2017-10-01
The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer's disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic function. In
Normalization for triple-target microarray experiments
Magniette Frederic
2008-04-01
Full Text Available Abstract Background Most microarray studies are made using labelling with one or two dyes which allows the hybridization of one or two samples on the same slide. In such experiments, the most frequently used dyes are Cy3 and Cy5. Recent improvements in the technology (dye-labelling, scanner and, image analysis allow hybridization up to four samples simultaneously. The two additional dyes are Alexa488 and Alexa494. The triple-target or four-target technology is very promising, since it allows more flexibility in the design of experiments, an increase in the statistical power when comparing gene expressions induced by different conditions and a scaled down number of slides. However, there have been few methods proposed for statistical analysis of such data. Moreover the lowess correction of the global dye effect is available for only two-color experiments, and even if its application can be derived, it does not allow simultaneous correction of the raw data. Results We propose a two-step normalization procedure for triple-target experiments. First the dye bleeding is evaluated and corrected if necessary. Then the signal in each channel is normalized using a generalized lowess procedure to correct a global dye bias. The normalization procedure is validated using triple-self experiments and by comparing the results of triple-target and two-color experiments. Although the focus is on triple-target microarrays, the proposed method can be used to normalize p differently labelled targets co-hybridized on a same array, for any value of p greater than 2. Conclusion The proposed normalization procedure is effective: the technical biases are reduced, the number of false positives is under control in the analysis of differentially expressed genes, and the triple-target experiments are more powerful than the corresponding two-color experiments. There is room for improving the microarray experiments by simultaneously hybridizing more than two samples.
Fission cross-section normalization problems
Wagemans, C.; Ghent Rijksuniversiteit; Deruytter, A.J.
1983-01-01
The present measurements yield σsub(f)-data in the neutron energy from 20 MeV to 30 keV directly normalized in the thermal region. In the keV-region these data are consistent with the absolute σsub(f)-measurements of Szabo and Marquette. For the secondary normalization integral I 2 values have been obtained in agreement with those of Gwin et al. and Czirr et al. which were also directly normalized in the thermal region. For the I 1 integral, however, puzzling low values have been obtained. This was also the case for σsub(f)-bar in neutron energy intervals containing strong resonances. Three additional measurements are planned to further investigate these observations: (i) maintaining the actual approx.2π-geometry but using a 10 B-foil for the neutron flux detection (ii) using a low detection geometry with a 10 B- as well as a 6 Li-flux monitor. Only after these measurements definite conclusions on the I 1 and I 2 integrals can be formulated and final σsub(f)-bar-values can be released. The present study also gives some evidence for a correlation between the integral I 2 and the neutron flux monitor used. The influence of a normalization via I 1 or I 2 on the final cross-section has been shown. The magnitude of possible normalization errors is illustrated. Finally, since 235 U is expected to be an ''easy'' nucleus (low α-activity high σsub(f)-values), there are some indications that the important discrepancies still present in 235 U(n,f) cross-section measurements might partially be due to errors in the neutron flux determination
Spatial normalization of array-CGH data
Brennetot Caroline
2006-05-01
Full Text Available Abstract Background Array-based comparative genomic hybridization (array-CGH is a recently developed technique for analyzing changes in DNA copy number. As in all microarray analyses, normalization is required to correct for experimental artifacts while preserving the true biological signal. We investigated various sources of systematic variation in array-CGH data and identified two distinct types of spatial effect of no biological relevance as the predominant experimental artifacts: continuous spatial gradients and local spatial bias. Local spatial bias affects a large proportion of arrays, and has not previously been considered in array-CGH experiments. Results We show that existing normalization techniques do not correct these spatial effects properly. We therefore developed an automatic method for the spatial normalization of array-CGH data. This method makes it possible to delineate and to eliminate and/or correct areas affected by spatial bias. It is based on the combination of a spatial segmentation algorithm called NEM (Neighborhood Expectation Maximization and spatial trend estimation. We defined quality criteria for array-CGH data, demonstrating significant improvements in data quality with our method for three data sets coming from two different platforms (198, 175 and 26 BAC-arrays. Conclusion We have designed an automatic algorithm for the spatial normalization of BAC CGH-array data, preventing the misinterpretation of experimental artifacts as biologically relevant outliers in the genomic profile. This algorithm is implemented in the R package MANOR (Micro-Array NORmalization, which is described at http://bioinfo.curie.fr/projects/manor and available from the Bioconductor site http://www.bioconductor.org. It can also be tested on the CAPweb bioinformatics platform at http://bioinfo.curie.fr/CAPweb.
Fusion and normalization to enhance anomaly detection
Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.
2009-05-01
This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.
Simulation and Verification of Form Filling with Self-Compacting Concrete
Thrane, Lars Nyholm
2005-01-01
This paper presents a form filling experiment and the corresponding 3D simulation. One side of the form is made of a transparent acrylic plate and to improve the visual observations of the flow behaviour, the first and second half of the form is cast with normal grey and red-pigmented SCC, respec...
The cystic form of rheumatoid arthritis
Dijkstra, P.F.; Gubler, F.M.; Maas, A.
1988-01-01
A nonerosive form of rheumatoid arthritis (R.A.) was found in 62 patients out of 660 patients with R.A.. These 62 patients exhibit slowly progressive cystic changes in about the same joints in which usually erosions develop in classic R.A.. The E.S.R. is often low, half of the patients remained seronegative and there are 35 males and 27 females in the group. A smaller group of 15 out of these patients could be followed from a stage wherein the radiographs were normal to a stage of extensive cystic changes, over a period of at least 6 years. An attempt is made to delineate this group within the rheumatoid arthritis disease entity. (orig.) [de
Mutual-friction induced instability of normal-fluid vortex tubes in superfluid helium-4
Kivotides, Demosthenes
2018-06-01
It is shown that, as a result of its interactions with superfluid vorticity, a normal-fluid vortex tube in helium-4 becomes unstable and disintegrates. The superfluid vorticity acquires only a small (few percents of normal-fluid tube strength) polarization, whilst expanding in a front-like manner in the intervortex space of the normal-fluid, forming a dense, unstructured tangle in the process. The accompanied energy spectra scalings offer a structural explanation of analogous scalings in fully developed finite-temperature superfluid turbulence. A macroscopic mutual-friction model incorporating these findings is proposed.
New method for computing ideal MHD normal modes in axisymmetric toroidal geometry
Wysocki, F.; Grimm, R.C.
1984-11-01
Analytic elimination of the two magnetic surface components of the displacement vector permits the normal mode ideal MHD equations to be reduced to a scalar form. A Galerkin procedure, similar to that used in the PEST codes, is implemented to determine the normal modes computationally. The method retains the efficient stability capabilities of the PEST 2 energy principle code, while allowing computation of the normal mode frequencies and eigenfunctions, if desired. The procedure is illustrated by comparison with earlier various of PEST and by application to tilting modes in spheromaks, and to stable discrete Alfven waves in tokamak geometry
Lou, C.
2002-01-01
An advection-diffusion model has been set up to describe normal grain growth. In this model grains are divided into different groups according to their topological classes (number of sides of a grain). Topological transformations are modelled by advective and diffusive flows governed by advective and diffusive coefficients respectively, which are assumed to be proportional to topological classes. The ordinary differential equations governing self-similar time-independent grain size distribution can be derived analytically from continuity equations. It is proved that the time-independent distributions obtained by solving the ordinary differential equations have the same form as the time-dependent distributions obtained by solving the continuity equations. The advection-diffusion model is extended to describe the stagnation of normal grain growth in thin films. Grain boundary grooving prevents grain boundaries from moving, and the correlation between neighbouring grains accelerates the stagnation of normal grain growth. After introducing grain boundary grooving and the correlation between neighbouring grains into the model, the grain size distribution is close to a lognormal distribution, which is usually found in experiments. A vertex computer simulation of normal grain growth has also been carried out to make a cross comparison with the advection-diffusion model. The result from the simulation did not verify the assumption that the advective and diffusive coefficients are proportional to topological classes. Instead, we have observed that topological transformations usually occur on certain topological classes. This suggests that the advection-diffusion model can be improved by making a more realistic assumption on topological transformations. (author)
On the transition to the normal phase for superconductors surrounded by normal conductors
Fournais, Søren; Kachmar, Ayman
2009-01-01
For a cylindrical superconductor surrounded by a normal material, we discuss transition to the normal phase of stable, locally stable and critical configurations. Associated with those phase transitions, we define critical magnetic fields and we provide a sufficient condition for which those...
Zimmerman, Donald W.
2011-01-01
This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…
Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.
Sznitman, Sharon R; Taubman, Danielle S
2016-09-01
Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.
Deformation associated with continental normal faults
Resor, Phillip G.
Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master
Yu, Wei-Wen
2010-01-01
The definitive text in the field, thoroughly updated and expanded Hailed by professionals around the world as the definitive text on the subject, Cold-Formed Steel Design is an indispensable resource for all who design for and work with cold-formed steel. No other book provides such exhaustive coverage of both the theory and practice of cold-formed steel construction. Updated and expanded to reflect all the important developments that have occurred in the field over the past decade, this Fourth Edition of the classic text provides you with more of the detailed, up-to-the-minute techni
Differential forms theory and practice
Weintraub, Steven H
2014-01-01
Differential forms are utilized as a mathematical technique to help students, researchers, and engineers analyze and interpret problems where abstract spaces and structures are concerned, and when questions of shape, size, and relative positions are involved. Differential Forms has gained high recognition in the mathematical and scientific community as a powerful computational tool in solving research problems and simplifying very abstract problems through mathematical analysis on a computer. Differential Forms, 2nd Edition, is a solid resource for students and professionals needing a solid g
Waste forms for plutonium disposition
Johnson, S.G.; O'Holleran, T.P.; Frank, S.M.; Meyer, M.K.; Hanson, M.; Staples, B.A.; Knecht, D.A.; Kong, P.C.
1997-01-01
The field of plutonium disposition is varied and of much importance, since the Department of Energy has decided on the hybrid option for disposing of the weapons materials. This consists of either placing the Pu into mixed oxide fuel for reactors or placing the material into a stable waste form such as glass. The waste form used for Pu disposition should exhibit certain qualities: (1) provide for a suitable deterrent to guard against proliferation; (2) be of minimal volume, i.e., maximize the loading; and (3) be reasonably durable under repository-like conditions. This paper will discuss several Pu waste forms that display promising characteristics
A4 see-saw models and form dominance
Chen, M-C; King, Stephen F.
2009-01-01
We introduce the idea of Form Dominance in the (type I) see-saw mechanism, according to which a particular right-handed neutrino mass eigenstate is associated with a particular physical neutrino mass eigenstate, leading to a form diagonalizable effective neutrino mass matrix. Form Dominance, which allows an arbitrary neutrino mass spectrum, may be regarded as a generalization of Constrained Sequential Dominance which only allows strongly hierarchical neutrino masses. We consider alternative implementations of the see-saw mechanism in minimal A 4 see-saw models and show that such models satisfy Form Dominance, leading to neutrino mass sum rules which predict closely spaced neutrino masses with a normal or inverted neutrino mass ordering. To avoid the partial cancellations inherent in such models we propose Natural Form Dominance, in which a different flavon is associated with each physical neutrino mass eigenstate.
The variability problem of normal human walking
Simonsen, Erik B; Alkjær, Tine
2012-01-01
Previous investigations have suggested considerable inter-individual variability in the time course pattern of net joint moments during normal human walking, although the limited sample sizes precluded statistical analyses. The purpose of the present study was to obtain joint moment patterns from...... a group of normal subjects and to test whether or not the expected differences would prove to be statistically significant. Fifteen healthy male subjects were recorded on video while they walked across two force platforms. Ten kinematic and kinetic parameters were selected and input to a statistical...... cluster analysis to determine whether or not the 15 subjects could be divided into different 'families' (clusters) of walking strategy. The net joint moments showed a variability corroborating earlier reports. The cluster analysis showed that the 15 subjects could be grouped into two clusters of 5 and 10...
Autobiographical Memory in Normal Ageing and Dementia
Harvey J. Sagar
1991-01-01
Full Text Available Autobiographical memories in young and elderly normal subjects are drawn mostly from the recent past but elderly subjects relate a second peak of memories from early adulthood. Memory for remote past public events is relatively preserved in dementia, possibly reflecting integrity of semantic relative to episodic memory. We examined recall of specific, consistent autobiographical episodes in Alzheimer's disease (AD in response to cue words. Patients and control subjects drew most memories from the recent 20 years: episode age related to anterograde memory function but not subject age or dementia. Subjects also related a secondary peak of memories from early adulthood; episode age related to subject age and severity of dementia. The results suggest that preferential recall of memories from early adulthood is based on the salience of retrieval cues, altered by age and dementia, superimposed on a temporal gradient of semantic memory. Further, AD shows behavioural similarity to normal ageing.
A Proposed Arabic Handwritten Text Normalization Method
Tarik Abu-Ain
2014-11-01
Full Text Available Text normalization is an important technique in document image analysis and recognition. It consists of many preprocessing stages, which include slope correction, text padding, skew correction, and straight the writing line. In this side, text normalization has an important role in many procedures such as text segmentation, feature extraction and characters recognition. In the present article, a new method for text baseline detection, straightening, and slant correction for Arabic handwritten texts is proposed. The method comprises a set of sequential steps: first components segmentation is done followed by components text thinning; then, the direction features of the skeletons are extracted, and the candidate baseline regions are determined. After that, selection of the correct baseline region is done, and finally, the baselines of all components are aligned with the writing line. The experiments are conducted on IFN/ENIT benchmark Arabic dataset. The results show that the proposed method has a promising and encouraging performance.
Zero cosmological constant from normalized general relativity
Davidson, Aharon; Rubin, Shimon
2009-01-01
Normalizing the Einstein-Hilbert action by the volume functional makes the theory invariant under constant shifts in the Lagrangian. The associated field equations then resemble unimodular gravity whose otherwise arbitrary cosmological constant is now determined as a Machian universal average. We prove that an empty space-time is necessarily Ricci tensor flat, and demonstrate the vanishing of the cosmological constant within the scalar field paradigm. The cosmological analysis, carried out at the mini-superspace level, reveals a vanishing cosmological constant for a universe which cannot be closed as long as gravity is attractive. Finally, we give an example of a normalized theory of gravity which does give rise to a non-zero cosmological constant.
Suitable Image Intensity Normalization for Arterial Visualization
Yara Omran
2012-12-01
Full Text Available Ultrasonic imaging is a widely used non-invasivemedical imaging procedure since it is economical, comparativelysafe, portable and adaptable. However, one of its main weaknessesis the poor quality of images, which makes the enhancementof image quality an important issue in order to have a moreaccurate diagnose of the disease, or for the transformation of theimage through telemedicine channel and in many other imageprocessing tasks [1]. The purpose of this paper is to automaticallyenhance the image quality after the automatic detection of theartery wall. This step is essential before subsequent measurementsof arterial parameters [9]. This was performed automaticallyby applying linear normalization, where results showedthat normalization of ultra sound images is an important step inenhancing the image quality for later processing. In comparisonwith other methods, our method is automatic. The evaluationof image quality was done mathematically by comparing pixelintensities of images before and after enhancement, in additionto a visual evaluation.
Normal anatomical measurements in cervical computerized tomography
Zaunbauer, W.; Daepp, S.; Haertel, M.
1985-01-01
Radiodiagnostically relevant normal values and variations for measurements of the cervical region, the arithmetical average and the standard deviation were determined from adequate computer tomograms on 60 healthy women and men, aged 20 to 83 years. The sagittal diameter of the prevertebral soft tissue and the lumina of the upper respiratory tract were evaluated at exactly defined levels between the hyoid bone and the incisura jugularis sterni. - The thickness of the aryepiglottic folds, the maximal sagittal and transverse diameters of the thyroid gland and the calibre of the great cervical vessels were defined. - To assess information about laryngeal function in computerized tomography, measurements of distances between the cervical spine and anatomical fixed points of the larynx and hypopharynx were made as well as of the degree of vocal cord movement during normal respiration and phonation. (orig.) [de
Broad Ligament Haematoma Following Normal Vaginal Delivery.
Ibrar, Faiza; Awan, Azra Saeed; Fatima, Touseef; Tabassum, Hina
2017-01-01
A 37-year-old, patient presented in emergency with history of normal vaginal delivery followed by development of abdominal distention, vomiting, constipation for last 3 days. She was para 4 and had normal vaginal delivery by traditional birth attendant at peripheral hospital 3 days back. Imaging study revealed a heterogeneous complex mass, ascites, pleural effusion, air fluid levels with dilatation gut loops. Based upon pelvic examination by senior gynaecologist in combination with ultrasound; a clinical diagnosis of broad ligament haematoma was made. However, vomiting and abdominal distention raised suspicion of intestinal obstruction. Due to worsening abdominal distention exploratory laparotomy was carried out. It was pseudo colonic obstruction and caecostomy was done. Timely intervention by multidisciplinary approach saved patient life with minimal morbidity.
Sphalerons, deformed sphalerons and normal modes
Brihaye, Y.; Kunz, J.; Oldenburg Univ.
1992-01-01
Topological arguments suggest that tha Weinberg-Salam model posses unstable solutions, sphalerons, representing the top of energy barriers between inequivalent vacua of the gauge theory. In the limit of vanishing Weinberg angle, such unstable solutions are known: the sphaleron of Klinkhamer and Manton and at large values of the Higgs mass in addition the deformed sphalerons. Here a systematic study of the discrete normal modes about these sphalerons for the full range Higgs mass is presented. The emergence of deformed sphalerons at critical values of the Higgs mass is seem to be related to the crossing of zero of the eigenvalue of the particular normal modes about the sphaleron. 6 figs., 1 tab., 19 refs. (author)
Normalizing the causality between time series
Liang, X. San
2015-08-01
Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.
Normal modes of vibration in nickel
Birgeneau, R J [Yale Univ., New Haven, Connecticut (United States); Cordes, J [Cambridge Univ., Cambridge (United Kingdom); Dolling, G; Woods, A D B
1964-07-01
The frequency-wave-vector dispersion relation, {nu}(q), for the normal vibrations of a nickel single crystal at 296{sup o}K has been measured for the [{zeta}00], [{zeta}00], [{zeta}{zeta}{zeta}], and [0{zeta}1] symmetric directions using inelastic neutron scattering. The results can be described in terms of the Born-von Karman theory of lattice dynamics with interactions out to fourth-nearest neighbors. The shapes of the dispersion curves are very similar to those of copper, the normal mode frequencies in nickel being about 1.24 times the corresponding frequencies in copper. The fourth-neighbor model was used to calculate the frequency distribution function g({nu}) and related thermodynamic properties. (author)
Myositis ossificans localisata pseudomalignant Form
Grunt, J.; Jankovich, E.; Vasovicova, M.
1994-01-01
Authors presents computer tomography and angiographic findings of rare pseudomalignant form of myositis ossificans. Correct diagnosis achieved by complex evaluation of ascertain findings, including biopsy, enables proper treatment with excluding too radical therapy. 3 figs., 4 refs
Ogorodnichuk, V I; Voitsekhovich, R I
1972-01-01
Lead-based anodes can be produced by forming a layer of lead dioxide by chemical treatment in a solution of sulfuric acid in potassium permanganate at 80 to 100/sup 0/. The solution is mixed by compressed air. (RWR)
Calculation of pion form factor
Vahedi, N.; Amirarjomand, S.
1975-09-01
The pion form factor is calculated using the structure function Wsub(2), which incorporates kinematical constraints, threshold behaviour and scaling. The Bloom-Gilman sum rule is used and only the two leading Regge trajectories are taken into account
Microdomain forming proteins in oncogenesis
I. B. Zborovskaya
2016-01-01
Full Text Available Lipid rafts are lateral assembles of cholesterol, sphingomyelin, glicosphingolipids and specific proteins within cell plasma membrane. These microdomains are involved into a number of important cellular processes including membrane rearrangement, protein internalization, signal transduction, entry of viruses into the cell. Some of lipid rafts are stabilized by special microdomain-forming proteins such as caveolins, SPFH domain containing superfamily, tetraspanins, galectins, which maintain integrity of rafts and regulate signal transduction via forming of “signalosomes”. Involvement of the different lipid rafts is necessary in many situations such as binding of growth factors with their receptors, integrin regulation, cytoskeleton and extracellular matrix rearrangements, vesicular transport, etc. However, such classes of microdomain-forming proteins are still considered separately from each other. In this review we tried to perform complex analysis of microdomain-forming proteins in regulation of cancer assotiated processes.
The self-normalized Donsker theorem revisited
Parczewski, Peter
2016-01-01
We extend the Poincar\\'{e}--Borel lemma to a weak approximation of a Brownian motion via simple functionals of uniform distributions on n-spheres in the Skorokhod space $D([0,1])$. This approach is used to simplify the proof of the self-normalized Donsker theorem in Cs\\"{o}rg\\H{o} et al. (2003). Some notes on spheres with respect to $\\ell_p$-norms are given.
Proteoglycans in Leiomyoma and Normal Myometrium
Barker, Nichole M.; Carrino, David A.; Caplan, Arnold I.; Hurd, William W.; Liu, James H.; Tan, Huiqing; Mesiano, Sam
2015-01-01
Uterine leiomyoma are a common benign pelvic tumors composed of modified smooth muscle cells and a large amount of extracellular matrix (ECM). The proteoglycan composition of the leiomyoma ECM is thought to affect pathophysiology of the disease. To test this hypothesis, we examined the abundance (by immunoblotting) and expression (by quantitative real-time polymerase chain reaction) of the proteoglycans biglycan, decorin, and versican in leiomyoma and normal myometrium and determined whether expression is affected by steroid hormones and menstrual phase. Leiomyoma and normal myometrium were collected from women (n = 17) undergoing hysterectomy or myomectomy. In vitro studies were performed on immortalized leiomyoma (UtLM) and normal myometrial (hTERT-HM) cells with and without exposure to estradiol and progesterone. In leiomyoma tissue, abundance of decorin messenger RNA (mRNA) and protein were 2.6-fold and 1.4-fold lower, respectively, compared with normal myometrium. Abundance of versican mRNA was not different between matched samples, whereas versican protein was increased 1.8-fold in leiomyoma compared with myometrium. Decorin mRNA was 2.4-fold lower in secretory phase leiomyoma compared with proliferative phase tissue. In UtLM cells, progesterone decreased the abundance of decorin mRNA by 1.3-fold. Lower decorin expression in leiomyoma compared with myometrium may contribute to disease growth and progression. As decorin inhibits the activity of specific growth factors, its reduced level in the leiomyoma cell microenvironment may promote cell proliferation and ECM deposition. Our data suggest that decorin expression in leiomyoma is inhibited by progesterone, which may be a mechanism by which the ovarian steroids affect leiomyoma growth and disease progression. PMID:26423601
Normal mode analysis for linear resistive magnetohydrodynamics
Kerner, W.; Lerbinger, K.; Gruber, R.; Tsunematsu, T.
1984-10-01
The compressible, resistive MHD equations are linearized around an equilibrium with cylindrical symmetry and solved numerically as a complex eigenvalue problem. This normal mode code allows to solve for very small resistivity eta proportional 10 -10 . The scaling of growthrates and layer width agrees very well with analytical theory. Especially, both the influence of current and pressure on the instabilities is studied in detail; the effect of resistivity on the ideally unstable internal kink is analyzed. (orig.)
On Normalized Compression Distance and Large Malware
Borbely, Rebecca Schuller
2015-01-01
Normalized Compression Distance (NCD) is a popular tool that uses compression algorithms to cluster and classify data in a wide range of applications. Existing discussions of NCD's theoretical merit rely on certain theoretical properties of compression algorithms. However, we demonstrate that many popular compression algorithms don't seem to satisfy these theoretical properties. We explore the relationship between some of these properties and file size, demonstrating that this theoretical pro...
The J/$\\psi$ normal nuclear absorption
Alessandro, B; Arnaldi, R; Atayan, M; Beolè, S; Boldea, V; Bordalo, P; Borges, G; Castanier, C; Castor, J; Chaurand, B; Cheynis, B; Chiavassa, E; Cicalò, C; Comets, M P; Constantinescu, S; Cortese, P; De Falco, A; De Marco, N; Dellacasa, G; Devaux, A; Dita, S; Fargeix, J; Force, P; Gallio, M; Gerschel, C; Giubellino, P; Golubeva, M B; Grigorian, A A; Grigorian, S; Guber, F F; Guichard, A; kanyan, H; ldzik, M; Jouan, D; Karavicheva, T L; Kluberg, L; Kurepin, A B; Le Bornec, Y; Lourenço, C; Cormick, M M; Marzari-Chiesa, A; Masera, M; Masoni, A; Monteno, M; Musso, A; Petiau, P; Piccotti, A; Pizzi, J R; Prino, F; Puddu, G; Quintans, C; Ramello, L; Ramos, S; Riccati, L; Santos, H; Saturnini, P; Scomparin, E; Serci, S; Shahoyan, R; Sigaudo, M F; Sitta, M; Sonderegger, P; Tarrago, X; Topilskaya, N S; Usai, G L; Vercellin, E; Villatte, L; Willis, N; Wu T
2005-01-01
We present a new determination of the ratio of cross-sections (J/psi) /DY as expected for nucleus-nucleus reactions if J/psi would only be normally absorbed by nuclear matter. This anticipated behaviour is based on proton-nucleus data exclusively, and compared, as a function of centrality, with updated S-U results from experiment NA38 and with the most recent Pb-Pb results from experiment NA50.
Research on Normal Human Plantar Pressure Test
Liu Xi Yang
2016-01-01
Full Text Available FSR400 pressure sensor, nRF905 wireless transceiver and MSP40 SCM are used to design the insole pressure collection system, LabVIEW is used to make HMI of data acquisition, collecting a certain amount of normal human foot pressure data, statistical analysis of pressure distribution relations about five stages of swing phase during walking, using the grid closeness degree to identify plantar pressure distribution pattern recognition, and the algorithm simulation, experimental results demonstrated this method feasible.
The classification of normal screening mammograms
Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.
2016-03-01
Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.